Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: max tokens #820

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

himself65
Copy link
Member

Copy link

vercel bot commented May 8, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
llama-index-ts-docs ✅ Ready (Inspect) Visit Preview 💬 Add feedback May 9, 2024 3:15am

Copy link

changeset-bot bot commented May 8, 2024

⚠️ No Changeset found

Latest commit: f1906bf

Merging this PR will not cause a version bump for any packages. If these changes should not result in a new version, you're good to go. If these changes should result in a version bump, you need to add a changeset.

This PR includes no changesets

When changesets are added to this PR, you'll see the packages that this PR includes changesets for and the associated semver types

Click here to learn what changesets are, and how to add one.

Click here if you're a maintainer who wants to add a changeset to this PR

super();
this.messages = init?.messages ?? [];
this.messagesBefore = this.messages.length;
this.summaryPrompt = init?.summaryPrompt ?? defaultSummaryPrompt;
this.llm = init?.llm ?? new OpenAI();
if (!this.llm.metadata.maxTokens) {
if (!this.llm.metadata.maxTokens || !init?.maxTokens) {
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we use Infinity if it's undefined?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the code below won't work if maxTokens is not specified, so I think it's ok, to throw this error, it's basically a configuration error.

this class also needs a tokenizer to work correctly (and for ollama it could be any LLM with any tokenizer).

So I think we should set tokenizer to globalsHelper.defaultTokenizer.encode only if the LLM is OpenAI and throw an error if tokenizer is not explicitly specified otherwise

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants