Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: init @llamaindex/core #938

Merged
merged 17 commits into from
Jun 26, 2024
Merged

Conversation

himself65
Copy link
Member

@himself65 himself65 commented Jun 17, 2024

Background

  1. Cloudflare workers / Vercel Edge runtime only have limited support of APIs (only network allowed, no fs and long-term TCP)
  2. API incompatible between js runtime (bun, deno, nodejs)
  3. Most third-party libraries are not used by many people (if someone uses OpenAI, then any other llm packages should be cut off)
  4. Web-based LLM will be landed in the future (Chrome Built-in AI)

I hope llama index-TS could be used in a web app (to load your frontend data in the editor and get value from it), serve as a serverless middleware function for processing documents, or an electron app to load/process documents from a local directory.

Propose

Bring the actual @llamaindex/core package, which only includes necessary and non-third-party related parts of modules like:

  • Definitions of modules in type (runtime and compile time using Zod)
  • Abstraction of some classes (only interfaces)
  • Minimal implementation of modules (store data in memory to make it cross-platform available)

Targeting for all js environments support:

  • browser
  • node.js
  • bun
  • deno

In the future, packages/llamaindex code should only include some environment-related code and libraries like MongoDBVectorStore (only for nodejs)...

llamaindex -> llamaindex/core
-> llamaindex/env

Also, llamaindex should also consider bundlers like vite, webpack... or some framework that uses them (like next.js, waku.gg) to make the best compatibility

At some point, I will bring the bundler back once we isolate the modules correctly so the bundler will give the right output for different js environments.

Current step

  1. Migrate Node.ts to llamaindex/core, which follows https://docs.llamaindex.ai/en/stable/api_reference/schema/
  2. Add some zod shcema
  3. vitest for decorators

some misc:

I removed JSR.json in this PR since I don't think JSR is a good package manager, at least at this point.

Nit: waku bundling for the CJS module is still crashing, maybe related to dai-shi/waku#709

Copy link

changeset-bot bot commented Jun 17, 2024

⚠️ No Changeset found

Latest commit: c59f07d

Merging this PR will not cause a version bump for any packages. If these changes should not result in a new version, you're good to go. If these changes should result in a version bump, you need to add a changeset.

This PR includes no changesets

When changesets are added to this PR, you'll see the packages that this PR includes changesets for and the associated semver types

Click here to learn what changesets are, and how to add one.

Click here if you're a maintainer who wants to add a changeset to this PR

Copy link

vercel bot commented Jun 17, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
llama-index-ts-docs ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jun 26, 2024 10:15pm

@himself65 himself65 changed the title feat: init @llamaindex/core feat: init @llamaindex/core Jun 26, 2024
@himself65 himself65 marked this pull request as ready for review June 26, 2024 22:13
@himself65 himself65 merged commit 22ff083 into run-llama:main Jun 26, 2024
15 checks passed
@himself65 himself65 deleted the llamaindex/core branch June 26, 2024 22:36
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant