Skip to content

run-llama/tool

Repository files navigation

@llamaindex/tool

⚠️Haven't published yet, you can try it locally.

Transform JS function for LLM tool call.

  • ✅OpenAI
  • 🚧ClaudeAI
  • ✅LlamaIndexTS
  • 🚧LangChainJS

Usage

In your code

// @file: index.llama.ts

// you can write JSDoc to improve the tool's performance
/**
 * @name getWeather
 * @description Get the weather of a city
 * @param city City name
 * @returns The weather in the city
 */
export function getWeather(city: string) {
  return `The weather in ${city} is sunny.`
}
// you don't need to worry about the shcema with different llm tools
export function getTemperature(city: string) {
  return `The temperature in ${city} is 25°C.`
}
export function getCurrentCity() {
  return 'New York'
}
// @file: app.ts
import Tools from './index.llama'
import { registerTools, convertTools } from '@llamaindex/tool'
// Register tools on top level
registerTools(Tools)

import { OpenAI } from 'openai'
const openai = new OpenAI()
openai.chat.completions.create({
  messages: [
    {
      role: 'user',
      content: 'What is the weather in the current city?'
    }
  ],
  tools: convertTools('openai')
})

// or you can use llamaindex openai agent
import { OpenAIAgent } from 'llamaindex'
const agent = new OpenAIAgent({
  tools: convertTools('llamaindex')
})
const { response } = await agent.chat({
  message: 'What is the temperature in the current city?'
})
console.log('Response:', response)

Run with Node.js

node --import tsx --import @llamaindex/tool/register ./app.ts

Vite (WIP)

import { defineConfig } from 'vite'
import tool from '@llamaindex/tool/vite'

export default defineConfig({
  plugins: [
    tool()
  ]
})

Next.js (WIP)

// next.config.js
const withTool = require('@llamaindex/tool/next')

const config = {
  // Your Next.js config
}
module.exports = withTool(config)

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published