Skip to content

tddworks/openai-kotlin

Repository files navigation

CI codecov

openai-kotlin powered by kotlin multiplatform

Getting Started:

To get started, simply add the following dependency to your Kotlin project:

OpenAI API

implementation("com.tddworks:openai-client-jvm:0.2")

Then, configure the OpenAI with your API keys and settings:

  • Default values are provided for the baseUrl, but you can override them with your own values.
  • OpenAI
    • default baseUrl is api.openai.com

Example:

import com.tddworks.openai.api.chat.api.ChatCompletionRequest
import com.tddworks.openai.api.chat.api.ChatMessage
import com.tddworks.openai.api.chat.api.Model
import com.tddworks.openai.di.initOpenAI

val openAI = initOpenAI(OpenAIConfig(
   baseUrl = { "YOUR_BASE_URL" },
   apiKey = { "YOUR_API_KEY" }
))

// stream completions
openAI.streamCompletions(
   ChatCompletionRequest(
      messages = listOf(ChatMessage.UserMessage("hello")),
      maxTokens = 1024,
      model = Model.GPT_3_5_TURBO
   )
).collect {
   println(it)
}

// chat completions
val chatCompletion = openAI.completions(
   ChatCompletionRequest(
      messages = listOf(ChatMessage.UserMessage("hello")),
      maxTokens = 1024,
      model = Model.GPT_3_5_TURBO
   )
)

OpenAI Gateway

implementation("com.tddworks:openai-gateway-jvm:0.2")

Then, configure the OpenAIGateway with your API keys and settings:

  • Default values are provided for the baseUrl, but you can override them with your own values.
  • OpenAI
    • default baseUrl is api.openai.com
  • Anthropic
    • default baseUrl is api.anthropic.com
    • default anthropicVersion is 2023-06-01
  • Ollama
    • default baseUrl is localhost
    • default protocol is http
    • default port is 11434

Example:

import com.tddworks.anthropic.api.AnthropicConfig
import com.tddworks.ollama.api.OllamaConfig
import com.tddworks.ollama.api.OllamaModel
import com.tddworks.openai.api.chat.api.ChatCompletionRequest
import com.tddworks.openai.api.OpenAIConfig
import com.tddworks.openai.api.chat.api.ChatMessage
import com.tddworks.openai.api.chat.api.Model
import com.tddworks.openai.gateway.api.OpenAIGateway
import com.tddworks.openai.gateway.di.initOpenAIGateway

val openAIGateway = initOpenAIGateway(
   OpenAIConfig(
      baseUrl = { "YOUR_OPENAI_BASE_URL" },
      apiKey = { "YOUR_OPENAI_API_KEY" }
   ),
   AnthropicConfig(
      baseUrl = { "YOUR_ANTHROPIC_BASE_URL" },
      apiKey = { "YOUR_ANTHROPIC_API_KEY" },
      anthropicVersion = { "YOUR_ANTHROPIC_VERSION" }
   ),
   OllamaConfig(
      baseUrl = { "YOUR_OLLAMA_BASE_URL" },
      protocol = { "YOUR_OLLAMA_PROTOCOL" },
      port = { "YOUR_OLLAMA_PORT" }
   )
)

// stream completions
openAIGateway.streamCompletions(
   ChatCompletionRequest(
      messages = listOf(ChatMessage.UserMessage("hello")),
      maxTokens = 1024,
      model = Model(OllamaModel.LLAMA2.value)
   )
).collect {
   println(it)
}

// chat completions
val chatCompletion = openAIGateway.completions(
   ChatCompletionRequest(
      messages = listOf(ChatMessage.UserMessage("hello")),
      maxTokens = 1024,
      model = Model(Model.GPT_3_5_TURBO.value)
   )
)