Replies: 1 comment
-
Hey, you can use AiServices with any LLM that is supported by LangChain4j. interface Assistant {
String chat(String userMessage);
}
ChatLanguageModel chatLanguageModel = OpenAiChatModel.withApiKey(...);
// or
ChatLanguageModel chatLanguageModel = OllamaChatModel.builder() // for local models
.baseUrl("http://localhost:11434")
.modelName("llama3")
.build();
Assistant assistant = AiServices.create(Assistant.class, chatLanguageModel); To automatically parse LLM response into a custom POJO, just define it as a return type: class Person {
private String firstName;
private String lastName;
private LocalDate birthDate;
}
interface PersonExtractor {
@UserMessage("Extract information about a person from {{text}}")
Person extractPersonFrom(@V("text") String text);
}
PersonExtractor extractor = AiServices.create(PersonExtractor.class, model);
String text = "In 1968, amidst the fading echoes of Independence Day, "
+ "a child named John arrived under the calm evening sky. "
+ "This newborn, bearing the surname Doe, marked the start of a new journey.";
Person person = extractor.extractPersonFrom(text);
// Person { firstName = "John", lastName = "Doe", birthDate = 1968-07-04 } You can find more info in the documentation. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I would like to customize an AIServices with local LLM. Do I just need to modify the
AiServices.builder
, can AIServices handle the process similarly to using OpenAiChatModel? Additionally, I'd like to known how to use the custom output parser? Could you please provide guidance on this?Beta Was this translation helpful? Give feedback.
All reactions