Replies: 1 comment
-
🤖 Hey @NILICK, fancy seeing you here again! Hope you're doing well amidst the coding adventures. Based on the error message you provided, it seems like there's an issue with the server that's hosting the LLM (Language Learning Model). The The Here are a few things you could try to resolve the issue:
If none of these suggestions resolve the issue, it would be helpful to have more detailed error messages or logs, and to know at which point in the code the error occurs. This would help in pinpointing the exact cause of the error. For more information, you can refer to the LlamaIndex codebase, specifically the Sources
|
Beta Was this translation helpful? Give feedback.
-
I'm trying to train QuestionGeneration example using Ollama. But it return
HTTPStatusError: Server error '500 Internal Server Error' for url 'http://localhost:11434/api/chat'
. This is my complete code:This code return this Error:
Beta Was this translation helpful? Give feedback.
All reactions