-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
No result prompt in local with open_llm #1227
Comments
Make sure the open_llm service port can be connected, such as testing with telnet: |
Hello, in fact it wasn't filming. But failed to spin it. So I moved to Ollama with the llama3 model. This works, takes time to respond between 2-3 minutes (material resources problem). I modified the config2 file with the config2.example model. Execution is faster, I have the creation of the folders but stalls on a method.. Here is the result : `(metagpt-p311) c:\metagpt-p311\MetaGPT>python c:\metagpt-p311\Metagpt\metagpt\software_company.py "write a cli snake game based on pygame" The above exception was the direct cause of the following exception: Traceback (most recent call last): During handling of the above exception, another exception occurred: Traceback (most recent call last): The above exception was the direct cause of the following exception: Traceback (most recent call last): |
@better629 Can you look at this question? |
Bug description
Hello,
I installed and configured MetaGPT on a VM under vmWare. However, I end up with the errors below.
I saw topic 838 which ended up with the same errors.
Except that here I am local and not with external AI.
Environment information
2 vCPU & 6 GB
prompt minconda
Screenshots or logs
The text was updated successfully, but these errors were encountered: