Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using Mixtral as Local LLM Fails #1401

Open
1 task done
CharlesMod opened this issue Feb 10, 2024 · 3 comments
Open
1 task done

Using Mixtral as Local LLM Fails #1401

CharlesMod opened this issue Feb 10, 2024 · 3 comments

Comments

@CharlesMod
Copy link

CharlesMod commented Feb 10, 2024

⚠️ Check for existing issues before proceeding. ⚠️

  • I have searched the existing issues, and there is no existing issue for my problem

Where are you using SuperAGI?

Linux

Which branch of SuperAGI are you using?

Main

Do you use OpenAI GPT-3.5 or GPT-4?

GPT-3.5

Which area covers your issue best?

Agents

Describe your issue.

Attempt to use nous-hermes-2-mixtral-8x7b-sft.Q4_K_M.gguf from TheBloke using the standard Local LLM loader shown in the youtube video released this January.

How to replicate your Issue?

Edit docker-compose-gpu.yml to mount the volume containing the local llm model.

Then, attempt to run the model with a new agent. This will result in "no model found". (The docker log in the CLI will give more information, the error occurs right after loading the model after running the agent.)

Upload Error Log Content

backend-1 | error loading model: create_tensor: tensor 'blk.0.ffn_gate.weight' not found backend-1 | llama_load_model_from_file: failed to load model backend-1 | 2024-02-04 22:11:06 UTC - Super AGI - ERROR - [/app/superagi/helper/llm_loader.py:27] - backend-1 | from_string grammar: backend-1 | backend-1 | 2024-02-04 22:11:06 UTC - Super AGI - ERROR - [/app/superagi/controllers/models_controller.py:185] - Model not found. backend-1 | 2024-02-04 22:11:06 UTC - Super AGI - INFO - [/app/superagi/controllers/models_controller.py:203] - Error: backend-1 | 2024-02-04 22:11:06 UTC - Super AGI - INFO - [/app/superagi/controllers/models_controller.py:203] -

@CharlesMod
Copy link
Author

Here is a more complete error log:
errorLogsDocker.txt

@memamun
Copy link

memamun commented Apr 1, 2024

I am also facing the same issue. Have you able to get it fixed?

@rounak610
Copy link
Collaborator

@memamun @CharlesMod could you try running mixtral with the "fixes_for_mixtral" branch instead of the main branch and let me know if you face any error

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants