-
Notifications
You must be signed in to change notification settings - Fork 384
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Running CogVLM and CogAgent on MPS #432
Comments
@1049451037 Hi, I can see that you've commented on the other macOS related issues here: #248, were you able to successfully use these models on macos? |
hi @zRzRzRzRzRzRzR, i see you self-assigned this to yourself. any updates on this? |
@Allisterlim did you managed to make it run on your macos? |
Following |
For now it is only support in Linux because we use xformers and triton which is only support on Linux, running in MPS is not available now |
System Info / 系統信息
Using Mac M3 Pro 18GB unified memory
Who can help? / 谁可以帮助到您?
No response
Information / 问题信息
Reproduction / 复现过程
I am attempting to run cogVLM and cogAgent https://huggingface.co/THUDM/cogagent-chat-hf (script provided in the readme.md)
I have changed DEVICE = 'mps'.
I get the error message 'ModuleNotFoundError: No module named 'xformers'' which is used in the visual.py file in both cogagent and cogvlm.
based on previous issues i've found in the xformers github, it doesn't look like it is supported for macos (AUTOMATIC1111/stable-diffusion-webui#8188). Has anyone managed to get cogvlm or cogagent working for macos and if so, how did they get around this xformers dependency issue?
Expected behavior / 期待表现
To run model inference on mps
The text was updated successfully, but these errors were encountered: