Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wheel releases all say "File is too big to commit. Rebuild the *** wheel." #6

Open
swcrazyfan opened this issue Dec 3, 2022 · 5 comments

Comments

@swcrazyfan
Copy link

What should I do? I've tried to use these wheels on Paperspace, but nothing works. They all seem to be corrupt. Does the note mean I am supposed to rebuild them to use them or is that a note for yourself?

Thanks so much for building all these!

@Cyberes
Copy link
Owner

Cyberes commented Dec 3, 2022

Make sure to save the file as xformers-0.0.14.dev0-cp37-cp37m-linux_x86_64.whl or else pip will fail to install it.

If I was installing a wheel for the RTX5000, I would rename rtx5000-xformers-0.0.14.dev0-cp39-cp39-linux_x86_64.whl to xformers-0.0.14.dev0-cp37-cp37m-linux_x86_64.whl then do pip install xformers-0.0.14.dev0-cp37-cp37m-linux_x86_64.whl

@Cyberes Cyberes closed this as completed Dec 3, 2022
@swcrazyfan
Copy link
Author

So, I found out my issue. It seems Gradient updated their containers. I created my own, new, wheels, and everything works perfectly. Would you mind if I submitted pull requests? I have A6000 and A100 so far.

@Cyberes
Copy link
Owner

Cyberes commented Dec 7, 2022

Sure, if the files are over 50MB just send a link and I'll upload the wheels to releases.

@Cyberes Cyberes reopened this Dec 7, 2022
@swcrazyfan
Copy link
Author

I'll send over a link when I get the chance. If I have time, I'll build the others.

@anonderpling
Copy link

anonderpling commented May 14, 2023

I'm using huggingface.co to host wheel files for SD. While HF doesn't have a repo type for such functions, I've read through their TOS and Content guidelines, and I can't find anything that describes what type of content they allow to be uploaded (other than licensing and sfw/nsfw distinctions).

If you use huggingface_hub to upload a folder, it should only upload files that have actually changed (I've used both colab and HF spaces this way); alternatively, you can use git LFS to upload, but you'll still need huggingface_hub for the LFS uploader and if there's an image files > 1mb, they'll need to be tracked manually because huggingface's LFS uploader doesn't do that automatically...

You can download the repo with git LFS enabled, or with huggingface_hub's download_snapshot function, or you can download individual files with the /resolve/ link1

Primarily, the advantage is that huggingface doesn't complain about large files being uploaded, and they also don't seem to limit bandwidth

Footnotes

  1. if you use aria2c, it won't name them properly because the server is broken, so you'll need to fix that using --out= (use indented out= on the next line in an input file). See Check the header “Content-Disposition” to determine the filename to store aria2/aria2#1118 for bug report on this. wget and other tools work fine.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants