-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
pyreadr.custom_errors.LibrdataError: Unable to read from file for large RDS files #99
Comments
I think there should not be such a limit. In addition you should probably get a memory error instead of a unable to read from file error, so I suspect that there is something else happening with that file. IS the file something you have created yourself with R? or is it something somebody else generated? If somebody else I think as mentioned before the problem is something else besides the size. If you did create it, please share a simplified code to reproduce the issue. |
This is a matrix that I generated from my data. It is a 31595 by 39643 matrix saved using |
I can confirm this bug exists. I submitted a fix to |
Is there an upper limit on the size of RDS files that can be loaded using pyreadr?
When reading an RDS file of a small matrix, the code works well, but when reading large matrices (>10GB in size), I get the following error:
pyreadr.custom_errors.LibrdataError: Unable to read from file
The text was updated successfully, but these errors were encountered: