We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hi, we have an .egg file with 1.7 Gib (please don't ask) that could not be uploaded to devpi with postgres
We saw already #985
We are getting an
54000 out of memory Cannot enlarge string buffer containing 0 bytes by 1391196735 more bytes.
error.
After digging a bit I found out that we are running against a hard limit of postgres (s.a. https://stackoverflow.com/questions/56714274/postgrest-postgresql-cannot-enlarge-string-buffer-message)
Current devpi-postgressql implementation uses bytea as type for storing the blobs
Maybe better to use LOBS for storing large binary data, which also comes with streaming APIs (https://wiki.postgresql.org/wiki/BinaryFilesInDB). Currently also the whole blob is copied into memory before pushing it to postgres, instead of using streaming or at least some kind of chunking. (https://github.com/devpi/devpi/blob/main/postgresql/devpi_postgresql/main.py#L466) Same on the way out.
But: That change would also require data migration
Are you using the latest released version?
Provide the output of pip list from the virtual environment you are using.
pip list
Provide the Python and operating system versions under which the issue occurs. Python 3.12 Linux Ubuntu 22.04
If possible, provide a minimal example to reproduce the issue. Upload a file > 1Gib
The text was updated successfully, but these errors were encountered:
There will be big changes to the backend for the next major devpi-server release anyway, so that would be a good opportunity to add this as well.
Sorry, something went wrong.
No branches or pull requests
Hi,
we have an .egg file with 1.7 Gib (please don't ask) that could not be uploaded to devpi with postgres
We saw already #985
We are getting an
error.
After digging a bit I found out that we are running against a hard limit of postgres (s.a. https://stackoverflow.com/questions/56714274/postgrest-postgresql-cannot-enlarge-string-buffer-message)
Current devpi-postgressql implementation uses bytea as type for storing the blobs
Maybe better to use LOBS for storing large binary data, which also comes with streaming APIs (https://wiki.postgresql.org/wiki/BinaryFilesInDB).
Currently also the whole blob is copied into memory before pushing it to postgres, instead of using streaming or at least some kind of chunking. (https://github.com/devpi/devpi/blob/main/postgresql/devpi_postgresql/main.py#L466) Same on the way out.
But: That change would also require data migration
Are you using the latest released version?
Provide the output of
pip list
from the virtual environment you are using.Provide the Python and operating system versions under which the issue occurs.
Python 3.12
Linux Ubuntu 22.04
If possible, provide a minimal example to reproduce the issue.
Upload a file > 1Gib
The text was updated successfully, but these errors were encountered: