-
-
Notifications
You must be signed in to change notification settings - Fork 415
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is entire file loaded into memory to be queried? would prefer read line by line, not whole file loaded in (10Gb file) #313
Comments
Hi, sorry for the late reply. q does load the data into memory for processing, but it does contain an automatic caching feature which might help for large files. If you run q with the In order to create the cache file, you could run After this preparation step (which will take a considerable time for a 10GB file), you will be able to do either of the following:
Hope that will help. I'd appreciate it if you can write down your impressions of the speedup here after testing this. |
thank you @harelba
I will let you know! thank you! |
I have a 10Gb TSV file that I'd like to read using SQL commands.
As a TSV, tab-separated value file, it is a spreadsheet-like file in that it has headings/columns and rows. It's effectively like a single table database.
Being 10Gb I'd prefer not to bulk complete read in all at once as the whole file into memory, due to time taken to do so and limitation of machines' memory size (though I do have 16Gb, 24Gb and 32Gb machines).
Can you advise if, when running queries on the TSV, it is loaded into memory entirely all at once?
The text was updated successfully, but these errors were encountered: