Are there practical limits to file size when I’m uploading files to data tables?
Please see the pricing page:
And some time limit info:
And some RAM info:
I’m considering allowing people to upload log files so I can automate analysis, some of these files could be substantially larger than 1GB.
I am familiar with server background tasks, but the dilemma I found while testing is this:
- In client code a file_lo* ader is used to select (let’s say) five files totaling 5GB
- file_loader calls an anvil.server.task, passing the five files
- anvil.server.task calls a server.background_task, passing the five files
- anvil.server.background_task processes the file files
I think I’m missing something because this is timing out at step 2
Would it help to compress the files, e.g., with zip? A shorter file takes less time to transfer, so should be less likely to time out.
well, yes it does help to compress files, however, for example I have one .zip file that has 5 log files - the size of the .zip file is 20GB
Hmm, you may have to consider processing your data in smaller chunks and/or rethinking where those files are being uploaded to (i.e., perhaps you could transfer them directly to Google drive).
Agreed… in handling data this large I need to think carefully about minimizing copy or movement
That may also mean placing the processing power (and code) on the same machine as the data. Otherwise, you end up copying it again.