I’m planning a project and have a question about the potential possibilities for transferring larger quantities of data between client and server.
In previous apps, performance has generally been great but the transmission of data between the uplink server and the client has tended to break down when it reaches a certain size - the connection is lost and the app errors out. This has been the case even when a media object was used, though I am not sure of the exact cause of this yet due to the nature of my app which would require a lot of testing time to pin it down.
I want to find a surefire way of passing larger amounts of data from uplink server to client for user download without the risk of a disconnection (e.g. error 1006) or ‘payload too big’ error. I am familiar with Google firebase - does anyone know if Firebase could be a viable alternative for transmitting larger amounts of data from an uplink server to the client, or any other methods aside from the usual function return?
(I will need to use an uplink server, and I need my server code to be ‘always on’ and am not sure that server code hosted with Anvil can currently offer this?)
The server code returns a media object, or a list of dictionaries, or both, it doesn’t use a table.
(I’m not using Anvil Tables at this time due to their speed and functionality limits for the needs of my app - unless they have improved, I haven’t tried them for about 6 months)
The issue is this - If my server code needs to generate an Excel file with 600,000 lines of data and takes 1 hour to do this before sending it back to the client, I want to find a way to make sure the file can be transmitted to the client, without the user seeing an error message like ‘payload too big’ or ‘server disconnected 1006’, this has happened quite a few times.
I am asking if there is an alternative way to send these large files back to the client from uplinked server code after a long processing wait - just encase anyone has encountered the same problem and found better way for this, e.g. using a Google database connection instead.
I understand you don’t use Anvil Tables due to their limitations, but in this case I think they are the correct tool. You can use a table just for this.
The uplink or background task works for an hour and stores the result to the data column of an Anvil Table.
When the job is completed, stores "done" to the status column of the same table.
A timer on the client pings the server and checks status, and downloads data when ready.
I haven’t tried this with very large file sizes, but I think it should work.
So I just tried it with uploading a 194 MB file from a file loader to a row and it seems to work fine from a client-write enabled table using the browser/client. It uploaded and then was downloaded without any trouble. There was just the normal expected slowdown with all of the I/O going on.
The limit of the table editor in the IDE appears to be max of 8MB for uploads to a table row.
I did not try using a regular server module call, it took a little over 1min which is obviously longer than 30 seconds.
I did not check background tasks either.
So I’m assuming if the row can store that much data as a media object it can probably still work for the required task.