Sending large file to server causes time out

I’m working on an app that allows user to make a recording and then sends it for transcription using API call.
Everything is working fine when i run the program from anvil My Apps space, but when i publish the app, and run it from the generated URL, I get server timeout error.

I figure out the time out error occurs when I’m passing the media blob file to the server side for the API call. I’m pretty sure this is due the file being too large (about 30-60 MB depending on the length of the recording)

#Clinet call with the media blob file passed as parameter
get_transcript = anvil.server.call_s(‘transcribe’,audio_input) #this is where the sever time out error occurs

#server function that passes the media blob to a background task to get it transcribed
@anvil.server.callable
def transcribe(audio_file):
background_task_id = anvil.server.launch_background_task(‘transcribe_audio’, audio_file,)
return background_task_id

Can anyone advise how to handle this? the media blob file only needs to be stored temporarily and gets deleted after the API call is made. I found some info that the file could be broken up and send in chunks, but that sounds like a project in itself.

There’s a technique of using client writeable views that’s supposed to allow large file uploads. Here’s a clone link that demonstrates the technique: Anvil | Login

Basically you’re not passing the file via server functions, but only via the client writeable views.

2 Likes

Thank you for a quick reply, I did see your post about it already and will give it a try. Although, for a beginner programmer this method does look somewhat complex and a bit of a hack to implement what is a pretty basic functionality of working with files. Do you know if there is maybe a way to use background tasks for this, or increase the timeout on passing the files?

Background tasks are for functions that take longer than 30 seconds to run on the server itself. Background tasks do not communicate directly with the client, so cannot be passed files, except through the server function that starts the background task.

So no, background tasks are not useful for getting around the file upload timeout.

The client writeable view is a bit of a hack, but it’s not really a complex hack.

1 Like

After implementing this technique, I am able to load bigger files then before, but anything above 150MB still gives an error


Is there anything you can suggest to that I could add to the code to run the upload in the “background” so that it doesn’t freeze up to UI?

That error is your web browser warning you that the page is taking a long time to complete its task. If you were to click Wait every time that came up the upload should complete.

As far as I know, to avoid that, you need to let the Javascript event loop get control now and then, which apparently isn’t happening with the client writeable view transferring the file.

You might be able to spin the upload off into a background thread in Javascript, or chunk the file and upload individual chunks. I have zero experience with any of that, though, so can’t offer advice on the best way to go about it.

If you’re using some sort of AI based transcription API, they might have Javascript libraries available, too, which would allow you to pass the file to them in the client and not need to involve the server in a file transfer.

Thanks. I’m new to programing, but my understanding is that any option to call the API from the client has the potential to expose the API key, which is a no no in my case. If there are options to call an API in anvil from client side using JavaScript, without exposing the key, I would love to know more.
For now, will try to go down the route of chunking the file and sending over to the server side in chunks.

Very often APIs like that will have a way to request a session API key that’s good for one session only. You’d request that from an anvil server function and send it back to the client, then the client would use that key in their Javascript library. Exposing that key doesn’t matter, since it’s time limited.

I don’t know if the service you’re using does that or not.

Other options for file uploads including using Google Drive, which has a built-in integration with Anvil, and AWS S3 buckets, or similar other cloud services. Again, Anvil’s built-in file handling has always been fine for my needs, so I can’t comment on those, but there have been forum threads about each over the years.