I’ve noticed that launching a background task in the server code is quite slow. What I mean is simply launching the background task, not waiting for it to finish.
So I’m wondering if there is a potential bug that waits for the background task to execute.
This is my code for the background task:
@anvil.server.background_task
def add_or_update_user_row(user_row, user_data):
if user_row is None:
app_tables.users.add_row(**user_data)
else:
user_row.update(**user_data)
And this is how I launch it (from a function in the same server module):
print(f"Saving to database START: {datetime.now().time()}")
anvil.server.launch_background_task('add_or_update_user_row', user_row=user_row, user_data=user_data)
print(f"Saving to database END: {datetime.now().time()}")
Simply launching the background task is taking more than 10 seconds in some cases. As a result of this I often run into server timeouts.
E.g., here are the logs from two calls of the server function:
Saving to database START: 15:23:15.000270
Saving to database END: 15:23:27.000532
Saving to database START: 15:24:49.000612
Saving to database END: 15:24:58.000503
Is anvil.server.launch_background_task working correctly or is this a potential bug? Or is it me that’s doing something wrong?
A background task is just another instance of your server-side code, spun up independently. The whole server-side program has to start, including any unconditional imports, and other things it does when launched. Only after it has finished initializing does Anvil call the specific function you asked for.
So, how long does your server-side App take to start and stop, on its own, i.e., not as a background task? To see, you might add the same print calls inside of add_or_update_user_row. (The output will appear in the background task’s log.)
So, it actually had nothing to do with the imports or the time it takes to spin up an instance of the server code. Spinning up an instance took ~50-100 ms in my case, but launching the background tasks was still slow.
The issue was the size of the arguments (up to a few Mbs) being passed on to the background task, which I was able to address with some compression.
So I’m leaving this as a note to anyone who might stumble across the same issue in the future - if your backgroung tasks are slow to launch, the reason might be the size of your arguments.