Batching of Server Calls

Created a simple package that allows batching of Server Calls. If you want to avoid multiple server calls in your app but also avoid writing a single complicated server function, this package will do the job for you.

There are three ways of using it

1 - ‘with’ block with .value

import anvil.server
from Server_Batching import Server_Batching

with Server_Batching.BatchServerCall(): 
    call_1 = anvil.server.call('test1') #Use server calls normally
    call_2 = anvil.server.call('test2') 

print(call_1.value) #The return value of the server call is accessible from .value
print(call_2.value)

In this example, a single call will execute both test_1 and test_2. The value for test1 and test2 can be accessed at .value after the with block. If you try to access .value inside the with block itself, it will return None

2. ‘with’ block with callback

import anvil.server
from Server_Batching import Server_Batching

def handle_test_1(value):
   print(value)

def handle_test_2(value):
   print(value)

with Server_Batching.BatchServerCall():  
    
    anvil.server.call('test1').on_complete(handle_test_1)
    anvil.server.call('test2').on_complete(handle_test_2)

3. Global Batching with on_complete

For more complex app structures, you can use a global batching system. This gives you complete control over when to start batching and when to execute those batch. Useful if you are embedding multiple forms that may have calls of their own.

For example (Assuming that SubForm1 and SubForm2 have their own server calls in form_show)

from ._anvil_designer import Form1Template
from Server_Batching import Server_Batching

class Form1(Form1Template):
    def __init__(self, **properties):
        self.init_components(**properties)
        Server_Batching.start_global_batching()

        self.add_component(SubForm1())
        self.add_component(SubForm2())

    def form_show(self, **event_args):
        Server_Batching.execute_global_batch()

Unless you call the execute_global_batch, any server calls happening anywhere on app (after start_global_batch) will be queued. With global batching, it is usually best to use the on_complete callback.

Note: I haven’t tested all my functionalities of this yet since it was made in a hurry for my own use case. If anything comes up, please let me know

8 Likes

Thanks for this interesting approach to reduce immediate server calls. As I am working on optimising these server calls (‘as few as possible’) for a larger app, I have a few questions to better understand how it works:

  1. what happens if one of the batched calls raises an exception?
  2. what if clients queue hundreds of calls? Could it overwhelm the server or other limits?
  3. have you compared ‘no batching’ vs. ‘batching’ to highlight the performance gains?
    Thanks again for this example!
3 Likes

Looking only at these Forum messages, it looks like each client instance (individual browser tab) has its own queue, so getting to “hundreds” (per queue) would likely be a bug on the developer’s part.

1 Like
  1. Good catch. Right now, the exception won’t be handled so all of the calls will just fail. But I suppose, the better way would be to return None in value for the call that raised an error and print a warning to console?

  2. Usually, the only concern I would be worried about would be the 30 second timeout issue. It shouldn’t cause any severe burden otherwise.

  3. Will do a proper testing in a while and share the results. Although you should see performance gains directly proportional to the number of server calls

2 Likes

Updated the dependency. Now a single failed call will not affect the rest of the calls. The value of the failed call will be None and a warning for it would be printed on the console.

1 Like

@divyeshlakhotia: Cool! Thank you for this extension “if not callable” and the try/except-block.
Recently I encountered this ‘30 second timeout issue’ for the first time (but could be solved by creating a ‘background task’).

@p.colbert: You got me. :sweat_smile:
I didn’t want to exaggerate either, I was just thinking of possible stumbling blocks, but I see your point. Thx!

1 Like