Hey Jay,
I was thinking more about discussions around what people have found or done.
As an example,
I have code that works fine when I write, push it and test it. Now, if I hope to a video meeting, it fails with the timeout error.
I’m using this approach to only have to do transformations once and cache things, Simple Caching approach
And this is an example of a functions that fails during meetings. It’s just more data than expected.
@anvil.server.callable
def single_search_server_get_data(force_rebuild=False):
c_data = global_funcs.cache_get(f'single_search_server_get_data')
if c_data and not force_rebuild:
data = json.loads(c_data)
return data
overview_data = app_tables.imported_view_overview.search()
out = []
keys_to_remove = [
'example1',
'example2'
]
for item in overview_data:
datum = dict(item)
for key in keys_to_remove:
del datum[key]
out.append(datum)
c_data = json.dumps(out)
global_funcs.cache_set(f'single_search_server_get_data', c_data)
return out
Using this with an anvil extras pivot, GitHub - anvilistas/anvil-extras
This is probably one of the biggest ones I’m working with. This JSON is an array of objects. The array has 5,691 objects. Each object has 30 keys. Some of them are small numeric values, but other are names of organizations (which can be around 70 characters).
The data will be updated via an uplink connection.
And while I can look at optimizing the functions and data, my understanding is that time to transfer is a factor, not just time to run the function. If someone’s at a slow enough connection, they’ll still have issues with this approach.