Combining multiple tables to be shown in a table

Hi there, this is my first post here, so have mercy! I was having a problem in which I don’t know if the solution that I’ve reached could be considered as a best practice or not.

Disclaimer: English is not my native language, so sorry for unwanted silly mistakes…

The problem: Combine data from three tables with different columns to populate a single repeating panel. This is more explicitly:

  • Having three tables (T1, T2 and T3) with columns:
  1. A, B, C, D for T1
  2. A, B, C for T2 and
  3. A, B, C, E for T3
  • I need to generate a table like:
          A     |    B    |    C    |
row 1   ...
row 2   ...
...

The Question: Is my solution in the end a best practice? (You can jump to Attempt 3 if you want to see the result)

Attempt 1 - First error
Following the logic for populating a repeating panel I make the following

server side

def compile_tables():
  table_t1 = app_tables.table_one.search(something)
  table_t2 = app_tables.table_two.search(something)
  table_t3 = app_tables.table_three.search(something)
  return [table_t1, table_t2, table_t3]

client side

tables= anvil.server.call(compile_tables)
self.repeating_elements.items = tables

This won’t work, it simply populates three empty rows and I think that populating the table three times, so worthless and inefficient.

Attempt 2: Works but really slow
server side

def compile_tables():
  table_t1 = app_tables.table_one.search(something)
  table_t2 = app_tables.table_two.search(something)
  table_t3 = app_tables.table_three.search(something)
  data= {'table_t1': table_t1, 
         'table_t2': table_t2, ... etc }
  return data

client side

tables= anvil.server.call(compile_tables)
# Populating table
self.repeating_elements.items = []
for k, v in tables.items(): 
  if v: 
    self.repeating_elements.items += v

This works but as I said, it was slow (Maybe because my speed is a bit slow here in Argentina during this COVID days due to the quarantine)

Attempt 3 - Final result
server side

import itertools
def compile_tables():
  table_t1 = app_tables.table_one.search(something)
  table_t2 = app_tables.table_two.search(something)
  table_t3 = app_tables.table_three.search(something)
  combine_1= itertools.chain(table_t1, table_t2)
  combine_2= itertools.chain(combine_1, table_t3) # Maybe here I could've 
# combined in one line but the idea is the same
  result = [x for x in combine_2] # Returning combine_2 raise a Serialization 
# error
  return result

client side

tables= anvil.server.call(compile_tables)
# Populating table
self.repeating_elements.items = table

Which was like 5s faster than the previous attempt and much more cleaner on the client side

PS… If this could not be well understood then I will try to write a small app to show this then. But thanks anyway for taking time reading it!

hi @federivadeneira and welcome to the forum,

looks good to me.

you may also find this useful.

worth noting the difference between sending a list of table rows (version 3) vs a search iterator (version 2)
when you send the search iterator anvil will cache the first 100 values or so and will make a new server call every 100 iterations. This can be slow if you’re loading things into a repeating panel.

Other musings:
if you have a table row with linked row that is used to populate the datatable, this can be slow because anvil will need to look up the linked row data in the server.

Sometimes sending a list of dicts rather than a list of table rows might be quicker, especially if you have a simple object column. Loading a simple object from a table row on the client creates a server call. Converting a table row to a dict before sending it to the client mitigates this.

1 Like

Thanks @stucork for the detailed answer!
So:

Do you suggest me something like this?

server_side

query_table_1= app_tables.table_1.search(sth) 
query_table_2= app_tables.table_1.search(sth) 
query_table_3= app_tables.table_1.search(sth) 
for query in [query_table_1, query_table_2]: 
  if len(query) > 0: 
    all_queries += {'A': query['A'], 
                    'B': query['B'], 
                    ...
                   }

I’ll be surely sending less data trough the server and no linked rows there… But I’ll have some limitation that’s usually really practical like deleting or updating the row trough a simple

self.item.delete()     # or
self.item.update(**kwargs)

I think that I’ve found this limitation before when I was starting using Anvil, now I’ve found this workaround that I’ll copy because someone might found it useful

  1. Added the id to the row item returned from the DB (Modifying the server function previously copied)
for query in [query_table_1, query_table_2]: 
  if len(query) > 0: 
    all_queries += {'A': query['A'], 
                    'B': query['B'], 
                    'id': query.get_id(),
                    ...
                   }
  1. Add a new server function to make the query to get the row element to later be updated/deleted by the client. Explicitly looking in the target tables

server side

def get_row_element(row_id): 
  table_1_query=  app_tables.table_1.get_by_id(row_id)
    if table_1_query
      return table_1_query, 'table_1' # this str is just to know which
                                      # table has returned the row

  table_2_query=  app_tables.table_2.get_by_id(row_id)
    if table_2_query
      return table_2_query, 'table_2'

And later from the client side to trigger a click event for example from a link/button in a RowTemplate
client side

def delete_button_click(self):
  this_row= anvil.server.call('get_row_element', self.item['id'])
  this_row.delete()

Thanks again @stucork! I’ve reduced the time in like 20%!

1 Like

or something like…


def row_as_dict (row):
  d = dict(row)
  d['id'] = row.get_id()
  return d

tables = ('table_1', 'table_2', 'table_3')
list_of_dicts = []
for table in tables:
  rows = getattr(app_tables,table).search(sth)
  list_of_dicts += [row_as_dict(row) for row in rows]
1 Like