Fast way to query a Table

I have a relatively large (but very sparse) table (1000 rows, 30 columns) and would like to get all data in a dictionary.
If I do it column by column I get a timeout error. It takes about a few seconds per column. Anyone knows a faster way to get the data?

  columns = app_tables.survey.list_columns()
  survey_data = {}
  for column in columns:
    program_name = column['name']
    program_data = [r[program_name] for r in app_tables.survey.search()]
    survey_data[program_name] = program_data
  
  return survey_data

In theory I could go try to reduce the sparseness of the table but would rather stick with this many-column design if I don’t have to change it (since it is easy to download and visualize in excel).

Ok, so I found a ā€œbugā€, basically it is obvious that I should NOT run .search for each iteration, that’s time consuming. Instead this runs much faster:

  columns = app_tables.survey.list_columns()
  table = app_tables.survey.search()
  survey_data = {column['name']: [] for column in columns}
  for row in table:
    for column in columns:
      column_name = column['name']
      survey_data[column_name].append(row[column_name])
  
  return survey_data
2 Likes

This forum post might be helpful.

Using pandas with Anvil

You might also consider going to a list of dicts rather than a dict of lists

my_table = list(dict(list(row)) for row in app_tables.table.search())

You could also then switch to a dict of lists from there - stack overflow will have some suggestions for ā€˜from list of dicts to dict of lists’

Just tried the approach (thanks), but it was roughly 20X slower from one above. Worth a try though, so thanks.

That’s interesting. Is it client side or server side code?