I have a relatively large (but very sparse) table (1000 rows, 30 columns) and would like to get all data in a dictionary.
If I do it column by column I get a timeout error. It takes about a few seconds per column. Anyone knows a faster way to get the data?
columns = app_tables.survey.list_columns()
survey_data = {}
for column in columns:
program_name = column['name']
program_data = [r[program_name] for r in app_tables.survey.search()]
survey_data[program_name] = program_data
return survey_data
In theory I could go try to reduce the sparseness of the table but would rather stick with this many-column design if I donāt have to change it (since it is easy to download and visualize in excel).
Ok, so I found a ābugā, basically it is obvious that I should NOT run .search for each iteration, thatās time consuming. Instead this runs much faster:
columns = app_tables.survey.list_columns()
table = app_tables.survey.search()
survey_data = {column['name']: [] for column in columns}
for row in table:
for column in columns:
column_name = column['name']
survey_data[column_name].append(row[column_name])
return survey_data