Adding multiple rows to a table in one call

Hello,

What I’m trying to do:
I am performing an operation that can add anything between 1 and 2000 rows to a table. I would like to add all the needed rows in a single or a few calls to the database/datatable.

What I’ve tried and what’s not working:
I used a for loop to create one row at a time but I would normally do it in one or less than 10 SQL calls. Is it possible to batch create rows where I provide something like add_rows() with a array of dicts?

I have accelerated tables turned on.

Yes, by using add_rows with an array of dicts :upside_down_face:

2 Likes

I’m not sure what I’m doing wrong but add_rows() and all other accelerated table methods do not show up in auto-complete for me nor do they work if I put them in anyway. I enabled accelerated tables in my app’s settings.

I was wondering if it was possible without Accelerated Tables but it seems like it’s not.

These are the docs that I looked at when I tried to get it to work with Accelerated Tables:
Anvil Docs | Accelerated Tables Beta → Batch add, update and delete rows

If you used the method with arguments of the type @stefano.menci mentioned then we need a code snippet or clone to help further.

1 Like

This is an http endpoint that I often use to upload from tables from Excel VBA (it could be much simpler, most of the code is there to make sure the shapes of the incoming and destination tables match):

@anvil.server.http_endpoint('/update_table/:table_name/:delete_content', methods=['POST'])
def update_table(table_name, delete_content=False):
    print_debug('update_table', 'Call')

    json_data = json.loads(anvil.server.request.body.get_bytes())

    column_headers = [header.lower().replace(' ', '_') for header in json_data['headers']]
    rows = json_data['rows']

    table = getattr(app_tables, table_name)

    columns_only_on_the_table = set(c['name'] for c in table.list_columns()) - set(column_headers)
    columns_only_on_the_input = set(column_headers) - set(c['name'] for c in table.list_columns())
    error = ''
    if columns_only_on_the_table:
        error += f'ERROR: The following columns have not been provided: {", ".join(columns_only_on_the_table)}\n'
    if columns_only_on_the_input:
        error += f'ERROR: The following columns do not exist in "{table_name}": {", ".join(columns_only_on_the_input)}\n'
    if error:
        return error

    with tables.Transaction() as tx:
        if delete_content:
            table.delete_all_rows()

        columns = table.list_columns()

        rows2 = []
        for row in rows:
            row2 = {}
            rows2.append(row2)
            for column in columns:
                column_name = column['name']
                column_type = column['type']
                value = row[column_headers.index(column_name)]
                if value is not None:
                    try:
                        if column_type == 'bool':
                            row2[column_name] = bool(value)
                        elif column_type == 'string':
                            row2[column_name] = str(value)
                        elif column_type == 'number':
                            row2[column_name] = float_or_int(value)
                        elif column_type == 'simpleObject':
                            row2[column_name] = json.loads(value)
                        else:
                            tx.abort()
                            return f'ERROR: Unexpected column type: "{column_type}"'
                    except Exception as e:
                        tx.abort()
                        return f'ERROR in column "{column_name}"\nError: {e}\nRow: {row}'
        
        table.add_rows(rows2)

    return f'Added {len(rows)} rows'

The Excel VBA macro will either upload the full table in one shot, or if it’s very large, upload it in chunks, asking to delete the old table only after the first chunk.

I created a new blank app, enabled Accelerated Tables and it works there.
Thank you. add_rows() is working fine in the new app.