Insert rows slow?

Hi,

I test the speed of the Anvil app by inserting 20,000 rows and it takes more than 30 seconds to process, on Individual plan as well as the open source server.

It seems that the background task should be used, but is the time is too slow comparing to php, which are about 2 seconds?

What is your view?
Thank you

Did you wrap those inserts within a transaction?

I just use the standard

add_row

Is it the transaction you mentioned
https://anvil.works/docs/data-tables/transactions#transactions

If you don’t wrap the inserts, the server will make 20000 separate databse calls - very slow!

So yes, that link is for the correct docs which describe how to use a transaction so the inserts are done in a single database call.

Thanks @owen.campbell, it makes sense. However, it still doesn’t work

def button_1_click(self, **event_args):
    """This method is called when the button is clicked"""
    self.timer_1.interval = 0.01
    self.label_2.text = anvil.server.call('add')
    self.timer_1.interval = 0

@anvil.server.callable
@anvil.tables.in_transaction
def add():
  for x in range(20000):
    task = app_tables.test.add_row(status="20000")
  return len(app_tables.test.search(status="20000"))

In what way does it not work? What error do you get?

(I think you may have the two decorators in the wrong order. Check the docs)

It still takes more than 30 seconds to complete while I follow the docs:

Make sure to specify @anvil.tables.in_transaction after @anvil.server.callable. 

# Server code only
import anvil.tables

@anvil.server.callable
@anvil.tables.in_transaction
def do_update():
  jane_smith = app_tables.people.get(Name="Jane Smith")
  jane_smith['age'] = 21
  app_tables.notes.add(Person=jane_smith,
                       Text="She's old enough to fly an aeroplane.")

Anvil bulk inserts are slow.
Wrapping the function in a transaction helps speeding it up a little, but there still are thousands of python objects being created and executing thousands of sql statements.

Depending on your requirements, you can:

  • Split the insert in chunks of 1000 rows. It will still take longer than 30 seconds but it will work. But you will not be able to wrap the whole operation in one transaction, you will have one transaction per 1000 row chunk
  • Use sql and build the sql string, bypassing the slow orm. This will be the fastest option, but you need a dedicated plan
  • Use an uplink script. Will be still slow, but it will work in one single transaction, if that’s what you need.
1 Like

Thanks for your suggestion. Indeed, the orm is quite slows