How do I speed up interaction with Google drive sheets?

I am trying to import some data from a few Google drive sheets into an Anvil table, and every call to ws[r,c] is very slow.

Is there something like ws.cache_all() or ws.cache_range(range)?

Unfortunately there isn’t - the Google Sheets API is just painfully slow. One-time import is definitely your best bet.

I have never used them, but it looks like the Google APIs allow to get a range of values at once with the get() function.

I tried using the Google APIs because I need to run a weekly import, about 30 rows per week.

Since this is very slow, I thought about doing the weekly import incremental (I don’t know if I can, I will try) and importing the existing data with a one-time import, but it I failed here too.

I tried uploading a csv file with about 5000 rows with the FileLoader and add to the table, but it timed out after about 1300 rows. I tried wrapping it in a transaction, and I could add about 2600 rows.

Is there a better way?

1 Like

Hey,

We’re working on creating a “bulk add” operation for when you’re calling add_row() lots of times like this! In the meantime, I can suggest breaking up the data to be imported into chunks, or running the import operation on the Uplink, which does not have timeouts.

1 Like