I have hit this limit before I ever heard of anvil.
The easiest way to get around a google call limit, is to construct all of the data on your side that you would like to add as one object that is digestible by a google api (using googles api library on your side) and then sending it all at once in one call.
They never seem to care about an amount of data transmitted, only about the high frequency of calls.
Just from anecdotal experience, it seems like they (google) will throttle faster the faster you send them calls.
in every iteration of
Its trying to send an object as fast as python can send it.
Here is some code I use to upload 1k or 2k, 5k etc. rows multiple times a day every day from an inventory program:
if row_list_data:
# row list data is a list of rows as lists
for i, _ in enumerate(row_list_data[0]) :
cell_list = gsheet.range(1, i + 1, len(row_list_data), i + 1)
gsheet.cell
for cell, row_list_data_cell in zip(cell_list, row_list_data):
cell.value = str(row_list_data_cell[i])
response = gsheet.update_cells(cell_list, value_input_option="RAW")
# I have to send this as raw only because i need google sheets to
# not auto format leading zeros on a barcode scan
#print(response)
else:
print("Empty Data Sent to Function: 'upload_as_gsheet' !!!!")
return False
Edit: for this to work in one shot you have to make sure the size of the sheet is already the exact size or larger than the data you want to update using:
gsheet.resize(len(row_list_data), len(row_list_data[0]))
where gsheet
is a google worksheet object.