The VS Code terminal output seemed to freeze up after the columns were created…
Connecting to wss://anvil.works/uplink
Anvil websocket open
Connected to "Default Environment" as SERVER
Anvil server output: Automatically creating column "unique id" (string)
Anvil server output:
Anvil server output: Automatically creating column "User First Name" (string)
Anvil server output:
Anvil server output: Automatically creating column "Training" (string)
Anvil server output:
Anvil server output: Automatically creating column "Firm Name" (string)
Anvil server output:
Anvil server output: Automatically creating column "User" (string)
Anvil server output:
Anvil server output: Automatically creating column "Training Completion Date" (string)
Anvil server output:
Anvil server output: Automatically creating column "Renewal Date" (string)
Anvil server output:
Anvil server output: Automatically creating column "Reminders On" (string)
Anvil server output:
Anvil server output: Automatically creating column "User Last Name" (string)
Anvil server output:
That said, when I go to the table view in Anvil, it does seem that the rows/records were properly imported.
I’d like to confirm that all the records were indeed imported, but there doesn’t seem to be a way for me to see how many rows/records are in that table.
How can I get a count of how many records are in an Anvil database table?
Just realized that you can download a csv file of any Anvil database table, so that was a way for me to see the number of rows/records in the table.
For some reason 2,370 records did not make it over into Anvil. My guess is that the import froze up for some reason either on Anvil’s end or on my laptop’s end. I’ll try the import again and report back.
Turns out that the terminal didn’t freeze up. It was just taking a long time to get the 30k records into Anvil.
I didn’t expect it to take so long so I assumed something was wrong, but that wasn’t the case…the terminal eventually showed that everything executed properly.
I downloaded a csv from Anvil of the database table to confirm that all the records/rows were sucessfully imported.
The example in the documentation is very slow because it uploads one row at a time.
I’m in the US and with my persistent server on my dedicated plan, a round trip takes 0.2 seconds.
30,000 rows at 0.2 seconds per row would be 5,000 seconds or 83 minutes. Without persistent server or dedicated plan, or moving to the West coast, or actually doing something in those calls like writing to a database, would go up to hours.
Instead I have excel macros that run nightly and upload thousands of rows in seconds.
They use an http endpoint instead of an uplink connection, because http endpoints can be used in any language, including vba, but that has nothing to do with the speed.
They are fast because they read a few thousands of lines, convert them to json (that’s the slow part in vba) and send them in one shot as the payload in an http call. The server adds them all to the database, rinse repeat.
My macros were calibrated to take about 20 seconds per call, that is tables with a few columns and little data would load chunks of 5,000 rows, while tables with dozens of columns or with larger amount of data per row would load chunks of a few hundreds. Then the accelerated tables came up, and the 20 seconds per chunk went down to 2 seconds.
Agreed, Accelerated Tables help immensely. @eddie, if you’re going to use Uplink to put data in, consider updating your code to use the Bulk Add feature described in New Features.
You would think this would take forever, but both the default len() from python, and the len operation as handled by the app tables service written by anvil are super optimized, so it shouldn’t take more than a second with millions of rows.
Also I made this for people who are starting out trying to upload data to anvil to test it out, it leans into how to optimize it yourself without worrying about getting it working first: