I know Anvil is taking care of backing up our databases and restoring them in case of catastrophic failures. That’s the most important thing. It feels good!
But I also would like to be able to dig and see what was in the database last week or last month or last year.
Hence this request: please allow to download a complete backup of the database. The frequency (weekly/monthly/quarterly) and the scope (per account/per app/per table) should be configurable.
A simple way to do this could be to put all the tables into one zip file, share it on some file sharing platform and send a notification that the backup is ready to be downloaded. Before the next backup, the file would be deleted to make room for the next round.
I often have problems and do some postmortem analysis. I can travel back in time with the app by cloning and digging in the git history. I would like to do something similar with the data tables.
This could be a service free (yummy!), or offered only on certain plans, or a la carte, pay per usage, proportionally to the frequency and size of the backup.
@jshaffstall was kind enough to update my backup routine and post it here:
Edit:
Nowhere near as nice as what you are suggesting!
SQLite has some advantages over zipfiles, i.e., the ability to embed the schema (included above), and to query the data directly.
A decent, hierarchical table-naming convention would embed the scope (appname__dbname__tablename) into the table name, to disambiguate.
Alternatively, a separate table could be used to map the hierarchy into some lower-level table name. This would be handy if, for example, you have a huge table shared by multiple databases and/or apps, and want it to be represented, accurately, as a single table in the backup as well, avoiding needless duplication.
There could conceivably be the best of both worlds, i.e., by zipping the .sqlite file.
Currently, bits and pieces of this could be programmed per App as a background task. However, they would not be deduplicated.
I remember when I saw it I thought “This is cool! I will create my own version that uses SQL to access all the tables and run it automatically every weekend” (with SQL you can access any table without linking it to the app).
This really doesn’t help anyone else at the non-direct-SQL access level, but there are so many built in functions in PostgreSQL that have been optimized and created to backup entire databases and tables, and tons of python specific backup tools to backup your “whatever flavor” of SQL using python and a JDBC/ODBC connection.
With direct SQL access, the place I used to work could whip up a programmatic backup of all the tables and schemas, and a way to restore them from scratch in a few days**, using PHP no less. (The Database will do most of the heavy lifting, if you can tell it what to do)
** edit: a few days of programming. The backups took about 30 min to run, mostly because of I/O