I am up and running with a server node which hosts the PostgreSQL database (using Postgre.app) and PSu (client) application on a MacPro.
Also have a MacBook Pro with just the PSu client software that connects to the MacPro database server.
Great. Now I need to come up with a backup strategy. In the single-user edition this was fairly simple. Just backup (make copies) of the Catalog folder (or just the catalog files: photosupreme.cat.db and photosupreme.thumbs.db). Easy peasy.
This was automated for me since I use Apple's Time machine (backup #1). Also replicated onto my Synology NAS (backup #2) and Dropbox (backup #3). And finally bootable system backup using SuperDuper (backup #4). Yes I'm that paranoid.
Now in the server edition, things seem a bit more complicated.
- 1. Is the Backup function in PSu not enough in the server edition?
It seems to generate the same files as in the single-user edition, except the actual db file is much smaller in size. My single-user catalog backup file was around 1.67 GB, while the server edition catalog backup file is now 181 MB. About 10x smaller. I believe the difference is attributable to the fact the server file is actually a text file while the single-user file was a SQLite db file.
- 2. What would be the difference between using the PSu Backup function versus a tool like the pgAdmin Backup or the native pgdump command in PostgreSQL command line interface?
- 3. Any thoughts to other tools used? I have seen SQL database management and backup tools in the net such as Barman, SQLBak and others. They seem like overkill to me. These tools seem like they are designed for much larger corporate database installations. I don't want to complicate the issue.
- 4. Anyone using the Postgre server version willing to share their backup procedure/strategy?