Backing up constantly changing files?

Post Reply
Username
Posts: 207
Joined: 18 Feb 18 22:21

Backing up constantly changing files?

Post by Username » 30 Dec 18 7:59

How do you guys manage your backup of photos and sidecars?
As they are often changing and with "automatically write out to file" a new jpg/dng et al. is created which need to be backed up again.

I'm not sure writing out every change in xmp-data to jpg is such good solution as the backup sets grows immensely fast.

Write out XMP only but not into jpg?
Split the backups to backup jpgs and .xmp/.DOP less often?

Any ideas?
PSu Server 5 & Postgres 10 on macOS 10.14
- I'm the user

Username
Posts: 207
Joined: 18 Feb 18 22:21

Re: Backing up constantly changing files?

Post by Username » 31 Dec 18 14:15

This might be a question for Hert.

When PSu writes out xmp to the image files.
How does that process work?
Does PSu only inject and add the xmp data into that specific section of the image file leaving the rest of the file bit identical?
So only a few kB of data is changed/added.

Or is the photo totally rewritten with new xmp data added?

Depending on how this is done I might rethink my backup strategies.
PSu Server 5 & Postgres 10 on macOS 10.14
- I'm the user

Hert
Posts: 5857
Joined: 13 Sep 03 7:24

Re: Backing up constantly changing files?

Post by Hert » 31 Dec 18 14:22

Only the metadata sections of the file are updated (XMP, IPTC,Exif, GPS where applicable). Image data remains untouched, so no recompression.
Not sure how that changes your backup strategy.
This is a User-to-User forum which means that users post questions here for other users.
Feature requests, change suggestions, or bugs can be logged in the ticketing system

snowman1
Posts: 250
Joined: 01 Jan 07 3:13
Location: UK

Re: Backing up constantly changing files?

Post by snowman1 » 31 Dec 18 14:40

I take it you are talking about backups of the image / sidecar files rather than PSU catalogue backups.

If you are exchanging images as part of your activities then you will almost certainly be wanting to write the metadata to the image/sidecar file, either automatically or manually at your convenience. But if you aren't then there may well not be much reason to do so. But if you don't you must be certain you have catalog backups. Also automatic writing ensures you can, in the event of a catalog loss, roll forward the catalog backup with any work done since the backup (assuming you have your write parameters correctly set).

Writing out metadata for lots of files in big job rather than automatically - and then having a consequent large backup job - may or may not help, depending on whether it suits you to do so. It may mean you backup a file once rather multiple times. Of course you have to remember to run the sync!

Using sidecars means only the sidecar file gets updated, and these are a fraction of the size of the image itself. But then if you are exchanging files you probably don't want to use a sidecar file as that can easily be ignored, separated, lost, at the other end.

A reliable blockwise (as opposed to filewise) backup program is probably what you need.

I wrote a post a while back here viewtopic.php?f=57&t=24237&p=113153&hil ... up#p113153 that touched on these questions, if it helps. Everyone's needs are different, the trick is to work through what you needs are.
Snowman1
http://www.flickr.com/photos/snowman-1/
--------------------------------------

Username
Posts: 207
Joined: 18 Feb 18 22:21

Re: Backing up constantly changing files?

Post by Username » 31 Dec 18 20:45

I'm running Duplicati as backup engine.

Images are backup in their own set where I save a full set for each week, each month and each year.
The Postgres db is backup as well, both as a live backup as well as an offline backup of pg_dump.
All this is done both locally as well as to two different off site locations.

Duplicati does 10KB block incrementals but it kind of depends on how PSu writes the updated XMP data to the files.
Here's the Duplicati thread about this.

https://forum.duplicati.com/t/recommend ... les/5826/2
The reply from one of the mods.

If the image / video labels and ratings always live in the same blocks of the files (so they not grow not shrink n no matter the values) then Duplicati will only back up the small changed blocks.
However, if the metadata is store before the end of the file and causes subsequent data to change “location” then the updated data AND the following moved-but-not-changed content will be backed up, causing longer backups and more backend storage needs.


I'l do some tests and see what the outcome is.

I'll get back next year :)

Happy New Year!
PSu Server 5 & Postgres 10 on macOS 10.14
- I'm the user

Post Reply