Seeking That Nearly Perfect Workflow

Post Reply
chandler
Posts: 3
Joined: 02 Feb 19 19:50

Seeking That Nearly Perfect Workflow

Post by chandler » 05 Oct 19 20:45

I’ve spent a lot of time in the past reading through posts looking for workflow suggestions for my personal/family image collection. Each time I find some good ideas, but it still seems that I spend more time looking for the “perfect” workflow than I do getting to use it.
This would be an easy problem if there just one tool that would rule them all. But my reality is more like (in general order of importance):

- PSU (which I really like as an overall DAM. I'd like this to be my overall catalog manager.)
- Lightroom (used for editing and restoration) and Photoshop (for when Lightroom just isn’t quite enough)
- Luminar (which I’ve about given up on… sadly. Awful support for a potentially powerful editor.)
- TagThatPhoto (really good at facial identification – important to me with about 85K images, mostly scanned with no existing metadata. Not really part of my workflow, but just a tool I can employ as needed. I wrote a tool to import TTP metadata to PSU.)

I would like to eventually eliminate the Adobe subscription. But in the meantime I have thousands upon thousands of edits in the LR database that I really don’t want to just render to JPEGs. Much of my collection is TIF or CR2 format images at relatively high resolutions.

My primary goals include:
1) Preservation of original image quality. If I have camera raw data I don’t want to lose that.
2) Creation of synchronized JPEG viewing copies, stored in a parallel folder structure.
3) Ability to selectively push images to a synchronized SmugMug site.
4) Access to very good editing capabilities for technical correction of images (not content enhancement – I like to preserve the original image. If someone needs a copy with Uncle Jim removed, that shouldn’t be part of my master collection.)
5) Deep metadata – people, places, dates, context – for every image. (A picture of a mountain or lake, with no faces in it, is useless 30 years later if no one knows where it was or why it was deemed photo worthy.)
6) Easy ingest of folders of images as family members show up on my doorstep with their own image sets.

Can someone suggest a reasonable workflow that integrates at least PSU and LR, and affords integration of metadata from other tools as well? If nothing else, perhaps this can serve as a collection point for a variety of workflow descriptions that newbies (or even oldies like me) can refer to for ideas

Thanks!

Ralf
Posts: 36
Joined: 19 Jan 19 14:37

Re: Seeking That Nearly Perfect Workflow

Post by Ralf » 06 Oct 19 13:27

Hi,

I can not suggest a workflow but maybe it helps to see how others implement that.
Personally, I have changed many tools, mostly involuntarily because companies have sold or reoriented, suddenly had to use a subscription or had errors in the applications einchlichen.
One thing has always remained, my filing of images at the directory level. I and my children can always use it, whether via Share on TV, mobile phone or if something is being searched for using tools.

I add PSU images after I have previously stored them in a specific structure (see below) and processed via various "scripts" / "snippets" the filing / metadata so that at least the most important information is included in the images.

If I get pictures of friends or friends, who are wildly confused I import the first via PSU with a script and the variable% Eventname which stores the data according to the filing below.
 
Here is an example of my filing of pictures:

- at one event (one day)
YYYYMMDD-title
 
- YYYYMMDD -> for scans in various date fields
- Title -> xmp: photoshop: headline
 
 
  - at multi-day events (trips)
YYYYMMDD to YYYYMMDD-title -> YYYYMMDD to YYYYMMDD-Bike Tour Austria
YYYYMMDD-01.Titel2
YYYYMMDD-02.Titel3

- YYYYMMDD to YYYYMMDD-title -> xmp: photoshop: headline
- Title2 -> xmp: dc: title or YYYYMMDD-Title2 -> xmp: dc: title
- Title3 -> xmp: dc: title
...

I have two more fields, these I use especially in old shots (scans) or from friends and acquaintances and I read these too.
 YYYYMMDD-title-remark-Copyright -> 19100101-Biketour-scan date not exactly-Max Mustermann
 
 Thus, I have become more independent and can easily change if necessary. Sure, there was a little time for the set up lost but I use now always in PSU. Formerly via exiftool and batch in Windows. Now even the Window Explorer finds my metadata;)

 If I come back from a holiday or get pictures of friends, they are quickly sorted and labeled. All I have to do is check the GPS data, query locations, set categories and tag faces. But that's fast with a glass of red wine;)
 With the amount of pictures I raise no claim to completeness because the children sifting the 100000 pictures times is unlikely but at least they will always find something. So my pictures are enough from 1920 to today ...
 I hope for a little insight to have helped, just because of my not so great English (SORRY)

I can gladly provide scripts or snippets or Profiles
Ralf
---------------------------------------------------------------------------------
Hobby photographer with many pictures (> 100000) of the family over generations.
(Excuse my english)

Mke
Posts: 455
Joined: 15 Jun 14 15:39

Re: Seeking That Nearly Perfect Workflow

Post by Mke » 06 Oct 19 15:45

Here are a few ideas that may help: viewtopic.php?f=57&t=23786&f=57&t=23786

chandler
Posts: 3
Joined: 02 Feb 19 19:50

Re: Seeking That Nearly Perfect Workflow

Post by chandler » 07 Oct 19 2:47

Thanks, Ralf, for the details above; and Mike for the pointers to the collection of workflows from others.

Many of the points mentioned are already part of my own process. I use a single folder structure for all my (master) images, structured as:

yyyy
yyyy-mm-dd - folder of all images created on a particular day

I copy images from camera or other source directly into this structure, then add to catalog in PSU. I keep the "best" format I have of each images intact -- be it CR2, DNG, TIF, or even JPEG is that's the best I have.

My culling of the original images is limited to obviously unusable images -- focus smears, burned out, etc. Storage is cheap, I keep telling myself. The images of relatives past are priceless, even if imperfect.

I do geotagging and try to apply labels soon after ingest to quickly arrive at a searchable repository to work from in future phases.

For recently-acquired images I perform face identification at this point; but I have an ongoing process underway of identifying the people in scans of family pictures back to the 1930s. For this, I resort to using tools such as TagThatPhoto, that don't write their metadata back to the image file. I created a piece of Python code to read the face names and bounding boxes and then allow it to be imported to PSU. (This still requires some manual examination to avoid occasional unexplained glitches.)

When I export JPEG files for viewing purposes they land in a parallel folder structure that mirrors the one used for masters. I'm currently implementing a scheme for tagging images that should NOT have viewing copies created.

The central "theme" of my workflow, I suppose, is to add to a central repository of master images, and use an array of tools to read those images and generate additional metadata that then comes back to the DAM (PSu) for association and persistence. The part I don't feel is handled well is association of edited master images (perhaps restored, cropped, or simply color-corrected) with the original master. I need to better manage the "family tree" of images so that, from any viewing copy, for example, I can easily get back to the original master file prior to any editing. That is probably the crux of my discomfort right now.

Thanks, and I'm sure there's more to come in the future.

Mke
Posts: 455
Joined: 15 Jun 14 15:39

Re: Seeking That Nearly Perfect Workflow

Post by Mke » 07 Oct 19 15:56

chandler wrote:
07 Oct 19 2:47
The part I don't feel is handled well is association of edited master images (perhaps restored, cropped, or simply color-corrected) with the original master. I need to better manage the "family tree" of images so that, from any viewing copy, for example, I can easily get back to the original master file prior to any editing. That is probably the crux of my discomfort right now.
Could that be because your 'parallel folder structure' is isolated from your processed images? At least some of us - perhaps most - have a folder structure that integrates original and processed images, such as:

|_Event/Project Folder (Processed files in here)
.........|_RAW (RAW files in here)

or

|_Event/Project Folder (nothing in here)
.........|_RAW (RAW files in here)
.........|_Processed (JPGs / TIFF files in here)

...together with the use of versioning to link the files together

Post Reply