Best practice for large imports?

Post Reply
Username
Posts: 149
Joined: 18 Feb 18 22:21

Best practice for large imports?

Post by Username » 22 Feb 18 13:21

My PS Server with postgres is up and running but have some questions doing the initial imports.
I have close to 2TB of photos and videos and need some help with how to do begin in best and most efficient way.

My thought is that the best and fastest way to import everything into the DB would be to run the PS app on the physical Mac server and not an a mac client over ethernet. But importing “locally" do create a different local mount path to all images compared to shared path my client would use.

I’ve done some tests with 2500 photos imported from client and then server which after each time of “Map to the correct physical folder" resulted in some Access violation (0) and crashes of the PS app.


What would the best practice for initial import of 1TB of photos be?
- Locally on mac server on which db as well as photos are stored?
- From a mac client over gigabit ethernet with a mounted server volume path?

What would the best way to do path change be so I can continue to work from my mac clients?
- Using “Map to the correct physical folder”?
- Any problems using this often?

I have not yest tested using both Mac and Windows clients connected to the same samba share and running PS to browse the libraries.
- Can I run both types or do paths as forward and backslashes cause any problems?



My system includes:
macOS server 10.12.6
Quad i7 with 16GB and some internal SSDs.
Several RAID volumes shared.
Postgres 10.2


Volumes mounted on server as:
/Volumes/RAID1/
/Volumes/RAID2/


/Volumes/RAID1/Folder1/Media1/Photos/2004/...
/Volumes/RAID1/Folder1/Media1/Photos/2005/…



Where the shared folders are:
/Folder1/

Which clients mount as:
/Volumes/Folder1



/username
- I'm the user

Post Reply