Rohit's Realm

// rohitsrealm.com / archive / 2006 / 07 / 27 / iphoto-no-more-part-2

July 27, 2006

iPhoto No More, Part 2

A couple weeks ago, I wrote about how iPhoto's painful inefficiency forced me to abandon it altogether as a photo processing tool. Today, I will discuss how I have replaced what was once a vital tool in my workflow.

My first step after deciding to purge iPhoto from my life was to search for a similar replacement, that would work on either my FreeBSD machines at home or perhaps, my Powerbook. A quick survey of the products available on the open source market, however, yielded no clear winners to replicate a photo processing workflow end-to-end and I came of the opinion that I would have to cobble together my own solution, leveraging existing tools where I could and writing my own programs where no tools were to be found.

Below are the technical details for implementing the process I outlined in my last post on the subject.

Step 1

The challenge of getting photos from the camera to my computer was easily resolved: gPhoto2 is an excellent open source tool (and library) for image-capture from a wide range of digital camera devices. (It definitely works on my point and shoot Canon PowerShot SD400; I'm not quite as certain of the support for my Nikon D70, as I have not had time recently to test it, but I think the D70 can be mounted directly if need be.) Running the following commands does the trick for me:

rohit@tyrant /export/pictures/incoming/paris05 % gphoto2 --auto-detect
rohit@tyrant /export/pictures/incoming/paris05 % gphoto2 --get-all-files

gPhoto2 runs in a few minutes and puts all the files from your camera onto your computer. You can also optionally delete all the files from your camera, although I generally like to verify the upload before doing this.

Step 2

Cataloging photographs in an application-independent manner was a crucial goal of mine; re-entering captions for different photo services is a pain in the ass! I decided to design my own basic SQL schema for this and then wrote a (relatively) simple command line Perl script to perform major cataloging tasks, including:

  1. Entering basic information about the photograph (i.e., Title, Summary or Caption, Description, Keywords, Copyright, etc.) and storing it in EXIF (IPTC) metatags of the photograph itself (thus, ultimate portability!);
  2. Extracting relevant technical data about the photograph (e.g., shutter speed, ISO, focal length, etc.) and storing this information as well as the semantic information gathered in Step 2.1 into my relational database;
  3. Generating a standardized directory structure (i.e., one directory for thumbnails and one for full images);
  4. Renaming all photographs using a globally unique identifier (e.g., a numerical sequence); and
  5. Generating thumbnails for quick, portable viewing of the album.

At the end of this script run, I would have the following example directory structure for an album paris05 with two pictures:

/export/pictures/paris05/thumbs/000001.jpg
/export/pictures/paris05/thumbs/000002.jpg
/export/pictures/paris05/images/000001.jpg
/export/pictures/paris05/images/000002.jpg

Each image and thumbnail would have all the semantic metadata embedded within EXIF (or IPTC) tags and the thumbnails would have a nice 3px border, courtesy of of ImageMagick.

Step 3

To facilitate portability, I decided I would generate a quick HTML index of the album (again using a simple Perl script), to allow quick browsing of thumbnails. The intention here was not to replicate Gallery-type functionality; however, it would be nice to have a quick means of surveying all pictures in an album and nothing beats the ubiquity of HTML.

My directory structure for the example paris05 album would now look like this:

/export/pictures/paris05/index.html
/export/pictures/paris05/thumbs/000001.jpg
/export/pictures/paris05/thumbs/000002.jpg
/export/pictures/paris05/images/000001.jpg
/export/pictures/paris05/images/000002.jpg

With the completion of this step, I would now have a self-contained unit of photographs. Each photograph carries its own semantic information and the HTML page allows for quick viewing of the album as a whole. An example of this HTML is here.

Step 4

The next step, of course, is to upload moderate quality and better photographs to my gallery for public consumption. I will probably eventually write a script for this using the Gallery Remote API, but as this is still not mature, my current process simply involves using scp to upload the entire images directory to my web server, and then using the Gallery web interface to upload photographs. I have configured it so Gallery will automatically populate the summary, description, and keywords fields upon upload, although this mapping of EXIF tags to Gallery metadata is not entirely desirable. Eventually, I will have to take a look at the API and see if I can modify it to populate the title, for instance.

Step 5

Finally, the most tedious of all steps (in my humble opinion). Archival. Like most folks, I am not as religious with my file system backups as I probably should be, but I have been making a sustained effort in the past few months to remedy that. As of now, almost all my systems implement RAID in some capacity and I do off-site backups to a machine that's not in my house. However, in the situation that something takes out San Francisco, like the next big one, I would like to have some removable media backups stored far far away.

Given the portable nature of my albums and the self-contained semantic data, backup is rather trivial. I simply tar and gzip each album, and burn them onto two CDRs. Currently, both sit in my apartment, but the goal is that one of these will go elsewhere soon. Eventually, once I get around to buying a DVD burner, I will probably re-archive everything on DVD for the longer shelf-life.

So, this brings to an end my rather long and pointless discourse on how I purged iPhoto. Since its unlamented disappearance from my life, my anger has rapidly subsided, so this exercise was more for the sake of completeness than anything else. However, perhaps for the poor soul still slaving away under the draconian rule of iPhoto, this methodology will provide some solace. If any of you want to take a look at those scripts I mentioned, I'll probably clean them up and post them on the web this weekend.

Comments

Pretty nice workflow. I've always wanted to use embedded metadata, too. Would be interested in seeing your script for step 2. As for globally unique identifier, I use date and time e.g. 20060727_1430_59 also making sure to account for time zone changes.

It's funny that as you mention it, some people at LifeHacker are taking a look at iPhoto alternatives.

I'll clean it up so it doesn't choke on arbitrary input and then e-mail it to you. It uses fairly standard Perl modules available via CPAN, so you should probably be able to modify it to suit your needs fairly easily.

I'm also interested in the scripts you use to do step #2. Once I get that all done, I'll probably work on integrating into Gallery Remote. I imagine that wouldn't be too hard. It also looks like there's a perl script at http://codex.gallery2.org/Other_Clients that does it too already.

Add Comment


 


 


 


 


* required field

E-mail addresses will never be displayed. The following HTML tags are allowed:
a abbr acronym address big blockquote br cite del em li ol p pre q small strong sub sup ul