CNET reported recently on a court case that involved image authentication software as well as human experts, both seeking to distinguish unretouched photographs from those created or altered using digital tools. After disallowing the software, written by Hany Farid & his team at Dartmouth, the judge ultimately disallowed a human witness, ruling that neither one could adequately distinguish between real & synthetic images. The story includes some short excerpts from the judge’s rulings, offering some insight into the legal issues at play (e.g. "Protected speech"–manmade imagery–"does not become unprotected merely because it resembles the latter"–illegal pornography, etc.).
As I’ve mentioned previously, Adobe has been collaborating with Dr. Farid & his team for a few years, so we wanted to know his take on the ruling. He replied,
The news story didn’t quite get it right. Our program correctly classifies about 70% of photographic images while correctly classifying 99.5% of computer-generated images. That is, an error rate of 0.5%. We configured the classifier in this way so as to give the benefit of the doubt to the defendant. The prosecutor decided not to use our testimony because of other reasons, not because of a high error rate.
The defense argues that the lay person cannot tell the difference between photographic and CG images. Following this ruling by Gertner, we performed a study to see just how well human subjects are at distinguishing. They turn out to be surprisingly good. Here is a short abstract describing our results. [Observers correctly classified 83% of the photographic images and 82% of the CG images.]
Elsewhere in the world of "Fauxtography" and image authenticity:
- In the wake of last summer’s digital manipulation blow-up, Reuters has posted guidelines on what is–and is not–acceptable to do to an image in Photoshop. [Via]
- Calling it "’The Most Culturally Significant Feature’ of Canon’s new 1D MkIII," Micah Marty heralds "the embedding of inviolable GPS coordinates into ‘data-verifiable’ raw files."
- Sort of the Ur-Photoshop: This page depicts disappearing commissars and the like from Russia, documenting the Soviet government’s notorious practice or doctoring photos to remove those who’d fallen from favor. [Via]
- These practices know no borders, as apparently evidenced by a current Iranian controversy, complete with Flash demo. [Via Tom Hogarty]
- Of course, if you really want to fake people out, just take a half-naked photo of yourself, mail it to the newspaper, and tell them that it’s a Gucci ad. Seems to work like a charm. [Via]
[Update: PS–Not imaging but audio: Hart Shafer reports on Adobe Audition being used to confirm musical plagiarism.]
0 thoughts on “Digital imaging goes to court”
Now this is a fascinating and important issue. I offer a solution which I think is potentially viable. Every image is tagged with information such as camera and speed, etc. Now why can’t adobe tag modifications that have been done with photoshop with such a tag that would create a watermark so to speak as part of the raw data within the color information to identify it as such? There must be a way to do so without ‘dropping’ the information after a save and resave. You could put it into the rgb color tags themselves?
[Photoshop does offer the ability to record a history log inside a file’s metadata. By necessity that’s a voluntary process, so it functions as a way to show one’s work, rather than to defeat tampering. –J.]
I was once told that a disadvantage of DNG was that it didn’t support such verification, and so could never be used for forensic work, etc.
Is this true, (or could sufficient information be stored in DNGPrivateData), and are changes being considered?
[Good question; I’ll ask. –J.]
This Wired article looks interesting, and related: http://www.wired.com/news/technology/0,72883-0.html?tw=wn_index_1
Can you comment on it at all, or are they just rehashing what you’ve already said?
[At the moment I don’t have anything new to offer. –J.]