Over on PhotoshopNews.com, Martin Evening provides a nice overview of the Photoshop CS3 beta’s new Clone Source palette. In a nutshell, you can now clone and heal more precisely by using a translucent overlay of your source pixels–either temporarily (hold down Opt/Alt+Shift after setting your source) or persistently (via the "Show Overlay" option on the palette). Building on what Martin wrote, here are some useful keyboard shortcuts:
- Opt/Alt + Shift temporarily shows the clone overlay, plus it lets you drag it around and ‘tack’ it down at the desired location.
- Opt/Alt + Shift + the arrow keys nudges the overlay up, down and side to side.
- Opt/Alt + Shift + [ or ] rotate the source
- Opt/Alt + Shift + < or > scale the source
Also, to adjust rotation, position, or scale, you can also use "scrubby sliders": hover over the label on each field (H, W, etc.), then drag left or right. As with all scrubby sliders, holding Opt/Alt while dragging will make the values change 10X slower, and holding Shift will make them change 10X faster.
On a related note, retouchers will be happy to learn that it’s now possible to have cloning/healing ignore adjustment layers. Let’s say in CS2 you had an image on the background layer, then added a layer above it to do some cloning (so as not to affect the original pixels), and above that you put a Hue/Saturation layer. If you used the clone tool set to sample all layers & didn’t turn off the Hue/Sat layer, the results would be screwy, as Hue/Sat would be double-applied. Now via a couple of new options (screenshot), you can elect to make cloning/healing ignore adjustment layers, and/or ignore all layers above the current one. It’s a really tweaky little change, but it’s one that’s been requested for ages.
Photoshop CS2 introduced the application’s first support for 32-bit high dynamic range (HDR) imaging. The support was pretty limited, consisting of the Merge to HDR command (for combining bracketed shots into a single image) and some basic imaging functions (cropping, cloning, conversing from 32 to 8 or 16 bits per channel). Even so, about a year ago examples started popping up of HDR experiments (not solely connected to Photoshop, of course, but helped along by CS2). In the time since then more good resources on the subject have emerged.
The Photoshop CS3 beta includes some improvements in the HDR realm. Some more functions (e.g. Levels) are enabled for 32-bit images, and the Merge to HDR command, although superficially similar to the one in CS2, contains a variety of improvements. It benefits from the new image alignment code; preserves a more complete set of source data; and uses improved algorithms for merging the data.
Trevor Morris has kindly supplied an HDR photo created with the CS3 beta, as well as the source frames. He says, "I could never get it to work in CS2, but it worked flawlessly in CS3, and I was quite pleased with the results." He writes,
This photo was shot inside the Christ Church Cathedral, located in Fredericton, New Brunswick, Canada.
For this particular shot, I used a tripod and remote to capture 12 exposures, from 1/125s to 20s, with a Nikon D70 @ f/16, ISO 200, FL 18mm. I combined the exposures using Merge to HDR, increased the local contrast, and gave the image a slight saturation boost.
Give it a whirl with your bracketed shots, and please let us know whether it works well for you.
Yesterday’s discussion of Smart Filters made me think that it’s worth writing up some thoughts on Smart Objects & the future of compositing in Photoshop in general.
I have a hypothesis, at least as regards Photoshop: flexibility generally breeds indirectness. That is, when you step away from the familiar world of applying pixel tools directly to a plane of pixels, you introduce complexity. Whether or not that complexity is worth accepting depends on bang for the buck.
Smart Filters–i.e. those that can be adjusted or removed at any time, leaving the underlying pixels unaffected–address what is probably the single longest-standing feature request in Photoshop. Customers’ response to them has been quite good, but the details of how & why they work as they do may be a little subtle. For example,
- Why can’t you paint directly onto a surface that has a Smart Filter applied?
- Why are you limited to having one filter mask per layer (instead of having one per filter)?
- Why do Smart Filters add file size?
If you’re interested in the story of how and why Smart Filters came to be as they are, read on. I find the whole topic of how Photoshop is evolving from a simple "a pixel is a pixel" app into a dramatically more powerful editing pipeline fascinating, but I recognize it’s not everyone’s cup of tea. 🙂
Interesting design/photography bits:
[See also previous bits]
Much has been written about the fact that the speed of individual CPU cores isn’t increasing at the rate it did from 1980 through 2004 or so. Instead, chip makers are now turning to multi-core designs to boost performance. (See this brief primer from Jason Snell at Macworld.) Thus a lot of people have been asking whether Photoshop takes advantage of these new systems. The short answer is yes, Photoshop has included optimizations for multi-processor machines (of which multi-core systems are a type) for many years.
What may not be obvious to a non-engineer like me, however, is that not all operations can or should be split among multiple cores, as doing so can actually make them slower. Because memory bandwidth hasn’t kept pace with CPU speed (see Scott Byer’s 64-bit article for more info), the cost of moving data to and from each CPU can be significant. To borrow a factory metaphor from Photoshop co-architect Russell Williams, "The workers run out of materials & end up standing around." The memory bottleneck means that multi-core can’t make everything faster, and we’ll need to think about doing new kinds of processing specifically geared towards heavy computing/low memory usage.
Because Russell has forgotten more than I will ever know about this stuff, I’ve asked him to share some info and insights in the extended entry. Read on for more.
Sometimes it’s the smallest, weirdest things that drive feature development. In the case of the new Quick Selection Tool & Refine Edge command*, hair loss played a key role.
As of this past summer, Photoshop engineers Jeff Chien and Gregg Wilensky had been cranking away on these tools for a while & had them working well for hard-edge selections. As luck would have it, Jeff’s mane is a little thin on top, and Gregg is more folliclularly challenged. So, when Jeff returned from vacation to Taiwan, he was rather unhappy to find that Quick Selection was selecting only his head, missing the wispy bits of hair on top. As he proclaimed while making a quick whiteboard self portrait, "I need to keep all the hair I’ve got!"
The desire to do a better job with irregular edges like hair got the guys thinking about new solutions, resulting in new algorithms we’ve been calling TrueEdge. You can see the kind of refinement possible via the Radius & Contrast controls in this screenshot. Pretty cool, eh? Viva Mother Nature (sorry, guys!). 😉
*For a video intro to the tools, you can consult Deke McClelland or Dave Cross.
Bill Perry, who manages global developer relations for mobile and devices at Adobe, has posted a quick walk-through of creating & previewing artwork using the Photoshop CS3 beta together with the new Adobe Device Central. ADC lets you browse among device profiles*, then preview your artwork on the devices while simulating screen glare, changes to backlighting, and more.
I have to say, I’m really glad to see mobile authoring get some love and attention at Adobe. It’s not that the company didn’t have ideas and tools prior to the Macromedia integration; in fact, GoLive included a variety of mobile emulators & authoring tools. It’s just that we could not, for the life of us, adequately get customers’ attention. I always envied Macromedia’s resolve to make its work known & would rant to fellow PMs: "Look at these guys: they’re giving away a plasma TV to whoever creates the best content with their tools. What do we offer–an ’82 Dodge Rampage and half a can of Schlitz?"
Now, however, we have a chance to bring the tools together to form an end-to-end solution. I’m very curious to see what people will create, and where we can take these tools going forward.
*A short list during the beta, but that’s temporary