Thanks for all the feedback regarding the just-announced Adobe Nav, Eazel, and Color Lava. A few quick thoughts:
- Please remember that these efforts are just part of a bigger picture that has yet to be revealed. I’ve seen comments along the lines of “Nice, but I want Lightroom for tablets”; “Why are you doing these apps instead of making improvement X to Photoshop?”; “I’d like to see more support for Android”; etc. The feedback is welcome, and none of these things are mutually exclusive.
- “Nav is one of the most exciting of our three new applications IF you think beyond Nav itself,” writes Photoshop PM Bryan O’Neil Hughes. “We’re showcasing one of the most powerful pieces of the new Photoshop SDK – the ability to drive Photoshop from a device.” See the rest of his comment for more perspective.
- You can indeed watch these videos via HTML5 on an iPad. Here’s a link to all of them plus a few I haven’t yet gotten to blog. For some reason embedded Adobe TV vids don’t work on iOS devices, but I’m told a fix is in progress.
Here I am again, the dumb guy who doesn’t get it.
“the ability to drive Photoshop from a device.”
I want to be closer to Photoshop, not farther away. I want something like a Cintiq that can really display colors and has an imperceptible separation between the stylus and the image, not an on-screen keyboard that controls a physical keyboard in the other room or some such thing.
When I say I want to do more, I mean getting the current limitations out of the way, not having a crippled finger painting experience that involves putting another piece of hardware between me and my work.
I guess I’ll have to take your word for it when you say that people like me “just don’t get it”, but I have to say that I can’t remember the last time someone said that and it turned out to be true. I’m ready. Knock my socks off.
With respect, SBG, the idea of using an iPad as a controller is indeed getting you closer. It’s a less expensive way to get tactile with Photoshop than a Cintique, which, I believe, still requires a pen. The ‘real’ distance between you and PS may increase in some way (pick your system of measurement), but the ‘virtual’ distance is closing.
Nav is not just about being a keyboard, but about utilizing something far more complex than Ctrl+Alt+Shift+E. Access to menu items can be a lot faster, you can open up more shortcuts and make them contextual, etc. Given what I’ve seen and John’s penchant for “Great! What else you got?” questions to the dev team, I’d say you are looking at the edge of a far more intuitive way to work with your images.
I’ve seen nothing that implies a ‘crippled finger painting experience’. Instead, I see a way to burn through common tasks and open up new methods of interaction that were not previously available. And as for another piece of hardware between you and your software, well that’s just patently wrong – it’s not like the iPad is motorizing your mouse to jump on the keyboard. Rethink that argument, because it simply doesn’t make sense.
But here’s the kicker: you don’t have to like it. You don’t have to use it. If your current work flow is fine, stick with it. If you have problems, state them clearly and help BOH and the Adobe engineers find a solution. From your kvetching, I have no idea what limitations you are encountering, but I’d genuinely love to hear them. It sounds like a robust conversation with the community could help get you some of the changes you want. It’s your responsibility to articulate them clearly, though.
Let me be another one who just didn’t get it 😀 I dont know how you guys work on daily basis with Photoshop but I have stylus in one hand while other is on the keyboard ready to shoot some shortcuts. Eyes on the screen. You want me now to have another device to look at, jumping between it and screen – how this will make work more intuitive? Shortcuts you say? Cool idea but heres problem that gets into play – its virtual keyboard. You dont FEEL it – using it for shortcuts will be just as much fun as using it for writing long texts. Trying to use it without looking at it will double that fun – like changing gears in your car on ipad screen instead of holding gear lever. Only way I see it really useful are gestures, so you dont have to take your eyes off the screen to look at ipad. But really and truly all this hype with ipad becomes to be ridiculous. Multi touch screen was suppose to change everything, Steve Jobs in his show for Apple fans was ecstatic. What is it useful for? Scrolling websites with your finger, browsing through phone menu and scaling pictures (or moving them on virtual table which is just as useful as wheel connected to fish. Thats about it. Ow, sorry – there is also one billion ultra useful applications for iphone/ipad like… games. Also you can use your finger to draw, right? Too bad that even really good artist that tries to use it for commercial purposes coming out with a result reminding me of my 3 years old son drawings. And now ipad has to be EVERYWHERE. It is FUTURE once again. Yeah, right. Just because you can doesnt mean that you have to. I am sure you will not be convinced, that many efford will go into this idea but I can bet you any time that I will finish my gob faster with wacom stylus and hand on the keaybord that you with this fansy pansy, costly device and its virtual, floating, multitouch UI.
Sorry for spelling mistakes – heres 3:30 am and I was just rushing to get that comment before falling asleep. Probably I could use some external, i-pad, multi touch spell checker 😉
Yes, the first demo applications are in the ipad. That doesn’t mean that the capability is limited to the ipad.
The SDK lets you write applications that control Photoshop via scripting (and a few new commands) across a network – with a password and encryption, of course.
These could be written on a tablet device, a phone, a TiVo, another desktop, an Air/Flash app, etc.
This allows you to write things like:
Remote Kiosk for specific tasks
Tutorial apps that can drive Photoshop for you (and not cover your screen)
Render farm controllers
Teacher/classroom apps that distribute input files, monitor progress, and receive final results
Remote color / data input from embedded devices
Remote devices for slide shows / presentation of comps
Secure inter-application scripting on the same machine, as well as between machines
Smarter tutorials that respond to your current activity in Photoshop
etc.
Yeah, this SDK isn’t for everyone. It’s for people who want to connect Photoshop to something else to help make their work easier or faster.
The point is – I dont see how this will make anybodys WORK faster and easier. OK, your examples of classroom/teacher app are good. And I dont think innovation shouldn’t go this way. I just dont like all that empty slogans like “this will make your experience more personal and more intuitive”. Its like those toilet paper commercials that somebody’s life changing experience took place in the toilet because paper was so soft. No one is saying that there is no place for toilet paper – we need it at times. Just keep it real – it will be of enormous help in the toilet. Thats it.
I think the main thing to remember is that we are seeing the very beginning of this whole process of extending capability to a multi touch screen. Maybe these apps are a little *too* early in their development for many (or most) professional users to see the benefits. But I still forsee some pretty great ways to interact with photoshop via multi touch.
Two quick examples of future potential uses:
1) Tell PS to deliver all large parametric dialogue boxes (such as liquify, vanishing point, filters, etc) to the touch screen and let your monitor preview results
2) Puppet warp. Not used by all, but can you imagine how much faster it could go with multi touch?
*foresee