Shapr3D is an iPad drawing app that lets you create 3D drawings without having to use a desktop computer or CAD software. Designs created in this “pro-level” tool are compatible with major CAD file formats and support instant exports for 3D printing.
This is kinda inside-baseball, but I’m really happy that friends from my previous team will now have their work distributed on hundreds of millions, if not billions, of devices:
[A] face contours model — which can detect over 100 points in and around a user’s face and overlay masks and beautification elements atop them — has been added to the list of APIs shipped through Google Play Services…
Lastly, two new APIs are now available as part of the ML Kit early access program: entity extraction and pose detection… Pose detection supports 33 skeletal points like hands and feet tracking.
Let’s see what rad stuff the world can build with these foundational components. Here’s an example of folks putting an earlier version to use, and you can find a ton more in my Body Tracking category:
TBH I’m a little nonplussed about the specific effects shown here, but I remain intrigued by the idea of a highly accessible, results-oriented app that could also generate layered imagery for further tweaking in Photoshop and other more flexible tools.
The main goal of apps like this might simply be to introduce more people to the Adobe ecosystem. Adobe CTO Abhay Parasnis said as much in an interview with The Verge, in which he calls Photoshop Camera “the next one in that journey for us.” Photoshop Camera could act as the “gateway drug” to a Creative Cloud subscription for anybody who discovers a dormant love of photo editing.
I’ve long joked-not-joked that I want better parental controls on devices, not so that I can control my kids but so that I can help my parents. How great would it be to be able to configure something like this, then push it to the devices of those who need it (parents, kids, etc.)?
I’ve always been part of that weird little slice of the Adobe user population that gets really hyped about offbeat painting tools—from stretching vectors along splines & spraying out fish in Illustrator (yes, they’re both in your copy right now; no, you’ve never used them), to painting with slick features that got pulled from Photoshop before release & somehow have never returned. I still wish we’d been able to shoehorn GPU-powered watercolor into Photoshop’s, er, venerable compositing engine, but so it goes. (A 15-year-old demo still lives at one of my best URLs ever, jnack.com/BlowingYourMindClearOutYourAss )
[Please note: I don’t work on the Pixel team, and these opinions are just those of a guy with a couple of phones in hand, literally shooting in the dark.]
In Yosemite Valley on Friday night, I did some quick & unscientific but illuminating (oh jeez) tests shooting with a Pixel 4 & iPhone 11 Pro Max. I’d had fleeting notions of trying some proper astrophotography (side note: see these great tips from Pixel engineer & ILM vet Florian Kainz), but between the moon & the clouds, I couldn’t see a ton of stars. Therefore I mostly held up both phones, pressed the shutter button, and held my breath.
Check out the results in this album. You can see which camera produced which images by tapping each image, then tapping the little comment icon. I haven’t applied any adjustments.
Overall I’m amazed at what both devices can produce, but overall I preferred the Pixel’s interpretations. They were darker, but truer to what my eyes perceived, and very unlike the otherworldly, day-for-night iPhone renderings (which persisted despite a few attempts I made to set focus, then drag down the exposure before shooting).
Check out the results, judge for yourself, and let me know what you think.
Oh, and for a much more eye-popping Pixel 4 result, check out this post from Adobe’s Russell Brown:
Boy, what I wouldn’t have given to have had this tech in Photoshop Touch, where Scribble Selection was the hotness du jour. Pam Clark writes,
This feature on the iPad works exactly the same as on Photoshop on the desktop and produces the same results, vastly enhancing selection capabilities and speed available on the iPad. With cloud documents, you can make a selection on the desktop or the iPad and continue your work seamlessly using Photoshop on another device with no loss of fidelity; no imports or exports required.
We originally released Select Subject in Photoshop on the desktop in 2018. The 2019 version now runs on both the desktop and the iPad and produces cleaner selection edges on the mask and delivers massively faster performance (almost instantaneous), even on the iPad.
The feature is rolling out today; I was able to try it on my Pixel 4 without a hitch. It works across 44 languages, and is available on both Android and iOS. Google Assistant is built into Android phones and no separate app is required. For iOS, simply download the Google Assistant app to try it out.