The new RAW tool opens automatically when Snapseed detects a RAW file and works seamlessly with other Snapseed tools, such as Healing, Brushes, Frames, Text, HDR, and Details. Editing changes can be saved non-destructively, or exported as JPG in high quality. Some of the available adjustments for RAW include Structure, Tint, Shadow control, Exposure (-4.0 to 4.0 f-stops), and Temperature (1.700°K to over 8.000°K). Anyone using Snapseed 2.9 and an Apple USB SD card photo adapter or WiFi SD card can now work with RAW images.
Here’s a RAW-vs.-JPEG comparison:
Meanwhile the new Face filter lets you brighten faces, smooth skin, and make eyes pop.
Want more? Okay, here’s more:
Also on Android: new Perspective and White Balance tools. Perspective straightens lines in your image by removing the perspective effect from the original image. White Balance offers fine color balance control with pinpoint precision via an eye dropper tool.
Lastly, you now have greater control over image saving:
In addition to UI improvements and bug fixes on both Android and iOS, you can now set the preferred JPG compression rate, or even save lossless (PNG) when exporting.
Enjoy, and as always, please let us know what you think!
Watch a camera-wearing eagle named Darshan plunge off the top of the Burj Khalifa—and just wait til he turns on the jets:
FREEDOM is delivering its message to urban audiences by capturing never-before-seen footage taken from the backs of Eagle’s flying from iconic city landmarks. Flights have taken place from the Eiffel Tower in Paris, Tower Bridge, St Paul’s Cathedral, and the ArcelorMittal Orbit in London. … As a partnership FREEDOM is a pioneering movement for the conservation community. It not only seeks to reintroduce threatened birds of prey, but it also serves an ambassador role for all species, highlighting ways for people and organisations to become involved in supporting effective frontline nature conservation programmes.
The only thing that could improve the footage is Danzig pumping out a little Where Eagles Dare.
I’m not sure about all this “Darshan” talk, though. You know damn well it’s Donald.
(Cue Goat Boy braying, “Hey, remember the niiiiineties?”)
This entertaining short piece covers not just “@ Cafe,” an East Village internet cafe that launched in 1995, but also touches on the history of St. Mark’s Place, gun-toting Germans, “Mr. Zero,” T1 lines, CUSeeMe, clueless Bryant Gumbel, and more. Enjoy!
jnack.com/BlowingYourMindClearOutYourAss—that’s the URL I picked, back circa 2005 (when men were men & we had to self-host all our videos!), to express my admiration for Nelson Chu’s Moxi watercolor app. Harnessing graphics processors to create realtime natural-media simulations has been a passion of his for the better part of 20 years. We hosted him as a summer intern on Photoshop, but we could never quite manage to marry this tech to the, ah, somewhat vintage underpinnings of PS. Now Nelson has launched the latest incarnation, Expresii ($59 for Windows). Check it out:
Movies, Android apps, magazines & more are ferried past government censors & past the lack of connectivity via “El Paquete Semanal” (the weekly packet), a 1TB flash drive that functions as the island’s backdoor content distribution network. Check out this short, enlightening documentary from Vox, and see John Graham-Cumming’s article for more details.
A bowling ball as puppeteering UI? Garbage bags as shimmering water? Who knew!
Check out the fascinating little documentary below, as well as The Verge’s overview (turns out that Laika head Tim Knight is the son of Nike founder Phil Knight) and an interesting NPR interview with Tim.
“What Google’s Tilt Brush is to Adobe Illustrator, SoundStage is to GarageBand,” writes the Verge. “It’s a VR music application that lets you arrange synthesizers, drums, speakers, and other equipment within the boundaries of your room, so you have a custom-built studio to make your own tunes.” Check it out:
“Using a GoPro Hero4 Session,” PetaPixel writes, “the clip puts you in the driver’s seat of a Hot Wheels car as it hurtles down 8 expertly crafted pieces of track connected by ‘teleporting tunnels…’ In all the car traversed about 200ft of track, most of it in the 4th section. As for the underwater drive, it’s real too.”
A little story of perseverance & defiance from my past…
So, when I got out of school with a History degree, I wasn’t exactly highly employable, so I used my self-taught design & coding skills to talk myself into an internship in NY. Most people at the agency were cool, but the admins were kind of petty tyrants who liked to deny things to the interns just to keep us in our (unpaid) places. That included denying us phones and nametags for our desks.
Not digging the indignity, one weekend I came into the office, stuck a full-time colleague’s nametag into a scanner, then brought it into Illustrator & generated a template from which I could bang out my own versions. I proceeded to create a ton of absurd variations—e.g. “Unmoved Mover,” “HMFIC,” etc.—that I then cycled through displaying. One variation said “Johnny Folk Hero,” and I ended up leaving it up for a while.
Later, a woman came back from a meeting laughing: “Someone was in there asking whether a graphics intern could do a project. ‘Isn’t there a Johnny something?’ she asked. ‘Oh, John Nack?’ ‘No, it’s like… something Native American, I think—Johnny Folk Hero or something??’”
This, as you might imagine, kind of made my life. 🙂
“Using traditional cameras and algorithms,” MIT News reports, “[Interactive Dynamic Video] looks at the tiny, almost invisible vibrations of an object to create video simulations that users can virtually interact with.” Check it out:
“This technique lets us capture the physical behavior of objects, which gives us a way to play with them in virtual space,” says CSAIL PhD student Abe Davis, who will be publishing the work this month for his final dissertation. “By making videos interactive, we can predict how objects will respond to unknown forces and explore new ways to engage with videos.”
Back in 2014, right after we’d both joined Google, my friend Alex Powell (who used to lead animation tools at DreamWorks) and I did a lot of exploration of what it would mean to turn photos and videos into paintings. Some of that work paid off later in work like Halloweenify face painting.
Now it’s exciting to see how the industry has evolved, using machine learning to style images as paintings. Evidently the band Drive Like Maria is veeeery patient and dedicated to getting this result. PetaPixel writes,
“We figured that if we’d process 600 pictures each (Bert, Bjorn, and myself) it would take about 5 hours per person to process everything at 30 seconds per picture,” the band’s guitarist Nitzan Hoffmann told us. “By the time we started processing the Android version of Prisma was also available so we could use iPhones and Android phones at the same time.”
Despite some issues transferring JPEGs to and from the phones while keeping them in some kind of order, they eventually managed to convert every frame—all 1,828 of them—into a “painting” by hand before plugging them back into Premier Pro. A little bit of fiddling later, they had their music video!
The favelas of Rio aren’t well-known to many outsiders, partly because there’s limited information about these areas to include on maps. We partnered with the local Brazilian nonprofit Grupo Cultural AfroReggae on a project called “Tá No Mapa” (“It’s On the Map” in English). Together with AfroReggae we trained 150 favela residents on digital mapping skills and in just two years they’ve mapped 26 favelas and gotten more than 3,000 businesses on the map. Not only does this allow locals to find businesses like Bar do David—an award-winning restaurant in the favela Chapeu Mangueira—it’s helped some local residents get a mailing address for the first time. [Read more]
“Why isn’t using Photoshop more like dancing?” I used to ask teammates—making them look at me like I’d just ripped a giant bongload in the stairwell. I meant of course that I wanted to offer a more fluid, immersive, physically expressive way of working, not one dominated by clicks & keystrokes.
Progress can take a while, but Google’s Tilt Brush (see recent) & now Oculus’s Medium let you synthesize art in 3D space. “Built specifically for virtual reality,” they say, “Medium lets you sculpt, model, paint, and create solid-feeling objects in a VR environment.” Check it out:
Man, they are gonna crush it with the Micronaxx demographic. 🙂
Verne: The Himalayas is a new experimental Android app that invites you to explore Google Maps’ 3D imagery of the Himalayas alongside a 500 foot Yeti named Verne. As Verne, you can run up Mt. Everest in seconds, skate across icy lakes, chase yaks, discover bits of information, ride a jetpack, play Himalayan instruments, and more.
Perfect for adding watermarks & punching up images, the new Text filter debuts today in Snapseed 2.8 for iOS & Android:
The style options are endless with the ability to invert the text, change the opacity, or even use the Text filter in combination with the stack brush to create one-of-a-kind designs.
That is, after creating text, you can tap the little numeral icon at the top of the home screen in order to open up the layer stack, then tap the text filter, and then adjust its blending options and/or paint a mask.
Also included in the 2.8 update is the ability to configure Snapseed to resize photos when sharing or exporting, as well as various UI adjustments and bug fixes.
I think you’ll dig it. Let us know what you think!
Tilt Brush lets you paint in 3D space with virtual reality. Now, with audio reactive brushes, your sketches will bounce, sway, move and pulse to the beat. Just play audio on your computer from any source, enable Audio Reactor mode, and create your own VR music visualizer.
I can’t promise it’ll make golf exciting, and the mental image of my dad & uncles jumping up from their recliners to fly around an AR golf course makes me chuckle. Having said that, this demo of Microsoft’s HoloLens suggests some interesting possibilities for the future.
“He seems kinda bossy,” says Bob Sabiston, creator of Rotoshop, the app used to create Waking Life (under a giant poster of which I’m now typing). In this eight-minute piece he talks about 15 years of turning his back on what was probably untold wealth in order to pursue his own vision while retaining his independence. (Sorry to hear that A Scanner Darkly was such an alienating shit-show.)