Category Archives: Photography

The Google Photos editor gets smarter & more powerful

“Perfection is achieved, not when there is nothing more to add, but when there is nothing left to take away.” – Antoine de Saint-Exupery

filters-animation_1

When I joined the Google Photos team, they’d just integrated Snapseed into Google+ (the predecessor of Photos). As I hope is obvious, I’m a huge Snapseed fan, but when we looked at what most users actually did in G+ (crop, rotate, tweak brightness, and maybe apply a filter), it became clear that Snapseed was dramatically more complex & powerful than they needed.

Therefore we made the hard decision to reset & build a new editor from scratch. We aimed to deliver great results in a single tap, offer just a few powerful sliders (which under the hood adjusted numerous parameters), and keep Snapseed just one extra tap away (via the overflow menu) for nerds like me.

The vision was always to keep learning from users’ behavior, then thoughtfully enable just the controls needed to deliver extra power when needed. I’m delighted to say that Photos now does just that: the update released Tuesday on iOS, Android, and Web (try it here) manages to keep a simple top-level UI while revealing a lot more of the power under the hood.

The filters UI applies Auto (which can now produce more accurate results) as part of every filter:

These unique looks make edits based on the individual photo and its brightness, darkness, warmth, or saturation, before applying the style. All looks use machine intelligence to complement the content of your photo. [1]

NewImage

In the adjustments section, in addition to the Light, Color, and Pop sliders:

  • Light opens to reveal Exposure, Contrast, Highlights, Shadows, Whites, Blacks, and Vignette
  • Color opens to reveal Saturation, Warmth, Tint, Skin Tone, and Deep Blue

I continue to find Auto to be highly effective for the bulk of my images, but I like being able to pop the hood when needed.

Please take the new features for a spin & let us know what you think!

Oh, and since you’ve been kind enough to read this far, here are some useful shortcuts for use on desktop:

  • E to enter & exit the editor
  • R to enter & exit crop/rotate
  • Shift-R to rotate 90º
  • A to Auto Enhance
  • O (press & hold) to see original
  • Z to zoom
  • Left/right arrows to move among images
  • Cmd-C/V to copy/paste edits among images
  • After clicking a slider, use arrow keys to adjust it & press Tab to put focus onto the next slider

Introducing Google PhotoScan

“Photos from the past, meet scanner from the future.” I think you’re gonna dig this (available now on Android & iOS). 🙂

Don’t just take a picture of a picture. Create enhanced digital scans, with automatic edge detection, perspective correction, and smart rotation.

PhotoScan stitches multiple images together to remove glare and improve the quality of your scans.

Check it out:

So, how does it work? Let’s hear right from the team:

Enjoy, and as always please let us know what you think!

NewImage

[YouTube 1 & 2]

Check out Google RAISR: Sharp images with machine learning

If you share a picture of a tree in a forest, but no one can see it, did you really share it?

Working at Google, where teams aspire to “three-comma moments” (i.e. reaching 1,000,000,000 users), it’s become overwhelmingly clear to me that all the fancy features in the world don’t mean squat if people can’t access them. And traveling in Nepal, I got a taste of just how slow & expensive connectivity can be. Anything that helps deliver content faster & more cheaply means more democratic access to ideas & inspiration.

That’s why I’ve been really excited about RAISR (“Rapid and Accurate Image Super-Resolution”). The Google Research team writes,

[It’s] a technique that incorporates machine learning in order to produce high-quality versions of low-resolution images. RAISR produces results that are comparable to or better than the currently available super-resolution methods, and does so roughly 10 to 100 times faster, allowing it to be run on a typical mobile device in real-time.

NewImage

I’ve been championing this tech within the company and—because the research paper is public—encouraging friends at Facebook, Adobe, Apple, and elsewhere to check it out. Fast, affordable access is good for everyone.

It’s funny: I came here to “teach Google Photoshop” (i.e. to make computers see & create like artists), yet if I do my job right here, you’ll never spot a thing. I’ve come to prioritize access far ahead of synthesis. Funny ol’ world.

PS—Obligatory (?) Old Man Nack remark: “In my day, Genuine Fractals, blah blah…”

Adobe demos automatic sky-swapping

My old Photoshop boss Kevin used to show a chart that nicely depicted the march of tools from simple & broad (think Clone Stamp) to sharp & purposeful (Healing Brush), smartly tailored to specific needs. I love to see how computer vision is helping to extend that arc, as demonstrated here:

Adobe says SkyReplace uses deep learning to automatically figure out the boundary lines between the sky and the rest of the shot (e.g. buildings and ground). It can then not only swap out the old sky and insert a completely new one, but it can adjust the rest of the photo to take on the same look and feel as the new sky, creating a more realistic look.

NewImage

The N-up UI reminds me of Photoshop’s early-90’s Variations dialog. Maybe graphically, as well as politically, everything old is new again.

NewImage

[Via]

Visual storytelling: Harry Caray calls the Cubs’ final out

I couldn’t take it; I just couldn’t take it.

In 1984, for the first time since 1945 (when my lifelong-fan dad was 5 years old), the Cubs won their division championship & reached the postseason. I remember sitting in the living room with my dad watching Harry Caray call Rick Sutcliffe’s winning pitch on WGN. Holy Cow!!

The Cubs proceeded to take a two-games-to-none lead in the best-of-five series with the San Diego Padres. No problem! Just win one game—one of three!—and they were through to the Series. Then… then they went full-on Cub, and during that fateful game 5 my cousin Andy & I headed upstairs to play Lego boats in the tub while the team blew their lead, the series, their shot. I just couldn’t take it.

And then… this past Wednesday. You know what happened. We went insane.

I’ll be honest, though, and confess just a little melancholy. It’s not 1984, and I’m not a kid watching the game with my dad. I’m thrilled for the team, but I can still tell you more players’ names from ’84 than from ’16. I’ve long lived in California and won’t tell you I follow the game or the team as I once did. The victory is great—amazing!—but… you can’t quite go home again.

Ah, but what if you could, just a little? Thanks to some clever & lightning-fast editing, Budweiser brought back to life that iconic voice of my youth, Harry “Cub fan, Bud man” Caray. Just watch. I can’t stop.

https://youtu.be/nApTGkLd2hs

And Harry, wherever you are, thanks. This Bud’s for you.

NewImage

[YouTube]

An expressive new camera app coming from Facebook

Snapchat often describes itself as being “a camera company,” and clearly Facebook (owning Instagram, and having bought MSQRD), is taking a similar path:

The new camera will be accessible by simply swiping right while you’re looking at your news feed, making it easy to quickly snap a filtered moment to share.

 TechCrunch reports that there will be filters for your face (selfie masks, overlaid graphics, and geofilters), art-themed filters, and filters that “respond to your body’s movements.” 

We’ll have to see, though: A lot of the appeal of Snapchat’s filters has been in their social context: Here today, gone today—transient fun. As I’ve often said, the genius of Instagram was in helping regular people be a bit better, and the genius of Snapchat has been in letting them not care. Posting lasting stuff to FB generally equals caring. But who knows, perhaps like Instagram FB will introduce “here today, gone today” ephemeral stories. (I’d certainly use them.) Interesting times.

NewImage

[Via]

You can now edit your panoramas in Snapseed

I was really annoyed & frankly embarrassed to find not long ago that when I used Snapseed to edit a panorama (captured via either my iPhone’s built-in pano mode or via Google Street View’s 360º capture), then tried to post it to Facebook, its magical pano-ness would be lost & the image would be rendered as a flat JPEG instead of as an interactive pano.

Happily this has been fixed, and if you install the latest update to Snapseed, you should be able to edit panos, then upload them in interactive form. (This works for spheres shown via photos.google.com, too.) Take it for a oh God don’t let me say spin and let me know if you hit any snags.

NewImage

Photography: A fun aerial tour of Google Chicago

Okay, I hesitated to share this as I’m allergic corporate self-congratulation, but A) it’s some pretty amazing aerial filmmaking (including in thunderstorms!), and B) the chase of the Androids is just so weird—and get only weirder/funnier as it progresses. That detail reminds me of Khoi Vinh’s smart observation from a couple years back:

Apple fans like myself often criticize Google for doing things that Apple would never do, and Smarty Pins is a prime example of that. Aside from being an unfair criticism, it’s pointless. The fact that Google endeavors to produce silly things like this is on the whole a positive thing, I believe. It’s acting according to its own compass, which is what every company should be doing.

Props to Joey Helms & crew.

googlechicago

[Via Alex Osterloh]

Google’s trippy VR 360º Sprayscape lets you paint to create VR spheres

Wow:

We love VR. We love taking pictures. So we figured, why not try smashing the two together?

Sprayscape is a quick hack using the phone’s gyroscope to take pictures on the inside of a 360-degree sphere. Just point your phone and tap the screen to spray faces, places, or anything else onto your canvas. Like what you’ve captured? Share your creations via a link and your friends can jump into your scapes and have a look around using their phones or even Google Cardboard.

https://youtu.be/6o3_m4-YH9U[

 Nerdy details from the team:

 Sprayscape is built in Unity with native Android support. Sprayscape maps the camera feed on a 360 degree sphere, using the Cardboard SDK to handle gyroscope data and the NatCam Unity plugin for precise camera control.

The GPU makes it all possible. On user tap or touch, the camera feed is rendered to a texture at a rate of 60 frames per second. That texture is then composited with any existing textures by a fragment shader on the GPU. That same shader also creates the scape you see in app, handling the projection from 2D camera to a 360 sphere.

When a user saves a scape, a flat panorama image is stored in the app data. When a user shares a scape, the three.js web viewer takes that flat image and wraps it to a sphere, making it navigable on mobile web by panning, tilting, and moving your device.

NewImage

[YouTube] [Via]

Photography: Werner Herzog’s “Inferno”

Chuck that Dan Brown shite into some molten rock & peep this Inferno instead. (I mean, I’d listen to Werner read the phone book, and here he is talking volcanoes, for God’s sake.)

Werner Herzog’s latest documentary, Into the Inferno, heads just where its title suggests: into the red-hot magma-filled craters of some of the world’s most active and astonishing volcanoes—taking the filmmaker on one of the most extreme tours of his long career. From North Korea to Ethiopia to Iceland to the Vanuatu Archipelago, humans have created narratives to make sense of volcanoes; as stated by Herzog, “volcanoes could not care less what we are doing up here.” Into the Inferno teams Herzog with esteemed volcanologist Clive Oppenheimer to offer not only an in-depth exploration of volcanoes across the globe but also an examination of the belief systems that human beings have created around the fiery phenomena.

NewImage

[YouTube] [Via]

Computational photography: Inside the Google Pixel

Many years ago the Photoshop team collaborated with Stanford professor Marc Levoy & his team. We were especially interested in their work to create a programmable device—charmingly known as the “Frankencamera”—that could run emerging algorithms to guide both capture & processing.

Fast forward to today, and Marc is leading a team of researchers at Google who just helped ship the new Pixel phone. As Marc notes, “The French agency DxO recently gave the Pixel the highest rating ever given to a smartphone camera.” On the Verge he provides lots of interesting details about how the camera works. For instance,

The Hexagon digital signal processor in Qualcomm’s Snapdragon 821 chip gives Google the bandwidth to capture RAW imagery with zero shutter lag from a continuous stream that starts as soon as you open the app. “The moment you press the shutter it’s not actually taking a shot — it already took the shot,” says Levoy. “It took lots of shots! What happens when you press the shutter button is it just marks the time when you pressed it, uses the images it’s already captured, and combines them together.”

Read on for more—or if you just want some quick highlights, check out this two-minute tour shot entirely with a Pixel:

NewImage

[YouTube]

Google Photos now turns videos into weirdly charming little animations

“We’ve always made animations from photos,” the team writes, “but now we make animations from your videos, too. And not just any videos. We look for segments that capture activity — a jump into the pool, or even just an adorable smile — and create short animations that are easy to share.”

Here’s one it generated of the Micronaxx:
Ninjas

And it made another from the luau we attended last week:
Dancers

As before, you don’t need to do anything: just let Photos back up your vids, then watch for Assistant notifications.

card

Join the Worldwide Photo Walk this Saturday

From Boston to Benghazi, photogs are headed out to create beauty & raise money for a good cause. Scott Kelby writes:

If you haven’t signed up head over to the official site and see if there’s a photo walk set-up near you (if you haven’t checked in a while, it’s worth checking again – we have photo walks organized in over 1,000 cities with over 20,000 photographers walking around the world. – It’s not too late to sign up!)

Find a walk near you (it’s free).

Can’t walk? You can still help support the Photo Walk by supporting The Springs Of Hope Kenya Orphanage.

Don’t forget to share the event with your friends on Facebook or Twitter and use the event hashtag #WWPW2016.

Photowalk

Google Photos introduces concept movies

 “POV: Preserve, Organize, Visualize”—that’s always been my mnemonic for Google Photos: your intelligent assistant should keep everything safe, help you navigate it through intelligent auto-organization, and then make it delightful to see & share the results.

Things are taking a big step forward with the introduction of concept movies:

Google Photos has always made movies for you using your recently uploaded photos. Now we’re going further, with new movies that are based on creative concepts — the kinds of movies you might make yourself, if you just had the time. And they’re not only limited to your most recent uploads. One of the first concepts is designed to show your child growing up right before your eyes.

Here’s a little capsule of my son Henry, made with zero work from me:

https://youtu.be/aNiu5mkyBns

My fam has been loving these. When I sent her the Henry video, my wife said,

Oh my God, honestly? I’m all smiles. It’s so cute!!! The music is really perfect and the timing of shots against the music is also great. I’m feeling all mushy now.

And this is just the beginning:

We’re rolling out a couple more concepts this week, with more coming soon. Look out for a concept to commemorate the good times from this summer, and another one for formal events like weddings. And you don’t need to do a thing — these movies get made automatically for you.

As always, your feedback is most appreciated!

NewImage

[YouTube]

Lightroom Mobile adds DNG capture on iOS

Catching up with the Android version launched in February (sorry, couldn’t resist :-)), LrM can now capture raw images in DNG format on iOS 10 (via an iPhone 6s or 7). The team blog provides a bunch of good perspective, including this useful side-by-side JPEG-vs-DNG comparison:

In order to capture shadow detail, this image was metered from the shadows, resulting in blown out highlights. The DNG version on the right enabled the highlights to be recaptured without issue.

Nice work, guys!

NewImage

Motion Stills adds text, gets faster, more

Following its tech’s inclusion into Google Photos, is the standalone stabilization tool going away? Nope: it remains a great way to combine multiple Live Photos into short movies, and the app has just gotten some nice enhancements.

  • You can add text in a variety of styles
  • You can make clips ping-pong back and forth
  • The app runs roughly 2x faster & uses less local storage

Enjoy! Let us know what else you’d like to see, and please share your creations with #MotionStills.

NewImage

Motion Stills tech arrives in Google Photos

This is rad: Your Live Photos are now automatically stabilized & made more sharable in Google Photos for iOS thanks to direct integration of the Motion Stills app technology that debuted this summer:

Using advanced stabilization and rendering originally used in the Motion Stills app, Google Photos can freeze the background in your Live Photos or create sweeping cinematic pans, turning your Live Photos into beautiful, captivating moments. Easily save it as a looping video and share it with anyone.

This update also includes the ability to sort photos in albums chronologically or by recently added (fear not – this is coming soon to Android and web as well). And, based on your feedback, you can now choose a new thumbnail for faces in People.

Enjoy!

[Via]

Workflow tips: Getting RAW files into Snapseed

 Now that 144 camera models (see list) are supported in Snapseed on iOS, how can you get images from them into the app? Ah, glad you asked. Here’s some info from Snapseed Help:

Lightning to SD Card Camera Reader: will read all supported RAW files and allow the user to import them to the Camera Roll. Note: Some DNG files may appear blank in the interface and Camera Roll but will be shown correctly in Snapseed. Check it out in the Apple Store.
Lightning to USB Camera Adapter: can be used in combination with a camera’s USB port or even a USB SD card reader to read all supported RAW files and allow the user to import them to the Camera Roll. Check it out in the Apple Store.
EyeFi MobiPro: RAW files can be transferred to an iOS device via Wifi using “Eyefi Mobi” app and selecting share/save photo. Photos will be saved as RAW files to the Camera Roll. Note: This requires iOS 9.3.4.
Google Drive: Select a photo in Drive, tap on the dot dot dot icon, then select “send copy”. Drive will download the file. Select “Save Photo” to save it to the Camera Roll, or “Open in” to directly open it in Snapseed. Note: This requires Drive v4.12 and iOS 9.3.4.
Apple Mail: Email a RAW file, fully download it in Mail, then open the photo preview and tap the “share” icon. Select “Save Photo” to save it to the Camera Roll, or “Copy to Snapseed” to directly open it in Snapseed.

NewImage

New Snapseed delivers RAW, Face filter, White Balance, Perspective, and more

I’m delighted to say that Snapseed now supports raw image editing on both iOS & Android:

The new RAW tool opens automatically when Snapseed detects a RAW file and works seamlessly with other Snapseed tools, such as Healing, Brushes, Frames, Text, HDR, and Details. Editing changes can be saved non-destructively, or exported as JPG in high quality. Some of the available adjustments for RAW include Structure, Tint, Shadow control, Exposure (-4.0 to 4.0 f-stops), and Temperature (1.700°K to over 8.000°K). Anyone using Snapseed 2.9 and an Apple USB SD card photo adapter or WiFi SD card can now work with RAW images. 

Here’s a RAW-vs.-JPEG comparison:

Raw vs jpeg

 Meanwhile the new Face filter lets you brighten faces, smooth skin, and make eyes pop.

Want more? Okay, here’s more:

Also on Android: new Perspective and White Balance tools. Perspective straightens lines in your image by removing the perspective effect from the original image. White Balance offers fine color balance control with pinpoint precision via an eye dropper tool.

Lastly, you now have greater control over image saving:

In addition to UI improvements and bug fixes on both Android and iOS, you can now set the preferred JPG compression rate, or even save lossless (PNG) when exporting.

Enjoy, and as always, please let us know what you think!

Photography: Where Eagles Dare

Watch a camera-wearing eagle named Darshan plunge off the top of the Burj Khalifa—and just wait til he turns on the jets:

FREEDOM is delivering its message to urban audiences by capturing never-before-seen footage taken from the backs of Eagle’s flying from iconic city landmarks. Flights have taken place from the Eiffel Tower in Paris, Tower Bridge, St Paul’s Cathedral, and the ArcelorMittal Orbit in London. … As a partnership FREEDOM is a pioneering movement for the conservation community. It not only seeks to reintroduce threatened birds of prey, but it also serves an ambassador role for all species, highlighting ways for people and organisations to become involved in supporting effective frontline nature conservation programmes.

The only thing that could improve the footage is Danzig pumping out a little Where Eagles Dare.

NewImage

I’m not sure about all this “Darshan” talk, though. You know damn well it’s Donald.

NewImage

[YouTube] [Via]

Google & Rio’s favela residents team up to map in 360º

You can explore in VR & 360º video thanks to a very cool partnership:

The favelas of Rio aren’t well-known to many outsiders, partly because there’s limited information about these areas to include on maps. We partnered with the local Brazilian nonprofit Grupo Cultural AfroReggae on a project called “Tá No Mapa” (“It’s On the Map” in English). Together with AfroReggae we trained 150 favela residents on digital mapping skills and in just two years they’ve mapped 26 favelas and gotten more than 3,000 businesses on the map. Not only does this allow locals to find businesses like Bar do David—an award-winning restaurant in the favela Chapeu Mangueira—it’s helped some local residents get a mailing address for the first time. [Read more]

NewImage

[YouTube 1 & 2]

Snapseed introduces text support

Perfect for adding watermarks & punching up images, the new Text filter debuts today in Snapseed 2.8 for iOS & Android:

The style options are endless with the ability to invert the text, change the opacity, or even use the Text filter in combination with the stack brush to create one-of-a-kind designs.

That is, after creating text, you can tap the little numeral icon at the top of the home screen in order to open up the layer stack, then tap the text filter, and then adjust its blending options and/or paint a mask.

Also included in the 2.8 update is the ability to configure Snapseed to resize photos when sharing or exporting, as well as various UI adjustments and bug fixes.

I think you’ll dig it. Let us know what you think!

Snaptext

[Via]

New software predicts how you look with different hair styles, colors, appearances

Check this out: 

A new personalized image search engine developed by a University of Washington computer vision researcher called Dreambit lets a person imagine how they would look a with different a hairstyle or color, or in a different time period, age, country or anything else that can be queried in an image search engine.

After uploading an input photo, you type in a search term — such as “curly hair,” “India” or “1930s.” The software’s algorithms mine Internet photo collections for similar images in that category and seamlessly map the person’s face onto the results.

Beyond obvious fun & beauty applications, tech like this could be amazing for the age-progression work of the National Center for Missing & Exploited Children.

NewImage

[YouTube] [Via]

What do you think of Polaroid Swing?

Blurring the line between photo & video, Polaroid Swing lets you capture 1-second clips that play as users scroll, scrub, or tilt their phones.

I find myself kinda nonplussed. It’s beautifully executed, and I’ve long wondered why Instagram has so steadfastly failed to take advantage of device characteristics like gyroscopes. On the other hand, this feels more like a feature than a product (see also Fyuse), and it’s hard for me to imagine frequently nailing 1-second captures.

Thoughts?

NewImage

[YouTube]

Runcible, a funky, circular wearable camera/phone

Hmm—I’m not quite sure what to make of this thing, but I’m intrigued by its form factor & materials:

Circular & palm sized. As powerful as a smartphone, but designed with a sense of quiet serenity and longevity. This anti-smartphone can do “smartphone things” like make calls, type, take pictures & video, explore the web and get directions. The rest of the time, Runcible is quiet, beautiful, and truly yours.

NewImage

 

[Vimeo] [Via Dan Rubin]

No Google? No problem. Try sheep.

Tired of waiting for Street View cars to capture the beautiful, winding roads of the Faroe Islands, local residents have devised SheepView360! Resident Durita Dahl Andreassen writes,

I gently placed a 360˚ camera, powered by a solar panel, on the back of a sheep that would take photographs as the animal freely grazed the open hillsides of the Faroe Islands. Photos are then transmitted back to my mobile phone so that I can upload them to Google Street View myself, finally putting the Faroes on the map in a very unique way!

Sometimes, in an often dark & sad world, someone—and some sheep—go and make it less so.

 NewImage

Check out Snapseed’s new built-in tutorial stream

Apply a saved look to your image just by tapping it in a tutorial (stored in the “Insights” drawer at the bottom of the home screen).

Snapseed 2.7 is rolling out today and we’re excited to introduce Snapseed’s new Insights stream on your iOS device! Insights offers helpful editing tips directly within Snapseed: quick tutorials, pro editing tips, and inspiration from great photographers are now at your fingertips, with new content published often.

In addition, both updates on Android and iOS have minor bug fixes and adjustments.

Feedback is, as always, most welcome!

[Via]

Photography tutorial: Creating a rain of sparks with steel wool

Cool stuff from GoPro, though I’d have liked a bit more detail on the actual camera/lighting settings: 

NewImage

Please note PetaPixel’s important note of caution:

Warning: This photography technique can be very dangerous and can cause serious bodily injuries and damage to a place if not done in a safe and controlled manner. In 2016 alone, steel wool photography has been blamed for burning down an iconic shipwreck and a 1920s landmark. Thus, it should only be done with appropriate care for surroundings, adequate safety equipment, and permission to use the specific location.

[YouTube]

Choreography: A Kaleidoscopic Chimera

I dig the sharp, split-screen editing of these bodies in motion:

Inspired by the mythological chimera—a fierce hybrid of a lion and a goat with a snake’s head for a tail—director Steven Briand’s balletic short sees dancers merge three unique movement styles through a single sequence, all choreographed by MIA collaborator Cathy Ematchoua.

[Vimeo] [Via]