Fifty billion photos & videos backed up since the product launched in May; yeah, that’ll happen when you say “free & unlimited.” 🙂
Here’s a concise demo of how Photos will help you pool photos with friends & family, let people sign up for email updates, label people, and display your images via Chromecast. (Oh, and I really hope that “Dutch Thunder on the beach in Cancun” become a thing.)
Sorry for the short notice, but if you’re around Mountain View tonight and, like me, are seeking ways to make more social impact with your life, come check out this event:
The Board Match offers a unique opportunity for Bay Area residents to become stronger leaders by serving on the boards of directors of local nonprofit organizations. Board service is for everyone, whether you’re just starting out, a mid-career professional, or a seasoned philanthropist, there is a nonprofit that will value your talents. Nonprofit board service offers young and mid-career professionals opportunities to become organizational and community leaders, with benefits for their own professional growth, as well as an entrée into philanthropy and civic stewardship that inspires others and can become a pattern for life. It offers seasoned professionals approaching retirement a vital next step in a lifelong career, the opportunity to put well-honed skills to use, build new networks, and foster the growth of other leaders.
Want to pool your kid photos with your partner so that your parents can always stay up to date? Or search for a person by name, or share your photos on a big screen via Chromecast? It’s all rolling out—some now, some coming soon—across Android, iOS, and web.
Label the people in your photos by what you call them, name or nickname.
This week in the U.S. you’ll be able to label the people in your photos however you want – Mom can be “Mom”, “Juliana”, or “Cat Lady” – whatever you choose. These labels are completely private to you and are not associated with a Google account or profile. Once people in your photos are labeled, you can make advanced searches to find photos of people with things, places or people, such as “Mom at the beach” or “Juliana and Marco in Hawaii.”
People labeling is rolling out in the U.S. this week on Android and is coming soon to iOS and the web.
Gather all your photos and videos from friends and family in one spot, and know as soon as new moments are added.
We’re introducing shared albums later this year – a new, easy way to pool photos and videos with whomever you want, and get updates when new moments are added. There’s no setup involved, and you can use shared albums on any device – Android, iOS, Mac, Windows and Chrome OS.
I’ve been testing these features for a while & think you’ll really like ‘em.
As groundbreaking as its capture experience (and mere existence!) was, the 2-megapixel cam in the original iPhone was, we can now admit, really godawful; hence the heavy, pancake-makeup approach of the image filters of the day. I remember training myself to compensate for the profound shutter lag as if I was Luke Skywalker donning a blast helmet. It wasn’t, “Hey kid, look at me [press shutter],” but rather, “Hey kid, [press shutter] look at me.”
But progress has been swift & amazing, and I can instantly visually carbon-date pics of my kids by the quality of the phone-captured shots. Now photographer Lisa Bettany has produced beautiful interactive side-by-side comparisons of every generation of iPhone camera. Talk about night-and-day differences (to say nothing of burst mode, HDR, going from zero video to optically stabilized 4k, and more).
As for the future, let’s hope that next year we’re raving about 3D depth sensing enabling SLR-like background separation. Staying tuned…
aka the thing we got screwed out of seeing; thanks, incredibly rare & ill-timed California cloud cover! Anyway, this is really crisp, interesting, and informative—oh, and according to NASA, we’ll apparently have jetpacks by the time we see the phenomenon again:
If you just got a new iPhone & see Google Photos seeming to upload tons of images you already uploaded, don’t worry: it’s just double checking that everything is backed up. We’ll work on making this interface clearer.
Sidestepping the privacy & fashion concerns that have bedeviled systems like Google Glass, Daqri targets industrial applications. The Verge writes,
Daqri is an augmented reality (AR) company based out of Los Angeles. It has developed an AR headset and the software which powers it. Technicians wearing its unit out in the field can see additional information, get step-by-step instructions, and easily relay what they are seeing to a support team connected remotely to their headset.
“There’s no replacement for displacement,” and there’s no substitute for physical optical stabilization (as featured on the iPhone 6s Plus but not the regular 6s). Check out side-by-side results recorded at 4k res:
Yet another reason there’s zero chance I’d consider choosing the smaller device. Zero.
Google Photos uses “deep learning” to help you find your images—but what does that entail, really? Spend five minutes with smart peeps & learn interesting things. (I find the erroneous barbell-arm thing pretty charming.)
Mitch Martinez arrayed 48 DSLRs, a RED Epic, and a Panasonic GH4 to capture a pair of fire artists practicing their craft. I love the CGI-free results, though I wish he’d found a way to vary the capture times just slightly in order to preserve a touch of motion while shifting perspective.
The beautiful Paper ad I blogged on Sunday is just the latest installment in Honda’s rich creative history. It’s worth taking a look back at some terrific ads from the last decade—and these are just the ones I’ve blogged!
Do you live in a world where every blemish, random bird, stray pedestrian, and telephone wire is perfectly round? Me neither!
Therefore I think you’ll really like Snapseed’s new ability to heal arbitrary-shaped regions. Just tap the filter selector, tap Healing, and then paint away the bits you’d like to omit. And of course these operations are, like everything else in the new Snapseed, non-destructive, meaning that you can go back and re-edit them and/or copy/paste them among images.
The update (2.0.4) should now be live on the App Store & Play Store. It also squashes some bugs & adds support for Traditional Chinese (Hong Kong) and Canadian French.
Photographer Christopher Herwig has been hunting bus stops in remote corners of the former Soviet Union since he stumbled upon them while biking to St. Petersburg in 2002. He has covered more than 30,000 km by car, bus and taxi in 13 countries discovering and documenting these strange works of art created behind the Iron Curtain. From the shores of the Black Sea to the endless Kazakh steppe, the bus stops show the range of public art from the Soviet era and give a rare glimpse into the creative minds of the time. Herwig’s series attracted considerable media interest around the world, and now with the project complete, the full collection will be presented in Soviet Bus Stops as a deluxe, limited edition, hard cover photo book. The book represents the most comprehensive and diverse collection of Soviet bus stop design ever assembled.
Got a case of vemödalen (“the frustration of photographing something amazing when thousands of identical photos already exist”)? Or perhaps you’ve just wanted a camera that sounds like a Geiger counter while blurting “NEIN” at you in big red letters?
“Camera Restricta introduces new limitations to prevent an overflow of digital imagery,” he says. “As a byproduct, these limitations also bring about new sensations like the thrill of being the first or last person to photograph a certain place.”
Adios, bothersome fences, reflections, etc. That’s presuming that normal users would be sufficiently motivated to move their devices during capture. Time will hopefully tell.
The video accompanying our SIGGRAPH 2015 paper ” A Computational Approach for Obstruction-Free Photography”. We present a unified computational approach for taking photos through reflecting or occluding elements such as windows and fences. Rather than capturing a single image, we instruct the user to take a short image sequence while slightly moving the camera. Differences that often exist in the relative position of the background and the obstructing elements from the camera allow us to separate them based on their motions, and to recover the desired background scene as if the visual obstructions were not there. We show results on controlled experiments and many real and practical scenarios, including shooting through reflections, fences, and raindrop-covered windows.
I had a ball sitting down with my ex-Photoshop/current Google Photos friend Aravind Krishnaswamy to chat with Andy Ihnatko, Russell Ivanovic, and Yasmine Evjen for this week’s Material Podcast. We talked about computer vision, the future of memory keeping, my wife hypothetically getting bum-rushed by a lady from the Clinique counter, and much more. (Oh, and the jury’s still out on whether there were snakes in the wall. You’ll see.)
DC presents Robbie “Maddo” Maddison’s “Pipe Dream,” giving the world a chance to witness history being made as Maddo rides his dirt bike on the powerful and iconic waves of Tahiti. From his helmet to motocross boots, Maddo was dressed for FMX when he took his dirt bike into the unchartered saltwater terrain of the Pacific Ocean in French Polynesia.
Adobe & Microsoft on stage in an Apple keynote, dogs & cats living together, mass hysteria! And continuing the mash-up madness, some of Painter’s famous brushes are now available inside Photoshop via a $49 plug-in:
Explore an array of 11 imaginative brushes, including Debris, Fabric, Fine Art, Fur, Hair, Light, Space, Smoke and Storm… Enjoy infinite inspiration with our extra brush packs available for purchase.
Kiva was born out of a desire to combine real human connections (vs. just transactions) with scalable, measurable impact. Earlier Jessica had been working at Stanford by day (where people talked in really ambitious but slightly impersonal terms about world-changing enterprises) and by night working with young mothers in East Palo Alto (where she made deep personal connections but questioned what change was resulting). Kiva is meant to foster real connections between entrepreneurs (many of whose stories she tells in the book) & lenders like you & me.
Sometimes you have to “dump the quarterback.” In high school she was asked out by Johnny Football Hero, and of course she had to say yes (as one does). But the guy was kind of a bore, and she dumped him (sacrilege!). It’s tough, but when the inside doesn’t match the outside (be it in a relationship, an ostensible dream job, etc.), something has to change.
I think you’ll find both the talk & the book rewarding, and if you’d like to get started lending via Kiva, check out my lender page and jump in!
Heh—folks worrying about the imminent & inevitable robopocalypse might want to check this out. Kottke writes,
[T]he system hadn’t seen much space imagery before, so it didn’t do such a great job. For the red ringed planet, it guessed “HAIR SLIDE, CHOCOLATE SAUCE, WAFFLE IRON” and the Enterprise was initially “COMBINATION LOCK, ODOMETER, MAGNETIC COMPASS” before it finally made a halfway decent guess with “SUBMARINE, AIRCRAFT CARRIER, OCEAN LINER”.
I love capturingpanos via The App Formerly Known As Photo Sphere, now significantly updated & renamed Street View (download for iOS, Android). PetaPixel writes,
Users can quickly browse all available traditional Street View content in addition to the newer 360-degree photospheres. Simply input a location, zoom in, and you are ready to start walking the streets of your favorite city. You can also explore beautiful photography through a pull-up tab that displays presorted collections and the ‘Explore’ tab. If you want to create your own photosphere you can do so, but will need a smartphone that contains a gyroscope sensor.
I particularly enjoy uploading my spheres to Google Maps to help other people explore the places I’ve visited.
Coincidentally, Ricoh just introduced the Theta S, a new version of their spherical 360º capture app that generates Street View-compatible images. Check out this 360º-degree video that you can spin around while streaming from YouTube:
The scientists first gathered together thousands of photos and asked people (through Amazon’s Mechanical Turk) to manually mark distracting regions in them… That set of annotated images was then used to train a computer to recognize areas of photos people might want to remove in random photos presented to it.
Okay, but I’d like to see this run in reverse, slyly inserting weird little elements (garden gnome, rune, cursed tiki, etc.) into the periphery of your shots—not unlike the PhotoBomb Tool parody that I got in trouble for blogging at Adobe. :-p
Back in 2003 we blew a lot of minds by showing Photoshop’s Match Color feature sucking up the color palette of one photo or painting, then depositing it onto another. This kind of thing kept getting love as it evolved (see 2010 demo), eventually matching lighting among images. As far as I know no one has ended up using such functionality in practice (and yes, Match Color is still sitting in Photoshop on your hard drive right now), but it’s still cool.
Now the tech has taken another leap forward. Per PetaPixel,
In a newly published research paper titled “A Neural Algorithm of Artistic Style,” scientists at the University of Tubingen in Germany describe how their deep neural network can create new artistic images when provided with a random photo and a painting to learn style from.
“Here we introduce an artificial system based on a Deep Neural Network that creates artistic images of high perceptual quality,” the paper says. “The system uses neural representations to separate and recombine content and style of arbitrary images, providing a neural algorithm for the creation of artistic images.”
Props to Bjorn Jonsson for assembling NASA photos into this animation:
The time covered is 09:35 to 13:35 (closest approach occurred near 11:50). Pluto’s atmosphere is included and should be fairly realistic from about 10 seconds into the animation and to the end. Earlier it is largely just guesswork that can be improved in the future once all data has been downlinked from the spacecraft. Light from Pluto’s satellite Charon illuminates Pluto’s night side but is exaggerated here, in reality it would be only barely visible or not visible at all.