I’m excited for my teammates & their new launches. The team writes,
- Storyboard (Android) transforms your videos into single-page comic layouts, entirely on device. Simply shoot a video and load it in Storyboard. The app automatically selects interesting video frames, lays them out, and applies one of six visual styles.
- Selfissimo! (iOS, Android) is an automated selfie photographer that snaps a stylish black and white photo each time you pose. Tap the screen to start a photoshoot. The app encourages you to pose and captures a photo whenever you stop moving.
- Scrubbies (iOS) lets you easily manipulate the speed and direction of video playback to produce delightful video loops… Shoot a video in the app and then remix it by scratching it like a DJ. Scrubbing with one finger plays the video. Scrubbing with two fingers captures the playback so you can save or share it.
Please take ‘em for a spin, then tell us what you think using the in-app feedback links.
My new teammates from Belarus are honored & grateful to have their Fabby (iOS, Android) and Fabby Look (iOS, Android) apps listed among Apple’s Best of 2017. I can’t wait to show you what’s in store for 2018 and beyond!
I’m eager to try this out with our lads:
We’ve redesigned Science Journal as a digital science notebook, and it’s available today on Android and iOS.
With this new version of Science Journal, each experiment is a blank page that you can fill with notes and photos as you observe the world around you. Over time, we’ll be adding new note-taking tools… We’ve added three new sensors for you to play with along with the ability to take a ”snapshot” of your sensor data at a single moment in time.
Honestly, I hope that my friends making imaging tools see things like MugLife (as well as automatic image selection & extraction, etc.) and say “Holy shit, it’s not the 90’s anymore; time to up our game.”
If TensorFlow, PDAF pixels, and semantic segmentation sound like your kind of jam, check out this deep dive into mobile imaging from Google research lead Marc Levoy. He goes into some detail about how the team behind the new Pixel 2 trains neural network, detects depth, and synthesizes pleasing, realistic bokeh even with a single-lens device. [Update: There’s a higher-level, less technical version of the post if you’d prefer.]
Motion Stills lets you make stabilized multi-clip movies, animated collages, loops, and more from Live Photos. Now version 2.0 for iOS adds
- Capture Motion Stills right inside the app.
- Capture and save Live Photos on any device.
- Swipe left to delete Motion Stills in the stream.
- Export collages as GIFs.
The app’s available on Android, too. Android Police writes, “It’s is essentially a GIF camera, but the app stabilizes the video while you’re recording. You can record for a few seconds, or use the fast-forward mode to speed up and stabilize longer videos.”
Not to be outdone, Google Photos on Web, iOS, and Android now displays Live Photos as well as Motion Photos from the new Pixel 2, giving you a choice of whether to display the still or moving portion of the capture. Here’s a quick sample on the Web. Note the Motion On/Off toggle up top.
I’m thrilled to have joined the team behind Motion Stills, so please let us know what you think & what else you’d like to see!
Fun insights from my new teammates, including:
- “You essentially have the space of a blueberry for the camera to squeeze into.”
- The lens is actually six lenses.
- Each pixel is split into two—useful for sensing depth.
- The whole thing weighs .003 pounds, about the same as a paperclip.
- HDR+ looks tile-by-tile across a range of captures shot in quick succession, moving chunks as needed to align them. This is good for “scaring ghosts.”
- A neural network trained on 1 million images built a model for what’s person-like and should be kept in focus while blurring the background.
- A hexapod rig is used to generate (and thus find ways to combat) various kinds of shakiness.
“Any sufficiently advanced technology…”
Watch as the all-new Pixel 2 heads up the mountains in India to test out the new Fused Video Stabilization. The left side of the video has no stabilization at all, with optical image stabilization (OIS) and electronic image stabilization (EIS) turned off. The right side is the Pixel 2 with Fused Video Stabilization enabled.
The Pixel 2 has a feature called “frame look ahead” which analyzes each individual frame of a saved video for movement. Machine learning compares dominant movements from one frame to another and stabilizes accordingly.
CNET’s got details:
The Google Pixel 2 is the top-performing mobile device camera we’ve tested, with a record-setting overall score of 98. Impressively, it manages this despite having “only” a single-camera design for its main camera. Its top scores in most of our traditional photo and video categories put it ahead of our previous (tied) leaders, the Apple iPhone 8 Plus and the Samsung Galaxy Note 8.
Read on for tons of details.
Austin Mann returned from India with some amazing images in hand, shot on an iPhone 8. Here he presents a brief overview of the new portrait modes available on the camera (er, phone—no, camera):