I’m eager to try this out with our lads:
We’ve redesigned Science Journal as a digital science notebook, and it’s available today on Android and iOS.
With this new version of Science Journal, each experiment is a blank page that you can fill with notes and photos as you observe the world around you. Over time, we’ll be adding new note-taking tools… We’ve added three new sensors for you to play with along with the ability to take a ”snapshot” of your sensor data at a single moment in time.
Honestly, I hope that my friends making imaging tools see things like MugLife (as well as automatic image selection & extraction, etc.) and say “Holy shit, it’s not the 90’s anymore; time to up our game.”
If TensorFlow, PDAF pixels, and semantic segmentation sound like your kind of jam, check out this deep dive into mobile imaging from Google research lead Marc Levoy. He goes into some detail about how the team behind the new Pixel 2 trains neural network, detects depth, and synthesizes pleasing, realistic bokeh even with a single-lens device. [Update: There’s a higher-level, less technical version of the post if you’d prefer.]
Motion Stills lets you make stabilized multi-clip movies, animated collages, loops, and more from Live Photos. Now version 2.0 for iOS adds
- Capture Motion Stills right inside the app.
- Capture and save Live Photos on any device.
- Swipe left to delete Motion Stills in the stream.
- Export collages as GIFs.
The app’s available on Android, too. Android Police writes, “It’s is essentially a GIF camera, but the app stabilizes the video while you’re recording. You can record for a few seconds, or use the fast-forward mode to speed up and stabilize longer videos.”
Not to be outdone, Google Photos on Web, iOS, and Android now displays Live Photos as well as Motion Photos from the new Pixel 2, giving you a choice of whether to display the still or moving portion of the capture. Here’s a quick sample on the Web. Note the Motion On/Off toggle up top.
I’m thrilled to have joined the team behind Motion Stills, so please let us know what you think & what else you’d like to see!
Fun insights from my new teammates, including:
- “You essentially have the space of a blueberry for the camera to squeeze into.”
- The lens is actually six lenses.
- Each pixel is split into two—useful for sensing depth.
- The whole thing weighs .003 pounds, about the same as a paperclip.
- HDR+ looks tile-by-tile across a range of captures shot in quick succession, moving chunks as needed to align them. This is good for “scaring ghosts.”
- A neural network trained on 1 million images built a model for what’s person-like and should be kept in focus while blurring the background.
- A hexapod rig is used to generate (and thus find ways to combat) various kinds of shakiness.
“Any sufficiently advanced technology…”
Watch as the all-new Pixel 2 heads up the mountains in India to test out the new Fused Video Stabilization. The left side of the video has no stabilization at all, with optical image stabilization (OIS) and electronic image stabilization (EIS) turned off. The right side is the Pixel 2 with Fused Video Stabilization enabled.
The Pixel 2 has a feature called “frame look ahead” which analyzes each individual frame of a saved video for movement. Machine learning compares dominant movements from one frame to another and stabilizes accordingly.
CNET’s got details:
The Google Pixel 2 is the top-performing mobile device camera we’ve tested, with a record-setting overall score of 98. Impressively, it manages this despite having “only” a single-camera design for its main camera. Its top scores in most of our traditional photo and video categories put it ahead of our previous (tied) leaders, the Apple iPhone 8 Plus and the Samsung Galaxy Note 8.
Read on for tons of details.
Austin Mann returned from India with some amazing images in hand, shot on an iPhone 8. Here he presents a brief overview of the new portrait modes available on the camera (er, phone—no, camera):
I’m eager to try this out:
When framing a subject, you’ll have a number of different lighting options to choose from for giving your portrait different looks — things like Contour Light, Natural Light, Studio Light, Stage Light, and Stage Light Mono.
These “aren’t filters,” Apple says. Instead, the phone is actually studying your subject’s face and calculating the look based on light that’s actually in the scene using machine learning.
Check out PetaPixel or Apple’s site for larger sample images.
Our friends in Google Research just dropped some new fun for iOS users:
- Beautiful video collages with 20 fun layouts.
- Favorite the Motion Stills you love.
- Export multiple clips at once for fastest sharing.
- Export Live Photos with double the sharpness.
Enjoy, and as always, feedback is most welcome!