With the latest version of the image.canon app (available on Android or iOS) and a compatible Canon camera, you can choose to automatically transfer original quality photos to Google Photos, eliminating the hassle of using your computer or phone to back them up.
In addition to a compatible Canon camera and the image.canon app, you’ll also need a Google One membership to use this feature. To help get started, Canon users will get one month of Google One free, providing access to up to 100 GB of cloud storage, as well as other member benefits, such as premium support from Google experts and family sharing.
Awesome work by the team. Come grab a copy & build something great!
The ML Kit Pose Detection API is a lightweight versatile solution for app developers to detect the pose of a subject’s body in real time from a continuous video or static image. A pose describes the body’s position at one moment in time with a set of x,y skeletal landmark points. The landmarks correspond to different body parts such as the shoulders and hips. The relative positions of landmarks can be used to distinguish one pose from another.
Some of the creatures include the Aegirocassis, a sea creature that existed 480 million years ago; a creepy-looking ancient crustacean; and a digital remodel of the whale skeleton, which is currently in view in the National History Museum’s Hintze Hall.
Exceedingly tangentially: who doesn’t love a good coelacanth reference?
I’m not sure what’s most bonkers: the existence of this vehicle at the turn of the last century; its continued existence & operation ~120 years and two world wars later; or the advances in machine learning that allow this level of film restoration & enhancement:
Denis Shiryaev of Neural Love then took the original footage and used a neural network to upscale it to 4K. He also colorized it, stabilized it, slowed it down to better represent real-time, and boosted the frame rate to 60fps.
I often say there’s “working at Google” and then there’s “WORKING AT GOOGLE.” I of course just “work at Google,” but folks like this are doing the latter. With so many Google & Adobe friends directly affected & evacuating, I love seeing smart folks putting their talents & resources to work like this:
Check out the Google blog for lots of interesting info on how all this actually works. It’s now showing up in specific new features:
Today we’re launching a new wildfire boundary map in Search and Maps SOS alerts in the U.S. to provide deeper insights for areas impacted by an ongoing wildfire. In moments like a growing wildfire, knowing exactly where a blaze is underway and how to avoid it is critical. Using satellite data to create a wildfire boundary map, people will now see the approximate size and location right on their phone or desktop.
When people look for things like “wildfire in California” or a specific fire like “Kincade fire” in Search, they will be able to see a wildfire’s approximate boundary of the fire, name and location, as well as news articles and helpful resources from local emergency agencies in the SOS alert.
On Google Maps, people will have access to the same details, including the fire boundary, and receive warnings if they’re approaching an active blaze. If someone is exploring an area near a wildfire on Google Maps, they’ll get an ambient alert that will point them to the latest information.
Just in time for our boys as they level up their math skills:
When they’re stuck on a homework problem, students and parents can use Socratic and soon can use Google Lens to take a photo of a problem or equation they need help with. Socratic and Lens provide quick access to helpful results, such as step-by-step guides to solve the problem and detailed explainers to help you better understand key concepts.
Meanwhile, 3D in Search now covers a bunch of STEM-related topics:
Longtime VFX stud Fernando Livschitz (seeprevious) has turned to 2D, making spray-painted cutouts derived from a real dancer in order to create this delightful little animation. It’s only 30s long, but the subsequent making-of minute is just as cool:
The stop-motion dancers remind me of the brilliant MacPaint animations (e.g. of Childish Gambino) from Pinot Ichwardardi, who happened to say this about low-fi tech:
I know 2020 sucks a whole lot of ass (just this morning we learned that the beloved Swanton Pacific Railroad for kids may have burned up, JFC…), but it’s good to remember the amazing bits of human progress that sometimes come to life—like this one:
Building on the helpfulness of Pixel Buds’ conversation mode translate feature, which helps when you’re talking back and forth with another person, the new transcribe mode lets you follow along by reading the translated speech directly into your ear, helping you understand the gist of what’s being said during longer listening experiences.
Launching initially for French, German, Italian and Spanish speakers to translate English speech, transcribe mode can help you stay present in the moment and focus on the person speaking.
And your headphones can even detect a crying baby (!) & lower volume:
If your dog barks, baby cries or an emergency vehicle drives by with sirens ringing, Attention Alerts—an experimental feature that notifies you of important things happening around you—lowers the volume of your content momentarily to alert you to what’s going on.
The Android Earthquake Alerts System turns your Android phone into a mini seismometer to detect earthquakes when they start. And starting in California, an integration of ShakeAlert in the Android OS enables phones to deliver earthquake alerts for added seconds to drop, cover and hold.