Check out five new Google Expeditions created in partnership with MLB.com:
These include virtual tours of Citi Field in New York and Oriole Park at Camden Yards in Baltimore, both of which are narrated by MLB Network’s Heidi Watney. You can also get behind-the-scenes access with career tours that showcase the lives of a baseball beat reporter and television broadcasters. We’re also bringing you a Statcast tour, so you can geek out Moneyball-style with the math and physics behind the game.
“So, what would you say you… do here?” Well, I get to hang around these folks and try to variously augment your reality:
Research in Machine Perception tackles the hard problems of understanding images, sounds, music and video, as well as providing more powerful tools for image capture, compression, processing, creative expression, and augmented reality.
Our technology powers products across Alphabet, including image understanding in Search and Google Photos, camera enhancements for the Pixel Phone, handwriting interfaces for Android, optical character recognition for Google Drive, video understanding and summarization for YouTube, Google Cloud, Google Photos and Nest, as well as mobile apps including Motion Stills, PhotoScan and Allo.
We actively contribute to the open source and research communities. Our pioneering deep learning advances, such as Inception and Batch Normalization, are available in TensorFlow. Further, we have released several large-scale datasets for machine learning, including: AudioSet (audio event detection); AVA (human action understanding in video); Open Images (image classification and object detection); and YouTube-8M (video labeling).
[Via Peyman Milanfar]
I’m eager to try this out with our lads:
We’ve redesigned Science Journal as a digital science notebook, and it’s available today on Android and iOS.
With this new version of Science Journal, each experiment is a blank page that you can fill with notes and photos as you observe the world around you. Over time, we’ll be adding new note-taking tools… We’ve added three new sensors for you to play with along with the ability to take a ”snapshot” of your sensor data at a single moment in time.
Honestly, I hope that my friends making imaging tools see things like MugLife (as well as automatic image selection & extraction, etc.) and say “Holy shit, it’s not the 90’s anymore; time to up our game.”
Paul Asente is an OG of the graphics world, having been responsible for (if I recall correctly) everything from Illustrator’s vector meshes & art brushes to variable-width strokes. Now he’s back with new Adobe illustration tech to drop some millefleurs science:
PhysicsPak automatically fills a shape with copies of elements, growing, stretching, and distorting them to fill the space. It uses a physics simulation to do this and to control the amount of distortion.
The new video for rock band Spoon’s “Do I Have to Talk You Into It” consists of lead singer Britt Daniel being rapidly morphed, deformed, beautified, clone-stamped, liquified, and peeled apart in Photoshop. At one point, Daniel is transformed into a coyote, the Photoshop interface drops away for a split second, and we just see some video of a snarling coyote in the woods. Why not?
Gizmodo goes behind the scenes (and skin!) with director Brook Linder & team.
I am delighted to share this news:
DxO plans to continue development of the Nik Collection. The current version will remain available for free on DxO’s dedicated website, while a new “Nik Collection 2018 Edition” is planned for mid-next year.
“The Nik Collection gives photographers tools to create photos they absolutely love,” said Aravind Krishnaswamy, an Engineering Director with Google. “We’re thrilled to have DxO, a company dedicated to high-quality photography solutions, acquire and continue to develop it.”
DxO is already integrating Nik tech into their apps:
The new version of our flagship software DxO OpticsPro, which is available as of now under its new name DxO PhotoLab, is the first embodiment of this thrilling acquisition with built-in U Point technology (video).
Having known them as Photoshop developers, I was always a big fan of the Nik crew & their tech. (In fact, their acquisition by Google was instrumental in making me consider working here.) I wanted to acquire them at Adobe, and I was always afraid that Apple would do so & put U Point into Aperture! ¯\_(ツ)_/¯
The desktop plug-ins, however, were never a great fit for Google’s mobile/cloud photo strategy, and other than Analog Efex, none had been improved since 2011 (more than a year before Google acquired them). I know that Aravind Krishnaswamy (badass photog, Photoshop vet, eng manager for Google Photos) and other went many extra miles to find a good new home for the Nik Collection, and I’m really excited to see what DxO can do with it. On behalf of photographers everywhere, thanks guys!
When they’re not savagely trolling me (“Hey Google, play Justin Bieber!”—then running away), the Micronaxx really enjoy playing the “I’m Feeling Lucky” trivia app with us. Therefore I was charmed to get invited to brainstorm with my Toontastic friends & others from Google’s kid-focused group, coming up with all kinds of ideas for other family-oriented audio apps. Now that work is starting to come to fruition, enabling 50+ new games & activities on Google Home:
Google says the Assistant is now better at recognizing kids’ voices; and like adults, it’ll be able to distinguish between them so that it can customize responses to each person. To do this, kids will need a Family Link account, which around Google accounts for kids under 13 that allow for parental supervision.
Check it out:
Hey kids, remember Bob Dole?
The last time I recall charting features in Adobe Illustrator getting an update, Bob was running for president—in 1996. Later (c. 2000), Illustrator & ImageReady (later Photoshop) added the ability to bind text objects & shapes to variables. That would have been a godsend in my old graphics production life, but the world didn’t seem to take much notice.
Figuring that we were never going to get around to doing something natively in the apps, I proposed enabling HTML or Flash layers right on the canvas of Adobe design apps. That way a single HTML or SWF GUI could run right in Illustrator, Photoshop, InDesign, etc.—and remain alive & dynamic when exported. You could argue that I was on crack, or you could argue that had we gone that way, we’d have had great charting a decade ago—or both.
But may the future bury the past: it looks like Adobe is at last getting serious about delivering great infographic-making tools. Check out this sneak of “Project Lincoln”:
Running Content-Aware Fill over time has traditionally produced results that are, um, more artistic than useful. Check out this old weirdness:
The trick (well, one of many, I’m sure) is to make the results temporally coherent, so that elements line up across frames. Looks like Adobe’s on their way to licking that problem in “Project Cloak”: