ILM have provided a taste of some of the 1750 visual effects shots that went into Rogue One:
This looks clever & promising:
What may appeal to most people here is the use of the OverCapture built into the GoPro app… It allows you to manually select an area of the film, following your subject, to pick out the best angles and export it as a normal, non-360° 1080p video.
This means that you can shoot a 360 video and turn it into something like this:
Where phone have led, cars will follow—first becoming malleable, software-centric platforms (e.g. Tesla rolling out Autopilot, improving cars’ acceleration, etc.), integrating voice assistants, and now adopting contactless charging. Sure, poking a plug into my car takes all of 15 seconds, but I’ll admit to being a touch jealous:
When approaching the pad, the 530e’s sensors and internal screen will navigate the driver to the necessary point above the charging pad. Once in position, the pad’s integrated coil, alongside a secondary coil found inside the car, generate an alternating magnetic field that will charge the car’s 9.4kWh battery in 3.5 hours with 3.2kW of power. The entire process can even be monitored via an app.
You made a grown ass man cry like a baby by automatically making a video titled “They grow up so fast”.. which has about 45 clips of videos with my daughter in it.. aged around 4-5 months to 22 months (current).
I have watched that 3 minute long video 3 times so far.. first time while I cried like a baby.. next two times with my jaw dropped due to the technology that made this possible.
I got one of these myself on Saturday, and now my mom & wife can’t stop watching our Henry Seamus grow from cooing blob to fun-sized weirdo. Cue gratuitous showing!
Remember when the yellow first-down line was the height of game-time badassery? Details here are scant, but this looks like a trip:
The system would require a number of high-resolution cameras mounted in various places around the stadium. Each camera is connected to a network and controlled by software. Afterward, the video viewpoints are fed into an image processing engine that turns it into high-resolution 3D spatial data.
“For about a decade, from 1975 to 1985,” Vice writes, “if you witnessed moving animation on television, it was either shot one frame at a time, or made using a Scanimate machine. Only ten of the devices were ever built.”
Here they drop in on engineer Dave Sieg, who has spent the last 20 years preserving the only working Scanimate. Dave discusses the technical and cultural impact of the Scanimate and what the future holds for this iconic machine.
[YouTube] [Via Margot]
Tommy played piano like a kid out in the rain
then he lost his leg in Dallas he was dancin’ with a train
Man, I thought that my flying a drone off the back of a boat on the Mississippi was risky—but that seems laughably sane compared to Paul Nurkkala flying his drone flying onto, next to, inside, and under a moving freight train.
If you can’t take the queasy-making camera moves, jump to 3:20 to go underneath & 3:30 to go inside:
Nurkkala specializes in flying camera drones through a first-person point-of-view using a live feed through goggles. His custom-assembled drone was equipped with a GoPro HERO5 Session action camera, which is light enough to keep the craft fast and nimble.
“I recognize that this isn’t the most ‘flowy’ video or anything, but all of the things were all in the same flight, so I wanted to show that off,” Nurkkala writes.
Austin Mann returned from India with some amazing images in hand, shot on an iPhone 8. Here he presents a brief overview of the new portrait modes available on the camera (er, phone—no, camera):
I’m stupid-excited to say that I’ve just joined Google’s Skynet Machine Perception team to build kickass creative, expressive experiences, delivering augmented reality to (let’s hope) a billion+ people. I told you sh*t just got real. 🙂
Now, the following career bits may be of interest only to me (and possibly my mom), but in case you’re wondering, “Wait, don’t you work on Google Photos…?”
Well, like SNL’s Stefon, “I’ve had a weird couple of years…”
The greatly smoothed version goes basically like this:
So now we’ve come full circle, and to capture my feelings, I’ll cite SNL yet again. Wish me luck. 🙂
More power & speed for the millions of people who use Snapseed every day:
We’re excited to announce that Snapseed 2.18 has started rolling out today to users on Android and iOS. This update includes a fresh new UI, designed for faster editing with more efficient access to your favorite features.
You’ll find Looks are now available from the main screen, making it easier than ever to apply your customized filters to your photos. Looks are a powerful way to save your favorite combinations of edits and apply them to multiple images. We’ve added 11 beautiful new presets (handcrafted by the Snapseed team) to help you get started – give them a try!
We’re also bringing the Perspective tool to iOS to allow you to easily adjust skewed lines and perfect the geometry of horizons or buildings.