Living in often bone-dry California, I can’t say that I’d thought of trying to capture waterfalls from a drone, but it’s a neat idea that Stewart Carroll covers nicely in this short overview. Meanwhile I’d like to learn more about paring a slow shutter with device motion to freeze a subject (e.g. a moving train) while blurring the background.
The Glambot itself is well-known Bolt high-speed cinebot by Camera Control holding up a Phantom 4K Flex camera with a Leica Summilux lens mounted on it. […]
“The pressure is on because you only ever have ONE take, and this is a dangerous rig that can knock you out,” Walliser writes. “I get good at explaining things, but sometimes the environment is so frenetic you can’t really hear me or focus.”
The key component to Oppo’s system is a periscope setup inside the phone: light comes in through one lens, gets reflected by a mirror into an array of additional lenses, and then arrives at the image sensor, which sits perpendicular to the body of the phone. That’s responsible for the telephoto lens in Oppo’s array, which has a 35mm equivalence of 160mm. Between that lens, a regular wide-angle lens, and a superwide-angle that’s 16mm-equivalent, you get the full 10x range that Oppo promises.
Bruce Berry (not Neil Young’s late roadie) created some beautiful time lapse imagery from images captured aboard the International Space Station:
On Vimeo he writes,
All footage has been edited, color graded, denoised, deflickered, stabilized by myself. Some of the 4K video clips were shot at 24frames/sec reflecting the actual speed of the space station over the earth. Shots taken at wider angels were speed up a bit to match the flow of the video.
Some interesting facts about the ISS: The ISS maintains an orbit above the earth with an altitude of between 330 and 435 km (205 and 270 miles). The ISS completes 15.54 orbits per day around the earth and travels at a speed of 27,600 km/h; 17,100 mph).
The yellow line that you see over the earth is Airgolw/Nightglow. Airglow/Nightglow is a layer of nighttime light emissions caused by chemical reactions high in Earth’s atmosphere. A variety of reactions involving oxygen, sodium, ozone, and nitrogen result in the production of a very faint amount of light (Keck A and Miller S et al. 2013).
I love the choice of music & wondered whether it comes from Dunkirk. Close: that somewhat anxious tock-tock undertone is indeed a Hans Zimmer jam, but from 20 years earlier (The Thin Red Line).
I’m bemused/amused to see this once-obscure (to non-speakers of Japanese) term now getting verbed in an Apple ad that’s racked up nearly 20 million views since Friday. (“This is the strangest life I’ve ever known…”)
Johnny Schaer (Johnny FPV) is a pro drone racer. His drones are designed to be light, quick, nimble, fly upside down and through all kinds of crazy flightpaths that DJI’s drones could never achieve. And when somebody with the skill of Johnny turns on the camera, that’s when you get results like the video above.
To shoot the footage, Johnny used a drone built around the AstroX X5 Freestyle Frame (JohnnyFPV edition, obviously) frame with a GoPro Hero 7. It has no GPS, no gimbal, no stabilisation, no collision avoidance, none of those safety features that make more commercial drones predictable and easy to fly.
The composite red, green, and blue value of every pixel in a digital photo is created through a process is called demosaicing.
Enhance Details uses an extensively trained convolutional neural net (CNN) to optimize for maximum image quality. We trained a neural network to demosaic raw images using problematic examples […] As a result, Enhance Details will deliver stunning results including higher resolution and more accurate rendering of edges and details, with fewer artifacts like false colors and moiré patterns. […]
We calculate that Enhance Details can give you up to 30% higher resolution on both Bayer and X-Trans raw files using Siemens Star resolution charts.
Hmm—I’m having a hard time wrapping my head around the resolution claim, at least based on the results shown (which depict an appreciable but not earth-shattering change). Having said that, I haven’t put the tech to the test, but I look forward to doing so.
For more info check out the related help doc plus some deep nerdery on how it all works.
Almost looks deceivingly pleasant & prosperous in these lovely aerials:
Pyongyang is by far the weirdest and strangest place I have ever been to. At the same time it’s also one of the the most interesting and intriguing places and unlike anywere else I have ever been to. You go there with 100 questions and you return with 1000!
Heh—I love the fun that Cuban fashion brand Clandestina is having with the Chrome “no internet” dino. Here he dodges palm trees, pineapples, and old Chevys before finally colliding with his nemesis, connectivity (“3G”).
I am, as the kids would say, there for this documentary:
The film is comprised entirely of archival footage and audio:
Miller and his team collaborated with NASA and the National Archives (NARA) to locate all of the existing footage from the Apollo 11 mission. In the course of sourcing all of the known imagery, NARA staff members made a discovery that changed the course of the project — an unprocessed collection of 65mm footage, never before seen by the public. Unbeknownst to even the NARA archivists, the reels contained wide format scenes of the Saturn V launch, the inside of the Launch Control Center and post-mission activities aboard the USS Hornet aircraft carrier.
The find resulted in the project evolving from one of only filmmaking to one of also film curation and historic preservation. The resulting transfer — from which the documentary was cut — is the highest resolution, highest quality digital collection of Apollo 11 footage in existence.
I also loved this music video made using mission audio & imagery:
Terrific work from Tarsicio Sañudo, who according to PetaPixel “shot thousands of RAW photos with his DJI Mavic 2 Pro over the course of two months.” He mentions using After Effects for post-capture stabilization.
Being able to preset one’s flight path on a map seems like a great way to set up shots that transition from day to night—especially cool when done with hyperlapses. Now to find a sufficiently interesting area in which to try it. See below for a demo/tutorial.
Oh, and there’s a really significant (for me, anyway) tweak hanging out in the corresponding firmware update: “Fixed issue: could not open Sphere panorama photos in Facebook.” The absence of the correct metadata was an ongoing pain that prevented me from seeing panos as interactive in Google Photos or making them interactive on Facebook. I haven’t yet installed the update, but I have my fingers crossed. [Update: It works!]
Whoa—apparently Irish Wonder Twin Powers involve an insane work ethic for finding interesting earthly patterns:
I was getting a sense of deja vu watching this, and PetaPixel helpfully writes,
If project reminds you of “Arena” by Páraic McGloughlin, there’s a good reason for that: Páraic is Kevin’s twin brother and the two had originally planned to create a single collaborative video before splitting and working independently on two separate videos while working in the same office.
Just a little drone fun my Mini Me & I had in Barstow, CA, at New Years (with a big hat tip to Mr. Johnny Cash):
And on the very off chance you’re interested in having a very rail-savvy 9-year-old tour you around the Western America Railroad Museum, well, you’re in luck. 😌
The lads and I are just back from an overnight visit to the USS Hornet, a decorated World War II-era carrier we last visited some 7 years ago. This time around we spent the night with our Cub Scout pack & several hundred other scouts & parents from around the area. On the whole we had a ball touring the ship, and I had a little fun flying my drone over the Hornet & her adjacent Navy ships:
And here’s an interactive 360º panorama from overhead. (Obligatory nerdy sidenote: This is the JPEG version stitched on the fly by the drone, and although I was able to stitch the raw source images in Camera Raw & get better color/done, I’ll be damned if I can figure out how to inject the proper metadata to make it display right. As usual I used EXIF Fixer to make the JPEG interactive.)
“It’s a hell of a lot easier to sit on your ass in a vehicle for thousands of miles than it is to carry 80 lbs of gear on your back into the wilderness for dozens of miles,” writesNicolaus Wegner, explaining his interest in capturing storm time lapses. “Plus, I think supercells and other forms of severe weather are just about the coolest events our planet manifests.” Agreed:
I have no idea whether this thing is worth a damn—but I’d sure like to find out (well, with the caveat that if it’s awesome, it’d be one more piece of bulky kit to schlepp around):
Using an astronaut’s perspective on intuitive motion through space, we have patented a unique and intuitive drone controller that anyone, whether they’re eight or eighty, can pick up and begin using immediately.
The FT Aviator is designed to incorporate the relevant 4 degrees of freedom of movement (x, y, z, and yaw) to drone flying, eliminating the awkward interface and steeper learning curve of existing dual thumb-controlled drones. It intuitively unlocks human potential to fly and capture stunning imagery.
We just passed the 50th anniversary of the creation of “the most influential environmental photograph ever taken,” captured on Christmas Eve, 1968, by astronaut Bill Anders during the Apollo 8 mission. Here’s a great short look into this photographic history:
What a gorgeous way to ring in the still-new year:
Pilot/photographer Sigurður Þór Helgason writes,
Happy New Year 2019. This is Reykjavik city on New Years Eve. Most households shot their own fireworks just before midnight so the outcome is a spectacular firework show, unlike any other. Music by Adi Goldstein.
Note: I shot this with my Mavic 2 Pro. I used D-log M, ISO 1600, shutter 1/25 frame rate 25 and used the LUT from DJI to bring the colours back. No other adjustments and no, there are no special effects in this video or post production. Hope you enjoy this.
This is a watershed moment for me: After 11+ years of shooting on iPhones & Canon DSLRs, this is the first time I’ve shot on an Android device that plainly outshines them both at something. Night Sight on Pixel 3 blows me away.
First, some important disclaimers:
I work at Google & get to collaborate with the folks responsible for this tech, but I can take no credit for it, and these are just my opinions & non-scientific findings.
I’m not here to rain on anybody’s parade. My iPhone X is great, and the 70D has been a loyal workhorse. I have no plans to ditch either.
The 70D came out in 2013, and it’s obviously possible to get both a newer DSLR & a lens faster than my 24-70mm f/2.8.
It’s likewise possible to know a lot more about manual exposure than I do. I went only as far as to choose aperture priority, crank the exposure wide open, and set ISO to Auto.
Having said all that, I think my results reasonably represent what a normal-to-semi-savvy person would get from the various devices. Here’s what I saw:
Pixel 3 vs. 70D shots (set one, set two), all unedited. CR2 files from the 70D got converted to JPEG using default processing in Lightroom. In many cases the 70D struggled to focus (whereas the Pixel never did), so some of its shots are soft as well as dark.
Pixel 3 vs. iPhone X on a separate evening. With a few subjects (e.g. this one) I tried taking an iPhone shot with default (auto) exposure, then one with exposure manually cranked up, and finally one with Pixel 3 Night Sight. Here’s another triplet. Regrettably I didn’t think to try shooting raw on either phone.
By the way, Happy New Year! Here’s an animation created last night by shooting a series of Night Sight images, then combining them in Google Photos & finally cropping the output in Photoshop.
PS—I love the Queen-powered “Flash!” ad showing Night Sight:
I find this kind of thing endlessly, eerily fascinating:
A collection of high quality remastered prints from the dawn of film taken in Belle Époque-era Paris, France from 1896-1900. Slowed down footage to a natural rate and added in sound for ambiance. These films were taken by the Lumière company.
0:08 – Notre-Dame Cathedral (1896)
0:58 – Alma Bridge (1900)
1:37 – Avenue des Champs-Élysées (1899)
2:33 – Place de la Concorde (1897)
3:24 – Passing of a fire brigade (1897)
3:58 – Tuileries Garden (1896)
4:48 – Moving walkway at the Paris Exposition (1900)
5:24 – The Eiffel Tower from the Rives de la Seine à Paris (1897)
If that’s up your horse-trodden alley, check out a similar piece from NYC that I posted earlier in the year:
Trent Mitchell shares his incredible devotion to capturing ethereal, ephemeral moments in this terrific short film by Robert Sherwood:
“My aim to render the true essence of the human condition and a mirror of one’s self could only be captured within the moment and in a single breath,” writes Mitchell. “In a space that moved the subject and the viewer with equal pull.” [Via]
Illuminating stuff as always from Stewart & Drone Film Guide:
Of course, I’m reminded that before I even bother with this stuff, I need to move myself off the absolutely shite color tools in iMovie and onto… what, exactly? As I mentioned the other day, the new Adobe Rush’s tools are really anemic; learning Premiere Pro seems like no joke; and I don’t care to pay for Final Cut. Hmm—to be continued.
The distinct H-shaped yoke determined both roll and pitch. Airspeed was controlled by the number of rocket chambers—up to four—fired by the silver thumb-switch to the left of the yoke; there was no throttle.
The Mach indicator above goes to Mach 1.5; it was most likely installed after Yeager’s first transonic flight. It’s flanked by a conventional altimeter and airspeed indicator. The fastest Glamorous Glennis ever flew was Mach 1.45.
Yeager signed his name in the cockpit of Glamorous Glennis on many occasions over the decades. (He piloted 33 of the aircraft’s 78 career test flights, including its last, on May 12, 1950.) Can you find all his signatures?
Just a quick bit of flying Thanksgiving weekend near Pismo Beach. A few thoughts:
Color grading in iMovie is for the birds, but somehow it’s no better in Adobe Rush (which lacks an Auto button (!), much less key framing), and learning Premiere Pro always seems like too big a hill to climb.
I likewise find it hard to cut on the beats—a problem compounded when I share the output to YouTube and Facebook (where, I swear to God, somehow the audio & video get differently out of sync).
I’ve gotta learn how to avoid (or later compensate for) the gross propeller shadows that appear in a few shots here.
No, the soundtrack doesn’t really fit (an assessment my 9-year-old Henry cheerfully volunteered 🙄), but, eh, I found the juxtaposition oddly fun. YMMV.
ESA astronaut Alexander Gerst… shot the photos in the timelapse by mounting a camera in the Cupola module and using an intervalometer to snap photos at regular intervals. Played back at 8 to 16 times normal speed, the timelapse above shows around 15 minutes of the rocket’s launch.
Pretty much like it says on the tin. PetaPixel writes,
There isn’t a filter in the app that lets you selectively see only Portrait mode photos, but the new option in the Edit menu will be present for any Portrait shot.
Download the latest version of Google Photos for iOS to get started with this new feature. Depth editing is already available on the Pixel 3, Pixel 2, and Moto phones that have depth photo support. Google says it’ll also be adding more Android devices soon.
This is easily the most awe-inspiring and jaw-dropping thing I’ve seen in months. In its low Earth orbit ~250 miles above our planet, the International Space Station takes about 90 minutes to complete one orbit of the Earth. Fewer than 600 people have ever orbited our planet, but with this realtime video by Seán Doran, you can experience what it looks like from the vantage point of the IIS for the full 90 minutes.
After filming the band performing the song, director Johnny Jansen spent $680 on printing out 2,250 of the frames on regular paper with a laser printer. With a crew of 6 people, Jansen then painstakingly photographed each print in a new photo to create the stop-motion video.
Visiting the NY Times was always among the real treats of my time working on Photoshop. I was always struck by the thoughtfulness & professionalism of the staff, but also by the gritty, brass-tacks considerations of cranking through thousands of images daily, often using some pretty dated infrastructure.
The morgue contains photos from as far back as the late 19th century, and many of its contents have tremendous historical value—some that are not stored anywhere else in the world. In 2015, a broken pipe flooded the archival library, putting the entire collection at risk. Luckily, only minor damage was done, but the event raised the question: How can some of the company’s most precious physical assets be safely stored?
I believe we can’t abandon our sense of adventure because we lose our ability to see it, and it has become my goal to help people who live with similar challenges, and show them that anything is possible.
In 2013, I became the first blind person to kayak the entire 226 miles of the Colorado River through the Grand Canyon But, I always felt it didn’t mean anything unless I found a way to pay it forward. So I joined up with the good folks at Team River Runner, a nonprofit dedicated to providing all veterans and their families an opportunity to find health, healing, community, and purpose. Together we had the audacious goal to support four other blind veterans take a trip down the Grand Canyon.
Facebook’s 3D photos (generated from portrait-mode images) have quickly proven to be my favorite feature added to that platform in years. Hover or drag over this example:
The academic research they’ve shared, however, promises to go farther, enabling VR-friendly panoramas with parallax. The promise is basically “Take 30 seconds to shoot a series of images, then allow another 30 seconds for processing.” The first portion might well be automated, enabling the user to simply pan slowly across a scene.
This teaser vid shows how scenes are preserved in 3D, enabling post-capture effects like submerging them in water:
Will we see this ship in FB, and if so when? Your guess is as good as mine, but I find the progress exciting.
Time, they say, has the nice property of keeping everything from happening at once. But what would it look like if everything did happen at once?
Photographer Páraic McGloughlin hung out on a bridge in Sligo, Ireland for 19 hours, to create a single, day-long shot that he then manipulated. Colossal writes,
“Using a fundamental image (a time lapse) to mask and cut into, I tried to show the variable possibilities within a limited time span, maintaining the integrity of each individual photograph while dissecting and rearranging the overall image.” The visual content was matched with each layer of audio created by Cooper to form the song, which stacks up to over one hundred layers.
My colleague Richard is in charge of Google News, and in addition to doing a million other interesting things, he’s an accomplished aerial photographer. I enjoy the perspectives—literal & figurative—he shares in this meditative piece:
Man, apparently some of the biker scouts from Endor lived on & now thread drones through forrest gaps over crazy mountain biking trails, as seen in this bonkers project from Cinematic Flow:
“We designed and refined FPV drones since 5 years now. When Kilian spoke about his idea of putting a GoPro Fusion on one of our drones, we were intrigued but thrilled about this new challenge. The design and the flying of this set up are so different than what we are used to, there were loads of crashes but the end result is so refreshing and pushes the drone shot to the next level.” Pierre, engineer at Cinematic Flow
Now, how about a look behind the scenes?
We trust the cam, we ensure a light flying and tighten the buttocks to export!
Wow—check out this amazing sneak peek from Adobe’s Long Mai (see paper):
Enables any photograph to be turned into a live photo; animating the image in 3D, simulating the realistic effect of flying through the scene.
This is especially dear to my heart.
As a brand new Photoshop PM (in 2002—gah!), one of my first trips was back to NYC to visit motion graphics artists. Touring one shop I was amazed to glimpse a technique I’d never seen, using Photoshop to break 2D photos into layers, fill in gaps, and then animate the results in After Effects. Later that year the work came to the big screen in The Kid Stays in the Picture, the documentary that now lends its name to this ubiquitous parallax effect.
Here Yorgo Alexopoulos talks about how he developed the technique & how he’s leveraged it in later works:
So, while we wait for Adobe’s new tech to ship, how could one do this by hand? Below, artist Joe Fellows gives a brief, highly watchable demo of how it’s done (although it physically pains me to see him using the Pen tool to make selections & no Content-Aware Fill to at least block in the gaps):
Man, I used to hate demoing alongside After Effects during internal Adobe events: We had Photoshop, sure—but they were Photoshop on wheels. You could just pencil them in for the Top Gun trophy nearly every time.
Making Content-Aware Fill work at all is hard—but making it effective over multiple frames (“temporally coherent,” in our nerdy parlance)? Well, that requires FM technology—F’ing Magic. Here’s a naive implementation (not from Adobe):
Cool, artsy—but generally not so useful. And here (at 1:50:44) it is as the After Effects team intends to ship it next year (first sneak-peeked last year as Project Cloak):
Special props to Jason Levine for vamping through the calculation phase & then going full “When Harry Met Sally deli scene” at the conclusion. As a friend noted, “I’ll have what he’s having.” 😝
[Y]ou just take a photo in Portrait mode using your compatible dual-lens smartphone, then share as a 3D photo on Facebook where you can scroll, pan and tilt to see the photo in realistic 3D… Everyone will be able to see 3D photos in News Feed and VR today, while the ability to create and share 3D photos begins to roll out today and will be available to everyone in the coming weeks.
Check out their post for tips on composing a 3D-friendly image (e.g. include lots of foreground/background separation; avoid transparent objects like drinking glasses).
“Been waiting to build this since the beginning of Google Photos :)” tweeted Dave Lieb, product lead for Google Photos. As TechCrunch writes,
[U]sing A.I. technologies and facial recognition is a next step, and one that makes Google Photos an even more compelling app. In practice, it means that you wouldn’t have to manually share photos with certain people ever again – you can just set up a Live Album once, and then allow the automation to take over.
Oh, and with the newly announced Google Home Hub, people (e.g. my folks) can have an auto-updating picture frame showing specific people (e.g. our kids).