Category Archives: Photography

Long-exposure drone photography

Living in often bone-dry California, I can’t say that I’d thought of trying to capture waterfalls from a drone, but it’s a neat idea that Stewart Carroll covers nicely in this short overview. Meanwhile I’d like to learn more about paring a slow shutter with device motion to freeze a subject (e.g. a moving train) while blurring the background.

[YouTube]

Photography: Red carpet robot @ 1,000fps

I’m intrigued by the Glambot:

The Glambot itself is well-known Bolt high-speed cinebot by Camera Control holding up a Phantom 4K Flex camera with a Leica Summilux lens mounted on it. […]

“The pressure is on because you only ever have ONE take, and this is a dangerous rig that can knock you out,” Walliser writes. “I get good at explaining things, but sometimes the environment is so frenetic you can’t really hear me or focus.”

Check it out in action:

[YouTube]

A peek at Oppo’s new 10x optical zoom for phones

Looks pretty nifty, though it’s interesting that it doesn’t (at least currently) work for capturing video or macro shots:

The Verge explains,

The key component to Oppo’s system is a periscope setup inside the phone: light comes in through one lens, gets reflected by a mirror into an array of additional lenses, and then arrives at the image sensor, which sits perpendicular to the body of the phone. That’s responsible for the telephoto lens in Oppo’s array, which has a 35mm equivalence of 160mm. Between that lens, a regular wide-angle lens, and a superwide-angle that’s 16mm-equivalent, you get the full 10x range that Oppo promises.

[YouTube]

Photography: Beautiful orbital shots of “The World Below”

Bruce Berry (not Neil Young’s late roadie) created some beautiful time lapse imagery from images captured aboard the International Space Station:

On Vimeo he writes,

All footage has been edited, color graded, denoised, deflickered, stabilized by myself. Some of the 4K video clips were shot at 24frames/sec reflecting the actual speed of the space station over the earth. Shots taken at wider angels were speed up a bit to match the flow of the video.

Some interesting facts about the ISS: The ISS maintains an orbit above the earth with an altitude of between 330 and 435 km (205 and 270 miles). The ISS completes 15.54 orbits per day around the earth and travels at a speed of 27,600 km/h; 17,100 mph).

The yellow line that you see over the earth is Airgolw/Nightglow. Airglow/Nightglow is a layer of nighttime light emissions caused by chemical reactions high in Earth’s atmosphere. A variety of reactions involving oxygen, sodium, ozone, and nitrogen result in the production of a very faint amount of light (Keck A and Miller S et al. 2013).

I love the choice of music & wondered whether it comes from Dunkirk. Close: that somewhat anxious tock-tock undertone is indeed a Hans Zimmer jam, but from 20 years earlier (The Thin Red Line).

[YouTube]

Eye-popping racing drone photography

Holy crap! Now my stuff looks positively lethargic ¯\_(ツ)_/¯, but what the heck, strap in & enjoy:

DIY Photography writes,

Johnny Schaer (Johnny FPV) is a pro drone racer. His drones are designed to be light, quick, nimble, fly upside down and through all kinds of crazy flightpaths that DJI’s drones could never achieve. And when somebody with the skill of Johnny turns on the camera, that’s when you get results like the video above.

To shoot the footage, Johnny used a drone built around the AstroX X5 Freestyle Frame (JohnnyFPV edition, obviously) frame with a GoPro Hero 7. It has no GPS, no gimbal, no stabilisation, no collision avoidance, none of those safety features that make more commercial drones predictable and easy to fly. 

[YouTube]

Adobe’s “Enhance Details” promises higher res, fewer artifacts

Enhance!” The latest changes in Camera Raw & Lightroom promise to improve the foundational step in raw processing:

The composite red, green, and blue value of every pixel in a digital photo is created through a process is called demosaicing.

Enhance Details uses an extensively trained convolutional neural net (CNN) to optimize for maximum image quality. We trained a neural network to demosaic raw images using problematic examples […] As a result, Enhance Details will deliver stunning results including higher resolution and more accurate rendering of edges and details, with fewer artifacts like false colors and moiré patterns. […]

We calculate that Enhance Details can give you up to 30% higher resolution on both Bayer and X-Trans raw files using Siemens Star resolution charts.

Hmm—I’m having a hard time wrapping my head around the resolution claim, at least based on the results shown (which depict an appreciable but not earth-shattering change). Having said that, I haven’t put the tech to the test, but I look forward to doing so.

For more info check out the related help doc plus some deep nerdery on how it all works.

New 65mm “Apollo 11” looks amazing

I am, as the kids would say, there for this documentary:

The film is comprised entirely of archival footage and audio:

Miller and his team collaborated with NASA and the National Archives (NARA) to locate all of the existing footage from the Apollo 11 mission. In the course of sourcing all of the known imagery, NARA staff members made a discovery that changed the course of the project — an unprocessed collection of 65mm footage, never before seen by the public. Unbeknownst to even the NARA archivists, the reels contained wide format scenes of the Saturn V launch, the inside of the Launch Control Center and post-mission activities aboard the USS Hornet aircraft carrier.

The find resulted in the project evolving from one of only filmmaking to one of also film curation and historic preservation. The resulting transfer — from which the documentary was cut — is the highest resolution, highest quality digital collection of Apollo 11 footage in existence.

I also loved this music video made using mission audio & imagery:

[YouTube 1 & 2] [Via]

Mavic 2 gains waypoint support, pano fix

Being able to preset one’s flight path on a map seems like a great way to set up shots that transition from day to night—especially cool when done with hyperlapses. Now to find a sufficiently interesting area in which to try it. See below for a demo/tutorial.

Oh, and there’s a really significant (for me, anyway) tweak hanging out in the corresponding firmware update: “Fixed issue: could not open Sphere panorama photos in Facebook.” The absence of the correct metadata was an ongoing pain that prevented me from seeing panos as interactive in Google Photos or making them interactive on Facebook. I haven’t yet installed the update, but I have my fingers crossed. [Update: It works!]

[YouTube]

“Epoch”: A warp-speed tour powered by Google Earth imagery

Whoa—apparently Irish Wonder Twin Powers involve an insane work ethic for finding interesting earthly patterns:

I was getting a sense of deja vu watching this, and PetaPixel helpfully writes,

If project reminds you of “Arena” by Páraic McGloughlin, there’s a good reason for that: Páraic is Kevin’s twin brother and the two had originally planned to create a single collaborative video before splitting and working independently on two separate videos while working in the same office.

[Vimeo]

Hey Porter!

Just a little drone fun my Mini Me & I had in Barstow, CA, at New Years (with a big hat tip to Mr. Johnny Cash):

And on the very off chance you’re interested in having a very rail-savvy 9-year-old tour you around the Western America Railroad Museum, well, you’re in luck. 😌

[YouTube]

[YouTube 1 & 2]

A little maritime drone fun

The lads and I are just back from an overnight visit to the USS Hornet, a decorated World War II-era carrier we last visited some 7 years ago. This time around we spent the night with our Cub Scout pack & several hundred other scouts & parents from around the area. On the whole we had a ball touring the ship, and I had a little fun flying my drone over the Hornet & her adjacent Navy ships:

And here’s an interactive 360º panorama from overhead. (Obligatory nerdy sidenote: This is the JPEG version stitched on the fly by the drone, and although I was able to stitch the raw source images in Camera Raw & get better color/done, I’ll be damned if I can figure out how to inject the proper metadata to make it display right. As usual I used EXIF Fixer to make the JPEG interactive.)

[YouTube]

“Fluidity” drone controller

I have no idea whether this thing is worth a damn—but I’d sure like to find out (well, with the caveat that if it’s awesome, it’d be one more piece of bulky kit to schlepp around):

Using an astronaut’s perspective on intuitive motion through space, we have patented a unique and intuitive drone controller that anyone, whether they’re eight or eighty, can pick up and begin using immediately.

The FT Aviator is designed to incorporate the relevant 4 degrees of freedom of movement (x, y, z, and yaw) to drone flying, eliminating the awkward interface and steeper learning curve of existing dual thumb-controlled drones. It intuitively unlocks human potential to fly and capture stunning imagery.

NewImage

[YouTube]

Photography: A New Year’s flight through Icelandic fireworks

What a gorgeous way to ring in the still-new year:

Pilot/photographer Sigurður Þór Helgason writes,

Happy New Year 2019. This is Reykjavik city on New Years Eve. Most households shot their own fireworks just before midnight so the outcome is a spectacular firework show, unlike any other. Music by Adi Goldstein.

Note: I shot this with my Mavic 2 Pro. I used D-log M, ISO 1600, shutter 1/25 frame rate 25 and used the LUT from DJI to bring the colours back. No other adjustments and no, there are no special effects in this video or post production. Hope you enjoy this.

Night Sight is outta sight!

This is a watershed moment for me: After 11+ years of shooting on iPhones & Canon DSLRs, this is the first time I’ve shot on an Android device that plainly outshines them both at something. Night Sight on Pixel 3 blows me away.

NewImage

First, some important disclaimers:

  • I work at Google & get to collaborate with the folks responsible for this tech, but I can take no credit for it, and these are just my opinions & non-scientific findings.
  • I’m not here to rain on anybody’s parade. My iPhone X is great, and the 70D has been a loyal workhorse. I have no plans to ditch either.
  • The 70D came out in 2013, and it’s obviously possible to get both a newer DSLR & a lens faster than my 24-70mm f/2.8.
  • It’s likewise possible to know a lot more about manual exposure than I do. I went only as far as to choose aperture priority, crank the exposure wide open, and set ISO to Auto.

Having said all that, I think my results reasonably represent what a normal-to-semi-savvy person would get from the various devices. Here’s what I saw:

What do you think?

By the way, Happy New Year! Here’s an animation created last night by shooting a series of Night Sight images, then combining them in Google Photos & finally cropping the output in Photoshop.

PS—I love the Queen-powered “Flash!” ad showing Night Sight:

[YouTube]

Restored footage shows 19th century Parisian street life in motion

I find this kind of thing endlessly, eerily fascinating:

A collection of high quality remastered prints from the dawn of film taken in Belle Époque-era Paris, France from 1896-1900.  Slowed down footage to a natural rate and added in sound for ambiance.  These films were taken by the Lumière company.

0:08 – Notre-Dame Cathedral (1896)
0:58 – Alma Bridge (1900)
1:37 – Avenue des Champs-Élysées (1899)
2:33 – Place de la Concorde (1897)
3:24 – Passing of a fire brigade (1897)
3:58 – Tuileries Garden (1896)
4:48 – Moving walkway at the Paris Exposition (1900)
5:24 – The Eiffel Tower from the Rives de la Seine à Paris (1897)

If that’s up your horse-trodden alley, check out a similar piece from NYC that I posted earlier in the year:

[YouTube 1 & 2]

Tutorial: The benefits of shooting 10-bit color

Illuminating stuff as always from Stewart & Drone Film Guide:

Of course, I’m reminded that before I even bother with this stuff, I need to move myself off the absolutely shite color tools in iMovie and onto… what, exactly? As I mentioned the other day, the new Adobe Rush’s tools are really anemic; learning Premiere Pro seems like no joke; and I don’t care to pay for Final Cut. Hmm—to be continued.

NewImage

[YouTube]

A 360º view from inside Chuck Yeager’s cockpit

History? Check.
Photography? Check.
Aerospace? Check.

I am there for this:

Smithsonian notes:

  • The distinct H-shaped yoke determined both roll and pitch. Airspeed was controlled by the number of rocket chambers—up to four—fired by the silver thumb-switch to the left of the yoke; there was no throttle.
  • The Mach indicator above goes to Mach 1.5; it was most likely installed after Yeager’s first transonic flight. It’s flanked by a conventional altimeter and airspeed indicator. The fastest Glamorous Glennis ever flew was Mach 1.45.
  • Yeager signed his name in the cockpit of Glamorous Glennis on many occasions over the decades. (He piloted 33 of the aircraft’s 78 career test flights, including its last, on May 12, 1950.) Can you find all his signatures?

[Via Bryan O’Neil Hughes]

Droning from a *van* down by the *river*

Just a quick bit of flying Thanksgiving weekend near Pismo Beach. A few thoughts:

  • Color grading in iMovie is for the birds, but somehow it’s no better in Adobe Rush (which lacks an Auto button (!), much less key framing), and learning Premiere Pro always seems like too big a hill to climb.
  • I likewise find it hard to cut on the beats—a problem compounded when I share the output to YouTube and Facebook (where, I swear to God, somehow the audio & video get differently out of sync).
  • I’ve gotta learn how to avoid (or later compensate for) the gross propeller shadows that appear in a few shots here.
  • No, the soundtrack doesn’t really fit (an assessment my 9-year-old Henry cheerfully volunteered 🙄), but, eh, I found the juxtaposition oddly fun. YMMV.

Google Photos brings depth editing to iOS

Pretty much like it says on the tin. PetaPixel writes,

There isn’t a filter in the app that lets you selectively see only Portrait mode photos, but the new option in the Edit menu will be present for any Portrait shot.

Download the latest version of Google Photos for iOS to get started with this new feature. Depth editing is already available on the Pixel 3, Pixel 2, and Moto phones that have depth photo support. Google says it’ll also be adding more Android devices soon.

NewImage

Photography: Orbiting the Earth in 90 minutes

Thanks, NASA, for these minutes of Zen:

Kottke writes,

This is easily the most awe-inspiring and jaw-dropping thing I’ve seen in months. In its low Earth orbit ~250 miles above our planet, the International Space Station takes about 90 minutes to complete one orbit of the Earth. Fewer than 600 people have ever orbited our planet, but with this realtime video by Seán Doran, you can experience what it looks like from the vantage point of the IIS for the full 90 minutes.

Happy Monday.

[YouTube]

Using computer vision to unlock a wealth of photographic history

Visiting the NY Times was always among the real treats of my time working on Photoshop. I was always struck by the thoughtfulness & professionalism of the staff, but also by the gritty, brass-tacks considerations of cranking through thousands of images daily, often using some pretty dated infrastructure.

Now Google’s Cloud Vision tools are helping to tap into that infrastructure—specifically, bringing treasures of “The Morgue” back into the light by making their patchwork annotations searchable.

The morgue contains photos from as far back as the late 19th century, and many of its contents have tremendous historical value—some that are not stored anywhere else in the world. In 2015, a broken pipe flooded the archival library, putting the entire collection at risk. Luckily, only minor damage was done, but the event raised the question: How can some of the company’s most precious physical assets be safely stored?

Check it out:

NewImage

[YouTube] [Via]

Blind veterans kayak the Grand Canyon, taking Street View along for the ride

This completely blows my mind. Have a happy, reflective, and grateful Veteran’s Day, everyone.

Check out their 360º captures on Google Street View. Blind Navy vet & expedition leader Lonnie Bedwell writes,

I believe we can’t abandon our sense of adventure because we lose our ability to see it, and it has become my goal to help people who live with similar challenges, and show them that anything is possible.

In 2013, I became the first blind person to kayak the entire 226 miles of the Colorado River through the Grand Canyon But, I always felt it didn’t mean anything unless I found a way to pay it forward. So I joined up with the good folks at Team River Runner, a nonprofit dedicated to providing all veterans and their families an opportunity to find health, healing, community, and purpose. Together we had the audacious goal to support four other blind veterans take a trip down the Grand Canyon.

NewImage

[YouTube]

What might be next for Facebook 3D photos?

Facebook’s 3D photos (generated from portrait-mode images) have quickly proven to be my favorite feature added to that platform in years. Hover or drag over this example:

My crazy three! 😝😍 #007 #HappyHalloween

Posted by John Nack on Wednesday, October 31, 2018

The academic research they’ve shared, however, promises to go farther, enabling VR-friendly panoramas with parallax. The promise is basically “Take 30 seconds to shoot a series of images, then allow another 30 seconds for processing.” The first portion might well be automated, enabling the user to simply pan slowly across a scene.

NewImage

This teaser vid shows how scenes are preserved in 3D, enabling post-capture effects like submerging them in water:

Will we see this ship in FB, and if so when? Your guess is as good as mine, but I find the progress exciting.

[YouTube]

Animation: Trippy photography plays with time in Ireland

Time, they say, has the nice property of keeping everything from happening at once. But what would it look like if everything did happen at once?

Photographer Páraic McGloughlin hung out on a bridge in Sligo, Ireland for 19 hours, to create a single, day-long shot that he then manipulated. Colossal writes,

“Using a fundamental image (a time lapse) to mask and cut into, I tried to show the variable possibilities within a limited time span, maintaining the integrity of each individual photograph while dissecting and rearranging the overall image.” The visual content was matched with each layer of audio created by Cooper to form the song, which stacks up to over one hundred layers. 

Check it out:

Eye-popping racing drone footage of mountain bike action

Man, apparently some of the biker scouts from Endor lived on & now thread drones through forrest gaps over crazy mountain biking trails, as seen in this bonkers project from Cinematic Flow:

“We designed and refined FPV drones since 5 years now. When Kilian spoke about his idea of putting a GoPro Fusion on one of our drones, we were intrigued but thrilled about this new challenge. The design and the flying of this set up are so different than what we are used to, there were loads of crashes but the end result is so refreshing and pushes the drone shot to the next level.” Pierre, engineer at Cinematic Flow

Now, how about a look behind the scenes?

We trust the cam, we ensure a light flying and tighten the buttocks to export!

¯\_(ツ)_/¯ 

NewImage

[YouTube 12] [Via Luke Wroblewski]

Seriously cool parallax-generation tech from Adobe

Wow—check out this amazing sneak peek from Adobe’s Long Mai (see paper):

Enables any photograph to be turned into a live photo; animating the image in 3D, simulating the realistic effect of flying through the scene.

This is especially dear to my heart.

As a brand new Photoshop PM (in 2002—gah!), one of my first trips was back to NYC to visit motion graphics artists. Touring one shop I was amazed to glimpse a technique I’d never seen, using Photoshop to break 2D photos into layers, fill in gaps, and then animate the results in After Effects. Later that year the work came to the big screen in The Kid Stays in the Picture, the documentary that now lends its name to this ubiquitous parallax effect.

Here Yorgo Alexopoulos talks about how he developed the technique & how he’s leveraged it in later works:

So, while we wait for Adobe’s new tech to ship, how could one do this by hand? Below, artist Joe Fellows gives a brief, highly watchable demo of how it’s done (although it physically pains me to see him using the Pen tool to make selections & no Content-Aware Fill to at least block in the gaps):

[YouTube 1, 2, and 3]

Content-Aware Fueled: After Effects

Man, I used to hate demoing alongside After Effects during internal Adobe events: We had Photoshop, sure—but they were Photoshop on wheels. You could just pencil them in for the Top Gun trophy nearly every time.

Making Content-Aware Fill work at all is hard—but making it effective over multiple frames (“temporally coherent,” in our nerdy parlance)? Well, that requires FM technology—F’ing Magic. Here’s a naive implementation (not from Adobe):

Cool, artsy—but generally not so useful. And here (at 1:50:44) it is as the After Effects team intends to ship it next year (first sneak-peeked last year as Project Cloak):

Special props to Jason Levine for vamping through the calculation phase & then going full “When Harry Met Sally deli scene” at the conclusion. As a friend noted, “I’ll have what he’s having.” 😝

[YouTube]

See your Portrait images in 3D via Facebook

Very cool:

3Doge

FB writes,

[Y]ou just take a photo in Portrait mode using your compatible dual-lens smartphone, then share as a 3D photo on Facebook where you can scroll, pan and tilt to see the photo in realistic 3D… Everyone will be able to see 3D photos in News Feed and VR today, while the ability to create and share 3D photos begins to roll out today and will be available to everyone in the coming weeks.

Check out their post for tips on composing a 3D-friendly image (e.g. include lots of foreground/background separation; avoid transparent objects like drinking glasses).

Automatically share kid & pet pics with Google Photos Live Albums

“Been waiting to build this since the beginning of Google Photos :)” tweeted Dave Lieb, product lead for Google Photos. As TechCrunch writes,

[U]sing A.I. technologies and facial recognition is a next step, and one that makes Google Photos an even more compelling app. In practice, it means that you wouldn’t have to manually share photos with certain people ever again – you can just set up a Live Album once, and then allow the automation to take over.

Oh, and with the newly announced Google Home Hub, people (e.g. my folks) can have an auto-updating picture frame showing specific people (e.g. our kids).

Try Live Albums right now on Web, iOS, or Android. It works like this:

NewImage