Category Archives: Photography

The tiny Insta360 GO looks clever

Back in 2013 I was really taken with how Microsoft’s PhotoSynth technology could generate interactive hyperlapses for reliving walks, bike rides, etc., and I was sad when the tech died pretty soon after. Wearable cameras just weren’t ubiquitous, affordable, and high quality at the time.

Are those times a-changin’? Maybe: The incredibly tiny, albeit not incredibly cheap, Insta360 GO wearable cam promises to capture stabilized hyperlapses, as shown in the demo below. It seems that most commentators are focusing on the device’s 30-second video limit, but that doesn’t bother me. Honestly, as much as I really love the DJI Osmo I got at the end of last year, I’ve barely put it to use: I just haven’t needed a handheld, non-phone, non-360º way to capture video. The GO, by contrast, promises ultra lightweight wearability, photo capture, and a slick AirPods-style case for both recharging & data transfer. Check it out:

NewImage

[YouTube]

Luminar invents Google+ 2013

This “AI Structure” feature looks neat, but I take slight exception to the claim to being “the first-ever content-aware tool to improve details only where needed.” The Auto Enhance feature built into Google+ Photos circa 2013 used to do this kind of thing (treating skin one way, skies another, etc.) to half a billion photos per day.

Of course, as I learned right after joining the team, Google excels at doing incredibly hard stuff & making no one notice. I observed back then that we could switch the whole thing off & no one would notice or care—and that’s exactly what happened. Why? Because A) G+ didn’t show you before/after (so people would never know what difference it had made) and B) most people are to photography as I am to wine (“Is it total shite? No? Then good enough for me”). Here at least the tech is going to that tiny fraction of us who actually care—so good on ‘em.

[YouTube]

Photography: Spin me right ’round, Milky Way edition

Whoa:

Creator Aryeh Nirenberg writes,

A timelapse of the Milky Way that was recorded using an equatorial tracking mount over a period of around 3 hours to show Earth’s rotation relative to the Milky Way.

I used a Sony a7SII with the Canon 24-70mm f2.8 lens and recorded 1100 10″ exposures at a 12-second interval. All the frames were captured at F/2.8 and 16000iso.

Kinda reminds me of “Turn Down For Spock”:

[YouTube 1 & 2] [Via]

New Snap Spectacles put 3D capture onto your face (!)

Wow:

Per The Verge:

The glasses’ marquee feature is a second camera, which enables Spectacles to capture depth for the first time. Snap has built a suite of new 3D effects that take advantage of the device’s new depth perception ability. They will be exclusive to Spectacles, and the company plans to let third-party developers design depth effects starting later this year.

This time around, Snap is offering a new way to view snaps taken through Spectacles: an included 3D viewer resembling Google Cardboard. (The Spectacles 3D viewer is made of cardboard as well.)

NewImage

[YouTube]

Camera Raw now seamlessly adjusts 360º panos

(May as well keep this Adobe-week content train rolling, amirite?)

If you’d asked me the odds of getting a tweak this deeply nerdy into Camera Raw, I’d probably have put it around 1 in 100—but dang, here we are! This is a godsend for those of us who like to apply area-based adjustments like Clarity & Dehaze to panoramas. Russell Brown shows the benefit below.

A note of caution, though: to my partial disappointment, this doesn’t (yet) work when applying Camera Raw as a filter, so if you want to use it on JPEGs, you’ll need to open them into ACR via Bridge (Cmd-R). And yes, my little Obi-Wan brain just said, “Now that’s a workflow I haven’t heard of in a long time…” Or, if you’re coming from Lightroom Classic, you’ll need to open the image as a Smart Object in Photoshop—clunky (though temporary, I’m told), but it beats the heck out of trying to fix seams manually.

[Vimeo]

Gone fishing… and feeling grateful

Hey gang—I know I greatly flatter myself in thinking that my voice here will be much missed if I go quiet for a bit, especially without notice, but for what it’s worth I’m enjoying some very welcome digital downtime with family in friends in Minnesota.

Being minutes away from wrapping up the celebration of my 44th (!) solar orbit, I wanted to say thanks for being one of those still crazy enough to traipse over here periodically & browse my random finds. Fourteen (!!) years after I started this racket, it still remains largely fun & rewarding. I hope you agree, and I’m grateful for your readership.

Now please excuse me for just a few more days while I get back to swamping my hard drive with a crushing backlog of drone, GoPro, Insta360, iPhone, and Osmo shots. 🙃

NewImage

Oh, and for some dumb reason Google Maps insists on starting this pano (showing where we’re staying) pointed straight down into the pitch-black lake. You can drag it upwards and/or zoom out while I go file a bug/feature requests. The work is never done—another possible source of gratitude.

Animation: Trippy Osaka bends before our eyes

🤯

Colossal writes,

In this fantastic short titled Spatial Bodies, actual footage of the Osaka skyline is morphed into a physics-defying world of architecture where apartment buildings twist and curve like vines, suspended in the sky without regard for gravity. The film was created by AUJIK, a collaborative of artists and filmmakers that refers to itself as a “mysterious nature/tech cult.”

Begone, lame skies!

Does anyone else remember when Adobe demoed automatic sky-swapping ~3 years ago, but then never shipped it… because, big companies? (No, just me?)

Anyway, Xiaomi is now offering a similar feature. Here’s a quick peek:

And here’s a more in-depth demo:

Coincidentally, “Skylum Announces Luminar 4 with AI-Powered Automatic Sky Replacement”:

It removes issues like halos and artifacts at the edges and horizon, allows you to adjust depth of field, tone, exposure and color after the new sky has been dropped in, correctly detects the horizon line and the orientation of the sky to replace, and intelligently “relights” the rest of your photo to match the new sky you just dropped in “so they appear they were taken during the same conditions.”

Check out the article link to see some pretty compelling-looking examples.

NewImage

[YouTube 1 & 2]

Set Drone Controls For The Heart OF The Sun

“If you want to be a better photographer, [fly] in front of more interesting things…” This eclipse hyperlapse is rad:

“I wasn’t sure if it was going to work but I didn’t want to use it manually because I wanted to watch what was my first-ever eclipse,” [photographer Matt] Robinson tells PetaPixel. “Around 10 minutes before totality, the drone was sent up above our camp and programmed to fly along and above the spectacular Elqui Valley in Chile.

[YouTube]

Moon Reunion: Fun lunar stuff to check out

As we roll up on the 50th (!) anniversary of humanity visiting our biggest satellite:

 

NewImage

Photography: A hyperkinetic maelstrom of patterns

Irish photographer Páraic Mc Gloughlin has a real knack for finding patterns among huge corpora of data (e.g. from Google Earth; see previous). Now he’s making music videos:

Mc Gloughlin’s latest work is for the band Weval’s track “Someday,” and features the filmmaker’s signature fusion of geometric shapes found in historical domes, skyscraper facades, and farmland irrigation systems. The tightly edited video shows quickly-passing frames that shift in time with the music, visually quaking or smoothly transitioning depending on the percussive and melodic elements of the song.

Brace yourself:

Inside Apple’s charming new “Bounce” commercial

I think you’ll enjoy this:

AdAge writes,

The team shot outdoor scenes in Kiev, Ukraine, before recreating the entire town on a set inside the country’s largest airplane hangar. The “ground,” however, was built six feet off the floor, to allow space for trampolines built into the sidewalks. […]

For a scene where he falls sideways beside a woman on a bench, two practical shots were merged into a single one. The actor bounces off a specially-crafted surface, and the camera was turned 90 degrees to film the woman, who was strapped into a bench built into a wall. The entire production was shot in just 12 days, a feat that required 200 artists and technicians.

[YouTube]

Remove.bg plugin comes to Photoshop

I haven’t yet tried it, but sample results look impressive:

NewImage

It’s free to download, but usage carries a somewhat funky pricing structure. PetaPixel explains,

You’ll need to sign up for an API key through the website and be connected to the Internet while using it. You’ll be able to do 50 background removals in a small size (625×400, or 0.25 megapixels) through the plugin every month for free (and unlimited removals through the website at that size). If you work with larger volumes or higher resolutions (up to 4000×2500, or 10 megapixels), you’ll need to buy credits.

Awesome new portrait lighting tech from Google

The rockstar crew behind Night Sight have created a neural network that takes a standard RGB image from a cellphone & produces a relit image, displaying the subject as though s/he were illuminated via a different environment map. Check out the results:

I spent years wanting & trying to get capabilities like this into Photoshop—and now it’s close to running in realtime on your telephone (!). Days of miracles and… well, you know.

Our method is trained on a small database of 18 individuals captured under different directional light sources in a controlled light stage setup consisting of a densely sampled sphere of lights. Our proposed technique produces quantitatively superior results on our dataset’s validation set compared to prior works, and produces convincing qualitative relighting results on a dataset of hundreds of real-world cellphone portraits. Because our technique can produce a 640 × 640 image in only 160 milliseconds, it may enable interactive user-facing photographic applications in the future.

[YouTube]

AI: 30fps -> 300fps

Chinese 360°/VR camera company Kandao is promising 10x interpolation to create super slow-mo effects. I find the results impressive, although there are some (probably unintentionally) charming artifacts visible on the squirrel close-up below. It’d be fun to compare it to work from my teammate Aseem as well as more recent efforts from NVIDIA.

PetaPixel notes,

AI Slow-Motion will first appear in Kandao’s Obsidian and QooCam 360/VR cameras, but Kandao is planning to open up the tech to other cameras down the road. For now, if you own a Kandao camera, you can find the new feature in the latest Qoocam Studio and in the upcoming Kandao Studio v3.0 (coming April 23rd).

[YouTube 1 & 2]

Drones helped track & stop the Notre Dame fire

Even without purpose-made mods like available thermal cameras, DJI drones helped minimize damage to Notre Dame. According to The Verge,

Fire brigade spokesman Gabriel Plus told local media that the drones were instrumental in saving the cathedral’s structure. “The drones allowed us to correctly use what we had at our disposal,” Plus said in comments translated from French. Firefighters also relied on the Mavic Pro’s visible light camera and optical and electronic zoom… 

“The mission was delicate and they intelligently called for the Parisian Police Drone Unit cell, which is a dedicated team of professional drone pilots ready to intervene in critical missions,” the spokesperson added. The drones were borrowed from France’s culture and interior ministries, as firefighters still don’t have their own drones.

Deliriously awesome drone + drifting

Amazing flying, shooting, and drifting on what appear to be the roads above Silicon Valley:

See also this short racetrack clip. Seems that the footage comes courtesy of one of these little goblins:

[YouTube]

Check out my drone shot making the grade

…or at least getting the grade, courtesy of Stewart Carroll of the oft-linked Drone Film Guide. A few weeks back he solicited viewer contributions of content we’d like to see expertly adjusted. And voila, check out the clip below! (My wife didn’t even complain that he’s using FCP. 😌) 

Maybe this will finally, six+ months later, get my motivated to finally grade & post footage from that awesome beach in Cabo. ¯\_(ツ)_/¯

NewImage

[YouTube]

Long-exposure drone photography

Living in often bone-dry California, I can’t say that I’d thought of trying to capture waterfalls from a drone, but it’s a neat idea that Stewart Carroll covers nicely in this short overview. Meanwhile I’d like to learn more about paring a slow shutter with device motion to freeze a subject (e.g. a moving train) while blurring the background.

[YouTube]

Photography: Red carpet robot @ 1,000fps

I’m intrigued by the Glambot:

The Glambot itself is well-known Bolt high-speed cinebot by Camera Control holding up a Phantom 4K Flex camera with a Leica Summilux lens mounted on it. […]

“The pressure is on because you only ever have ONE take, and this is a dangerous rig that can knock you out,” Walliser writes. “I get good at explaining things, but sometimes the environment is so frenetic you can’t really hear me or focus.”

Check it out in action:

[YouTube]

A peek at Oppo’s new 10x optical zoom for phones

Looks pretty nifty, though it’s interesting that it doesn’t (at least currently) work for capturing video or macro shots:

The Verge explains,

The key component to Oppo’s system is a periscope setup inside the phone: light comes in through one lens, gets reflected by a mirror into an array of additional lenses, and then arrives at the image sensor, which sits perpendicular to the body of the phone. That’s responsible for the telephoto lens in Oppo’s array, which has a 35mm equivalence of 160mm. Between that lens, a regular wide-angle lens, and a superwide-angle that’s 16mm-equivalent, you get the full 10x range that Oppo promises.

[YouTube]

Photography: Beautiful orbital shots of “The World Below”

Bruce Berry (not Neil Young’s late roadie) created some beautiful time lapse imagery from images captured aboard the International Space Station:

On Vimeo he writes,

All footage has been edited, color graded, denoised, deflickered, stabilized by myself. Some of the 4K video clips were shot at 24frames/sec reflecting the actual speed of the space station over the earth. Shots taken at wider angels were speed up a bit to match the flow of the video.

Some interesting facts about the ISS: The ISS maintains an orbit above the earth with an altitude of between 330 and 435 km (205 and 270 miles). The ISS completes 15.54 orbits per day around the earth and travels at a speed of 27,600 km/h; 17,100 mph).

The yellow line that you see over the earth is Airgolw/Nightglow. Airglow/Nightglow is a layer of nighttime light emissions caused by chemical reactions high in Earth’s atmosphere. A variety of reactions involving oxygen, sodium, ozone, and nitrogen result in the production of a very faint amount of light (Keck A and Miller S et al. 2013).

I love the choice of music & wondered whether it comes from Dunkirk. Close: that somewhat anxious tock-tock undertone is indeed a Hans Zimmer jam, but from 20 years earlier (The Thin Red Line).

[YouTube]

Eye-popping racing drone photography

Holy crap! Now my stuff looks positively lethargic ¯\_(ツ)_/¯, but what the heck, strap in & enjoy:

DIY Photography writes,

Johnny Schaer (Johnny FPV) is a pro drone racer. His drones are designed to be light, quick, nimble, fly upside down and through all kinds of crazy flightpaths that DJI’s drones could never achieve. And when somebody with the skill of Johnny turns on the camera, that’s when you get results like the video above.

To shoot the footage, Johnny used a drone built around the AstroX X5 Freestyle Frame (JohnnyFPV edition, obviously) frame with a GoPro Hero 7. It has no GPS, no gimbal, no stabilisation, no collision avoidance, none of those safety features that make more commercial drones predictable and easy to fly. 

[YouTube]

Adobe’s “Enhance Details” promises higher res, fewer artifacts

Enhance!” The latest changes in Camera Raw & Lightroom promise to improve the foundational step in raw processing:

The composite red, green, and blue value of every pixel in a digital photo is created through a process is called demosaicing.

Enhance Details uses an extensively trained convolutional neural net (CNN) to optimize for maximum image quality. We trained a neural network to demosaic raw images using problematic examples […] As a result, Enhance Details will deliver stunning results including higher resolution and more accurate rendering of edges and details, with fewer artifacts like false colors and moiré patterns. […]

We calculate that Enhance Details can give you up to 30% higher resolution on both Bayer and X-Trans raw files using Siemens Star resolution charts.

Hmm—I’m having a hard time wrapping my head around the resolution claim, at least based on the results shown (which depict an appreciable but not earth-shattering change). Having said that, I haven’t put the tech to the test, but I look forward to doing so.

For more info check out the related help doc plus some deep nerdery on how it all works.

New 65mm “Apollo 11” looks amazing

I am, as the kids would say, there for this documentary:

The film is comprised entirely of archival footage and audio:

Miller and his team collaborated with NASA and the National Archives (NARA) to locate all of the existing footage from the Apollo 11 mission. In the course of sourcing all of the known imagery, NARA staff members made a discovery that changed the course of the project — an unprocessed collection of 65mm footage, never before seen by the public. Unbeknownst to even the NARA archivists, the reels contained wide format scenes of the Saturn V launch, the inside of the Launch Control Center and post-mission activities aboard the USS Hornet aircraft carrier.

The find resulted in the project evolving from one of only filmmaking to one of also film curation and historic preservation. The resulting transfer — from which the documentary was cut — is the highest resolution, highest quality digital collection of Apollo 11 footage in existence.

I also loved this music video made using mission audio & imagery:

[YouTube 1 & 2] [Via]