Just a taste:
Introducing HandSpace: A collection of hand tracking oddities.
Get it here: https://t.co/ZUinIQ9FAj pic.twitter.com/L04IbsEbSv— Daniel Beauchamp (@pushmatrix) February 27, 2020
Lots more where that came from:
[YouTube]
Just a taste:
Introducing HandSpace: A collection of hand tracking oddities.
Get it here: https://t.co/ZUinIQ9FAj pic.twitter.com/L04IbsEbSv— Daniel Beauchamp (@pushmatrix) February 27, 2020
Lots more where that came from:
[YouTube]
Cool!
A modular, creative, unique super-puzzle with magnetic bits that attach seamlessly to your fridge, whiteboard, locker, car… you get it.
Eye-opening comparisons by Alvaro Gracia Montoya:
Colossal notes,
2008 TC3 is the smallest shown with a mean diameter of about 4.1 meters, while the largest is 1 Ceres, which has a mean diameter of about 939 kilometers.
Related/previous: Visualizing relative tree sizes.
https://www.youtube.com/watch?v=l3qpcbMV_fM&feature=youtu.be
This little dude looks nifty as heck:
The Looking Glass is powered by our proprietary 45-element light field technology, generating 45 distinct and simultaneous perspectives of three-dimensional content of any sort.
This means multiple people around a Looking Glass are shown different perspectives of that three-dimensional content—whether that’s a 3D animation, DICOM medical imaging data, or a Unity project – in super-stereoscopic 3D, in the real world without any VR or AR headgear.
[Vimeo]
Lovely work from Theodore John:
All this stretching brings back fond late-90’s memories of Gmunk:
Crafty Rube Goldberg-ing for social good (making tech more accessible):
Control your Mac using head movements. Rotate your head to move the cursor and make facial expressions to click, drag, and scroll. Powered by your iPhone’s TrueDepth camera.
[YouTube]
Back in 2014, Action Movie Dad posted a delightful vid of his niño evading the hot foot:
But now instead of needing an hour-long tutorial on how to create this effect, you can do it it realtime, with zero effort, on your friggin’ telephone. (Old Man Nack does wonder just how much this cheapens the VFX coin—but on charges progress.)
https://twitter.com/tomemrich/status/1230535407609057281
[YouTube]
Man, who knew just how much cultural identity could be wrapped up in a style of printing?
This excellent 99% Invisible episode covers the origins of blackletter printing (faster & more reliable for medieval scribes), the culture wars (from Luther to Napoleon) in which it battled Roman faces, its association with (and revilement by!) Nazis, and more.
Bonus: stick around for a discussion of revanchist, Trumpian mandates around government architecture, featuring that delightful term of art, CHUD. *chef’s kiss*
Good lord… seems like I was just posting about the 25th anniversary (see fun links from then), and before that the 20th (do I even dare to listen to whatever presumable ear poison I recorded back then? (“A toast to Photoshop and 20 years of pain and pleasure! A toast to our guests and the spontaneous creation of a drinking game based on every time Bryan says “Scott Kelby” or John says “Configurator!””))… and yet here we are, old friends.
The team is marking the occasion by releasing new features for desktop & mobile versions of the app, including ML-powered object selection in both:
As for mobile, it sounds like things are going in the right direction. Per TechCrunch:
It’s no secret that the original iPad app wasn’t exactly a hit with users as it lacked a number of features Photoshop users wanted to see on mobile. Since then, the company made a few changes to the app and explained some of its decisions in greater detail. Today, Adobe notes, 50% of reviews give the app five stars and the app has been downloaded more than 1 million times since November.
Back in 2010 Russell Brown & friends got Photoshop 1.07 running on an iPhone; a decade later he’s showing the iPad version running machine learning:
[YouTube]
Heh—here’s a super fun application of body tracking tech (see whole category here for previous news) that shows off how folks have been working to redefine what’s possible with. realtime machine learning on the Web (!):
I present to you my latest #prototype. Real-time person removal from complex backgrounds in #JavaScript using #TensorFlowJS in the browser. Tell me what you think! Star on Github + try it yourself right now: https://t.co/Ox5YhJu50y#MadeWithTFJS #WebDev #Web #Experiment #JS pic.twitter.com/ikDlftpxHT
— Jason Mayes (@jason_mayes) February 17, 2020
I’ve long heard that 19th-century audiences would faint or jump out of their seats upon seeing gripping, O.G. content like “Train Enters Station.” If that’s true, imagine the blown minds that would result from this upgraded footage. Colossal writes,
Shiryaev first used Topaz Lab’s Gigapixel AI to upgrade the film’s resolution to 4K, followed by Google’s DAIN, which he used to create and add frames to the original file, bringing it to 60 frames per second.
Check out the original…
…and the enhanced version:
Update: Conceptually related:
there’s an app called remini that enhances your blurry pictures & i’m never taking anything but macbook selfies ever again pic.twitter.com/KrRbcO4Ndp
— zahra (@zhashx) February 16, 2020
[YouTube]
Try as I might, I can’t conceive of a reason I need this thing, but it still looks so cool:
[YouTube]
As my sons would likely say (generally after dropping some sick joke on me), “You got ROASTED!” 🚁🔥
According to Popular Mechanics, the Drone Dome system “can detect objects as small as 0.002 square meters at 3.5 kilometers (2.1 miles).”
I’m pleased to say researchers have built on my team’s open-source MediaPipe framework to create AutoFlip, helping make video look good across a range of screens.
Taking a video (casually shot or professionally edited) and a target dimension (landscape, square, portrait, etc.) as inputs, AutoFlip analyzes the video content, develops optimal tracking and cropping strategies, and produces an output video with the same duration in the desired aspect ratio.
Check out the team’s post for more details. I don’t know how the results compare to Adobe’s Auto Reframe tech, but hopefully this release will help more people collaborate in delivering great tools.
And I think to myself
What a Giger-ian world… 🎶
Thomas Blanchard used an 8K RED Helium camera & various macro lenses to take us up close (way close) to our future insect overlords:
[Via]
Jon Rolph treats us to some stop-motion cleverness:
[YouTube] [Via Alex Powell]
I’m a longtime superfan of (now longtime!) Adobe designer Dave Werner. Just as the one good thing to come of my first two futile years at Adobe was meeting my wife, the clear good that came from my final project was hiring Dave. Our project mostly crashed & burned, but Dave hung in there & smartly made his way onto Character Animator. Here he provides a delightful inside look at how the app’s little red mascot came to be.
[YouTube]
Rather glorious work from Thomas Blanchard:
PetaPixel notes,
To achieve this level of detail he shot the whole thing in 8K on a $25,000 RED Helium camera using a Canon 100mm f/2.8L Macro and a Canon MP-E 65mm f/2.8 1-5X Macro, and then edited the final product down to 4K resolution.
[Vimeo]
The LA Times is showcasing the best Oscar dresses from the past five decades in augmented reality, commissioning artists to draw them in 3D via Google’s Tilt Brush app. You can try out the interactive experience in their app or just get a taste here:
[YouTube]
Somehow I missed this paper when it debuted, and beyond its interesting Photoshop-style applications, now I’m thinking of a contemplative Dr. Phil as Philosoraptor. Er, anyway, check out what machine learning can now do for content synthesis:
Relish this delightful short, illustrated by Katie Scott & animated by James Paulley.
Hmm—AR glasses + smart watch (or FitBit) + ring? 🧐
VentureBeat writes,
[A] finger could be used to write legibly in the air without a touch surface, as well as providing input taps, flick gestures, and potentially pinches that could control a screened device from afar. Thanks to the magnetic sensing implementation, researchers suggest that even a visually obscured finger could be used to send text messages, interact with device UIs, and play games. Moreover, AuraRing has been designed to work on multiple finger and hand sizes.
[They] include photos of a sketchbook, hanging folders, metallic packaging, a booklet covered in bubble wrap… Some files have layers that are labeled yellow in the Layers panel. Those yellow layers are Smart Objects; double-click them to open a separate file, in which you’ll add text or create a design.
[Via]
…at least in this SNL parody of Rudy. 😌 Happy football day:
[YouTube]
Pretty amazing to be able to compare a phone to a multi-thousand-dollar DSLR.
Ian talks about his experience shooting astrophotos with the Google Pixel 4 XL in the dark skies of the California deserts. Ian also makes some comparisons between the results from the Pixel Astrophotography mode, and his full-sized camera, the Sony a7S.
[YouTube]