The original Glass will be to AR wearables as the Apple Newton was to smartphones—ambitious, groundbreaking, unfocused, premature. After that first… well, learning experience… Google didn’t give up, and folks have cranked away quietly to find product-market fit. Check out the new device—dramatically faster, more extensible, and focused on specific professionals in medicine, manufacturing, and more:
My team has been collaborating with TensorFlow Lite & researchers working on human-pose estimation (see manypreviousposts) to accelerate on-device machine learning & enable things like the fun “Dance Like” app on iOS & Android:
Hmm… am I a size 10.5 or 11 in this brand? These questions are notoriously tough to answer without trying on physical goods, and cracking the code for reliable size estimation promises to enable more online shoe buying with fewer returns.
Now Nike seems to have cracked said code. The Verge writes,
With this new AR feature, Nike says it can measure each foot individually — the size, shape, and volume — with accuracy within 2 millimeters and then suggest the specific size of Nike shoe for the style that you’re looking at. It does this by matching your measurements to the internal volume already known for each of its shoes, and the purchase data of people with similar-sized feet.
My team has been accelerating machine learning on devices and enabling AR face effects for developers (via ARCore & ML Kit). In recent months we’ve worked with Care OS, makers of smart mirror technology, to enable virtual try-ons via their hardware. Here’s a quick demo from Google I/O:
The app consists of two modes — a cutout mode and a collage mode.
The idea is that you should walk around and collect a bunch of different materials from the world in front of your camera’s viewfinder while in the cutout mode. These images are cut into shapes that you then assemble when you switch to collage mode. To do so, you’ll arrange your cutouts in the 3D space by moving and tapping on the phone’s screen.
You can also adjust the shapes while holding down your finger and moving up, down, left and right — for example, if you want to rotate and scale your “weird cuts” collage shapes.
Unrelated (AFAIK), this little app lets you sketch in 2D, then put the results into AR space. (Adobe Capture should do this!)