Google’s wearable ML helps blind runners

Call it AI, ML, FM (F’ing Magic), whatever: tech like this warms the heart and can free body & soul. Google’s Project Guideline helps people with impaired vision navigate the world on their own, independently & at speed. Runner & CEO Thomas Panek, who is blind, writes,

In the fall of 2019, I asked that question to a group of designers and technologists at a Google hackathon. I wasn’t anticipating much more than an interesting conversation, but by the end of the day they’d built a rough demo […].

I’d wear a phone on a waistband, and bone-conducting headphones. The phone’s camera would look for a physical guideline on the ground and send audio signals depending on my position. If I drifted to the left of the line, the sound would get louder and more dissonant in my left ear. If I drifted to the right, the same thing would happen, but in my right ear. Within a few months, we were ready to test it on an indoor oval track. […] It was the first unguided mile I had run in decades.

Check out the journey. (Side note: how great is “Blaze” as a name for a speedy canine running companion? ☺️)

One thought on “Google’s wearable ML helps blind runners

  1. What an impressive piece of work! As an engineer, I appreciate the insight, effort, and ingenuity of the Google engineers in creating this tool. Wow!

Leave a Reply

Your email address will not be published. Required fields are marked *