Google researchers unveil better panorama stitching

Mike Krainin & Ce Liu go into detail about how optical flow techniques are helping Google Street View produce panoramas that are not only freer of artifacts, but easier for machines to read (producing a better understanding of business names, hours, etc.):

I wonder whether these techniques might be useful to pano-stitching in apps like Photoshop & Lightroom. I’ve passed the info their way.



2 thoughts on “Google researchers unveil better panorama stitching

  1. Hi John,
    I was impressed, but it also gave me a chuckle. I was unaware at the time that your cameras were on me as I was watering my hedge in the front garden of 2 Squires Road, so I appear in StreetView in the latest iteration, but the car did not continue down my road, so there was not a stitch in time; there was a hitch in time from an earlier visit when the cameras did go down the road, showing the same house taken before my arrival. MK43 0QL, you will see the anomalies – it’s fun to go slightly back in time to spot the schism! Thanks for the laugh!

  2. Hi John, do you think those extensive algorithms will someday make it into the street view app ? I was not really convinced by what I got from the “human centered/held camera”.

    Also looking for a way to get all the unstitched shots so I could import them into Ps Photo Merge cause I love the dots telling me where to take next image 🙂

Leave a Reply

Your email address will not be published.