I’m guessing the average GoPro buyer’s life cycle goes like this:
- Watch amazing GoPro-shot footage.
- Wow, look at these adventures! If I get one, I’ll have adventures!
- Go shoot a bunch of footage, sit down to watch it.
- Wow, this is… dull. But hey, I’m gonna get around to watching it & editing down to the good parts… [never].
- Also, I don’t have a lot of adventures.
- These new cams are cool, but I didn’t really use my current one, so…
And thus to address their “Achille’s heel,” GoPro just spent $100M+ buying Stupeflix, makers of the excellent Replay movie-making app (Apple’s 2014 App of the Year) as well as video editor Splice. Congrats to the makers of these excellent tools. I’m really eager to see what they can do together & with GoPro.
Meanwhile GoPro competitor the TomTom Bandit uses sensor data (speed, G-force, even max heart rate) to annotate what it captures (see below). As someone who knows just how hard it is for software to discern the really important moments in a video (Cf. the movies feature in Google Photos), I’m excited to see richer data sets captured & surfaced to users. Check out The Verge’s review for details.
One thought on “Using sensor data to help edit action-cam video”
no stabilization based on gyroscopic datas … fail !!!
i can’t understand why such devices as iphones and google phones didn’t provide such a feature.