Back when I worked in Google Research, my teammates developed fast models divide images & video into segments (people, animals, sky, etc.). I’m delighted that they’ve now brought this tech to Snapseed:
The new Object Brush in Snapseed on iOS, accessible in the “Adjust” tool, now lets you edit objects intuitively. It allows you to simply draw a stroke on the object you want to edit and then adjust how you want it to look, separate from the rest of the image.
Check out the team blog post for lots of technical details on how the model was trained.
The underlying model powers a wide range of image editing and manipulation tasks and serves as a foundational technology for intuitive selective editing. It has also been shipped in the new Chromebook Plus 14 to power AI image editing in the Gallery app. Next, we plan to integrate it across more image and creative editing products at Google.