{"id":5741,"date":"2017-08-02T10:24:34","date_gmt":"2017-08-02T17:24:34","guid":{"rendered":"http:\/\/jnack.com\/blog\/?p=5741"},"modified":"2017-08-02T16:59:40","modified_gmt":"2017-08-02T23:59:40","slug":"google-mit-unveil-realtime-image-retouching-on-mobile-devices","status":"publish","type":"post","link":"http:\/\/jnack.com\/blog\/2017\/08\/02\/google-mit-unveil-realtime-image-retouching-on-mobile-devices\/","title":{"rendered":"Google &#038; MIT unveil realtime image retouching on mobile devices"},"content":{"rendered":"<p>\u201cTeaching Google Photoshop.\u201d That\u2019s the three-word mission statement I chose upon joining Photos. I meant it as shorthand for \u201cgetting computers to see &amp; think like artists.&#8221; Now researchers are enabling that kind of <a href=\"http:\/\/news.mit.edu\/2017\/automatic-image-retouching-phone-0802\">human-savvy adjustment to run in realtime<\/a>, even on handheld devices:<\/p>\n<blockquote>\n<p>Researchers from MIT\u2019s Computer Science and Artificial Intelligence Laboratory and Google are presenting a new system that can automatically retouch images in the style of a professional photographer. It\u2019s so energy-efficient, however, that it can run on a cellphone, and it\u2019s so fast that it can display retouched images in real-time, so that the photographer can see the final version of the image while still framing the shot.<\/p>\n<\/blockquote>\n<p>And yes, it\u2019s a small world: &#8220;The researchers trained their system on a data set created by Durand\u2019s group and Adobe Systems;\u201d and Jiawen interned at Adobe; and then-Adobe researcher Aseem Agarwala <a href=\"http:\/\/blogs.adobe.com\/jnack\/2010\/07\/computational-rephotography-helps-marry-new-old.html\">collaborated<\/a> with Fr\u00e9do before joining Google.<\/p>\n<p><iframe loading=\"lazy\" width=\"604\" height=\"340\" src=\"https:\/\/www.youtube.com\/embed\/GAe0qKKQY_I?feature=oembed\" frameborder=\"0\" allow=\"autoplay; encrypted-media\" allowfullscreen><\/iframe><\/p>\n<p><img decoding=\"async\" loading=\"lazy\" title=\"NewImage.png\" src=\"http:\/\/jnack.com\/blog\/wp-content\/uploads\/2017\/08\/NewImage-2.png\" alt=\"NewImage\" width=\"599\" height=\"342\" border=\"0\" \/><\/p>\n<p>[<a href=\"https:\/\/www.youtube.com\/watch?v=GAe0qKKQY_I\">YouTube<\/a>]<\/p>\n","protected":false},"excerpt":{"rendered":"<p>\u201cTeaching Google Photoshop.\u201d That\u2019s the three-word mission statement I chose upon joining Photos. I meant it as shorthand for \u201cgetting computers to see &amp; think like artists.&#8221; Now researchers are enabling that kind of human-savvy adjustment to run in realtime, even on handheld devices: Researchers from MIT\u2019s Computer Science and Artificial Intelligence Laboratory and Google [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[6,3],"tags":[],"_links":{"self":[{"href":"http:\/\/jnack.com\/blog\/wp-json\/wp\/v2\/posts\/5741"}],"collection":[{"href":"http:\/\/jnack.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/jnack.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/jnack.com\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/jnack.com\/blog\/wp-json\/wp\/v2\/comments?post=5741"}],"version-history":[{"count":2,"href":"http:\/\/jnack.com\/blog\/wp-json\/wp\/v2\/posts\/5741\/revisions"}],"predecessor-version":[{"id":5743,"href":"http:\/\/jnack.com\/blog\/wp-json\/wp\/v2\/posts\/5741\/revisions\/5743"}],"wp:attachment":[{"href":"http:\/\/jnack.com\/blog\/wp-json\/wp\/v2\/media?parent=5741"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/jnack.com\/blog\/wp-json\/wp\/v2\/categories?post=5741"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/jnack.com\/blog\/wp-json\/wp\/v2\/tags?post=5741"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}