{"id":7121,"date":"2018-06-24T20:44:58","date_gmt":"2018-06-25T03:44:58","guid":{"rendered":"http:\/\/jnack.com\/blog\/?p=7121"},"modified":"2018-06-24T20:44:58","modified_gmt":"2018-06-25T03:44:58","slug":"adobes-working-to-detect-photoshopping","status":"publish","type":"post","link":"http:\/\/jnack.com\/blog\/2018\/06\/24\/adobes-working-to-detect-photoshopping\/","title":{"rendered":"Adobe&#8217;s working to detect Photoshopping"},"content":{"rendered":"<p>Who better to sell radar detectors than the people who make radar guns?<\/p>\n<p>From <a href=\"https:\/\/www.youtube.com\/watch?v=7XchCsYtYMQ&amp;start=0&amp;autoplay=1\">DeepFakes<\/a> (changing faces in photos &amp; videos) to <a href=\"https:\/\/lyrebird.ai\/\">Lyrebird<\/a> (synthesizing voices) to video <a href=\"http:\/\/jnack.com\/blog\/2016\/04\/15\/realtime-face-puppeting-tech\/\">puppetry<\/a>, a host of emerging tech threatens to further undermine trust in what\u2019s recorded &amp; transmitted. With that in mind, the US government\u2019s DARPA has <a href=\"https:\/\/www.darpa.mil\/program\/media-forensics\">gotten involved<\/a>:<\/p>\n<blockquote>\n<p>DARPA\u2019s MediFor program brings together world-class researchers to attempt to <strong>level the digital imagery playing field<\/strong>, which currently favors the manipulator, by developing technologies for the automated assessment of the integrity of an image or video and integrating these in an <strong>end-to-end media forensics platform<\/strong>.<\/p>\n<\/blockquote>\n<p>With that in mind, I like seeing that Adobe\u2019s <a href=\"https:\/\/theblog.adobe.com\/spotting-image-manipulation-ai\/\">jumping in<\/a> to detect the work of its &amp; others\u2019 tools:<\/p>\n<p><iframe loading=\"lazy\" width=\"604\" height=\"340\" src=\"https:\/\/www.youtube.com\/embed\/7e5Q0TgPR54?feature=oembed\" frameborder=\"0\" allow=\"autoplay; encrypted-media\" allowfullscreen><\/iframe><\/p>\n<p><img decoding=\"async\" loading=\"lazy\" title=\"NewImage.png\" src=\"http:\/\/jnack.com\/blog\/wp-content\/uploads\/2018\/06\/NewImage-24.png\" alt=\"NewImage\" width=\"599\" height=\"306\" border=\"0\" \/><\/p>\n<p>[<a href=\"https:\/\/youtu.be\/7e5Q0TgPR54\">YouTube<\/a>] [<a href=\"https:\/\/petapixel.com\/2018\/06\/23\/adobe-using-ai-to-spot-photoshopped-photos\/?mc_cid=cd8bc22ff7&amp;mc_eid=53d8d48922\">Via<\/a>]<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Who better to sell radar detectors than the people who make radar guns? From DeepFakes (changing faces in photos &amp; videos) to Lyrebird (synthesizing voices) to video puppetry, a host of emerging tech threatens to further undermine trust in what\u2019s recorded &amp; transmitted. With that in mind, the US government\u2019s DARPA has gotten involved: DARPA\u2019s [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[3],"tags":[],"_links":{"self":[{"href":"http:\/\/jnack.com\/blog\/wp-json\/wp\/v2\/posts\/7121"}],"collection":[{"href":"http:\/\/jnack.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/jnack.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/jnack.com\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/jnack.com\/blog\/wp-json\/wp\/v2\/comments?post=7121"}],"version-history":[{"count":1,"href":"http:\/\/jnack.com\/blog\/wp-json\/wp\/v2\/posts\/7121\/revisions"}],"predecessor-version":[{"id":7122,"href":"http:\/\/jnack.com\/blog\/wp-json\/wp\/v2\/posts\/7121\/revisions\/7122"}],"wp:attachment":[{"href":"http:\/\/jnack.com\/blog\/wp-json\/wp\/v2\/media?parent=7121"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/jnack.com\/blog\/wp-json\/wp\/v2\/categories?post=7121"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/jnack.com\/blog\/wp-json\/wp\/v2\/tags?post=7121"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}