I believe strongly that creative tools must honor the wishes & rights of creative people. Hopefully that sounds thuddingly obvious, but it’s been less obvious how to get to a better state than the one we now inhabit, where a lot of folks are (quite reasonably, IMHO) up in arms about AI models having been trained on their work, without their consent. People broadly agree that we need solutions, but getting to them—especially via big companies—hasn’t been quick.
Thus it’s great to see folks like Mat Dryhurst & Holly Herndon driving things forward, working with Stability.ai and others to define opt-out/-in tools & get buy-in from model trainers. Check out the news:
Here’s a concise explainer vid from Mat:
The issue is that OpenAI is using images without the permission of the copyright holders. In what other industry would “we are breaking the law, but may stop on a case-by-case basis if an artist requests it” be acceptable?
Name one other industry where the solution to theft is “but you can opt out after we’ve done it”.
If OpenAI is illegally using images, then the solution is for OpenAI needs to start over with a program that allows copyright holders to opt-in.