I think it’s fair to require disclosure. However, they don’t really go into much detail about what is considered AI. So for example, if I use Photoshop’s ‘content aware fill’ tool on a piece of artwork, would I need to disclose that the project used AI? What about if I simply upscale a piece of artwork that I drew? As tools involving some sort of machine learning become more common, it’s eventually going to just become a given that AI has been used at some point in a workflow.
Very good point. I can imagine that they are currently just trying to prevent a deluge of fully AI generated content with this policy and will then adjust accordingly.
I think it’s fair to require disclosure. However, they don’t really go into much detail about what is considered AI. So for example, if I use Photoshop’s ‘content aware fill’ tool on a piece of artwork, would I need to disclose that the project used AI? What about if I simply upscale a piece of artwork that I drew? As tools involving some sort of machine learning become more common, it’s eventually going to just become a given that AI has been used at some point in a workflow.
Let me predict the response:
“Photoshop… oh, that’s just a standard piece of software! Everybody uses Photoshop!”
“Stable Diffusion?! Commie open-source AI detected! OPEN REVOLT! TEY TAK ER JERBS!”
Remember, it’s not about protecting the consumer from AI. It’s about the illusion that they are doing something about it.
Very good point. I can imagine that they are currently just trying to prevent a deluge of fully AI generated content with this policy and will then adjust accordingly.
In my opinion, that more resembles a tool rather than a black box with minimal interaction, which is only good for spamming.