While AI generation tools like DALL-E and ChatGPT are producing amazing results, and sparking whole new types of business opportunities, many questions have been raised about the legality of such processes, and how they source the work of human creators for digital re-purposing.
Various artists, for example, are angry that DALL-E can use work that they charge for as the source material for new images, for which they have no legal rights. At least, they don’t right now – which is something that a collective of artists is now looking to rectify in a new case.
As per The Verge:
“A trio of artists have launched a lawsuit against Stability AI and Midjourney, creators of AI art generators Stable Diffusion and Midjourney, and artist portfolio platform DeviantArt, which recently created its own AI art generator, DreamUp. The artists allege that these organizations have infringed the rights of ‘millions of artists’ by training their AI tools on five billion images scraped from the web ‘without the consent of the original artists’.”
The suit claims that several AI image generators have effectively been stealing original art, which then enables their users to create similar looking work by using specific prompts and guides.
And those prompts can be totally overt – for example, in the DreamStudio guide to writing better AI prompts, it explains:
“To make your style more specific, or the image more coherent, you can use artists’ names in your prompt. For instance, if you want a very abstract image, you can add “in the style of Pablo Picasso” or just simply, “Picasso”.
So it’s not just coincidence in some cases, these tools are prompting users to replicate the styles of artists by guiding the tools in this manner.
Which, in the case of working artists, is a significant concern, and one of several key points that’s likely to be raised via the legal proceedings in this new case.
It’s not the first lawsuit relating to AI generators, and it certainly won’t be the last. Another group is suing Microsoft, GitHub, and OpenAI over an AI programming tool called ‘CoPilot’, which produces code based on examples sourced from the web, while various photographers are also exploring their legal rights to their images used in the ‘training’ of these AI models.
The concern around future litigation relating to such tools is why Getty Images is refusing to list artificial intelligence-generated art for sale on its website, while Google has published a new blog post which outlines why it’s not releasing its own AI generation tools to the public at this stage.
As per Google:
“We believe that getting AI right – which to us involves innovating and delivering widely accessible benefits to people and society, while mitigating its risks – must be a collective effort involving us and others, including researchers, developers, users (individuals, businesses, and other organizations), governments, regulators and citizens. It is critical that we collectively earn public trust if AI is to deliver on its potential for people and society. As a company, we embrace the opportunity to work with others to get AI right.”
Google has also noted that AI-generated content is in violation of its Search guidelines, and will not be indexed if detected.
So there’s a range of risks and legal challenges that could de-rail the rise of these tools. But they’re unlikely to go away entirely – and with Microsoft also looking to take a controlling stake in OpenAI, the company behind DALL-E and ChatGPT, it seems just as possible that these tools will become more mainstream, as opposed to being restricted.
In essence, the most likely outcome will be that these AI companies will need to come to terms on certain usage restrictions (i.e. artists will be able to register their name to stop people using it in their prompts), or arrange a form of payment to their source providers. But AI generative tools will remain, and will remain highly accessible, in various applications, moving forward.
But there are risks, and it is worth maintaining awareness of such in your usage, especially as more and more people look to these tools to save time and money in various forms of content creation.
As we’ve noted previously, AI generation tools should be used as complementary elements, not as apps that wholly replace human creation or process. They can be extremely helpful in this context – but just note that leaning too far into such could have negative impacts, now and in future, depending on legal next steps.
Source: www.socialmediatoday.com, originally published on 2023-01-16 17:37:54