The AI software Stable Diffusion has a remarkable ability to turn text into images. When I asked the software to draw "Mickey Mouse in front of a McDonald's sign," for example, it generated the picture you see above.
Stable Diffusion can do this because it was trained on hundreds of millions of example images harvested from across the web. Some of these images were in the public domain or had been published under permissive licenses such as Creative Commons. Many others were not—and the world's artists and photographers aren't happy about it.
In January, three visual artists filed a class-action copyright lawsuit against Stability AI, the startup that created Stable Diffusion. In February, the image-licensing giant Getty filed a lawsuit of its own.
[...]
The plaintiffs in the class-action lawsuit describe Stable Diffusion as a "complex collage tool" that contains "compressed copies" of its training images. If this were true, the case would be a slam dunk for the plaintiffs.But experts say it's not true. Erik Wallace, a computer scientist at the University of California, Berkeley, told me in a phone interview that the lawsuit had "technical inaccuracies" and was "stretching the truth a lot." Wallace pointed out that Stable Diffusion is only a few gigabytes in size—far too small to contain compressed copies of all or even very many of its training images.
Related:
Ethical AI art generation? Adobe Firefly may be the answer. (20230324)
Paper: Stable Diffusion "Memorizes" Some Images, Sparking Privacy Concerns (20230206)
Getty Images Targets AI Firm For 'Copying' Photos (20230117)
Pixel Art Comes to Life: Fan Upgrades Classic MS-DOS Games With AI (20220904)
A Startup Wants to Democratize the Tech Behind DALL-E 2, Consequences be Damned (20220817)
(Score: 2) by VLM on Tuesday April 04 2023, @08:51PM
This is the strategy going forward for long term human involvement or keeping AI out of the workplace.
Some random idiot human who gets hired on upwork for $1/hr to draw a picture assumes all the legal risk, and corporations like it that way.
If a billion dollar company automates the process, they're a lawsuit magnet and corporations don't like that.
Its just like self driving cars and legal liability. Some random idiot out for a drive makes a mistake, no corporation loses money. Some random idiot programmer at Tesla makes a mistake, infinite legal liability follows and Tesla has money and lawyers are like sharks sniffing blood in the water.
No one has a working techno-legalistic solution to AI legal liability in the real world.