Stories
Slash Boxes
Comments

SoylentNews is people

posted by hubie on Saturday March 25 2023, @12:03AM   Printer-friendly
from the artificial-artificial-intelligence dept.

https://arstechnica.com/information-technology/2023/03/ethical-ai-art-generation-adobe-firefly-may-be-the-answer/

On Tuesday, Adobe unveiled Firefly, its new AI image synthesis generator. Unlike other AI art models such as Stable Diffusion and DALL-E, Adobe says its Firefly engine, which can generate new images from text descriptions, has been trained solely on legal and ethical sources, making its output clear for use by commercial artists. It will be integrated directly into Creative Cloud, but for now, it is only available as a beta.

Since the mainstream debut of image synthesis models last year, the field has been fraught with issues around ethics and copyright. For example, the AI art generator called Stable Diffusion gained its ability to generate images from text descriptions after researchers trained an AI model to analyze hundreds of millions of images scraped from the Internet. Many (probably most) of those images were copyrighted and obtained without the consent of their rights holders, which led to lawsuits and protests from artists.

Related:
Paper: Stable Diffusion "Memorizes" Some Images, Sparking Privacy Concerns
90% of Online Content Could be 'Generated by AI by 2025,' Expert Says
Getty Images Targets AI Firm For 'Copying' Photos
Adobe Stock Begins Selling AI-Generated Artwork
A Startup Wants to Democratize the Tech Behind DALL-E 2, Consequences be Damned
Adobe Creative Cloud Experience Makes It Easier to Run Malware
Adobe Goes After 27-Year Old 'Pirated' Copy of Acrobat Reader 1.0 for MS-DOS
Adobe Critical Code-Execution Flaws Plague Windows Users
When Adobe Stopped Flash Content from Running it Also Stopped a Chinese Railroad
Adobe Has Finally and Formally Killed Flash
Adobe Lightroom iOS Update Permanently Deleted Users' Photos


Original Submission

Related Stories

Adobe Lightroom iOS Update Permanently Deleted Users’ Photos 65 comments

Adobe Lightroom iOS update permanently deleted users' photos:

A recent update to the Adobe Lightroom app permanently deleted some iOS users' photos and presets, an Adobe rep confirmed on the Photoshop feedback forums. Adobe has since corrected the issue, which was first spotted by PetaPixel, but not before drawing the ire of many disappointed users.

[...] Needless to say, users who had just lost photos and presets were not happy. "Rikk, we understand the announcement, however this doesn't solve the problem," wrote Ewelina Wojtyczka. "People lost months/years of their work. Apologies will not bring it back."

Adobe hasn't further commented on the bug outside Flohr's post. [...] While Adobe shouldn't be let off the hook for this error, perhaps the importance of multiple backups is the hard lesson we can learn from this.


Original Submission

Adobe Has Finally and Formally Killed Flash 36 comments

That's it. It's over. It's really over. From today, Adobe Flash Player no longer works. We're free. We can just leave:

The Photoshop giant promised Flash would die on January 12, 2021. Thanks to the International Date Line, The Register's Asia-Pacific bureau, like other parts of the world, are already living in a sweet, sweet post-Flash future, and can report that if you try to access content in Adobe's Flash Player in this cyber-utopia, you'll see the following:

[...] Adobe's page also explains why you'll see the Flash Death Notice depicted above, rather than Flash content:

Since Adobe is no longer supporting Flash Player after the EOL Date, Adobe will block Flash content from running in Flash Player beginning January 12, 2021 to help secure users' systems. Flash Player may remain on the user's system unless the user uninstalls it.

More specifically, what's happened is that Adobe snuck a logic bomb into its Flash software some releases ago that activates on January 12, and causes the code to refuse to render any more content from that date. Adobe has also removed previous versions from its site, and "strongly recommends all users immediately uninstall Flash Player to help protect their systems."

[...] Thus ends Flash, which started life in 1993 as a vector drawing product named SmartSketch, from long-dead company FutureWave Software. FutureWave turned SmartSketch into an animation tool called FutureSplash Animator. FutureWave was acquired by Macromedia in 1996, occasioning a name change to Macromedia Flash 1.0.


Original Submission

When Adobe Stopped Flash Content from Running it Also Stopped a Chinese Railroad 13 comments

When Adobe Stopped Flash Content From Running It Also Stopped A Chinese Railroad:

Adobe's Flash, the web browser plug-in that powered so very many crappy games, confusing interfaces, and animated icons of the early web like Homestar Runner is now finally gone, after a long, slow, protracted death. For most of us, this just means that some goofy webgame you searched for out of misplaced nostalgia will no longer run. For a select few in China, though, the death of Flash meant being late to work, because the city of Dalian in northern China was running their railroad system on it.Yes, a railroad, run on Flash, the same thing used to run "free online casinos" and knockoff Breakout games in mortgage re-fi ads.

[...] So, when Adobe finally killed Flash-based content from running, this Tuesday Dalian's railroad network found itself ground to a halt for 20 hours.

The railroad's technicians did get everything back up and running, but the way they did this is fascinating, too. They didn't switch the rail management system to some other, more modern codebase or software installation; instead, they installed a pirated version of Flash that was still operational. The knockoff version seems to be known as "Ghost Version."

This, along with installing an older version of the Flash player to work with the knockoff Flash server setup, "solved" the problem, and the railroad was back up and running.

(Emphasis preserved from original.)

Has anything like this ever happened where you work or worked?

Also at: Ars Technica; official's account (in Chinese).

Related/Previously:
Flash is Back in South Africa.


Original Submission

Adobe Critical Code-Execution Flaws Plague Windows Users 25 comments

Adobe Critical Code-Execution Flaws Plague Windows Users:

Adobe has issued patches for a slew of critical security vulnerabilities, which, if exploited, could allow for arbitrary code execution on vulnerable Windows systems.

Affected products include Adobe's Framemaker document processor, designed for writing and editing large or complex documents; Adobe's  Connect software used for remote web conferencing; and the Adobe Creative Cloud software suite for video editing.

"Adobe is not aware of any exploits in the wild for any of the issues addressed in these updates," according to an Adobe spokesperson.

Adobe fixed a critical flaw (CVE-2021-21056) in Framemaker, which could allow for arbitrary code execution if exploited. The vulnerability is an out-of-bounds read error; which is a type of buffer-overflow flaw where the software reads data past the end of the intended buffer. An attacker who can read out-of-bounds memory might be able to get "secret values" (like memory addresses) that could ultimately allow him to achieve code execution or denial of service.

[...] Adobe also fixed three critical vulnerabilities in the desktop application version of Adobe Creative Cloud for Windows users.

Adobe Goes After 27-Year Old 'Pirated' Copy of Acrobat Reader 1.0 for MS-DOS 31 comments

Adobe Goes After 27-Year Old 'Pirated' Copy of Acrobat Reader 1.0 for MS-DOS * TorrentFreak:

Today, there are many popular PDF readers available but Adobe’s original ‘Acrobat Reader’ is still the go-to software for many. Needless to say, Adobe doesn’t want third-parties to pirate its software, so the company regularly sends out DMCA notices to remove infringing copies.

[...] While this is totally understandable when it comes to newer releases, F-Secure researcher Mikko Hyppönen found out that Adobe’s takedown efforts go far beyond that.

In a recent tweet, Hyppönen mentioned that the software company removed one of his tweets that linked to an old copy of Acrobat Reader for MS-DOS. This software, hosted on WinWorld, came out more than 27-years ago, shortly after the PDF was invented.

The security researcher posted the tweet five years ago and at the time there were no issues. The message was copied a few weeks ago by his own Twitter bot, which reposts all his original tweets five years later.

“They sent a DMCA notice to my bot (@mikko__2016) when it posted that tweet on the tweet’s 5th anniversary. The original tweet is fine,” Hyppönen notes.

While the original tweet is still up, the reposted message was swiftly removed by Twitter. Not just that, the bot’s account was locked as well, which is standard practice nowadays.

Looking more closely at the takedown notice, we see that it was sent by the “brand protection analyst” at Incopro, which is one of Adobe’s anti-piracy partners. It doesn’t provide any further details on the reasons for taking it down, other than an alleged copyright infringement.


Original Submission

Adobe Creative Cloud Experience Makes It Easier to Run Malware 7 comments

Bundled version of Node.js simplifies executing downloaded code

Adobe Creative Cloud Experience, a service installed via the Creative Cloud installer for Windows, includes a Node.js executable that can be abused to infect and compromise a victim's PC.

Michael Taggart, a security researcher, recently demonstrated that the node.exe instance accompanying Adobe's service could be exploited by writing a simple proof-of-concept JavaScript file that spawns the Windows Calculator app.

"I have confirmed that the node.exe packaged with the Adobe Customer Experience service can run any JavaScript you point it to," he explained to The Register.

[. . .] Security researchers commenting on Taggart's finding said they'd been under the impression the bundled Node runtime would only execute files signed by Adobe, but evidently that's not the case.

[. . .] "Because the JavaScript is getting invoked by path in C:\Program Files, it would be extremely difficult to detect from a monitoring/threat hunting perspective," explained Taggart, who added that he was able to get his own custom file dropper to run and execute a command-and-control agent without any warning from Windows Defender.


Original Submission

A Startup Wants to Democratize the Tech Behind DALL-E 2, Consequences be Damned 14 comments

A startup wants to democratize the tech behind DALL-E 2, consequences be damned – TechCrunch:

DALL-E 2, OpenAI's powerful text-to-image AI system, can create photos in the style of cartoonists, 19th century daguerreotypists, stop-motion animators and more. But it has an important, artificial limitation: a filter that prevents it from creating images depicting public figures and content deemed too toxic.

Now an open source alternative to DALL-E 2 is on the cusp of being released, and it'll have few — if any — such content filters.

London- and Los Altos-based startup Stability AI this week announced the release of a DALL-E 2-like system, Stable Diffusion, to just over a thousand researchers ahead of a public launch in the coming weeks. A collaboration between Stability AI, media creation company RunwayML, Heidelberg University researchers and the research groups EleutherAI and LAION, Stable Diffusion is designed to run on most high-end consumer hardware, generating 512×512-pixel images in just a few seconds given any text prompt.

"Stable Diffusion will allow both researchers and soon the public to run this under a range of conditions, democratizing image generation," Stability AI CEO and founder Emad Mostaque wrote in a blog post. "We look forward to the open ecosystem that will emerge around this and further models to truly explore the boundaries of latent space."

But Stable Diffusion's lack of safeguards compared to systems like DALL-E 2 poses tricky ethical questions for the AI community. Even if the results aren't perfectly convincing yet, making fake images of public figures opens a large can of worms. And making the raw components of the system freely available leaves the door open to bad actors who could train them on subjectively inappropriate content, like pornography and graphic violence.

Adobe Stock Begins Selling AI-Generated Artwork 15 comments

https://arstechnica.com/information-technology/2022/12/adobe-stock-begins-selling-ai-generated-artwork/

On Monday, Adobe announced that its stock photography service, Adobe Stock, would begin allowing artists to submit AI-generated imagery for sale, Axios reports. The move comes during Adobe's embrace of image synthesis and also during industry-wide efforts to deal with the rapidly growing field of AI artwork in the stock art business, including earlier announcements from Shutterstock and Getty Images.

Submitting AI-generated imagery to Adobe Stock comes with a few restrictions. The artist must own (or have the rights to use) the image, AI-synthesized artwork must be submitted as an illustration (even if photorealistic), and it must be labeled with "Generative AI" in the title.

Further, each AI artwork must adhere to Adobe's new Generative AI Content Guidelines, which require the artist to include a model release for any real person depicted realistically in the artwork. Artworks that incorporate illustrations of people or fictional brands, characters, or properties require a property release that attests the artist owns all necessary rights to license the content to Adobe Stock.
[...]
AI-generated artwork has proven ethically problematic among artists. Some criticized the ability of image synthesis models to reproduce artwork in the styles of living artists, especially since the AI models gained that ability from unauthorized scrapes of websites.


Original Submission

Getty Images Targets AI Firm For 'Copying' Photos 19 comments

US firm Getty Images on Tuesday threatened to sue a tech company it accuses of illegally copying millions of photos for use in an artificial intelligence (AI) art tool:

Getty, which distributes stock images and news photos including those of AFP, accused Stability AI of profiting from its pictures and those of its partners. Stability AI runs a tool called Stable Diffusion that allows users to generate mash-up images from a few words of text, but the firm uses material it scrapes from the web often without permission.

The question of copyright is still in dispute, with creators and artists arguing that the tools infringe their intellectual property and AI firms claiming they are protected under "fair use" rules.

Tools like Stable Diffusion and Dall-E 2 exploded in popularity last year, quickly becoming a global sensation with absurd images in the style of famous artists flooding social media.

Related:


Original Submission

90% of Online Content Could be ‘Generated by AI by 2025,’ Expert Says 35 comments

Generative AI, like OpenAI's ChatGPT, could completely revamp how digital content is developed, said Nina Schick, adviser, speaker, and A.I. thought leader told Yahoo Finance Live:

"I think we might reach 90% of online content generated by AI by 2025, so this technology is exponential," she said. "I believe that the majority of digital content is going to start to be produced by AI. You see ChatGPT... but there are a whole plethora of other platforms and applications that are coming up."

The surge of interest in OpenAI's DALL-E and ChatGPT has facilitated a wide-ranging public discussion about AI and its expanding role in our world, particularly generative AI.

[...] Though it's complicated, the extent to which ChatGPT in its current form is a viable Google competitor, there's little doubt of the possibilities. Meanwhile, Microsoft already has invested $1 billion in OpenAI, and there's talk of further investment from the enterprise tech giant, which owns search engine Bing. The company is reportedly looking to invest another $10 billion in OpenAI.

Previously:


Original Submission

Paper: Stable Diffusion “Memorizes” Some Images, Sparking Privacy Concerns 8 comments

But out of 300,000 high-probability images tested, researchers found a 0.03% memorization rate:

On Monday, a group of AI researchers from Google, DeepMind, UC Berkeley, Princeton, and ETH Zurich released a paper outlining an adversarial attack that can extract a small percentage of training images from latent diffusion AI image synthesis models like Stable Diffusion. It challenges views that image synthesis models do not memorize their training data and that training data might remain private if not disclosed.

Recently, AI image synthesis models have been the subject of intense ethical debate and even legal action. Proponents and opponents of generative AI tools regularly argue over the privacy and copyright implications of these new technologies. Adding fuel to either side of the argument could dramatically affect potential legal regulation of the technology, and as a result, this latest paper, authored by Nicholas Carlini et al., has perked up ears in AI circles.

Related:
Getty Images Targets AI Firm For 'Copying' Photos


Original Submission

Stable Diffusion Copyright Lawsuits Could be a Legal Earthquake for AI 15 comments

https://arstechnica.com/tech-policy/2023/04/stable-diffusion-copyright-lawsuits-could-be-a-legal-earthquake-for-ai/

The AI software Stable Diffusion has a remarkable ability to turn text into images. When I asked the software to draw "Mickey Mouse in front of a McDonald's sign," for example, it generated the picture you see above.

Stable Diffusion can do this because it was trained on hundreds of millions of example images harvested from across the web. Some of these images were in the public domain or had been published under permissive licenses such as Creative Commons. Many others were not—and the world's artists and photographers aren't happy about it.

In January, three visual artists filed a class-action copyright lawsuit against Stability AI, the startup that created Stable Diffusion. In February, the image-licensing giant Getty filed a lawsuit of its own.
[...]
The plaintiffs in the class-action lawsuit describe Stable Diffusion as a "complex collage tool" that contains "compressed copies" of its training images. If this were true, the case would be a slam dunk for the plaintiffs.

But experts say it's not true. Erik Wallace, a computer scientist at the University of California, Berkeley, told me in a phone interview that the lawsuit had "technical inaccuracies" and was "stretching the truth a lot." Wallace pointed out that Stable Diffusion is only a few gigabytes in size—far too small to contain compressed copies of all or even very many of its training images.

Related:
Ethical AI art generation? Adobe Firefly may be the answer. (20230324)
Paper: Stable Diffusion "Memorizes" Some Images, Sparking Privacy Concerns (20230206)
Getty Images Targets AI Firm For 'Copying' Photos (20230117)
Pixel Art Comes to Life: Fan Upgrades Classic MS-DOS Games With AI (20220904)
A Startup Wants to Democratize the Tech Behind DALL-E 2, Consequences be Damned (20220817)


Original Submission

This discussion was created by hubie (1068) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 1, Insightful) by Anonymous Coward on Saturday March 25 2023, @12:19AM (3 children)

    by Anonymous Coward on Saturday March 25 2023, @12:19AM (#1298074)

    meaning it's highly censored

    • (Score: 2) by ikanreed on Saturday March 25 2023, @01:23AM

      by ikanreed (3164) Subscriber Badge on Saturday March 25 2023, @01:23AM (#1298079) Journal

      No, I'm quite sure the censorship is a separate engine in most of these systems.

    • (Score: 2) by DannyB on Sunday March 26 2023, @04:01PM (1 child)

      by DannyB (5839) Subscriber Badge on Sunday March 26 2023, @04:01PM (#1298239) Journal

      I think it was trained only on free range artwork.

      --
      Don't put a mindless tool of corporations in the white house; vote ChatGPT for 2024!
      • (Score: 2) by Freeman on Monday March 27 2023, @02:31PM

        by Freeman (732) on Monday March 27 2023, @02:31PM (#1298339) Journal

        No, that was the issue with Stable Diffusion, etc.

        --
        Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
  • (Score: 0) by Anonymous Coward on Saturday March 25 2023, @01:12AM

    by Anonymous Coward on Saturday March 25 2023, @01:12AM (#1298078)

    Where did the licensed images come from? Button mashing.

  • (Score: 5, Insightful) by darkfeline on Saturday March 25 2023, @02:31AM (4 children)

    by darkfeline (1030) on Saturday March 25 2023, @02:31AM (#1298083) Homepage

    Copyright is not natural. It is a purely artificial, legal limitation implemented inconsistently in a number of specific jurisdictions under the unproven premises that both 1. artists (and other creators) will not create as much without it and 2. that is significantly detrimental to society. (Personally, I would argue that both premises are blatantly false; lots of people create art as an uncompensated hobby and/or receive funding from willing supporters, and we now have a surplus of art, to the extent of being detrimental to society.)

    There is nothing "ethical" or "unethical" about AI art generation. It is merely a legal question.

    --
    Join the SDF Public Access UNIX System today!
    • (Score: 5, Interesting) by HiThere on Saturday March 25 2023, @03:16AM (3 children)

      by HiThere (866) Subscriber Badge on Saturday March 25 2023, @03:16AM (#1298085) Journal

      There are other things, though, that could involve ethics. E.g. the use of images of people.
      I would definitely agree that copyright is separate from ethics, but that doesn't mean that ethics isn't involved. And there is definitely a large area where the ethics are unclear, if only because different groups of people vies them differently. (This, of course, makes the claim that the images are ethical either dubious, or implies that the images are extremely highly censored. Consider the various opinions that hunter, a bull-figher, a member of PETA and a Vegan would have.)

      --
      Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
      • (Score: 2) by guest reader on Saturday March 25 2023, @04:19PM (2 children)

        by guest reader (26132) on Saturday March 25 2023, @04:19PM (#1298122)

        I would definitely agree that copyright is separate from ethics

        Copyright is not separate from ethics. Copyright is a law. Laws are ethically right (laws against robbery) or ethically wrong (discrimination laws).

        • (Score: 2) by HiThere on Saturday March 25 2023, @05:40PM (1 child)

          by HiThere (866) Subscriber Badge on Saturday March 25 2023, @05:40PM (#1298128) Journal

          I think we're using language differently, as that seemed an example of law being separate from ethics. Would you prefer if I said "independent of"? But I feel that's too strong a statement, as there are correlations.

          --
          Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
          • (Score: 2) by guest reader on Saturday March 25 2023, @07:29PM

            by guest reader (26132) on Saturday March 25 2023, @07:29PM (#1298137)

            Yes, "independent of" makes it easier to understand for me. I would still argue that copyright law has connections to ethics. For example "authors can benefit from their work". At least the authors who published their work under some attribution, non-commercial or commercial license.

  • (Score: 3, Funny) by Ingar on Saturday March 25 2023, @10:19AM (2 children)

    by Ingar (801) on Saturday March 25 2023, @10:19AM (#1298104) Homepage Journal

    Question: Ethical?
    Answer: Adobe!

    --
    Understanding is a three-edged sword: your side, their side, and the truth.
    • (Score: 4, Touché) by Gaaark on Saturday March 25 2023, @10:43AM

      by Gaaark (41) on Saturday March 25 2023, @10:43AM (#1298107) Journal

      and with the added trust of Microsoft!

      --
      --- Please remind me if I haven't been civil to you: I'm channeling MDC. ---Gaaark 2.0 ---
    • (Score: 2) by ElizabethGreene on Sunday March 26 2023, @11:14PM

      by ElizabethGreene (6748) Subscriber Badge on Sunday March 26 2023, @11:14PM (#1298270) Journal

      It's a bit jarring to see the words in the same sentence.

(1)