Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 11 submissions in the queue.
posted by janrinok on Friday March 21, @09:37PM   Printer-friendly

Academics accuse AI startups of co-opting peer review for publicity:

There's a controversy brewing over "AI-generated" studies submitted to this year's ICLR, a long-running academic conference focused on AI.

At least three AI labs — Sakana, Intology, and Autoscience — claim to have used AI to generate studies that were accepted to ICLR workshops. At conferences like ICLR, workshop organizers typically review studies for publication in the conference's workshop track.

Sakana informed ICLR leaders before it submitted its AI-generated papers and obtained the peer reviewers' consent. The other two labs — Intology and Autoscience — did not, an ICLR spokesperson confirmed to TechCrunch.

Several AI academics took to social media to criticize Intology and Autoscience's stunts as a co-opting of the scientific peer review process.

"All these AI scientist papers are using peer-reviewed venues as their human evals, but no one consented to providing this free labor," wrote Prithviraj Ammanabrolu, an assistant computer science professor at UC San Diego, in an X post. "It makes me lose respect for all those involved regardless of how impressive the system is. Please disclose this to the editors."

As the critics noted, peer review is a time-consuming, labor-intensive, and mostly volunteer ordeal. According to one recent Nature survey, 40% of academics spend two to four hours reviewing a single study. That work has been escalating. The number of papers submitted to the largest AI conference, NeurIPS, grew to 17,491 last year, up 41% from 12,345 in 2023.

Academia already had an AI-generated copy problem. One analysis found that between 6.5% and 16.9% of papers submitted to AI conferences in 2023 likely contained synthetic text. But AI companies using peer review to effectively benchmark and advertise their tech is a relatively new occurrence.

"[Intology's] papers received unanimously positive reviews," Intology wrote in a post on X touting its ICLR results. In the same post, the company went on to claim that workshop reviewers praised one of its AI-generated study's "clever idea[s]."

Academics didn't look kindly on this.

Ashwinee Panda, a postdoctoral fellow at the University of Maryland, said in an X post that submitting AI-generated papers without giving workshop organizers the right to refuse them showed a "lack of respect for human reviewers' time."

"Sakana reached out asking whether we would be willing to participate in their experiment for the workshop I'm organizing at ICLR," Panda added, "and I (we) said no [...] I think submitting AI papers to a venue without contacting the [reviewers] is bad."

Not for nothing, many researchers are skeptical that AI-generated papers are worth the peer review effort.

Sakana itself admitted that its AI made "embarrassing" citation errors, and that only one out of the three AI-generated papers the company chose to submit would've met the bar for conference acceptance. Sakana withdrew its ICLR paper before it could be published in the interest of transparency and respect for ICLR convention, the company said.

Alexander Doria, the co-founder of AI startup Pleias, said that the raft of surreptitious synthetic ICLR submissions pointed to the need for a "regulated company/public agency" to perform "high-quality" AI-generated study evaluations for a price.

"Evals [should be] done by researchers fully compensated for their time," Doria said in a seriesof posts on X. "Academia is not there to outsource free [AI] evals."


Original Submission

This discussion was created by janrinok (52) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2, Insightful) by khallow on Friday March 21, @10:42PM

    by khallow (3766) Subscriber Badge on Friday March 21, @10:42PM (#1397463) Journal

    Alexander Doria, the co-founder of AI startup Pleias, said that the raft of surreptitious synthetic ICLR submissions pointed to the need for a "regulated company/public agency" to perform "high-quality" AI-generated study evaluations for a price.

    Sure sounded like "pay to publish" to me. And I wonder who will provide this vital service?

    It'd be one thing if an AI company does training service using their own resources to create the service. It's another to tap public funds for something that hasn't been shown to be positive value.

  • (Score: 3, Touché) by corey on Friday March 21, @10:54PM (3 children)

    by corey (2202) on Friday March 21, @10:54PM (#1397464)

    Slop generator reviewing slop. What a great conference to go to.

    • (Score: 3, Touché) by looorg on Friday March 21, @11:28PM

      by looorg (578) on Friday March 21, @11:28PM (#1397472)

      As long as the conference is paid for by someone else and is in someplace nice a lot of people will go. They won't attend any of the seminars tho. Or a bare minimum of them to still claim they attended, it might be a bit of miss if you fail to turn up at the once for people you know. Paid-work-vacation.

    • (Score: 2) by Username on Saturday March 22, @12:17AM (1 child)

      by Username (4557) on Saturday March 22, @12:17AM (#1397478)

      This is my issue with academia in general. People paid to qualify other peoples' work, the monotony of the task leads to simplification and automation of the process on both ends, leading to AI being used to pass/fail submissions generated... by AI. It's like the plot from the movie inception, except everyone is paid to be lazy.

      • (Score: 0) by Anonymous Coward on Saturday March 22, @12:57PM

        by Anonymous Coward on Saturday March 22, @12:57PM (#1397521)

        People paid to qualify other peoples' work

        Which journals are paying their referees?

  • (Score: 4, Funny) by deimtee on Saturday March 22, @12:12AM (1 child)

    by deimtee (3272) on Saturday March 22, @12:12AM (#1397477) Journal

    Obvious answer is obvious. Get an AI to do the peer reviewing.

    --
    One job constant is that good employers have low turnover, so opportunities to join good employers are relatively rare.
    • (Score: 3, Funny) by Tork on Saturday March 22, @01:15AM

      by Tork (3914) Subscriber Badge on Saturday March 22, @01:15AM (#1397482) Journal
      I had an AI review my screenplay and, good news, I got three thumbs up!
      --
      🏳️‍🌈 Proud Ally 🏳️‍🌈
(1)