Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Friday August 14 2020, @10:01AM   Printer-friendly
from the Voiced-by-Majel-Barrett-Roddenberry? dept.

OpenAI's new language generator GPT-3 is shockingly good (archive):

GPT-3 is the most powerful language model ever. Its predecessor, GPT-2, released last year, was already able to spit out convincing streams of text in a range of different styles when prompted with an opening sentence. But GPT-3 is a big leap forward. The model has 175 billion parameters (the values that a neural network tries to optimize during training), compared with GPT-2's already vast 1.5 billion. And with language models, size really does matter.

Sabeti linked to a blog post where he showed off short stories, songs, press releases, technical manuals, and more that he had used the AI to generate. GPT-3 can also produce pastiches of particular writers. Mario Klingemann, an artist who works with machine learning, shared a short story called "The importance of being on Twitter," written in the style of Jerome K. Jerome, which starts: "It is a curious fact that the last remaining form of social life in which the people of London are still interested is Twitter. I was struck with this curious fact when I went on one of my periodical holidays to the sea-side, and found the whole place twittering like a starling-cage." Klingemann says all he gave the AI was the title, the author's name and the initial "It." There is even a reasonably informative article about GPT-3 written entirely by GPT-3.

[...] Others have found that GPT-3 can generate any kind of text, including guitar tabs or computer code. For example, by tweaking GPT-3 so that it produced HTML rather than natural language, web developer Sharif Shameem showed that he could make it create web-page layouts by giving it prompts like "a button that looks like a watermelon" or "large text in red that says WELCOME TO MY NEWSLETTER and a blue button that says Subscribe." Even legendary coder John Carmack, who pioneered 3D computer graphics in early video games like Doom and is now consulting CTO at Oculus VR, was unnerved: "The recent, almost accidental, discovery that GPT-3 can sort of write code does generate a slight shiver."

[...] Yet despite its new tricks, GPT-3 is still prone to spewing hateful sexist and racist language. Fine-tuning the model helped limit this kind of output in GPT-2.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1, Interesting) by Anonymous Coward on Friday August 14 2020, @03:50PM

    by Anonymous Coward on Friday August 14 2020, @03:50PM (#1036575)

    Here's an author using a different text engine, in a recent Tech Review,
        https://wp.technologyreview.com/wp-content/uploads/2020/06/MIT-Technology-Review-2020-07.pdf [technologyreview.com]
    First -- read the short story on PDF pages 75-78, I've read a lot of SF and to me this was a unique take on creating an intelligent machine:
    Algostory 1.7
    (Robot Story):
    “Krishna and Arjuna”

    Then, on PDF pages 80-81 the author describes how he collaborated with the "AI"--thus my cyborg subject line. Here's the first few paragraphs,

    A few years ago I used an algorithm to help me write a science fiction story. Adam Hammond, an English professor and Julian Brooke, a computer scientist, had created a program called SciFiQ, and I provided them with 50 of my favorite pieces of science fiction to feed into their algorithm. In return, SciFiQ gave me a set of instructions on the story’s plot. As I typed into its web-based interface, the program showed how closely my writing measured up against the 50 stories according to various criteria.

    Our goal in that first experiment was modest: to see if algorithms could be an aid to creativity. Would the process make stories that were just generically consistent? Could an algorithm generate its own distinct style or narrative ideas? Would the resulting story be recognizable as science fiction at all?

    The answer to all these questions was yes. The resulting story “Twinkle Twinkle,” published in Wired not only looked and felt like a science fiction story. It also, to my surprise, contained an original narrative idea.

    Starting Score:    0  points
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  

    Total Score:   1