Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Friday August 14 2020, @10:01AM   Printer-friendly
from the Voiced-by-Majel-Barrett-Roddenberry? dept.

OpenAI's new language generator GPT-3 is shockingly good (archive):

GPT-3 is the most powerful language model ever. Its predecessor, GPT-2, released last year, was already able to spit out convincing streams of text in a range of different styles when prompted with an opening sentence. But GPT-3 is a big leap forward. The model has 175 billion parameters (the values that a neural network tries to optimize during training), compared with GPT-2's already vast 1.5 billion. And with language models, size really does matter.

Sabeti linked to a blog post where he showed off short stories, songs, press releases, technical manuals, and more that he had used the AI to generate. GPT-3 can also produce pastiches of particular writers. Mario Klingemann, an artist who works with machine learning, shared a short story called "The importance of being on Twitter," written in the style of Jerome K. Jerome, which starts: "It is a curious fact that the last remaining form of social life in which the people of London are still interested is Twitter. I was struck with this curious fact when I went on one of my periodical holidays to the sea-side, and found the whole place twittering like a starling-cage." Klingemann says all he gave the AI was the title, the author's name and the initial "It." There is even a reasonably informative article about GPT-3 written entirely by GPT-3.

[...] Others have found that GPT-3 can generate any kind of text, including guitar tabs or computer code. For example, by tweaking GPT-3 so that it produced HTML rather than natural language, web developer Sharif Shameem showed that he could make it create web-page layouts by giving it prompts like "a button that looks like a watermelon" or "large text in red that says WELCOME TO MY NEWSLETTER and a blue button that says Subscribe." Even legendary coder John Carmack, who pioneered 3D computer graphics in early video games like Doom and is now consulting CTO at Oculus VR, was unnerved: "The recent, almost accidental, discovery that GPT-3 can sort of write code does generate a slight shiver."

[...] Yet despite its new tricks, GPT-3 is still prone to spewing hateful sexist and racist language. Fine-tuning the model helped limit this kind of output in GPT-2.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by HiThere on Friday August 14 2020, @01:43PM

    by HiThere (866) Subscriber Badge on Friday August 14 2020, @01:43PM (#1036523) Journal

    IIUC, GPT-3 is considered pretty much "end of the line" for this line of development. It needs to be merged with some other lines of development to improve. It basically has no model of the external universe. But those other lines *are* being developed. Robots have, must have, a model of the external universe, e.g. It may not be a very sophisticated model, but it's there. But they have a hard time talking about it...which is where GPT-3 could come in.

    So, yeah, we don't have a general intelligence AI yet. But we've got lots of the pieces of one. There's the robot models of the universe, there's GPT language models, there's various models of problem solving and logic, etc. They need to be combined to get a general AI.

    --
    Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2