Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 15 submissions in the queue.
posted by hubie on Wednesday January 11 2023, @11:13PM   Printer-friendly
from the GPT-3-is-so-last-month dept.

Although ChatGPT can write about anything, it is also easily confused:

As 2022 came to a close, OpenAI released an automatic writing system called ChatGPT that rapidly became an Internet sensation; less than two weeks after its release, more than a million people had signed up to try it online. As every reader surely knows by now, you type in text, and immediately get back paragraphs and paragraphs of uncannily human-like writing, stories, poems and more. Some of what it writes is so good that some people are using it to pick up dates on Tinder ("Do you mind if I take a seat? Because watching you do those hip thrusts is making my legs feel a little weak.") Other, to the considerable consternation of educators everywhere, are using it write term papers. Still others are using it to try to reinvent search engines . I have never seen anything like this much buzz.

Still, we should not be entirely impressed.

As I told NYT columnist Farhad Manjoo, ChatGPT, like earlier, related systems is "still not reliable, still doesn't understand the physical world, still doesn't understand the psychological world and still hallucinates."

[...] What Silicon Valley, and indeed the world, is waiting for, is GPT-4.

I guarantee that minds will be blown. I know several people who have actually tried GPT-4, and all were impressed. It truly is coming soon (Spring of 2023, according to some rumors). When it comes out, it will totally eclipse ChatGPT; it's safe bet that even more people will be talking about it.

[...] In technical terms, GPT-4 will have more parameters inside of it, requiring more processors and memory to be tied together, and be trained on more data. GPT-1 was trained on 4.6 gigabytes of data, GPT-2 was trained on 46 gigabytes, GPT-3 was trained on 750. GPT-4 will be trained on considerably more, a significant fraction of the internet as a whole. As OpenAI has learned, bigger in many ways means better, with outputs more and more humanlike with each iteration. GPT-4 is going to be a monster.

The article goes on to list 7 "dark predictions" that, if realized, may signal it's time to move on.

Previously:


Original Submission

 
This discussion was created by hubie (1068) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Interesting) by RamiK on Thursday January 12 2023, @01:10AM

    by RamiK (1813) on Thursday January 12 2023, @01:10AM (#1286431)

    It's just parroting text back that sometimes looks genious and sometimes makes no sense and feels like it was written by an alien...It's like a gazillion little monkeys with typewriters -- eventually they'll produce something solid among all the crap.

    It doesn't work like but I won't waste your time explaining it since you can get a good idea of the iterative creative process involved when working with those types of systems by playing a few rounds of AI Dungeon: https://play.aidungeon.io/ [aidungeon.io]

    Once you do, you'll understand how CNET shaved off the "creative writing" requirement off their financial pieces by having an AI do the parroting and a human do the fact checking and editing: https://futurism.com/the-byte/cnet-publishing-articles-by-ai [futurism.com]

    --
    compiling...
    Starting Score:    1  point
    Moderation   +3  
       Interesting=2, Underrated=1, Total=3
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5