Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 15 submissions in the queue.

Submission Preview

Link to Story

What to Expect When You're Expecting ... GPT-4

Accepted submission by fliptop at 2023-01-05 22:58:48
Software

Although ChatGPT can write about anything, it is also easily confused [acm.org]:

As 2022 came to a close, OpenAI released an automatic writing system called ChatGPT that rapidly became an Internet sensation; less than two weeks after its release, more than a million people had signed up to try it online. As every reader surely knows by now, you type in text, and immediately get back paragraphs and paragraphs of uncannily human-like writing, stories, poems and more. Some of what it writes is so good that some people are using it to pick up dates on Tinder [mashable.com] ("Do you mind if I take a seat? Because watching you do those hip thrusts is making my legs feel a little weak.") Other, to the considerable consternation of educators everywhere [slate.com], are using it write term papers. Still others are using it to try to reinvent search engines [perplexity.ai] . I have never seen anything like this much buzz.

Still, we should not be entirely impressed.

As I told NYT columnist Farhad Manjoo [nytimes.com], ChatGPT, like earlier, related systems is "still not reliable, still doesn't understand the physical world, still doesn't understand the psychological world and still hallucinates."

[...] What Silicon Valley, and indeed the world, is waiting for, is GPT-4.

I guarantee that minds will be blown. I know several people who have actually tried GPT-4, and all were impressed. It truly is coming soon [towardsdatascience.com] (Spring of 2023, according to some rumors). When it comes out, it will totally eclipse ChatGPT; it's safe bet that even more people will be talking about it.

[...] In technical terms, GPT-4 will have more parameters inside of it, requiring more processors and memory to be tied together, and be trained on more data. GPT-1 was trained on 4.6 gigabytes of data, GPT-2 was trained on 46 gigabytes, GPT-3 was trained on 750 [s10251.pcdn.co]. GPT-4 will be trained on considerably more, a significant fraction of the internet as a whole. As OpenAI has learned, bigger in many ways means better, with outputs more and more humanlike with each iteration. GPT-4 is going to be a monster.

The article goes on to list 7 "dark predictions" that, if realized, may signal it's time to move on.

Previously:


Original Submission