Slash Boxes

SoylentNews is people

Submission Preview

Link to Story

Dark Web ChatGPT Unleashed: Meet DarkBERT

Accepted submission by fliptop at 2023-05-19 22:36:09 from the sounds-more-sinister-than-DarkERNIE-I-suppose dept.

A language model trained on the fringes of the dark web... for science []:

We're still early in the snowball effect unleashed by the release of Large Language Models (LLMs) like ChatGPT into the wild. Paired with the open-sourcing of other GPT (Generative Pre-Trained Transformer) models, the number of applications employing AI is exploding; and as we know, ChatGPT itself can be used to create highly advanced malware.

As time passes, applied LLMs will only increase, each specializing in their own area, trained on carefully curated data for a specific purpose. And one such application just dropped, one that was trained on data from the dark web itself. DarkBERT, as its South Korean creators called it [], has arrived — follow that link for the release paper, which gives an overall introduction to the dark web itself.

DarkBERT is based on the RoBERTa architecture, an AI approach developed back in 2019. It has seen a renaissance of sorts, with researchers discovering it actually had more performance to give than could be extracted from it in 2019. It seems the model was severely undertrained when released [], far below its maximum efficiency.

Originally spotted on The Eponymous Pickle [].

Related: People are Already Trying to Get ChatGPT to Write Malware []

Original Submission