████ # This file was generated bot-o-matically! Edit at your own risk. ████
No Moods, Ads or Cutesy Fucking Icons » Some People Just Want to Watch the Internet Burn. [rifters.com]:
A Blast from the Past:
Arpanet.
Internet.
The Net. Not such an arrogant label, back when one was all they had.
Cyberspace lasted a bit longer— but space implies great empty vistas, a luminous galaxy of icons and avatars, a hallucinogenic dreamworld in 48-bit color. No sense of the meatgrinder in cyberspace. No hint of pestilence or predation, creatures with split-second lifespans tearing endlessly at each others’ throats. Cyberspace was a wistful fantasy-word, like hobbit or biodiversity, by the time Achilles Desjardins came onto the scene.
Onion and metabase were more current. New layers were forever being laid atop the old, each free—for a while—from the congestion and static that saturated its predecessors. Orders of magnitude accrued with each generation: more speed, more storage, more power. Information raced down conduits of fiberop, of rotazane, of quantum stuff so sheer its very existence was in doubt. Every decade saw a new backbone grafted onto the beast; then every few years. Every few months. The endless ascent of power and economy proceeded apace, not as steep a climb as during the fabled days of Moore, but steep enough.
And coming up from behind, racing after the expanding frontier, ran the progeny of laws much older than Moore’s.
It’s the pattern that matters, you see. Not the choice of building materials. Life is information, shaped by natural selection. Carbon’s just fashion, nucleic acids mere optional accessories. Electrons can do all that stuff, if they’re coded the right way.
It’s all just Pattern.
And so viruses begat filters; filters begat polymorphic counteragents; polymorphic counteragents begat an arms race. Not to mention the worms and the ‘bots and the single-minded autonomous datahounds—so essential for legitimate commerce, so vital to the well-being of every institution, but so needy, so demanding of access to protected memory. And way over there in left field, the Artificial Life geeks were busy with their Core Wars and their Tierra models and their genetic algorithms. It was only a matter of time before everyone got tired of endlessly reprogramming their minions against each other. Why not just build in some genes, a random number generator or two for variation, and let natural selection do the work?
The problem with natural selection, of course, is that it changes things.
The problem with natural selection in networks is that things change fast.
By the time Achilles Desjardins became a ‘Lawbreaker, Onion was a name in decline. One look inside would tell you why. If you could watch the fornication and predation and speciation without going grand mal from the rate-of-change, you knew there was only one word that really fit: Maelstrom.
Of course, people still went there all the time. What else could they do? Civilization’s central nervous system had been living inside a Gordian knot for over a century. No one was going to pull the plug over a case of pinworms.
—Me, Maelstrom, 2001
*
Ah, Maelstrom. My second furry novel. Hard to believe I wrote it almost a quarter-century ago.
Maelstrom combined cool prognostications with my usual failure of imagination. I envisioned programs that were literally alive— according to the Dawkinsian definition of Life as “Information shaped by natural selection”—and I patted myself on the back for applying Darwinian principles to electronic environments. (It was a different time. The phrase “genetic algorithm” was still shiny-new and largely unknown outside academic circles).
I confess to being a bit surprised—even disappointed—that things haven’t turned out that way (not yet, anyway). I’ll grant that Maelstrom’s predictions hinge on code being let off the leash to evolve in its own direction, and that coders of malware won’t generally let that happen. You want your botnets and phishers to be reliably obedient; you’re not gonna steal many identities or get much credit card info from something that’s decided reproductive fitness is where it’s at. Still, as Michael Caine put it in The Dark Knight: some people just want to watch the world burn. You’d think that somewhere, someone would have brought their code to life precisely because it could indiscriminately fuck things up.
Some folks took Maelstrom’s premise and ran with it. In fact, Maelstrom seems to have been more influential amongst those involved in AI and computer science (about which I know next to nothing) than Starfish ever was among those who worked in marine biology (a field in which I have a PhD). But my origin story for Maelstrom’s wildlife was essentially supernatural. It was the hand of some godlike being that brought it to life. We were the ones who gave mutable genes to our creations; they only took off after we imbued them with that divine spark
It never even occurred to me that code might learn to do that all on its own.
Apparently it never occurred to anyone. Simulation models back then were generating all sorts of interesting results (including the spontaneous emergence of parasitism, followed shortly thereafter by the emergence of sex), but none of that A-Life had to figure out how to breed; their capacity for self-replication was built in at the outset.
Now Blaise Agüera y Arcas and his buddies at Google have rubbed our faces in our own lack of vision. Starting with a programming language called (I kid you not) Brainfuck, they built a digital “primordial soup” of random bytes, ran it under various platforms, and, well…read the money shot for yourself, straight from the (non-peer-reviewed) ArXiv preprint “Computational Life: How Well-formed, Self-replicating Programs Emerge from Simple Interaction [arxiv.org]”[1] [rifters.com]:
“when random, non self-replicating programs are placed in an environment lacking any explicit fitness landscape, self-replicators tend to arise. … increasingly complex dynamics continue to emerge following the rise of self-replicators.”
Apparently, self-replicators don’t even need random mutation to evolve. The code’s own self-modification is enough to do the trick. Furthermore, while
“…there is no explicit fitness function that drives complexification or self-replicators to arise. Nevertheless, complex dynamics happen due to the implicit competition for scarce resources (space, execution time, and sometimes energy).”
For those of us who glaze over whenever we see an integral sign, Arcas provides a lay-friendly summary over at Nautilus [nautil.us], placed within a historical context running back to Turing and von Neumann.
But you’re not really interested in that, are you? You stopped being interested the moment you learned there was a computer language called Brainfuck: that’s what you want to hear about. Fine: Brainfuck is a rudimentary coding language whose only mathematical operations are “add 1” and “subtract 1”. (In a classic case of understatement, Arcas et al describe it as “onerous for humans to program with”.) The entire language contains a total of ten commands (eleven if you count a “true zero” that’s used to exit loops). All other characters in the 256 ASCII set are interpreted as data.
So. Imagine two contiguous 64-byte strings of RAM, seeded with random bytes. Each functions as a Brainfuck program, each byte interpreted as either a command or a data point. Arcas et al speak of
“the interaction between any two programs (A and B) as an irreversible chemical reaction where order matters. This can be described as having a uniform distribution of catalysts a and b that interact with A and B as follows:
Which as far as I can tell boils down to “a” catalyzes the smushing of programs A and B into a single long-string program, which executes and alters itself in the process; then the “split” part of the equation cuts the resulting string back into two segments of the initial A and B lengths.
You know what this looks like? This looks like autocatalysis: the process whereby the product of a chemical reaction catalyzes the reaction itself. A bootstrap thing. Because this program reads and writes to itself, the execution of the code rewrites the code. Do this often enough, and one of those 64-byte strings turns into a self-replicator.
It doesn’t happen immediately; most of the time, the code just sits there, reading and writing over itself. It generally takes thousands, millions of interactions before anything interesting happens. Let it run long enough, though, and some of that code coalesces into something that breeds, something that exchanges information with other programs (fucks, in other words). And when that happens, things really take off: self-replicators take over the soup in no time.
What’s that? You don’t see why that should happen? Don’t worry about it; neither do the authors:
“we do not yet have a general theory to determine what makes a language and environment amenable to the rise of self-replicators”
They explored the hell out of it, though. They ran their primordial soups in a whole family of “extended Brainfuck languages”; they ran them under Forth; they tried them out under that classic 8-bit ZX-80 architecture that people hold in such nostalgic regard, and under the (almost as ancient) 8080 instruction sets. They built environments in 0, 1, and 2 dimensions. They measured the rise of diversity and complexity, using a custom metric— “High-Order Entropy”— describing the difference between “Shannon Entropy” and “normalized Kolmogorov Complexity” (which seems to describe the complexity of a system that remains once you strip out the amount due to sheer randomness[2]).
They did all this, under different architectures, different languages, different dimensionalities—with mutations and without—and they kept getting replicators. More, they got different kinds of replicators, virtual ecosystems almost, competing for resources. They got reproductive strategies changing over time. Darwinian solutions to execution issues, like “junk DNA” which turns out to serve a real function:
“emergent replicators … tend to consist of a fairly long non-functional head followed by a relatively short functional replicating tail. The explanation for this is likely that beginning to execute partway through a replicator will generally lead to an error, so adding non-functional code before the replicator decreases the probability of that occurrence. It also decreases the number of copies that can be made and hence the efficiency of the replicator, resulting in a trade-off between the two pressures.”
I mean, that looks like a classic evolutionary process to me. And again, this is not a fragile phenomenon; it’s robust across a variety of architectures and environments.
But they’re still not sure why or how.
They do report one computing platform (something called SUBLEQ) in which replicators didn’t arise. They suggest that any replicators which could theoretically arise in SUBLEQ would have to be much larger than those observed in other environments, which they suggest could be a starting point towards developing “a theory that predicts what languages and environments could harbor life”. I find that intriguing. But they’re not even close to developing such a theory at the moment.
Self-replication just—happens.
It’s not an airtight case. The authors admit that it would make more sense to drill down on an analysis of substrings within the soup (since most replicators are shorter than the 64-byte chunks of code the used), but because that’s “computationally intractable” they settle for “a mixture of anecdotal evidence and graphs”—which, if not exactly sus, doesn’t seem especially rigorous. At one point they claim that mutations speed up the rise of self-replicators, which doesn’t seem to jibe with other results suggesting that higher mutation rates are associated with a slower emergence of complexity. (Granted “complexity” and “self-replicator” are not the same thing, but you’d still expect a positive correlation.) As of this writing, the work hasn’t yet been peer-reviewed. And finally, a limitation not of the work but of the messenger: you’re getting all this filtered through the inexpert brain of a midlist science fiction writer with no real expertise in computer science. It’s possible I got something completely wrong along the way.
Still, I’m excited. Folks more expert than I seem to be taking this seriously [www.artificiality.world]. Hell, it even inspired Sabine Hossenfelder [youtube.com] (not known for her credulous nature) to speculate about Maelstromy scenarios in which wildlife emerges from Internet noise, “climbs the complexity ladder”, and runs rampant. Because that’s what we’re talking about here: digital life emerging not from pre-existing malware, not from anarchosyndicalist script kiddies—but from simple, ubiquitous, random noise.
So I’m hopeful.
Maybe the Internet will burn after all.
The paper cites Tierra and Core Wars prominently; it’s nice to see that work published back in the nineties is still relevant in such a fast-moving field. It’s even nicer to be able to point to those same call-outs in Maelstrom to burnish my street cred. ↑
This is a bit counterintuitive to those of us who grew up thinking of entropy as a measure of disorganization. The information required to describe a system of randomly-bumping gas molecules is huge because you have to describe each particle individually; more structured systems—crystals, fractals—have lower entropy because their structure can be described formulaically. The value of “High-order” Entropy, in contrast, is due entirely to structural, not random, complexity; so a high HEE means more organizational complexity, not less. Unless I’m completely misreading this thing. ↑
This entry was posted on Sunday, October 20th, 2024 at 11:21 am and is filed under a-life [rifters.com], evolution [rifters.com]. You can follow any responses to this entry through the RSS 2.0 [rifters.com] feed. You can leave a response, or trackback [rifters.com] from your own site.
Journal Reference:
Just a moment..., (DOI: https://www.science.org/doi/10.1126/sciadv.adk1189 [doi.org])
Just a moment..., (DOI: https://www.pnas.org/doi/full/10.1073/pnas.2308901121 [doi.org])
Just a moment..., (DOI: https://www.science.org/doi/10.1126/science.adj6233 [doi.org])
Just a moment..., (DOI: https://www.pnas.org/doi/full/10.1073/pnas.2204892120 [doi.org])
Just a moment..., (DOI: https://www.pnas.org/doi/full/10.1073/pnas.2207183120 [doi.org])
Levine, Hagai, Jørgensen, Niels, Martino-Andrade, Anderson, et al. Temporal trends in sperm count: a systematic review and meta-regression analysis of samples collected globally in the 20th and 21st centuries, Human Reproduction Update (DOI: 10.1093/humupd/dmac035 [doi.org])
Just a moment..., (DOI: https://agupubs.onlinelibrary.wiley.com/doi/10.1029/2022GL098895 [doi.org])