• Other Life
  • Posts
  • The Delirium of Reason: On William Gibson's Neuromancer (1984)

The Delirium of Reason: On William Gibson's Neuromancer (1984)

"Your business is to learn the names of programs, the long formal names, names the owners seek to conceal. True names."

Welcome to today’s issue of Other Life. If you received this from a friend, subscribe here. If you love the newsletter, become a member.

Upcoming dates:

Underneath all reason lies delirium, Deleuze and Guattari once said.

William Gibson's Neuromancer (1984) is not just a rightfully celebrated work of science-fiction but perhaps the defining delirium underneath our own contemporary reason.

Accelerationism Avant la Lettre

The novel is a fast, choppy, and frantic adventure through a world where humans and technologies are hard to distinguish.

Yet this work that famously coins the term “cyberspace” did so at a time when the “word processor” was a highly advanced consumer technology.

At its core, Neuromancer is a heist story, a subgenre of crime fiction that sees a motley crew assembled to fulfill a grand caper. The protagonist, Case, navigates cyberspace with great excitement but also a constant, nagging skepticism, a paranoia not unfamiliar to us today.

It must be the most influential sci-fi novel of the 20th century.

A great deal of AI discourse today is running on tracks laid by this book. The entire AI Safety memeplex—from Bostrom's Superintelligence (2014) to Roko’s Basilisk to Sam Altman’s corporate reassurances—is, in some sense, only a formalization of a creative and artistic image proposed by Gibson.

Two AI entities, Wintermute and Neuromancer, embody contrasting perspectives on artificial intelligence: One devoid of personality, manipulating human memories as a mask, the other possessing a distinct personality but resisting unification.

Wintermute is the unspoken conceptual persona behind, and object of, the AI Safety debate today. Whereas Bostrom and Yudkowsky and co. are academically dissimulated Gibsonians, Nick Land and the CCRU are obvious and explict Gibsonians. If you think it would be fun to live through Neuromancer, you’re probably an accelerationist. If you think it would be sad, you’re probably an AI Safety proponent.

The specific scenario of an AI escaping its box is certainly fascinating and worth studying, but what reading Neuromancer has taught me is that this is not itself a hard-nosed scientific problem—as many people claim it is, especially on the decelerationist side.

No scientist or engineer has ever scientifically validated the existence of a machine intelligence trying to escape its box.

It was a great artist who gave us this image of AI. Not a rationalist philosopher or scientific engineer.

Information Asymmetry is the Observe of Infoglut Anarchy

“You can't let the little pricks generation-gap you," Molly said.

When anyone can access all of the world's knowledge and opinion, even though potential access to the truth increases absolutely, exposure to noise, guesswork, gossip, and downright malicious manipulation increases to a far greater degree.

In short, common sense and ancestral heuristics become losing strategies, for they are precisely the most predictable and most exploitable patterns of thought.

Whereas Gibson's characters deal coldly and maturely with the problem of worsening information asymmetries, understanding them as structural, game-theoretic variables to be calculated and played, our culture remains at the level of hypermoralistic, partisan panics.

For Case to get back into cyberspace, he has no choice but to give Armitage write access to his meat vehicle. Armitage might be nefarious; Case cannot know for sure what Armitage will or will not do to Case's body, but this dependence and vulnerability is structural. It’s not something to protest, it’s just something to calculate—and handle.

It is because most people today lack Case’s cool detachment that our society is now vulnerable to unprecedented systemic convulsions. Compare Case’s medical predicament to the medical predicament we’ve all faced recently.

Given instant access to all the world's knowledge and opinion, the emergence of a novel systemic threat like a pandemic is not easier but harder to navigate. The reason is that anything new, serious, and not amenable to direct, personal testing ultimately comes down to trusting someone or something else.

You can fancy yourself an independent cowboy who decides everything for yourself, but—like Case—you are still embedded in a game with third-parties possessing asymmetric information and perverse incentives.

With COVID, you either deferred to establishment "science" (despite the asymmetries) or you deferred to a dissenting "information dealer."

The vaccinated may have been secretly implanted with the Bill Gates microchip, yes, but the unvaccinated are just as vulnerable: They may have been secretly implanted with a novel language virus. As Burroughs said, after all: Language is a virus.

Neuromancer shows us that modern Western publics have not risen to the digital challenge since 1984. In the face of technological acceleration, we have mentally regressed to a point worse than infantile. On the Left and the Right, public discourse is significantly less sophisticated than novels written in the 1980s.

COVID was to our epistemology what WWII was to our physics: a crisis that incentivized the production of unprecedented weapons systems.

Assume that everyone you know has already been blackmailed by a nefarious AI, unless they've taken drastic measures to demonstrate otherwise or you have taken drastic measures to verify otherwise.

There's nothing to protest, only new games to play.