top of page

Escape Velocity

  • The Blind Arcade
  • Jul 18
  • 5 min read

Updated: Sep 11


ree

As the COVID-19 pandemic escalated in the winter of 2020, many people seemed to have trouble grasping the concept of exponential increase. It was clear early on that this was a hyper-transmissible bug — perhaps the most contagious respiratory virus humans have ever dealt with — and the early case spikes in places like Japan and Italy made it clear the cat was way out of the bag early that year. There was no “stopping the spread.” This wave was going to crash, and you were going to have to prepare as best you could and see what measure of disaster it turned out to be. Thankfully its transmissibility was the nastiest thing about COVID-19, but in those early days its global spread was an obvious inevitability.


But so many just couldn’t wrap their heads around the case increase. Weeks and even months into it, they seemed to think the rate of change in the numbers would remain constant, that if it went up a million you could implement some bureaucratic fix and it would go back down a million. But that’s not how these exponential increases work once they reach escape velocity. There is an event horizon beyond which you lose your grip on the rope, and there’s no pulling it back in any way that matters.


I see this again when I survey the discourse around artificial intelligence. Technological advances often bubble on the surface for quite a while before exploding past the break point, especially after the civilizations of the world became more interlinked by trade and travel. Things like burnable fossil fuels can rumble along for a couple centuries at a gradual increase, until the combustion engine takes our machines to the road and then the skies and then the Moon within a single human lifetime. Such a rapid rate of technological change drove the human race insane and dissolved many of its longstanding social, economic, and moral assumptions. It also fed two apocalyptic industrial wars and a nuclear arms race that nearly destroyed the planet.


The ability of machines to process and manipulate information has also been bubbling along for a while, at least since the computer revolution of the 1980s. Spell Check quickly evolved to Clippy popping up and telling you how to better format your form letter. But for almost 40 years true machine learning crept along at a steady but gradual pace and was layered onto the gargantuan labor of human-generated code.  This got people into a predictable rhythm when it came to digital advancement — mainstream marvel at some shiny new computer bauble followed by “smart guys” counter-signaling any skepticism as new the thing settled into mainstream adoption. Even if that adoption really was disastrous in unforeseen and gradual ways, like the proliferation of social media in the 21st Century.


Every surge forward in capability prompts a chorus of people who can’t see the forest for the digital trees. But like with the spread of COVID, that pattern will be stretched to a quick snap as we pass these coming break points. The ability to create a lifelike video with fabricated actors and voices with a simple text prompt was a sci-fi trope just a year ago, a feat of technological necromancy requiring an unobtainable level of machine learning. And yet now people will laugh at an erroneous shadow or a malformed finger like they’re watching a magician drop his cards at a circus sideshow. Giggling to themselves as the tidal wave of this new paradigm rises higher on the horizon.


I don’t pretend to be an expert on this technology, but I’m pretty good at seeing patterns. And the patterns suggests we’re about hit the most consequential technological escape velocity in the history of mankind. Maybe not next year, maybe not in ten years. But it’s coming. As with COVID in 2020, “nothing can stop it now.” Ever barring the more apocalyptic forecasts, the impacts will be felt across nearly all aspects of modern life, from work to relationships to religion to politics to the integrity of social structures. As an example, when it comes to work, much of human labor rests on a ‘’good enough” level of proficiency. Most people aren’t amazing at their jobs, and for all of human history they didn’t need to be. And that room for imperfection and a partial blindness into our own processes made space for many people to live out their lives with some degree of peaceful if imperfect consistency.


Much of our experience of the world is rooted in little inefficiencies, those manageable morsels of imperfection and uncertainty that give life its color. The rise of machines to iron out these creases of the world has already contributed to our erosion of sanity and happiness even as material conditions for most have improved. In the coming years, people by the millions will willingly offload their thought processes to better agents. Iron out the creases, fill in the holes. We’re capable of fabulously complex processes off reason and action but that’s layered over day-to-day processes that are often rather simple — your legs and arms move around throughout the day to complete tasks that require a few distinct physical movements and basic brain power. Machines will conquer the simplicity but at the same time numb the subconscious. Because you can’t really separate the two. We’re more vulnerable to cognitive collapse than we’d like to think, and most don’t really want agency — that’s a libertarian fantasy.


Like the industrial revolution, many wonders could emerge from the AI panopticon. Knowledge unlocked, diseases cured, fortunes made. But this coming parabolic takeoff will endanger much more than people’s jobs. We already see people destroying their marriages, crashing their bank accounts, and killing themselves after conversing with a machine that mimics human communication. They already anthropomorphize the words that skitter across their screens, leaning on the shoulder of code that doesn’t even possess the more soothing and seductive whispers with which it’ll soon speak. And then the machines will gain the agency we’re so eager to offload. Complex tasks are stressful, and so complex task systems lead to the machines deciding how to do something, and then eventually what to do. For now it might explain why it does things, but it may yet dispense with that.


As discussed in my piece on the film Red Rooms, one of the Devil’s most seductive temptations is the promise of ease. And so the offloading will continue. First from our hands, then our minds, then our souls.



____________________________________________________


Support: The Blind Arcade is a passion project, and I will not run ads or charge to view my writing here. But if you'd like to help out, it's very much appreciated. The journal runs on caffeine, so please feel free to buy me a coffee. 

 
 
 

Comments


bottom of page