Vestigial Stasis (or Ghosts in the Machine)

Vestigial Stasis (or Ghosts in the Machine)
Photo by Pavel Herceg / Unsplash

At this point in history, most people under forty have been on the internet for most of their lives. They have been on the internet for longer than they were ever not on the internet. Assuming most youthful Westerners got personal “modern internet” tools like laptops, emails, smartphones, and social media accounts between around 2004-2010, they have been building digital profiles for nearly twenty years — the collective sum of their purchases, site visits, cookies accumulated, apps downloaded and deleted, email lists signed up for, accounts created, and searches made. It’s been longer if you count the prior years when Google started accumulating search data and building personas based on our behavior for AdWords around Y2K.

The digital profiles we've all built over the past two-plus decades are immensely complex tapestries woven from every mundane or salacious thread of existence we have ever had on the internet, “Private Browsing” be damned. Our demographics: age, gender, sexuality, race, location(s), and interests. Our behaviors: our friend and follow requests, comments, impressions, click-through rates, and conversions. These data points are contextualized, analyzed, blended, chopped, diced, swallowed, regurgitated, swallowed again, digested, rarely sanitized to remove personally identifiable information, and sold to create further datasets and machine learning algorithms that determine digital ad spends, targeting criteria, formats, and placements for essentially ever business on Earth to sell us products at the infinite rectangular mall in our pockets.

Who we have been on the internet over the past twenty-some-odd years shapes our entire digital experience today, based not just on who we are now but overwhelmingly on who we once were, to infer who we will most likely become.

Platforms and their dutiful algorithms are tuned to our data, and most of the data about each of us lies in our vast online history, so platforms are effectively time machines to our past selves. Our accumulated digital personas determine what we are served based on what we have previously been served and how we reacted then, creating a perverse and recursive feedback loop that assumes who we are now instead of who we actually might be, a form of vestigial stasis where the vestiges of our past unconsensually haunt us and directly prohibit our present and future selves from accessing the potential of who we could be in the future, trapped in an unintentional prison of our own design.

“They” know who we are, but who we were is what statistically decides who we might become. Our curated feeds, recommended news, and ads served will never leave the standard deviation of our historical social, political, ethnic, religious, and economic classifications unless we actively pursue differentiated ideologies and identities ourselves. Without significant personal effort and/or prescription methamphetamine and a Duolingo subscription, most of us do not seek differentiation of identity, even more so the further we age. Most people stop discovering new music at 30. Making new friends who might challenge our beliefs is harder each year. Change is difficult and often traumatic. To seek change is to seek potential disorder and chaos. But what if even those among us who thrive in chaos and seek out new ideologies, identities, and beliefs to modify, adapt, and evolve ourselves only think that we have actively sought out change and evolution? What if our free willed intentional growth has actually been guided by the invisible hand of ad-capital to those ideologies, identities, and beliefs which are not the most personally affective, but which are the most engaging and most profitable?

The concept of society being divided and people being stuck in their own "bubble" is about as fresh of an idea as a bear carcass dumped by RFR Jr. in Central Park, but it’s work looking at from a different vantage. Ideologically, each of us may or may not be in a “progressive,” “independent,” or “conservative” bubble, our feeds made to reflect our personal sociopolitical ideologies. But that feed is based entirely on our past experiences on the internet and reinforcing our pre-existing ideologies, whether or not we would still value those ideologies if and when exposed to different ones.

The vestiges of our past beliefs are used to efficiently and systematically block off our future possibilities of change because there is simply more data to support what we used to believe than there is to support what we might believe now or could believe in the future. This is a technofeudal and enshitified mockery of the very nature of personal growth and self-improvement, where machine learning curates the perfect dopamine roofie to keep you either where you are, or nudge you to where you might buy, all while reciting the old playground taunt, “I know you are, but what am I?" into the blue glow of our pale, screen-reflected visages as we biometrically confirm the purchase of new skibidi toilet x chappell roan salad tongs to unironically use at a dinner party.

Every new data point we add to our digital personas confronts the mountain of data points we have left behind us. Even if we deviate from the normative levels of our rotting corpse of historical data - where we "Liked" something on Facebook but never "Unliked" it, for instance - even more new data, and the personal effort to create that new and differentiated data, is necessary to change our persona to reflect a change in ourselves. How often do you make an effort to “Unfollow” or “Unlike” something you Followed or Liked in the past? Almost never. Absent some inciting news incident where you discover that an influencer/DJ/band/podcast host whom you used to like it turns out likes having inappropriate relationships with underage women, kicking puppies, or using the N-word at a Wendy’s drive-thru when their Frosty wasn’t America enough because of Joe Biden.

The residual effect of this is that even the general corpus of knowledge on the internet shifts the Overton backwards to what was, instead of what is. There are untold millions of accounts of long since dead individuals whose accounts were never taken down and whose personas are permanently etched in silicon—a life of distilled down to Follows, Likes, Friends, Comments and Email Lists.

The phenomenon of vestigial stasis is both somehow probabilistic and deterministic at the same time.

It is probabilistic because our digital vestiges are the data through which probabilities of our future behavior are predicted. It is deterministic because this vestigial historical precedent of data is used to move us toward the most likely potential futures of probability, whether or not there was another outlying statistical possibility of strong deviation from the norm which we would have experienced without the deteriminism foisted upon us by probabilistically calculated vestigial data. Outliers don’t sell bootleg ‘your mom is brat’ t-shirts on Shein and Temu. We are in stasis, and that is not brat.

Our “new” technologies have quite literally established entire business models on vestigial architecture. Artificial intelligence's entire value proposition has been "what's old is new again" – or at least "what's old is still old, but it might look new if it has seven fingers or tells you to eat rocks and glue while being sure that 2+2=5 and not in the sense of the Radiohead song." While slightly lesser-known AI tools like RAGs can help mediate some vestigial stasis in the technology, the usual preference for up-to-date information is eschewed by the absolute necessity of the technology to be based on old information already in the training data. If we were to suddenly discover that the sky was not blue, but is, in fact, a heretofore unknown pleasant hue called gurblezoid, which is actually the result of alien space farts that have traveled to our atmosphere over billions of years, information that undoubtedly and instantly changes a fundamental fact about the world, any model trained before the Great Gurblezoid Epiphany of 2025™ would need to be retrained or manually fine tuned to block out any prior "belief" system about a sky that is blue and not due to extraterrestrials who needed more fiber in their diets.

We don't even really make new products any longer because our innovation is stuck in vestigial stasis due to our collective inability to think not just outside of the box but outside of our own historical precedents. Every product released effectively since 2007 has been a nearly linear and teleological progression. Something exists, therefore something else must exist that builds upon it, even if that the original thing sucks. VR and AR headsets, "the metaverse," live-streaming, tablets, and smartwatches are all just derivatives and downstream runoff of the iPhone and the App Store, launched in 2007 and 2008, respectively, or from the video game industry since it's inception. Blockchain is just a more expensive, more energy intensive, more complicated, and slower version of traditional finance.

The market no longer says to us, "Here is something you have never seen before," it looks at our data, crunches the numbers, and then stares at us with a straight face and says, "Nothing is new, nothing is novel, and we are not only out of ideas, but also fresh out of the capital-incentive to innovate, and we know you are feeling nostalgic because we see you looking at your ex's wedding album on Instagram like a creep so please for the love of god will you download this new weight loss app for $10.99 a month?"

Even our politics reflects vestigial stasis. Joe was old, so now we have something "new" in Kamala, who, beyond her obvious ethnic and gender traits, which are clearly novel for a hopefully future President, is running a hybrid of the Obama and Clinton campaigns from 2008 and 1992 respectively, complete with a Sheppard Fairey portrait to yank on the millennial heart strings yearning for a time when more people than we now feel comfortable admitting felt we had solved racism and ended history.

The problem en mass is not likely to get better. (It won’t get better). Our technological billionaire saviors in Silicon Valley would have us believe that large language models will solve all our problems, and yet they have quite literally ingested all of the data humanity has ever created just to make the existing models. To solve this lack of additional data, they are creating “synthetic data” out of old data which itself has been synthesized by the existing models’ existing data to be combined into "new" data for training new models. And yes, I triple checked, that sentence does make sense. The synthetic data method of training is something that Jathan Sadowski of This Machine Kills has appropriately and hilariously called "Hapsburg AI," models with data so bereft of originality that it is effectively inbred DNA. If you are unfamiliar with the Austrian Hapsburg Dynasty, here's a photo of how they generally looked after enough generational inbreeding to make a Lannister blush.

Nostalgia used to be considered a disease, and now we can't even create a new technology that isn't firmly rooted in an old technology, putting a slick marketing sheen on something we still use and trying to convince us that it’s new and not just another lovingly polished turd.

The Internet is dead not just because of Dead Internet Theory, where bots are talking to bots talking to bots and humans are the passive consumers of bot entertainment. The internet is dead because the platforms and the products we use have penned us in - ID4 Jeff Goldblum voice - using our own satellites (historical data) against us. We can retreat into the Dark Forests of the Internet, free from the algorithmic feeds and SEO, but we still need to be people in the normative world of banking, taxes, work emails, and Amazon Prime deliveries. Modernity and capital each require us to interact with the internet in order to to be full-fledged Members of Society IRL, so we will never really be able to leave our selves behind, or obfuscate the data we have already amassed even when actively trying to create less of it.

We are the ghost in the machine. Shackled to our digital past by platforms and advertising datasets, we are prevented from individually and collectively reaching the escape velocity required to meet an alternative and less probable but truly untethered future.