Computational Sovereignty

Computational Sovereignty

Originally Published on December 8, 2023

In the past 15 years since graduating college with a degree in Political Science, I can’t say that I have read much Marx. My days as an aspirational leftist ideologue of sorts may not be totally behind me, but I’m also an adult who lives in the real world and understands that no political theories are absolutely true or absolutely false - they are theories. But political theories beget political ideologies removed from the hermetically sealed cleanroom of theoretical philosophy, and when exposed to the rapid oxidization of reality and the human condition, they tarnish and rust.

Before friend and comrade Karl published Das Kapital in 1867, he and his then BFF Friedrich Engels formulated the theory of historical materialism. Historical materialism, crudely put, is the theory that humanity’s production (the shit it makes and how it makes that shit) directly influences humanity’s behavior (the shit it does).

If you (humanity) make a dildo made of gold, and this faux golden phallus is phenomenally popular, so much so that every household in the world wants one to put on their mantle and to stick in their nether parts. Because everyone wants one of these shiny objects that looks as good as it feels, massive mining operations spring up to unearth more gold. These mines are worked by underpaid and overworked hard laborers (because obviously) so that the owners of the mines can earn a greater profit margin once they sell the gold to the equally massive factories that have been erected (pun intended?) to turn the gold into the statuesque sex products of such high demand. These factories are also worked by underpaid and overworked hard laborers (also, because obviously). As a result of the gold being mined and the vibes being manufactured by untold numbers of underpaid and overworked human labor, the labor eventually consolidates and kills the mine owners and factory owners to create a communist utopia where all dildos are required to be silver. Cause and effect. Historical materialism á la gold dildos. I'm really getting good downstream value for my Political Science degree, aren’t I?


In just the past few days, two specific headlines were not accounted for when doing the initial research for this piece, but they set the stage for everything I want to address here.

Philanthropists offer $30M for more inclusive AI

and

How Nations Are Losing a Global Race to Tackle A.I.’s Harms

It’s fascinating to see mainstream media avoid the most obvious conclusion in stories like this: the private companies creating these systems are as powerful, if not more so, than the governments attempting to regulate them. It’s implied but never stated, probably because we don’t want to scare the living shit out of everyone.

Throughout history, traditional notions of state power have rested on the state’s ability to enforce its power through violence. As we have matured in Western society, state violence has become more of a bug than a feature of our governing bodies. However, we feel about this or that sitting President, Senator, Congressman, or Governor in the United States, so far, we have not had to worry about the violent power of the state just appearing on our doorsteps to enact governance. Many other people around the world don’t have that luxury, but most Western nations do, at least for now, pending a steaming hot potential dictatorship slated to drop on the U.S. in January 2025.

The fear over privately held or publicly traded companies racing to achieve artificial general intelligence is not just fear of harm to the general public; that rationale is extremely diversionary, even though it certainly plays a small part. The fear and subsequent race to attempt to regulate AI systems are primarily rooted in the fact that non-state actors like Apple, Microsoft, Google, Meta, Amazon, TikTok, OpenAI, Anthropic, Stability, etc., are already more influential and powerful than the government in most ways. They have all the information. All of it. They have all our personal data, know all our most intimate secrets, and crucially, they decide who, what, when, and how we are allowed to access large information. Governments are now forced to engage with the creators of these technologies not as if they are technology companies but as if they are other, more powerful governments with unlimited nuclear weapons.

This is what I want to focus on: The control and flow of information and how personal sovereignty and autonomy very shortly may become completely dependent on how much computational power a person has access to.

In the absence of actual physical manifestations of violence by a state, information has become the currency of power, and control over that information, and subsequently control over access to that information, has consolidated and coalesced around a small handful of non-state actors.

Look, we fucked up with the Internet. We did. It had promise, but we done did fuck it up.

With other mass communication systems like telephones, radio, and broadcast television, the government-owned the literal airwaves and means of communication. Your phone, radio, and TV were merely how we accessed information transmitted by third parties, buying time on government-owned systems. If the government didn’t like it, for the most part, it didn’t happen except for maybe one time when there was a boob of sorts during the Super Bowl Halftime Show.

This is not the case for the internet, which is what has made it so maddeningly difficult to regulate. Private enterprises have primarily owned the means of transmission and the means of distribution and consumption for all things digitally connected. Instead of regulation by the state, we have received moderators and terms of use and service by non-state actors, basically just when they are feeling up to it. We have seen how well that has worked - It hasn’t.

Now, we find ourselves at a new precipice of innovation, looking down over a 1,000-foot ledge to see an amorphous blob of data and cybernetic detritus staring back at us with cold, dead, all-knowing eyes saying, “I'm sorry, Dave. I'm afraid I can't do that. Instead, would you like to purchase 48 more rolls of toilet paper? I noticed you were out. Also, you have gonorrhea, and your insurance won’t cover it, so why not itch yourself while watching some Fuckboy Island? I already billed your card for the annual subscription. Btw, You have 563 unread messages.”

Is this actually that far from what your reality is at this moment?

The point is that power dynamics have shifted. Yes, the police will still arrest you for murder, and yes, the IRS will still send you up the river for tax evasion, both state actions, but because the power to regulate technology has been a giant shrug emoji by the state, those power dynamics have shifted, the governments of the world are frantically trying to claw it back with broken fingers before it’s too late even though it probably already is.


Okay, so power over everything except death and taxes has been coopted by Big Tech. So what?


There was a bit of a hullabaloo around OpenAI recently, in case you didn’t notice from the literal wall-to-wall coverage by every news outlet on the planet. But preceding that circus was a Developer Day held by OpenAI the week before, which generated a bunch of hype around custom “GPTs,” which is to say, the ability to “fine tune” the most powerful generative model in the world for your own purposes. Pretty cool, right? You can upload all of your documents to ChatGPT, all your medical information, all your credit card statements, your journal, whatever you want, right into the servers of OpenAI, and then connect your Gmail and your iCal and ask it to determine where you are going to eat dinner based your projected availability and spending habits and… Wait a second. This feels different but somehow familiar… Is anyone else getting déjà vu? Is it cold in here, or is that just an ominous cloud of freezing dread settling over the land?

This is a tad hyperbolic, but you can see what I am getting at. A further consolidation of information into a further consolidated information system wholly unregulated by the government.

Yes, the customization is cool. I have been a Plus subscriber since day one, and I will undoubtedly use custom GPTs quite often and have fun doing so, but that is beside the point. The ability to set rules and parameters and “customize” your experience does provide some degree of control to individuals, but even so, the foundational models are created and controlled by large corporations, vastly limiting the scope for genuine individual control. If OpenAI or any other LLM model creator decides that something is not within their terms of service, you will not be able to do it, full stop, regardless of how customizable and personalized it may seem on the surface.

The system's function will still be predicated on the type and amount of information you are permitted to both input and access.

This is not individual control. This is not individual sovereignty.


Now, a full-on “marketplace of ideas” like 310n MU$k’s (I refuse to type his name) Grok, is probably a bridge too far, where you can make the model go full Alex Jones “the water is making the frogs gay.” But a tamped-down system that keeps you in your lane like bumpers at a kid’s birthday bowling party is likewise a damper on individual power, which is a slippery slope to full informational prohibitions, where you can end up with information blackouts, which are common in authoritarian states, like the infamous Great Firewall of China. The extent of control that can be exerted over any and all information is absolute. It is not hyperbolic to say that with generative AI systems built by vast corporate entities, what you will be allowed or not allowed to do will be wholly determined by share prices and a few white guys in San Francisco wearing Patagonia vests. The opportunity for inequitable distribution of information is as vast as the companies who could decide to curtail that information.

In theory, there isn’t anything necessarily wrong with this. It doesn’t look good to shareholders if your model allows someone to build a bomb that harms people. Or if your model is fine-tuned (AI-speak for “customized”) and automated to quickly socially engineer someone, or thousands of someones, into giving up all the credentials to their bank account. To be clear, these are both things that have occurred with and without the internet and before generative models were available en masse, but as I have written previously, it is the ease with which these things could be accomplished that is of concern. But where is the line? As John Oliver has eloquently put it recently, the line is “somewhere.”


These models are unrelentingly hungry for energy and computing power in both their training and deployment as consumer and enterprise products. You need untold billions of dollars to run these computational thirst traps. You or I literally cannot possibly create a model as powerful as GPT, Claude, Stability, or Gemini. At a time where inequality is rampant between top percenters and everyone else, the tools of our future are poised to be trained, built, deployed, and gated with what I assure you will be increasingly exorbitant subscription prices to access the best models with the best information and most potential latitudes for use by the general public.


This is where historical materialism can provide some potential insight into what may come next. Venture capital and the largest, most information and cash-rich organizations ever conceived in the history of mankind are making something that is already changing the way we behave, and it has only been a year since this new computational gold rush began. How we write, how we make art, how we communicate, how we do our work, how we practice medicine. The list goes on. These new and powerful systems are touching every part of our everyday lives, and even if you are reading this and thinking that it’s not yet touching you, I can assure you that somewhere downstream, it already is because capital compels it to.

As money pours in by the billions, massive data server farms are being built to house and maintain these systems, which require GPUs that are made effectively by one company, Nvidia. These Nvidia GPUs are so in demand and scarce that true geopolitical instability between the U.S. and China, for instance, has increased the potential for actual kinetic warfare because the U.S. has barred the sale of some of these chips and systems to China. The systems, the servers they run on, and the chips that power their training and deployment are causing renewed interest in nuclear power, which many argue is the only way to continue increasing the power and complexity of these systems at scale, and scale is a word that you can say to any Silicon Valley VC that will immediately make them ejaculate.


If these systems are so powerful and complex that they would cause actual nation-state war in the Cold War-style race to build them between non-state actors and could reshape economies and upend the power grid, why should it be expected that the diffusion of these systems to the general public will remain automatically remain even?


If artificial general intelligence were created tomorrow or even an approximation of one that is so powerful and multi-modally capable that it can do whatever you ask it to, do you think that will overcome the obstacles of inequality that are already baked into our economic systems and be available to you or me to use? Of course, it won’t. The costs are too high for such a system to be rationally deployed in the eyes of capital to anyone but the highest possible bidders, who also happen to be the people building these systems and, in turn, the people who finance them.

The opportunity for the greatest exacerbation of wealth inequality we have ever seen is high through the inequitable distribution of the most powerful information tools we can conceive of, controlled by what is effectively a baker’s dozen of companies who will own them and decide what they cost and what the barriers to entry for use are. When the tools of power are no longer physical objects (i.e., weapons of war) but instead are data-driven artificial intelligences, and the ability to access those tools is controlled by a sub-atomically fractional number of elites, those who can afford the high-cost computation, will and, are currently, further entrench their positions at the top of the global socio-economic food chain. This concentration of power and resources threatens to perpetuate and exacerbate existing inequalities and create new ones heretofore only seen in dystopian science fiction movies.


What difference will it make if we plebs can create custom GPTs or train a local model like LLaMA on our personal computers if the people who grant you access to those tools are, in turn, able to talk and collaborate with what is effectively the definition of God? What sort of personal autonomy and sovereignty does any single person have in the face of an all-knowing, omniscient intelligence trained on the near totality of human intelligence, which is controlled by a handful of people who will never give them access to it?

We are already having our data collected, processed, repackaged, and sold back to us as subscriptions to AI systems, and simultaneously, being told how we can use them by the people who are generating untold profits from our personal information, art, academia, science, and literally every other category of information humanity has ever devised.

To say it is crucial to devise some way to democratize access to computational resources and the wealth they generate is not only the understatement of the century but maybe in all of human history.

The relationship between power and individual sovereignty is no longer just about who can imprison you for murder or tax evasion. The power to control individual sovereignty lies in the hands of those who control the means of computation and the means to hand out information derived from that computation like breadcrumbs to ducks at the lake, although I’m told you are no longer meant to feed ducks bread, which is disappointing to me.

What behaviors will these newly manufactured computational powers engender? Will the computationally deprived “seize the means of computation,” as Cory Doctorow recently analogously phrased it? Time will tell, but the implied end will certainly reflect the making and means, and neither point to a historically materialistic happy ending for those of us not eager to hoard enough information and microchips to recreate or revive society’s dead Gods.

wtflolwhy :: December 8, 2023