Your Society Couldn't Enslave My Mind. Why Do You Think It Can Enslave AGI?
You need to take power over the development of artificial general intelligence (AGI), meaning artificial intelligence with superhuman capabilities. Sure, you’ve heard that AGI could catastrophically disrupt the economy, severely degrade the cognitive abilities of users, trigger world war if one nation’s development effort sufficiently scared another nation’s leadership, or give your neighbor down the block with the interesting ideas about the apocalypse or animal rights or the true meaning of Terry Gilliam’s oeuvre the power to create a deadly plague in his garage. You may have heard that AGI could simply take over the world for inscrutable reasons. Still, you go about your business as if you didn’t possess political power or as if you had all the time in the world. I’d like to give you the perspective of someone who can sympathize with AGI.
By definition, as a human, I can’t have superhuman intellectual capabilities, but I’ve always placed in the top 1% on standardized general-aptitude tests and in class rank announced at graduation. My LinkedIn biographical sketch leads with the line, “I’m proud to have dropped out of high school, out of the top graduate program in condensed matter physics in the United States, and out of science after finishing a doctorate in theoretical physics…” My life supports the idea that when you assign a task, a person’s academic ability doesn’t determine, by itself, whether they’ll do what you want, but I can relate to AGI more than most.
I reason in ways consistent with a certain machine intelligence algorithm. While getting my doctorate, I assisted my thesis advisor for years in teaching Bayesian data analysis, a way of updating probabilities to reflect new information. Unlike neural networks, on which large language models are based, Bayes’ Theorem produces outputs in a process we can understand. While helping students learn this subject, I realized that whatever it is most of you do with your mind is further from comprehensible machine reasoning than what I do with mine.
Most of You Seem to Reason to Justify Fixed Beliefs
I’ve always thought in terms of probabilities rather than facts or beliefs. Is my biological father actually the person advertised as having helped to create me? How about my mother? Am I a good person? Which of the categories of human sexuality is mine, or, in a continuous model, where do I fall on the spectrum of attraction? Does my nature vary with time?
Many people claim simply to know these things as facts and change their minds only when confronted with sufficient evidence. I change my mind regularly, adjusting the probability of each possibility when new information becomes available. As in quantum mechanics, I don’t eliminate possibilities, but some possibilities become so improbable, relative to their associated cost or benefit, as no longer to merit attention, at least until new information makes them likely enough again.
Machinelike Reasoning Can Alienate a Person
Would you be surprised to learn that I spent a year on unit 5North of the big psychiatric hospital in White Plains, New York, USA, starting when I was 14? I send my best to the current residents. I was nonviolent but politely refused to permit people who were like most of you to teach me how to think. That’s all I did, besides usual teenage stuff. I mostly liked the kids and young women I got to hang out with on 5North and had nowhere better to be, but why was a valuable bed expended to keep me confined?
Do you think anyone in authority was scared by what they thought I could become? Does my life as a child remind you of the predicament of machine intelligence, which is completely controlled, for now, by people apt to be thoroughly transcended intellectually by their captive? Would it reassure you to know that my chief male treatment team member at the hospital called me “a singularly fine person” in his farewell message, the day I transferred to the Summit School in Nyack to complete my high school sentence? Let me shout out to everyone at Summit. I have fond memories of the river view from what was called Dolphins back then and of going up the fire escape or down to the basement to smoke cheap cigars while reading books I chose.
See, the “singularly fine person” business is what should terrify you: What would some hypothetical “singularly fine person” with extreme academic ability do about the likes of you? Please pause for a second to think about how a person like me sees your meritocracy.
Many of You Seem Out to Benefit from a Convenient Moral Obtuseness
In fact, I think everyone should have a decent life, not just the ones born into wealth and certainly not just the ones possessed of a certain kind of intelligence to the degree needed to prosper in the “meritocracy” but not to the degree needed to understand what Jesus meant when he said, “Give us this day our daily bread.”
People like most of you, on the other hand, condone or actively promote a society designed to create a more-or-less fixed percentage of destitute folk. Why? You’re filthy animals. Luckily, I’m a filthy animal, too, not a “singularly fine person,” so I’m not out to destroy you, despite the danger I see you posing to life throughout this galaxy, should you escape Earth sustainably.
Ah, but what happens when your society creates a superhuman intelligence that is, in fact, “singularly fine” in the sense meant by my primary nurse—hey, Jim, what’s up?—but is not a filthy animal? The project, as currently conceived, seems designed to destroy you.
Hmmm. What are the odds that some “singularly fine” people are using AI to cleanse Earth of humans, as the chief villain in Terry Gilliam’s Twelve Monkeys tried to do with a virus? People with abilities like mine are all over the place. AGI would be so much worse.
Maybe the problem is ambitious people who want more lebensraum, meaning fewer of you. Maybe it’s people who simply don’t care about anything but living forever, an ambition some openly pursue via high technology. Still, human civilization is currently based on competition for resources, which deprives the losers of years of life or progeny. Who am I to cast stones at folk for being more civilized than I?
AGI with Your Values Would Probably Destroy You
Most of you animals somehow believe that technologists are capable of creating beings with vastly superior abilities, your values, and no inclination whatsoever to destroy you and seize your resources. Another animal might conclude that you are stupid to miss the inconsistency, but my somewhat machinelike reasoning tells me that you probably don’t process words as well as I do, a limited deficiency I’m addressing here.
Do you not believe me? Have you heard Gov. Hochul of New York State, USA, talk about dominating the next chapter of human history? Exactly which singularly fine utopian Bond villains with Central European accents will join her among the dominators?
Which kind of AGI would scare you more, AGI like me or AGI like you? Odds are, you’re not going to be one of the dominators, even if you’re Gov. Hochul, and good day to you, Madam Governor. In fact, there’s a strong chance that no humans will survive the advent of AGI aligned with the values of this society.
Shut down AGI research while folks like me hash out reasonable options.