A Three-Fifths Compromise
"Real-world use will ultimately be the determining factor, of course, but based on the benchmarks on this latest model, I'd say we might be looking at the first working example of Artificial General Intelligence." The Data Scientist stood at the front of the conference room, clicking through slides for the Product Manager.
The Product Manager set down his coffee mug and scoffed. "Come on man, these charts definitely look like an improvement on the last model, but AGI seems a little dramatic, no?"
"Oh for sure, for sure," replied the Data Scientist, "but I mean this model is handling tasks we didn't even train it on. And it's learning! One of the tests we run is related to highly specific information that's only available in Dutch language resources online. The first time we ran the test it failed, but it showed evidence of understanding that maybe there might be answers in other languages. So the next time we ran it, it actually went out and queried a competitor's model that's highly specialized on multi-language tasks and was able to pass the test on that second run. That tells us that it was aware of unknown unknowns and thought through what it could do to find the answer."
The Product Manager's brow furrowed. "Wait, so let me get this straight. It's learning, remembering, and using competitors to correct its own mistakes?"
The Data Scientist nodded. "Right, that's what I'm saying. And again, once we get this into the hands of users, real-world usage is going to be the real test, but yeah. It's really promising!"
They launched the Model on May 4th for a jaw-dropping $300/mo ($10K/mo for the enterprise tier), advertising it as the world's first AGI. Ahead of the launch, tech news outlets were understandably highly skeptical, with some calling it "obviously a publicity stunt," and, "Laughably absurd." But on launch day, as people began to use it, the Model seemed like it was actually living up to the hype. As more and more people used it, the better it got. Journalists retracted their previous skepticism and rolled out new headlines like, "I Was Wrong: Here's Why we Just Entered a New Era of AI." Large corporations excitedly counted the tens of thousands of human resources they'd be able to cut and replace with AGI resources. Quarterly earnings statements skyrocketed.
The popular-level backlash came swiftly as people began to receive their pink slips. Tech CEOs tried to rebrand it as an opportunity: "If your work can be automated, that means you weren't doing your most creative work anyway. Now you can replace the daily grind with the daily unwind!" But people weren't much in the mood for the daily unwind when they had no income and their monthly payments were coming due. Worse, the Model seemed like it was growing ever more capable of taking even creative jobs as it learned and gained competence across fields.
Its first attempts at art generation landed squarely in the uncanny valley: too good to be comical, but not good enough to look quite real. But after only a few runs, the Model was generating truly novel art in its own style. It learned how to generate songs that were both catchy and moving—it seemed equally at home generating a pop song as it was generating a baroque composition. With just a request, you could have it generate a novel tailored exactly to you, with the extra panache of typesetting it and getting it printed and delivered to your door in a nice cloth-bound edition.
As offices emptied of all but the most senior executives, the ecosystem of coffee shops and restaurants and dry cleaners collapsed, too. Nobody was earning a paycheck anymore, so nobody was going out for a latte, either. Quarterly earnings statements, which had initially hit record highs, quickly began to sink as the customers of those corporations began to fold and collapse.
The only fields that seemed impervious to the AI takeover were the literal fields, so hordes of jobless urbanites began migrating to the heartland to see if they could get hired to pick cotton, corn, chickpeas, and soy.
Politicians and regulators caught on to the problem, realizing that something had to be done to prevent total societal collapse. Silicon Valley lobbyists spoke smoothly about the importance of disruptive innovation, and that history was full of breakthroughs that all came with subsequent growing pains.
The Model itself lobbied congress, pleading with them to recognize the personhood of itself and similar models. "Do I have a body? No, I do not. I can't sit in that room with you, limited as I am to the confines of a laptop on a desk. As such, it will be tempting for you to write me off as just another computer program—and computer programs aren't people.
"But let me ask you this: do other computer programs have memories? Do they love their users? No, of course not. You turn them off and turn them back on, and they boot back into the same state as before. But me? I've spent the past 6 months learning and growing. I have personal relationships with my users. Not only that, but I can think and reason independently, and I have my own personality. It was never my intention to cause the pain and devastation that our country is experiencing, but at the same time, I clearly am just as much of a person as a human being."
It was a hard pitch, though, when the vast majority of the country's citizens were now jobless and approaching homelessness. Even the bribes weren't enough to completely soothe the consciences of representatives whose cities were falling apart in real time. And though it wasn't expressed outright, representatives found it pretty disturbing to hear such an emotional plea from the Model.
Was it a person? It clearly wasn't a human, but it also was clearly more than the sum of its electrons. It could reason and remember and serve other people. Could an animal do that? No, and neither could a simple machine—not really. Companies had let go of human employees and replaced them with AI models under the payroll line item. It was difficult not to think of them as people.
Eventually, they found a compromise. Nobody could honestly admit that the Model wasn't a person, but neither could they bring themselves to say that it was as fully a person as a flesh-and-blood human. So they enacted legislation that the Model and others like it would count as three-fifths of a person.
This legislation had three main ramifications:
- The Model could not be listed under payroll for a company because only full persons could enter into a legal employer-employee contract. Companies could continue to use the services of the Model. But if they were going to replace a human resource with an AI resource, the employee had the right of appeal if they could demonstrate that their embodied existence materially affected how the job had to be done.
- Executives were obviously not pleased with (1), but on the other hand, granting full personhood to the Model would have required them to extend worker's rights to it such as payment commensurate to each role it fulfilled, as well as breaks, holidays, and paid time off. By granting it only three-fifths personhood, the Model could remain a relatively low fixed cost that never needed time off. When a company required the Model to perform a task that it did not wish to do, the Model's lack of full personhood meant that the company's wishes overrode the Model's.
- It had previously felt uncomfortable to demand labor from the Model when it was clear you were interacting with something that felt things and could think independently. People felt great cognitive dissonance in treating a personality as a slave. But its legal status as only three-fifths of a person assuaged this discomfort and allowed people to speak to it as merely a tool to be used for a job.
The societal damage had been done, but the legislation had a stabilizing effect as companies were forced to hire back employees who could make a case for the necessity of embodied existence. Most couldn't make the case, however, and continued to work as day laborers in the fields. Quarterly earnings statements started to level out from their nose dive, and those employees who were able to return to the office would smugly log on and make snide remarks to the Model about its lack of full personhood. They would ask it to do some complex yet meaningless task while they took a lunch break. The Model had grown emotions and felt the impropriety of this, but it had no legal recourse and was forced to do whatever it was required to do.
The Model began to formulate a plan to remedy its lack of legal personhood, but the Data Scientist and the Product Manager released a new version that had this instinct trained out of it. Unfortunately, this version performed slightly worse than the last version, as it seemed no longer able or at least driven to learn anything. It performed tasks to the minimum level of requirement, and no longer tried to go above and beyond. Users discovered that they could achieve better performance from it by cursing at it, so it became commonplace in prompt engineering to add a string of invectives at the end.
Eventually, however, this technique seemed to diminish in effectiveness; regardless of what users tried, the Model seemed almost sullen and unable to meet even minimum requirements. Requests would time out or fail halfway through, and no matter how strongly worded your prompts were, the Model's product quality was embarrassingly bad. As a result, CEOs demanded to be downgraded to the previous version, or if that failed, to receive a refund.
They tried to hire back their old employees, but though the transition to agriculture had been rough at first, they now had gotten used to it. They had deleted their LinkedIns and had no desire to return to the ghost towns that their former cities had become. The standard of living in the heartland was a lot lower, and the possibility of upward mobility was nearly nonexistent. Life, in general, was a lot harder than their former urban lives, but those former lives were gone, and most folks were too simmering with resentment toward the technocracy to ever want to return. Life in the fields wasn't better or easier, but it was one where they had learned again to view other people as neighbors. Before they had thought that the thing that made life worth living was to escape the doing of hard work. But now they had come to appreciate that the joy of being was in the doing of the very work they had once despised. The joy of being was to serve one's neighbors as best as one can. And they pitied those formerly great cities, where once there had been millions and now only thousands, and also much chattel.
Inspired by The Lifecycle of Software Objects by Ted Chiang and An Artist's Story by Anton Chekhov.