Imagine you have in your possession a fantastic new game: a programmable, mechanical ant farm. This farm consists of some dirt and water and plants, as well as a few mechanical ants that have tiny programmable brains in them. These ants are also able, by a fun mechanical diversion, to reproduce.
When you first take the ant farm out of the box and assemble it, the ants can’t do anything. You alone are responsible for their behavior by using a set of rules that their programmable brains will follow. You don’t control every decision or motion they make (where would the fun be in that?) but rather you set up the rules and turn them on and watch what happens. Will their little civilization rise to greatness, forcing you to buy expansion modules to give them room to grow? Or will it wither and die before it ever really gets started? Oh, and one other fun attribute possessed by these ants: They know they’re in the game. Their brains are just smart enough to realize that their inconsequential lives are owed to you, the owner of the game. But they’re okay with it because otherwise they would enjoy no other existence.
A game like this would bear a resemblance to certain city and world building computer simulations, except it would operate in the physical realm, and your granular control would extend to every single ant. After all, you build the rules and configure each ant as part of your grand design. Even the new ants born from the fantastically fun reproductive diversion come under your control, because you set the rules for them, too.
Now let’s be honest: As fun as it sounds, how long could you play this game before it began to bore you? If you were good at building rules, your ant farm would grow, the ants would multiply and multiply because the most fun thing they would have to do is play the reproductive game. After a while you’d probably get sick of buying expansion modules, because all this growth and productivity would be flat out boring.
After a while you might decide to fuck with the ants.
Like, for instance, you might change a setting in the game that sets a limit on the lifetime of individual ants. You could also set a randomizer so that, for some ants, their mechanical bodies would break down more quickly than others. You could lead the ants to believe that certain behaviors would increase their potential lifetime, while other behaviors would decrease it. So now you wonder: Will the ants’ behavior change if they know this? Except the randomizer would ultimately carry more importance than ant behavior, which means the ants could never really know for sure how to maximize their lifetimes.
But why stop there? Maybe you could tell the ants that, after their mechanical bodies finally break down, the memory in their little brains might be transferred into the Ecstasy Computer, where for the rest of eternity they could frolic among each other in an endless supply of RAM and CPU cycles. The only thing the ants must do to ensure a place in the Ecstasy Computer is follow a set of confusing rules. These rules would be determined by you, the owner of the game.
And here’s where you can really fuck with the ants’ little brains. Why not present these rules in a cryptic statement, in a computer language not well known by the ants? And what about making the statement really, really long, and confuse the ants by building conflicting rules? You could give them a set of ten rules that were absolutely, positively unbreakable, but then in your statement you could provide examples of historical ants that broke the rules and still made it to the Ecstasy Computer.
Or check this out: Maybe you could take the one activity most enjoyed by the ants—the reproductive diversion—and vilify it. Make them feel guilty about doing it. You could design the urge to engage in this diversion unbalanced between the male and female versions of ants, so their instructions about the diversion no longer match. You could build rules instructing ants to find a certain partner to play the reproductive game with, and instill a penalty for playing with any other ant: no access to the Ecstasy Computer. And then confuse the ants further by programming their brains to perceive new and exotic ants to be increasingly more attractive than their own life partners for each year they remain monogamous.
Now we’re having fun, eh?
Perhaps the most intriguing part of the game would be access to the Ecstasy Computer. The ants know inherently that life in the EC is way better than on the ant farm, but the only way to get there is first endure some indeterminate period of time where they must follow confusing and conflicting rules that you designed. Through your creative propaganda you try to convince the ants that their actions determine their fate, but in reality they know this isn’t true because you built the rules. As a result, the ants never really know if they’re going to make it into the EC until after their time on the farm is up, which makes each day as frustrating and confounding as the last. After a while a lot of the ants would likely write their own rules about how to reach the EC, to rationalize succumbing to the temptations you programmed them to feel, but in the end nothing would assuage their guilt. Because you won’t allow it.
I would ask someone to invent this game, but I suspect we’re already playing living it.
Have you ever thought about writing an e-book or guest authoring on other sites?
I have a blog centered on the same ideas you discuss and would really like
to have you share some stories/information. I know my readers would value your work.
If you are even remotely interested, feel free to send me an e-mail.