TIA Tuesday: The game of life

It’s time for TIA Tuesday again, though it’s rather an abrupt shifting of the gears to go from William Lane Craig to Vox Day, especially when they’re both trying to address the problem of evil. Vox’s approach is a good deal less philosophical than Craig’s, having begun with the premise that maybe bad things happen because God “possesses” knowledge and power only in the sense that He possesses a capacity for both, which He chooses not to exercise, because—well, perhaps an illustration would help.

Vox describes for us a combat video game demo that his game-programming partner, “Big Chilly,” was demonstrating.

During the demo, Big Chilly and the three AI-controlled members of his fireteam had successfully taken out both the wide patrol and the guards, and they were just beginning to lay the explosives to blow the door that held the prisoners captive when there was a sudden burst of bright laser fire that caused him to jump in his seat and emit a startled shriek loud enough to make everyone else in the room jump, too. While his AI squadmates shot down the intruder before anyone’s battlesuits took too much damage, what shocked Big Chilly was that for the first time in hundreds of playings, an enemy AI had taken it upon itself to circle around behind the rescue force and attack it from an unexpected direction.

But how could this happen? How could a lowly artificial intelligence surprise a lead programmer who was demonstrably omniscient and omnipotent in the AI’s world? How can the created do what the creator did not will? The answer, when viewed in this context, should be obvious.

Indeed it is: Big Chilly didn’t know his own game as well as he thought he did. But can the same be said of God?

Vox apparently thinks it can.

If it is not difficult to accept that an omniscient and omnipotent programmer can reject omniderigence, why should it be hard to imagine that an all-powerful God might choose to do the same? Even human lovers know that the lover cannot control the beloved, so it should not be difficult to believe that a loving God would permit His creatures to choose freely how they will live.

Let’s follow that train of thought for a moment. The programmer is not rejecting “omniderigence” out of concern for the free will of the AI’s. The AI’s don’t really even exist. It’s just a computer simulation that applies a complicated set of programmed instructions to an organized set of data stores. The programmer lets the computer figure out what the “characters” are going to do because it takes too long to do all the mental calculations to figure out which random numbers are going to be generated, which decision paths are going to be followed, and which code sections are going to execute. The programmer’s brain, in other words, is inferior to the silicon running the game world in terms of doing the “knowing.”

One interesting consequence of this is that it becomes possible for the “omnipotent” programmer to lose his own game. He may have built “back doors” into the system, and he may be able to turn “cheats” on and force a fake “win” by simply violating his own rules. But if he sticks to his own script, and plays in real time, according to the rules, there’s a chance he can lose (if the game is non-trivial and well-designed, anyway).

Amazingly, Vox takes it even farther than just an analogy. He describes how he and his partner successfully applied evolutionary theory to automatically generate, select and improve their Artificial Intelligence algorithms, and then muses:

This AI development process is remarkably similar to the Biblical description of the harvest of souls, of the separating of the wheat from the chaff. This metaphor is central to the New Testament; Jesus Christ mentions it on several occasions and in several different forms, such as the distinction between sheep and goats. While the “God as game designer” hypothesis might reasonably be described as literally making God in one’s own image, especially when it comes from a game designer, it does offer the potential of explaining the importance of obedience to God’s will as well as the seemingly arbitrary nature of what is in line with that will and what is not. If we are AIs in God’s laboratory, then we cannot expect to have any more understanding of His ultimate purpose for us than those AIs in Big Chilly’s war lab did.

In other words, if it seems like God is playing games with us, that’s only to be expected, because it’s just possible that real life is a video game for some celestial game designer with a Cosmos 2000 mega-core googol-flop processor, quantum RAM (probably more memory than you will ever need), and optional multiverse waveform generator. Vox even supports this conclusion by citing the number of computer generated characters in video games (roughly 10,000 for every individual live player), and concludes that, statistically speaking, the odds are overwhelmingly in favor of each individual live human being a mere simulation. (I shudder to think how much beer I would need before that one started to make sense.)

This may be little more than over-caffeinated techno-speculation, but it is, I think, an exciting way to view the universe as well as providing a reasonable solution for those pesky problems of evil and ultimate purpose. If it also happens to be a near mathematical certainty, then so much the better. It is a fundamentally optimistic perspective, because if this is only the 3D war lab, imagine what the real game in all its multi-dimensional glory must be like! Even if we are immaterial simulations, we are immaterial simulations with a genuine purpose and a future more radical than we can possibly imagine in front of us. Accepting the idea that we are not only the gods of the machine, but also the machines of God, gives us the wherewithal to face the prospect of death with enthusiastic anticipation instead of courage, resignation, or even terror.

In a footnote, Vox allows that “both drugs and PS/2 were [probably] involved” in developing the “hypothesis” that we are all just a computer simulation in some larger universe of “posthuman” gamers. And yet, perhaps because it resonates so well with his own particular profession, he seems a bit reluctant to reject it. Or perhaps it’s because he too is vexed by the problem of evil, and by the fact that God, Whom he upholds as the source of morality, seems to fall so far short of His own standards. Perhaps he is keeping the “life is a game” hypothesis around just because he can’t think of any better answer.

There’s an interesting and subtle category error in this scenario. By proposing a metaworld in which the relative “goodness” or “badness” of our actions is decided arbitrarily, Vox is projecting our earthly perception of morality onto an otherworldly system, while still attempting to resolve the moral issue of the existence of evil within the earthly system. If God’s just a game designer, and our system is just God’s game, then He’s morally ok, because we’re His intellectual property, and He’s entitled to write whatever rules He likes for us. But that’s judging God by earthly standards of morality, which according to Vox’s scenario are not valid standards with respect to God, and which therefore cannot justify His conduct. The only moral judgment we can make, using the moral system that is valid within this context, is that God’s actions are negligent, evil, and immoral at least a great deal of the time. Even if it’s “God’s game, God’s rules,” the game score, as calculated by those rules, shows God with a non-zero Sin Total. What happens outside the game console is irrelevant.

Meanwhile, what Vox is doing is not so much solving the problem of evil as he is merely denying that evil is actually real. You don’t arrest someone for murder if all they’re doing is pushing a button on a joystick and turning a few pixels blood red. It’s not real evil, it’s just simulated. And, by the same token, the good deeds are fake as well. Not that it matters to any of us, since we’re not real either. It’s the ultimate in extreme denialism.

Vox has, in fact, wound up in one of a small number of cul-de-sacs that await those who try to reconcile the Christian idea of God with the problem of the reality of evil. Yes, you can deny the reality of evil by denying the reality of everything we experience. It’s an odd form of solipsism, in that not even the solipsist is actually real. Those computer characters are not real, even if they seem to surprise their programmer by behaving unexpectedly. In fact, they don’t even exist—the computer program creates an illusion of independent characters by mimicking the sights and sounds you would perceive if you observed a real (or presumably real) person in action.

There doesn’t seem to be much more that needs to be said. When your defense of God leads you into denying the existence of good, evil, and even your own self, then chances are you’ve divorced yourself from the real-world truth a long time ago.

 
1 Star2 Stars3 Stars4 Stars5 Stars (2 votes, average: 5.00 out of 5)
Loading...Loading...
Posted in TIA, Unapologetics. 15 Comments »

15 Responses to “TIA Tuesday: The game of life”

  1. cptchaos Says:

    Well,

    even if the game programmer created AIs that are intelligent and compassionate in the human sense, maybe by running the Evoution algorithm some billiards of years, then would it be Good to intervene? Since evolution requires randomness the programmer is not omniscient anymore, he just nows the rule set that started everything. Now assume that the programmer is moral just and sees that he has created a (evolving) reality with compassionate intelligence in it. He knows that he already lost track of the state of its program, he can’t know the state since it is created from a source of true randomness so he can not interrupt the Program without taking the great risk to crash it, and therefore annilate all life in it. And thus, since the programmer is moral just, he will just ensure that the machine running his program does not crash. But he is no longer capable to interfere with the reality he started, he can’t raise the dead, forgive sin or create miracles. For the Intelligence Live Forms in this reality this means they are alone with their problems an their struggles.
    So God as a programmer is not a bad picture but renders any historic Religion and their hopes for salvation to be foolish.

    best regards,
    Eike

  2. Kenneth Says:

    This simulated universe idea has been proposed and discussed elsewhere. Right now, our world is creating many, many simulations. If we are a simulation, then it’s a simulation creating simulations. One, how do we know that our particular creator is not another simulator? Two, how do we know that our particular “game” pays off with eternal life and carbuncle houses? And three, it only takes several layers of imbedded, naturalistic universe games to make the memory needs truly astounding. Whereas Vox Day sees this theory as positive it is much more likely, if true, to result in a blinking “GAME SUCCESSFULLY DELETED”.

    Does anyone really want to worship Vox Day’s buddy? Well, in this theory Pascal’s Wager might actually pay off. As Dorothy might say, ‘We’re not in Christianity any more’.

  3. Tacroy Says:

    Vox Day is not the first person to craft this hypothesis. Wikipedia has a nice summary of it, which mainly refers to Nick Bostrom’s paper, written in 2001. In fact, David Brin even wrote a short story based on this idea. It was published in 2003.

    So really, Vox needed neither a PS2 nor drugs to come up with this; he only needed to do a bit of reading. He also doesn’t seem to explain why, exactly, God is a more likely simulation runner than anyone else – I bet Loki was a good hacker.

  4. Rich Says:

    Then there’s his actuall coding:

    http://www.antievolution.org/cgi-bin/ikonboard/ikonboard.cgi?act=ST;f=14;t=5752;st=30#entry120496

    Clearly they’re not the AI gurus they claim. And they didn’t use GAs, that much is clear – I called him on it on his blog.

  5. jim Says:

    Hey! Like, when my dad gets out of prison, I’m so TOTALLY gonna get him to buy me ‘Babies in Hell II-The Reckoning’ for my PSP!

  6. Chayanov Says:

    “If we are AIs in God’s laboratory, then we cannot expect to have any more understanding of His ultimate purpose for us than those AIs in Big Chilly’s war lab did.”

    And yet it’s the theists who complain that atheism is dehumanizing.

    PS: Somebody should tell Vox that neither The Matrix nor The Sims are documentaries.

  7. Dominic Saltarelli Says:

    Now you’re all seeing the problem with trying to address Vox’s arguments after assuming him to hold traditional, (or recognizable, even…) Christian beliefs.

    Now would be a good time to revisit his rebuttal of Dawkins Ultimate 747 in light of the above revelations about his theology.

  8. Liz Says:

    “quantum RAM (probably more memory than you will ever need)”

    ZING!

    I’ve got nothing really to add here, other than to commend you on this little gem. I’ve been really enjoying your TIA takedown thus far. Keep up the good work!

  9. Galloway Says:

    Kenneth: ” One, how do we know that our particular creator is not another simulator?”

    Great question! The answer of course, is we don’t. And what if the Simulator is not Vox’s idea of god, but just a higher life-form no different than us except that the Simulator has been evolving (simulating) a few billion years longer than us. Does that change anything in Vox’s mind?

  10. Deacon Duncan Says:

    Tacroy –

    That’s my fault: Vox cited Bostrom’s paper in TIA; I trimmed my post more than I should have and omitted the reference. Mea culpa.

  11. Rich Says:

    Here’s an exchange we had on his blog, here:

    http://voxday.blogspot.com/2008/06/fun-with-memes.html

    Rich 6/29/08 7:45 PM
    VD: 06.28.08 – 03:11 PM

    Rich clearly doesn’t know that Big Chilly and I were using genetic algorithms to develop better artificial intelligences more than a decade ago. He babbles about science having “predictive and descriptive ability” while defending evolution, a non-scientific model which has virtually no predictive ability. He also fails to note the significance of the adjective or understand the irony of his description when expressing his opinion about the limits of AI.

    Throughout, he demonstrates his inability to function within the parameters of the social niceties. Thus, we conclude that he is likely an atheist.

    Oh please, let me cry, Bullshìt!

    You’ve peaked my interest, Vox. What search space where you trying to traverse? What was the objective function of your optimization? What did you encode genomically, what were your sources of variation and what were your selection criteria? Why didn’t you use A*, GBF, Iterative Deeping, simulated annealing, etc?

    And as for “inability to function within the parameters of the social niceties.”, one of us writes for an online tinfoiled-hatted fundy mag, has a pretentious internet name and grew a Mohawk to hide his receding hairline. Clue: it’s not me.

    PS: did you use GAs in this, Vox?:

    http://www.somethingawful.com/d/game-reviews/war-heaven.php

  12. pboyfloyd Says:

    Indeed! I possess a capacity to crap gold flaked, diamond studded turds!

    Everyone does.(hint: you are what you eat!)

    If ol’ Voxbaby can pull ‘reasons-for-God’ right out of his ass like this, doesn’t he think it’s possible that those ‘funny-old-folk’ who lived in those ‘funny-old-times’ could do the same thing?

    He’s not doing ‘the case for God’ any favors at all!

    I LIKE that in a guy!

  13. Freidenker Says:

    I have to say, as dim-witted-out-of-one’s-ass analogies go, Vox did well with that one. Apparently, it wasn’t original, though.

    Beautiful post and wonderful dissection as usual, Deacon.

  14. Dominic Saltarelli Says:

    see also:

    http://www.xkcd.com/505/

  15. Modusoperandi Says:

    “How could a lowly artificial intelligence surprise a lead programmer who was demonstrably omniscient and omnipotent in the AI’s world?”
    So, to save the 3-O’d God, he’s has turned Him into a little “tribal-style” god that lacks the characteristics that make the big G God special. Kudos, Vox.

    “Accepting the idea that we are not only the gods of the machine, but also the machines of God, gives us the wherewithal to face the prospect of death with enthusiastic anticipation instead of courage, resignation, or even terror.”
    Apparently he never shuts off his computer. Or deletes programs. Or has a hard drive die. Or loses power before saving. Or…

    Chayanov “PS: Somebody should tell Vox that neither The Matrix nor The Sims are documentaries.”
    It’s potentially worse than that, since the video game implies that we’re sub-programs running in an application. Welcome to the [simulation of the] desert of the real program, algorithm.