Is God a brain in a vat?

From a comment I made last year at UD:

It’s impossible to verify the reliability of a cognitive system from the inside. Why? Because you have to use the cognitive system itself in order to verify its reliability.

If the system isn’t reliable, you might mistakenly conclude that it is!

This even applies to God himself. From the inside, God may think that he’s omniscient and omnipotent. He seems to know everything about reality, and he seems to be able to do anything that is logically possible. But how can he know these things with absolute certainty?

What if there is a higher-level God, or demon, who is deceiving him into thinking that he’s the master of the universe when he really isn’t? How, for that matter, can God be sure that he isn’t a brain in a vat?

He can’t. Defining him as omniscient doesn’t help. Like everyone else, he can only try to determine, from the inside, whether his cognitive apparatus is reliable. He can never be absolutely sure that he isn’t being fooled, or fooling himself.

39 thoughts on “Is God a brain in a vat?

  1. If the definition of God includes that he is the Absolute, or even – I think – the ground of being, he can’t be a brain in a vat. Since gods are entirely definition, you can get them out of anything.

    ETA: You are talking about the cognitive system of something that doesn’t have one. I’m sure VJ Torley could write 5,000 words about how God doesn’t think. I can’t see you making any ground with this argument.

  2. As someone at UD so eloquently said, God is defined as the first cause.

    Therefore He is.

    That was simple.

  3. It’s impossible to verify the reliability of a cognitive system from the inside.

    It is far from clear that the idea of verifying the reliability of a cognitive system is meaningful. One might perhaps talk of whether it is working well. But I don’t see where verifying the reliability could apply.

    You seem to be making the same kind of assumptions about a cognitive system, that Plantinga makes in his EAAN.

    It is even less clear that you can apply this to God. Perhaps God does not have or need` a cognitive system.

  4. davehooke,

    Sure, if you define God as not being a brain in a vat, then he isn’t a brain in a vat.

    But here’s what’s interesting: even then, I don’t think he can know that he isn’t a brain in a vat. It just so happens that he isn’t, but he can’t determine that for himself.

    Put yourself in his heavenly shoes. What could you do to prove to yourself that you weren’t a BIAV?

  5. Neil,

    You seem to be making the same kind of assumptions about a cognitive system, that Plantinga makes in his EAAN.

    Such as?

    It is even less clear that you can apply this to God. Perhaps God does not have or need` a cognitive system.

    Most theists believe that God knows things and reasons about them. For the purposes of my argument, his cognitive system is whatever it is that allows him to do that.

  6. keiths: Such as?

    You seem to be assuming that a cognitive system must adhere to some sort of standard that could be subject to verification.

    I’m not at all sure what “verified” could mean, other than checking conformance to a standard.

    At the very least, you need to more clearly define what you are arguing.

  7. keiths: Most theists believe that God knows things and reasons about them. For the purposes of my argument, his cognitive system is whatever it is that allows him to do that.

    That would seem to be a very anthropomorphic conception of God.

  8. keiths

    What if there is a higher-level God, or demon, who is deceiving him into thinking that he’s the master of the universe when he really isn’t?

    and what if the higher level God or demon is being deceived by another higher level God .. and so on ?

    He can’t. Defining him as omniscient doesn’t help. Like everyone else, he can only try to determine, from the inside, whether his cognitive apparatus is reliable. He can never be absolutely sure that he isn’t being fooled, or fooling himself.

    What makes you think your brain is not in a vat ? What if your view of God is limited by your ability to understand.

  9. If atheists are (supposed to be) plagued by existential angst, think how much worse it is for God!

  10. I am a brain in a vat. The vat is made of bone, and it is attached to a intricate system of bones, organs, muscles, and skin.

  11. KN,

    I am a brain in a vat. The vat is made of bone, and it is attached to a intricate system of bones, organs, muscles, and skin.

    You think you are a brain in a vat made of bone. The Cartesian demon has fooled you into trusting your sensory input.

  12. Allan Miller,

    If atheists are (supposed to be) plagued by existential angst, think how much worse it is for God!

    Outwardly, I had it all. Hosannas from a heavenly chorus. Enough burnt offerings to blanket Beijing in smog for a year. Worship 24 hours a day, from all corners of the globe. People were even flying planes into buildings for me.

    It was every Supreme Being’s dream.

    Yet inwardly, I felt empty. I was plagued by doubts. What if it were all… an illusion?

  13. coldcoffee:

    and what if the higher level God or demon is being deceived by another higher level God .. and so on ?

    That doesn’t really work, if you think about it. If the “god” at level n is really a brain in a vat, then he is being fooled by the god at level n+1. Any gods below him don’t actually exist — they are part of the illusion being foisted upon him.

    What makes you think your brain is not in a vat ?

    Like God, I can never be sure that it isn’t.

    What if your view of God is limited by your ability to understand.

    Then I could be wrong, just as you could be wrong whenever you say anything about God — for example, that he created the universe, that he is omniscient, that he loves us, etc.

  14. keiths:

    Most theists believe that God knows things and reasons about them. For the purposes of my argument, his cognitive system is whatever it is that allows him to do that.

    Neil:

    That would seem to be a very anthropomorphic conception of God.

    Humans have a penchant for creating gods in their own image. Besides, a God that doesn’t know anything is hardly in the running when it comes to omniscience. People like their omniGods.

  15. I don’t really understand what “from the inside” is doing here, and it worries me that our lazy, habitual use of this phrase is making Cartesian problems look unavoidable. Do cognitive systems have insides? Outsides? What does “from the inside” mean? Do we ever look at cognitive systems “from the outside”?

  16. Neil,

    You seem to be assuming that a cognitive system must adhere to some sort of standard that could be subject to verification.

    Sure, and that standard includes things like consistency. If I solve a problem in two different ways, then I should get the same answer. If not, then there is a flaw in my thinking somewhere.

    The problem is that no one, including God, can prove the reliability of his or her thinking from the inside.

    If I solve the problem twice and get the same answer, it doesn’t mean that I’m right. Maybe I made the same mistake in both attempts, or maybe I made two distinct errors that happened to lead to the same answer. The possibility of error can never be ruled out entirely.

    Absolute certainty is a myth.

  17. keiths: Sure, and that standard includes things like consistency. If I solve a problem in two different ways, then I should get the same answer.

    So if I solve the problem of naming a bright object in the sky one way, by giving that object the name “The evening star”, and I solve it a different way by giving it the name “The morning star”, that’s a cognitive failure? People got along quite well doing that, for a long time.

    If I solve the problem of what to have for breakfast by choosing pancakes on one day, and scrambled eggs on the next day, that’s a cognitive failure?

  18. Oliver Sacks’ writings suggest that if our brain changes, we cannot detect the change, at least in some instances. His books are full of examples. It is at least theoretically possible that our perceptions could be altered by remote control, without our being aware of the manipulation.

    In such a case, we could solve a problem that appears to have only one correct solution in multiple ways and not perceive the change. In Last Thursday Mode, the manipulation would change our perceived history, so even if we took notes (as in the movie Memento) our notes would not reflect the discrepancy.

    Why we should care, I don’t know.

    ETA:

    http://robinlea.com/pub/wife-hat.pdf

  19. Neil,

    I’m speaking of things like balancing one’s checkbook or computing a table of logarithms. If you do it twice and you don’t get the same results, you immediately start looking for your mistake.

  20. Marginally related – but a maximal God, Omniscient, Omnipotent, Atemporal etc etc –
    why *do* anything? You already know the outcome. Any action is incoherent.

  21. keiths: I’m speaking of things like balancing one’s checkbook or computing a table of logarithms.

    What does that have to do with the reliability of a cognitive system? This is learned reasoning, but not a necessary ability for a cognitive systems.

    If you do it twice and you don’t get the same results, you immediately start looking for your mistake.

    Why wouldn’t looking for your own mistake count as internal verification?

  22. Some folks are conspicuously absent from this thread. Almost as if they are afraid to have fun with this kind of thinking.

  23. If our brains are envatted, why do some envatted brain think God exists and others think He doesn’t ? Why do some think God’s brain is in a vat and other think God can’t be envatted because he created the universe?

  24. CC,
    why do some envatted brain think God exists and others think He doesn’t ? Why do some think God’s brain is in a vat and other think God can’t be envatted because he created the universe?

    A change of pace, a guy gets bored of making the same brain in the vat all day, everyday.

  25. coldcoffee,

    If our brains are envatted, why do some envatted brain think God exists and others think He doesn’t ? Why do some think God’s brain is in a vat and other think God can’t be envatted because he created the universe?

    I haven’t argued that our brains are envatted — just that they might be, and that we can never know for sure that they aren’t.

    God has the same problem.

  26. Neil,

    Why wouldn’t looking for your own mistake count as internal verification?

    It does count as internal verification. The problem is that your internal verification can’t establish the correctness of your cognitive system, because if your cognitive system is unreliable, then your internal verification (which depends on your cognitive system) may be faulty.

  27. KN,

    I don’t really understand what “from the inside” is doing here, and it worries me that our lazy, habitual use of this phrase is making Cartesian problems look unavoidable.

    I think that Cartesian problems (of the evil-demon sort) actually are unavoidable, which is why I argue that absolute certainty is a myth.

  28. keiths: The problem is that your internal verification can’t establish the correctness of your cognitive system, …

    Personally, I question whether “correctness of the cognitive system” has any actual meaning. Unlike you, I am not a representationalist.

  29. Neil,

    Personally, I question whether “correctness of the cognitive system” has any actual meaning.

    Do you question the fact that certain questions have right and wrong answers, and that certain people, due to their problem-solving skills, are more adept at finding the right answers than others are?

  30. keiths: Do you question the fact that certain questions have right and wrong answers, and that certain people, due to their problem-solving skills, are more adept at finding the right answers than others are?

    No, I don’t question that. I question applying “correctness” to the cognitive system, rather than to the person.

  31. Neil,

    Okay. Well, my statement can be rephrased accordingly without changing my intended meaning:

    The problem is that your internal verification can’t establish the correctness of your reasoning, because if your reasoning is unreliable, then your internal verification (which depends on your reasoning) may be faulty.

  32. keiths,

    And yet we do a lot of internal verification in our reasoning. I could not do mathematics without it.

    An autofocus camera is able to autofocus using entirely internal methods.

  33. Neil,

    And yet we do a lot of internal verification in our reasoning. I could not do mathematics without it.

    Agreed, and it’s a very good idea. It increases the likelihood that we’ll catch our errors.

    My point is that internal verification isn’t foolproof. Even God can’t assume that it is.

Leave a Reply