Another position is to add an implicit “conditional on God being real” to statements.
OK, but that begs the question of which god (lower-case g) you’re conditioning on. This was actually the mode I was arguing in when I cited the story of Rabbi Eliezer and the carob tree.
Fictional characters are subject to fictional laws of physics in the fictional worlds the authors create.
That too, but they are also subject to constraints imposed by the theory of computation on their authors (at least so long as their authors are Turing machines). That actually rules out omnipotent gods even in fiction. Simply saying that something is omnipotent doesn’t make it omnipotent even in a fictional world.
But that’s a trivial observation
It might be a trivial observation, but it has very profound consequences that are not immediately apparent. Specifically, there’s a positive feedback loop where certain beliefs produce effects which provide evidence that support those beliefs. Such beliefs can become self-sustaining even in cases where the beliefs themselves are objectively false. But because they are self-sustaining, they can be very hard to dislodge.
Ironically, an example of such a self-sustaining but objectively false belief is the belief that rationalism will win the battle of ideas, or even that it’s a better way to live your life, simply because, well, it’s rational. (I’m not saying you believe this, but many people do.)
but that begs the question of which god (lower-case g) you’re conditioning on.
Since the context of the discussion involved quotes from Torah/Bible, I thought it was apparent.
so long as their authors are Turing machines
Speaking of ontological categories… Humans are not Turing machines.
a positive feedback loop where certain beliefs produce effects which provide evidence that support those beliefs
Sure, but I still don’t see it as particularly profound. It happens all the time and is the mechanism involved in some well-known biases. I understand your point that “personal experience” of a believer is suspect as evidence and that point has some validity, but this is a complex discussion involving interpretations, cultural expectations, philosophy of qualia, etc. etc. :-)
Since the context of the discussion involved quotes from Torah/Bible, I thought it was apparent.
It isn’t apparent. Genesis is part of three different religious traditions with radically different theologies. For example, there’s a rich tradition in Judaism of arguing with God, and even winning sometimes (e.g. Exo32:9-14), something which would be unthinkable in Christianity or Islam.
Humans are not Turing machines.
The software processes running on human brains can, as far as anyone can tell, be modeled by a Turing machine, so if a TM can’t do it, neither can a human, and hence neither can any fictional character a human can describe.
I still don’t see it as particularly profound
I guess we’ll just have to agree to disagree about that.
I disagree. People can (and do) have interesting and constrained discussions and even debates about fictional characters all the time.
OK, but that begs the question of which god (lower-case g) you’re conditioning on. This was actually the mode I was arguing in when I cited the story of Rabbi Eliezer and the carob tree.
That too, but they are also subject to constraints imposed by the theory of computation on their authors (at least so long as their authors are Turing machines). That actually rules out omnipotent gods even in fiction. Simply saying that something is omnipotent doesn’t make it omnipotent even in a fictional world.
It might be a trivial observation, but it has very profound consequences that are not immediately apparent. Specifically, there’s a positive feedback loop where certain beliefs produce effects which provide evidence that support those beliefs. Such beliefs can become self-sustaining even in cases where the beliefs themselves are objectively false. But because they are self-sustaining, they can be very hard to dislodge.
Ironically, an example of such a self-sustaining but objectively false belief is the belief that rationalism will win the battle of ideas, or even that it’s a better way to live your life, simply because, well, it’s rational. (I’m not saying you believe this, but many people do.)
Since the context of the discussion involved quotes from Torah/Bible, I thought it was apparent.
Speaking of ontological categories… Humans are not Turing machines.
Sure, but I still don’t see it as particularly profound. It happens all the time and is the mechanism involved in some well-known biases. I understand your point that “personal experience” of a believer is suspect as evidence and that point has some validity, but this is a complex discussion involving interpretations, cultural expectations, philosophy of qualia, etc. etc. :-)
It isn’t apparent. Genesis is part of three different religious traditions with radically different theologies. For example, there’s a rich tradition in Judaism of arguing with God, and even winning sometimes (e.g. Exo32:9-14), something which would be unthinkable in Christianity or Islam.
The software processes running on human brains can, as far as anyone can tell, be modeled by a Turing machine, so if a TM can’t do it, neither can a human, and hence neither can any fictional character a human can describe.
I guess we’ll just have to agree to disagree about that.