So I was reading one of the morality posts on lesswrong [warning for anti-religion stuff, *sigh*], which at one point asks the question, ‘if you believe in a morality of the universe, what if it tells you to do something horrible’.
If you do happen to think that there is a source of morality beyond human beings… and I hear from quite a lot of people who are happy to rhapsodize on how Their-Favorite-Morality is built into the very fabric of the universe… then what if that morality tells you to kill people?
If you believe that there is any kind of stone tablet in the fabric of the universe, in the nature of reality, in the structure of logic—anywhere you care to put it—then what if you get a chance to read that stone tablet, and it turns out to say “Pain Is Good”? What then?
Maybe you should hope that morality isn’t written into the structure of the universe. What if the structure of the universe says to do something horrible?
[Note, I took out a link from the above quote, if you want to see it click through to the original.]
This was one of the things I had to think through a bit to figure out how it worked. But once I did what I come to is that it’s either not a problem or it doesn’t make sense.
If I believe there’s a morality of the universe, and I also believe there are elements of that morality I am aware of, then if I run into something making some claim to revealing/embodying/etc said morality, and it says something that contradicts something I already believe to be an element of morality – then either I end up thinking I’m wrong, or I end up thinking that the thing in question does not actually reveal/embody the morality.
If I believe that the morality of the universe includes ‘killing people is bad’, and this thing I run into says ‘whoever kills the most people is the best person so go kill lots of people’, then either I then believe I’ve been wrong about ‘killing people is bad’ (in which case I do not have a moral problem, because I don’t think the thing told me to do something horrible, since I’ve changed my mind about what’s horrible), or I believe that the thing is not actually telling me true things about the morality of the universe (in which case I also do not have a moral problem because the thing that told me to do horrible things is not telling me morality things). I can’t have both ‘it’s right’ and ‘killing people is bad’, because they’re incompatible. (I can also be uncertain, but again, I’m uncertain between these two options).
Like – say I instead get a chance to consult an omniscient truthful oracle, and it tells me I speak Finnish. Either I’m going to believe that I speak Finnish, and I was previously wrong about me not speaking Finnish, or I’m going to believe that this is not actually an omniscient truthful oracle.
(If I believe the option that involves changing my mind but I continue to experience the same evidence I used to, I’m probably going to want some explanation for how this works, which is to say, why whatever led me to believe that killing people is bad/that I don’t speak Finnish is actually incorrect. Say ‘your species is not advanced enough to have a sense for true morality’ or ‘you have abnormality in the area of your brain that perceives you speaking Finnish, such that you are not aware of doing it’. And again, which way I go depends on whether I accept this account over ‘the thing is not accurate’).
It’s like the ‘what if an unstoppable force meets an immovable object’ question. That doesn’t work, because you can’t have a universe that includes both those things, they are mutually exclusive.
(Now if I run across something *very powerful* that tells me to do things I find immoral, that is, well, a problem. But it’s an entirely separate moral question (which is, rather, the one I thought of when reading about the Left Behind books!)).