|Fundamental axiom of morality
||[Jun. 8th, 2003|01:17 am]
As some of you know, I'm a part time philosopher, though I try to do it only with other fully informed, consenting adults. One of the things that's been bugging me most of my adult life is trying to put together a proper set of underpinnings for morality - what is right, what is wrong, and what it matters. Ideally, I want something that is nearly a rational imperative... i.e., once a person reads and understands the ideas I put forth, the person will not find a rational way to say that it's not correct. |
(A rational imperative, as I imagine it, would be something where a person must agree that it's right. "If I let go of this rock, and nothing else prevents it from falling, it will fall" would be a rational imperative. If you understand the words as I use them, you won't have a rational way to do anything other than agree with me. When I say "nearly a rational imperative", I'm settling for the person being unable to say that it's not right, which is a little bit weaker. A person might think it's incomplete, oversimplified, or whatever, for example.)
Recently, in a discussion about relativism versus absolutism, I made a statement of a moral axiom, without even realizing it.
In the discussion, a person was talking about things that were just plain wrong. I pointed out that there's another standard, past the action, once you consider the person performing the action. If an insane person believed that I was the anti-Christ, for example, and that killing me would save billions of lives, you can't consider that person evil for wanting to kill me. If you *did* consider that person to be evil, then you'd (essentially) be denying an already accepted condition: that the person is insane.
(There's another option, of course: your definition of evil could be such that intent doesn't matter at all. However, at that point, our positions are so far apart that we can't really discuss things at all.)
I then said, basically, that I can't accept a system of judgement wherein a person could be said to be wrong for performing an action that s/he thought, sincerely and to the best of his or her ability, was the right action to perform.
And tonight it hit me that this has to be the fundamental basis for any moral system that I can accept.
The first axiom would be: If you believe an action A, or a lack of action B, is the right thing to do, and have investigated both the situation, and your own motivations to the extent of your ability, it is right for you to perform action A, or avoid action B.
It should be noted that this is a *personal* definition, and it has to be.
Perhaps any time a person acted in this manner, that person should not be punished (though perhaps such a person would need to be restrained - like the insane person who thinks I'm the anti-Christ - to prevent their misconceptions from hurting others), but the fact of the matter is, we'll *never* know with certainty when someone has, or hasn't, acted in that manner.
We don't know if that person truly investigated their own motivations; we don't know if that person truly thought the situation through; we certainly don't know if either action was performed to the extent of the other person's ability.
But it does work as an individual guidance... though it's so weak it's barely meaningful. For example, if you would think that it was right to perform action A, but don't because your religions teachings forbid it, you've really decided, to the best of your ability, that your religion must be followed even when you don't think it should be.
However, it justifies a thought that I've had for a long time: that the duty of any person is to do one's best to do what's right.
 As nearly anyone who follows my usenet postings probably knows, whenever I mention the difference between absolutism and relativism, I feel compelled to point out the four categories. They are:
Strong Absolutism: "There are moral absolutes, and they are known"
Weak Absolutism: "There are absolutes, and they may, or may not, be known"
Weak Relativism: "Different societies will come to different conclusions about morality (i.e.: there's no universal conscience saying what's right and wrong; there are questions that clearly haven't been answered)
Strong Relativism: "Different societies will come to different conclusions about what's right and wrong... and they're all correct in those conclusions. Morality is *determined* by society."
(To me, that last one says "morality doesn't really exist; it's just societal opinion". However, I haven't read any writings from strong relativists, so I don't know how deeply this goes.)