ext_176843 ([identity profile] spoonless.livejournal.com) wrote in [personal profile] spoonless 2010-10-14 01:43 am (UTC)

Re: the is-ought gap


Do you intend to deny that any purely normative question has any answer beyond a descriptive one about what people of a particular culture (or perhaps, particular people) find intuitive?

Almost, but I'd prefer to rephrase it in my own way.

I deny that normative questions are objective. I think normative questions only have answers with respect to either a single subject, or a group of subjects who have some kind of shared goals, values, or intuitions about right and wrong. This group, in principle, could be as large as "the entire human species" for some questions, although I am unaware of any such question for which that's true. I'm open to the possibility that if somebody thought hard enough, they might be able to come up with one. But for most every day normative questions, they are either culturally relative or individually relative.

What about a normative question like "should the observation of ten remissions out of eleven in the treatment group, versus three remissions out of eight in the control group, make us more confident in the claim that the treatment is helpful?" That is a purely normative question.

I disagree that this a purely normative question. I think you can factor it into two parts, one of which is normative and the other of which is positive. The positive question is whether the observation makes it more likely that the treatment is helpful. And the normative part is to what extent we should react by feeling confident.

There is of course, the question of whether we are capable of choosing how we react to new evidence. But I think either way you answer that, you can still say that the question of whether it's right or wrong that we react that way is subjective, and unrelated to whether the conclusions we would draw are really more likely to be true given the premise or the evidence.

Now--just looking at the normative part of the question, I would agree that it seems like it must be a pretty universally shared intuition that to become more confident about something being true, when it is more likely to be true... is something that is almost always good. It indicates that your brain is "functioning well" and that you're doing a good job of mapping reality. And I think the universality of that comes from the fact that it is almost always easier to achieve other goals when you have a more accurate representation of reality in your head. Not to mention, some people, like presumably you and I, place some intrinsic value on learning and knowledge, and regard that as a worthy goal in and of itself, even if it doesn't help with attaining any other goals. There are other people do not share an appreciation for knowledge and learning nearly as much, and for them the utility of accurately representing the world is far less, although it still helps with a lot of their more primary goals. For them, I expect that there are more situations where it's not as important for them to react to new evidence in a purely Bayesian way. There may in fact be utility they get out of reacting in a different way... so in some cases, it could be a net negative for them if they buy the statistics.

Overall, taking the purely normative part of the question, I think it comes close to universal among agents immersed in an environment, since modeling their environment accurately is tied strongly to so many other things that the agent regards as "good"... like achieving goals and avoiding getting killed. But I think it fails to be quite universal, and there is still an element of subjectivity to it. In most other cases of ethics, I feel like there is a much greater subjectivity involved.

Post a comment in response:

This account has disabled anonymous posting.
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting