ext_74472 ([identity profile] easwaran.livejournal.com) wrote in [personal profile] spoonless 2010-10-14 06:22 pm (UTC)

Re: the is-ought gap

Note that even for a robot whose goal is to kill itself after 5 minutes, it still needs to be a Bayesian (or otherwise rational and reasonable) in order to achieve its goal. It needs to form some sort of relatively accurate representation of what things in the world might be able to kill it, and what things won't, and then act in ways that maximize the likelihood that it will run into something that will kill it.

If we want to describe a robot as having anything like "beliefs" or "goals" rather than just arbitrary sentences stored in memory, then it has to respond to stimuli in certain ways. If you're not appropriately responsive to evidence, then you're not even having beliefs - your internal sentence-states must be described as something else, like imaginations or suppositions or desires, or perhaps just lists. My suspicion is that once we understand this notion of "appropriately responsive", we'll see that you just don't even count as a believer unless you do something approximately Bayesian. Then, this would give us an objective ground for Bayesianism - to count as a believer at all one needs to be approximately Bayesian, and to be an ideal believer, one should be as perfectly Bayesian as possible.

We might then take a similar approach to rationality about actions. In order to count as someone who has goals or desires, one needs to respond to the world and interact with it in particular ways. If you never seek to change anything about the world, then it seems hard to describe your internal states as desires, rather than suppositions, imaginations, or whatever. In order to count as someone who acts, you have to behave approximately in accord with expected value decision theory. And therefore, to count as fully rational, you should act as perfectly in accord with this theory as possible.

Perhaps some similar ground could be drawn for morality. And this would explain why things like rocks and trees don't count as rational or irrational, moral or immoral - they don't act in ways that count as having beliefs or desires or anything of that sort, so nothing makes these norms binding on them.

Obviously, there are lots of holes in this story and I don't know how to fill them in, but I suspect that something like this is at the foundation of all normativity.

Post a comment in response:

This account has disabled anonymous posting.
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting