The thing you gave me was not what I would refer to as a fundamental ought! What you stated is what I would call a desire, motivation, intrinsic drive of the AI. I mean, it could technically fill the role of a fundamental ought, but it would be trivially stupid.
Your statement that "kill all humans" is a trivially stupid fundamental ought, while "create a thriving global community is a great one is very obviously motivated by an emotional response, not be reason or science.
What you mean to say is not that it's stupid, but that it is morally wrong, ie that you feel anger and outrage over it, rather than warm fuzzy feelings like you feel about "create a thriving global community". I have similar feelings about both of those statements, but it seems like I'm more aware that my emotions are what is making me feel that way. You've somehow convinced yourself through circular reasoning that you don't need any foundation for a fundamental ought and that it's somehow self-derived.
no subject
Date: 2010-12-07 12:01 am (UTC)The thing you gave me was not what I would refer to as a fundamental ought! What you stated is what I would call a desire, motivation, intrinsic drive of the AI. I mean, it could technically fill the role of a fundamental ought, but it would be trivially stupid.
Your statement that "kill all humans" is a trivially stupid fundamental ought, while "create a thriving global community is a great one is very obviously motivated by an emotional response, not be reason or science.
What you mean to say is not that it's stupid, but that it is morally wrong, ie that you feel anger and outrage over it, rather than warm fuzzy feelings like you feel about "create a thriving global community". I have similar feelings about both of those statements, but it seems like I'm more aware that my emotions are what is making me feel that way. You've somehow convinced yourself through circular reasoning that you don't need any foundation for a fundamental ought and that it's somehow self-derived.