Isn’t it an optimization to code in the type, and let the .AI work out the details necessary to implement the token ? We don’t think theorem provers need to be overloaded with all known maths.
“But when you ask a question and someone provides an answer you don’t like, showing why that answer is wrong can sometimes be more effective than simply asserting that you don’t buy it”
I agree completely. Had I known in advance the quality of argument you would put up, I would not have wanted you to put it up, and would not have asked for one, in full compliance with this maxim. Lacking prescience, I didn’t know in advance, so I did want an argument, and I did ask for one, which fails to violate this maxim.
I’m afraid I have developed a sudden cognitive deficit that prevents me from understanding anything you are saying. I have also forgotten all the claims I have made, and what this discussion is about.
So, I guess the point of EY’s metaethics can be summarized as ‘by “morality” I mean the token, not the type’.
(Which is not a problem IMO, as there are unambiguous words for the type, e.g. “values”—except insofar as people are likely to misunderstand him.)
Especially because the whole point is to optimize for something. You can’t optimize for a type that could have any value.
Isn’t it an optimization to code in the type, and let the .AI work out the details necessary to implement the token ? We don’t think theorem provers need to be overloaded with all known maths.
Is this some kind of an NLP exercise?
FWIW, I’ve mostly concluded something along those lines.
You wrote
“But when you ask a question and someone provides an answer you don’t like, showing why that answer is wrong can sometimes be more effective than simply asserting that you don’t buy it”
..and I did..
Indeed. And?
If you don’t want someone to put up an argument, don’t ask t for it.
I agree completely.
Had I known in advance the quality of argument you would put up, I would not have wanted you to put it up, and would not have asked for one, in full compliance with this maxim.
Lacking prescience, I didn’t know in advance, so I did want an argument, and I did ask for one, which fails to violate this maxim.
You wanted an argument? Sorry this is “Insults”. Go down the hall and to the left. Monty Python (to my best recollection)
You want 12A, just along the corridor
I’m afraid I have developed a sudden cognitive deficit that prevents me from understanding anything you are saying. I have also forgotten all the claims I have made, and what this discussion is about.
In short, I’m tapping out.
?
There are immoral and amoral values, so no.