The problem I have with this use of the words “should” and “good” is that it treats the them like semantic primitives, rather than functions of context. We use them in explicitly delimited contexts all the time:
“If you want to see why the server crashed, you should check the logs.”
“You should play Braid, if platformers are your thing.”
“You should invest in a quality fork, if you plan on eating many babies.”
“They should glue their pebble heaps together, if they want them to retain their primality.”
Since I’m having a hard time parting with the “should” of type “Goal context → Action on causal path to goal”, the only sense I can make out of your position is that “if your goal is [extensional reference to the stuff that compels humans]” is a desirable default context.
If you agree that “What should be done with the universe” is a different question than “What should be done with the universe if we want to maximize entropy as quickly as possible”, then either you’re agreeing that what we want causally affects should-ness, or you’re agreeing that the issue isn’t really “should”’s meaning, it’s what the goal context should be when not explicitly supplied. And you seem to be saying that it should be an extensional reference to commonplace human morality.
The problem I have with this use of the words “should” and “good” is that it treats the them like semantic primitives, rather than functions of context. We use them in explicitly delimited contexts all the time:
“If you want to see why the server crashed, you should check the logs.”
“You should play Braid, if platformers are your thing.”
“You should invest in a quality fork, if you plan on eating many babies.”
“They should glue their pebble heaps together, if they want them to retain their primality.”
Since I’m having a hard time parting with the “should” of type “Goal context → Action on causal path to goal”, the only sense I can make out of your position is that “if your goal is [extensional reference to the stuff that compels humans]” is a desirable default context.
If you agree that “What should be done with the universe” is a different question than “What should be done with the universe if we want to maximize entropy as quickly as possible”, then either you’re agreeing that what we want causally affects should-ness, or you’re agreeing that the issue isn’t really “should”’s meaning, it’s what the goal context should be when not explicitly supplied. And you seem to be saying that it should be an extensional reference to commonplace human morality.