This would explain why our formalised moral systems are either hideously complicated, or fail to capture important parts of our morality… We just have far more urges wants and needs than we realise.
Congratulations to Stuart Armstrong on nailing my hidden subtext.
(Albeit even the hideously complicated moral systems still don’t capture a fraction of our morality.)
@Tiiba: You seem to think I can just blurt out my AI ideas. I’ve tried that. It doesn’t work.
Having watched other AIfolk “explaining” their ideas, I know very well how to convince someone that you’ve just conveyed an AI theory—just pick a word like “complexity”, “emergence”, or “Bayesian” and call it the secret of the universe; or draw a big diagram full of connected boxes with suggestive names drawn from cognitive science. Unlike these other AIfolk, I’ve actually learned a little about how intelligence works, and so I know this would be unhelpful and dishonest.
Bayes is the secret of the universe, but believing this statement will not help you.
If you seek enlightenment upon this matter of AI, then I must ask whether you’ve read existing textbooks such as Machine Learning by Mitchell, Probabilistic Reasoning in Intelligent Systems, Artificial Intelligence: A Modern Approach (2nd Ed), and Elements of Statistical Learning. Recommended in that order.
@Unknown: I am horrified by the thought of humanity evolving into beings who have no art, have no fun, and don’t love one another. There is nothing in the universe that would likewise be horrified, but I am. Morality is subjectively objective: It feels like an unalterable objective fact that love is more important than maximizing inclusive fitness, and the one who feels this way is me. And since I know that goals, no matter how important, need minds to be goals in, I know that morality will never be anything other than subjectively objective.
With all that said, I hope you won’t mind if I use objective language to say:
“Evolving into obsessive replicators would be a waste of humanity’s potential. They might not mind, just as sociopaths don’t mind killing, but I mind. I will avoid such a future with every power of my intelligence.”
This would explain why our formalised moral systems are either hideously complicated, or fail to capture important parts of our morality… We just have far more urges wants and needs than we realise.
Congratulations to Stuart Armstrong on nailing my hidden subtext.
(Albeit even the hideously complicated moral systems still don’t capture a fraction of our morality.)
@Tiiba: You seem to think I can just blurt out my AI ideas. I’ve tried that. It doesn’t work.
Having watched other AIfolk “explaining” their ideas, I know very well how to convince someone that you’ve just conveyed an AI theory—just pick a word like “complexity”, “emergence”, or “Bayesian” and call it the secret of the universe; or draw a big diagram full of connected boxes with suggestive names drawn from cognitive science. Unlike these other AIfolk, I’ve actually learned a little about how intelligence works, and so I know this would be unhelpful and dishonest.
Bayes is the secret of the universe, but believing this statement will not help you.
If you seek enlightenment upon this matter of AI, then I must ask whether you’ve read existing textbooks such as Machine Learning by Mitchell, Probabilistic Reasoning in Intelligent Systems, Artificial Intelligence: A Modern Approach (2nd Ed), and Elements of Statistical Learning. Recommended in that order.
@Unknown: I am horrified by the thought of humanity evolving into beings who have no art, have no fun, and don’t love one another. There is nothing in the universe that would likewise be horrified, but I am. Morality is subjectively objective: It feels like an unalterable objective fact that love is more important than maximizing inclusive fitness, and the one who feels this way is me. And since I know that goals, no matter how important, need minds to be goals in, I know that morality will never be anything other than subjectively objective.
With all that said, I hope you won’t mind if I use objective language to say:
“Evolving into obsessive replicators would be a waste of humanity’s potential. They might not mind, just as sociopaths don’t mind killing, but I mind. I will avoid such a future with every power of my intelligence.”