That’s also a good point. I suppose I’m overextending my experience with weaker AI-ish stuff, where they tend to reproduce whatever is in their training set — regardless of whether or not it’s truly relevant.
I still think that LW would be a net disadvantage, though. If you really wanted to chuck something into an AGI and say “do this,” my current choice would be the Culture books. Maybe not optimal, but at least there’s a lot of them!
That’s also a good point. I suppose I’m overextending my experience with weaker AI-ish stuff, where they tend to reproduce whatever is in their training set — regardless of whether or not it’s truly relevant.
I still think that LW would be a net disadvantage, though. If you really wanted to chuck something into an AGI and say “do this,” my current choice would be the Culture books. Maybe not optimal, but at least there’s a lot of them!