After reading this, I am convinced that any AI requiring galactic-scale production would have a “minimum interestingness” value somewhere significantly above boring.
I hope this is not just a rationalization on my part, but I think what Yudkowsky meant was not that it would be impossible to write an interesting story about it, but that it would not be interesting from within the universe, not only for people who will die, but and for the brace maximizer himself, who is unlikely to actually experience emotions. And here I agree.
After reading this, I am convinced that any AI requiring galactic-scale production would have a “minimum interestingness” value somewhere significantly above boring.
I hope this is not just a rationalization on my part, but I think what Yudkowsky meant was not that it would be impossible to write an interesting story about it, but that it would not be interesting from within the universe, not only for people who will die, but and for the brace maximizer himself, who is unlikely to actually experience emotions. And here I agree.
Ah yes, that’s plausibly true