No.
Things humans value will continue to be valuable because we are going to solve the alignment problem
Even if AI gained all of the power it would still find us interesting for the same reason we find dinosaurs interesting
Even if you are right and the max-entropy state of the universe is a literal paper clipper it would be way more interesting than you are describing
I hope you are correct but timelines exist where you are not
same as (1)
I also hope you are correct but optimality is whatever is optimal
No.
Things humans value will continue to be valuable because we are going to solve the alignment problem
Even if AI gained all of the power it would still find us interesting for the same reason we find dinosaurs interesting
Even if you are right and the max-entropy state of the universe is a literal paper clipper it would be way more interesting than you are describing
I hope you are correct but timelines exist where you are not
same as (1)
I also hope you are correct but optimality is whatever is optimal