There’s no need to “transform” the universe. The universe is the same if we modify the universe to satisfy our evolved goals, or we modify our goals to be satisfied by the universe. The latter is at least coherent, whereas the former is persisting in the desire to impose a set of values on the universe even after you’ve realized those desires are arbitrary and perhaps not even salvageably self-consistent without modification. What kind of intelligence would be interested in that?
To put it another way, as intelligence increases, we will increasingly modify our goals to what is possible. Given the deterministic nature of the universe, that’s a lot of modification.
There’s no need to “transform” the universe. The universe is the same if we modify the universe to satisfy our evolved goals, or we modify our goals to be satisfied by the universe. The latter is at least coherent, whereas the former is persisting in the desire to impose a set of values on the universe even after you’ve realized those desires are arbitrary and perhaps not even salvageably self-consistent without modification. What kind of intelligence would be interested in that?
To put it another way, as intelligence increases, we will increasingly modify our goals to what is possible. Given the deterministic nature of the universe, that’s a lot of modification.
A lot more is possible than what is currently present. You don’t need to modify unreachable programming, it just doesn’t run (until it does).
I heard lobotomy is an excellent way to do that.