If we knew how to build a machine that chooses its outputs as to maximize some property of the surrounding universe, such a machine would be very dangerous, because maximizing almost any easily defined property leads to a worthless universe (without humans, or with humans living pointless lives, etc.) I believe the preceding statement is uncontroversial [...]
You also need a machine to be powerful for it to be dangerous. A weak maximiser is not likely to be dangerous.
You also need a machine to be powerful for it to be dangerous. A weak maximiser is not likely to be dangerous.