I see, that’s a great point, thanks for your response. It does seem realistic that it would become political, and it’s clear that a co-ordinated response is needed.
On that note I think it’s a mistake to neglect that our epistemic infrastructure optimises for profit which is an obvious misalignment now. Like facebook and google are already optimising for profit at the expense of civil discourse, they are already misaligned and causing harm. Only focusing on the singularity allows tech companies to become even more harmful, with the vague promise that they’ll play nice once they are about to create superintelligence.
Both are clearly important and the control problem specifically deserves a tonne of dedicated resources, but in addition it would be good to have some effort on getting approximate alignment now or at least better than profit maximisation. This obviously wouldn’t make progress on the control problem, but it might help society move to a state where it is more likely to do so.
I see, that’s a great point, thanks for your response. It does seem realistic that it would become political, and it’s clear that a co-ordinated response is needed.
On that note I think it’s a mistake to neglect that our epistemic infrastructure optimises for profit which is an obvious misalignment now. Like facebook and google are already optimising for profit at the expense of civil discourse, they are already misaligned and causing harm. Only focusing on the singularity allows tech companies to become even more harmful, with the vague promise that they’ll play nice once they are about to create superintelligence.
Both are clearly important and the control problem specifically deserves a tonne of dedicated resources, but in addition it would be good to have some effort on getting approximate alignment now or at least better than profit maximisation. This obviously wouldn’t make progress on the control problem, but it might help society move to a state where it is more likely to do so.