Hm. Well once/if the singularity does happen, I would think that it’d be beyond my ability to manipulate. But I think that your points are valid in reference to the time leading up to it.
Regulatory: make the singularity pay you rent by being a gatekeeper. This will be a large industry worldwide. Probably the best bet.
Could you explain this a bit more? I don’t understand how anyone could be a gatekeeper.
I mean it in this non-flattering sense rent-seeking.
I envision all sorts of arbitrary legal limits imposed on AIs. These limits will need people to dream them up, evangelize the needs for even more limits, and enforce the limits (likely involving creation of other ‘enforcer’ AIs). Some of the limits (early on) will be good ideas but as time goes on they will be more arbitrary and exploitable. If you want examples just think of what laws they will try to stop unfriendly AI and stop individuals from using AI to do evil (say with an advanced makerbot).
Once you have a role in the regulatory field then converting it to fun and profit is a straight forward exercise in politics. How many people are in this role is determined by how successful it is at limiting AIs.
Ah ok. I was assuming that if a singularity occurred it’d be beyond our control, and that our fate would be determined by how the AI was originally programmed. But my reason for assuming this is based on much limited information, so I don’t really know. If it were the case that people with political power control AI, then I think that you are very right.
But if you’re right and we live in a society where there is ASI level power that is controlled by people with political power… that really really scares me. My intuition is that it’d be just a matter of time before someone screws up. I’m not sure what to think of this...
Why not try to exploit the singularity for fun and profit? Its like you have an opportunity to buy Apple stock dirt cheap.
Investment: own data center stocks initially. I am not sure what you would transition to when the AI learns to make CPUs.
Regulatory: make the singularity pay you rent by being a gatekeeper. This will be a large industry worldwide. Probably the best bet.
At the very least you should be able to rule out bad investments (time or money).
Energy
Land
Jobs that will be automated
Hm. Well once/if the singularity does happen, I would think that it’d be beyond my ability to manipulate. But I think that your points are valid in reference to the time leading up to it.
Could you explain this a bit more? I don’t understand how anyone could be a gatekeeper.
I mean it in this non-flattering sense rent-seeking.
I envision all sorts of arbitrary legal limits imposed on AIs. These limits will need people to dream them up, evangelize the needs for even more limits, and enforce the limits (likely involving creation of other ‘enforcer’ AIs). Some of the limits (early on) will be good ideas but as time goes on they will be more arbitrary and exploitable. If you want examples just think of what laws they will try to stop unfriendly AI and stop individuals from using AI to do evil (say with an advanced makerbot).
Once you have a role in the regulatory field then converting it to fun and profit is a straight forward exercise in politics. How many people are in this role is determined by how successful it is at limiting AIs.
Ah ok. I was assuming that if a singularity occurred it’d be beyond our control, and that our fate would be determined by how the AI was originally programmed. But my reason for assuming this is based on much limited information, so I don’t really know. If it were the case that people with political power control AI, then I think that you are very right.
But if you’re right and we live in a society where there is ASI level power that is controlled by people with political power… that really really scares me. My intuition is that it’d be just a matter of time before someone screws up. I’m not sure what to think of this...