Get capital while you can—Money is broadly useful and can be quickly converted into other resources in a critical moment. At the very least money can be converted into time. Be frugal, you might need your resources soon.
Besides, the value of human capital might fall. If you have a lucrative position (ex: finance or tech) now is a good time to focus on making money. The value of human capital might fall. Investing in your human capital by going back to school is a bad idea.
I find this advice puzzling.
Firstly, I suspect that when things get sufficiently crazy, money will become increasingly worthless.
Secondly, the general paradigm of automation often involves replacing many low skill workers with a few high skill engineers. By the point that an AI can do everything that a highly skilled AI expert human can do, most human jobs are automated, and recursive self improvement is picking up.
Suppose that an AI system can do work similar to a human AI expert, at a compute cost of $10,000. (Ie less than an AI expert salary, but not orders of magnitude less).
This is the lest extreme tech you need to put an AI expert out of a good job, and even then, I suspect some people will want a human. At this stage, the improvement work on AI is being done by AI. If we haven’t already developed some friendliness theory, we will have unfriendly AI’s producing more powerful unfriendly AI’s. Most of the alignment work that you can do on an AI that’s as smart as you and twice as fast can be done on an AI far smarter than you, like post Foom ones.
In this scenario, the AI’s will probably be smart enough to consider the end they are working towards. If that is profit, we are pretty much doomed. If that is being nice, we won’t really need money.
I think there’s a wide range of scenarios where narrow ai make certain companies more profitable and replaces a lot of jobs and maybe changes society as much as the industrial revolution did, without tipping over into recursive self improvement of that type. Or at least not right away.
I would be surprised if we got a future where everything except top AI research is done by AI. What future are you thinking of where a few hundred AI researchers can earn a living, and no one else can.
I find this advice puzzling.
Firstly, I suspect that when things get sufficiently crazy, money will become increasingly worthless.
Secondly, the general paradigm of automation often involves replacing many low skill workers with a few high skill engineers. By the point that an AI can do everything that a highly skilled AI expert human can do, most human jobs are automated, and recursive self improvement is picking up.
Suppose that an AI system can do work similar to a human AI expert, at a compute cost of $10,000. (Ie less than an AI expert salary, but not orders of magnitude less).
This is the lest extreme tech you need to put an AI expert out of a good job, and even then, I suspect some people will want a human. At this stage, the improvement work on AI is being done by AI. If we haven’t already developed some friendliness theory, we will have unfriendly AI’s producing more powerful unfriendly AI’s. Most of the alignment work that you can do on an AI that’s as smart as you and twice as fast can be done on an AI far smarter than you, like post Foom ones.
In this scenario, the AI’s will probably be smart enough to consider the end they are working towards. If that is profit, we are pretty much doomed. If that is being nice, we won’t really need money.
I think there’s a wide range of scenarios where narrow ai make certain companies more profitable and replaces a lot of jobs and maybe changes society as much as the industrial revolution did, without tipping over into recursive self improvement of that type. Or at least not right away.
Most people, including most lesswrong readers, are not top AI experts. Nor will they be able to become one quickly.
I would be surprised if we got a future where everything except top AI research is done by AI. What future are you thinking of where a few hundred AI researchers can earn a living, and no one else can.