Before you can answer this question, I think you have to look at a more fundamental question, which is simply: why are so few people interested in supporting FAI research/ concerned about the possibility of UFAI?
It seems like there are a lot of factors involved here. In times of economic stress, short-term survival tends to dominate over long-term thinking. For people who are doing long-term thinking, there are a number of other problems that many of them are more focused on, such as resource depletion, global warming, ect; even if you don’t think those are as big a threat to our future as UFAI is, if we move towards solving them it will free up some of that “intellectual energy” for other problems. Also, pessimistic thinking about the future in general, and a widespread pessimism about our odds of ever advancing to the point of having true GAI, is also likely to be a factor.
In general, I think that the better the economy is, the more people see the world getting better, and the more other problems we are able to solve, the more likely people are to start thinking about long-term risks, such as UFAI, nanotechnological weapons, ect. I don’t think the difference between a better economy or a worse economy is likely to make much different of the rate of technological change (graphs seem to demonstrate that, for example, the Great Depression didn’t slow down technology much), but it is likely to have a major impact on the kind of short-term thinking that people have during a crisis and the kind of long-term thinking that people are more able to do when the short-term issues seem to be under control.
Before you can answer this question, I think you have to look at a more fundamental question, which is simply: why are so few people interested in supporting FAI research/ concerned about the possibility of UFAI?
It seems like there are a lot of factors involved here. In times of economic stress, short-term survival tends to dominate over long-term thinking. For people who are doing long-term thinking, there are a number of other problems that many of them are more focused on, such as resource depletion, global warming, ect; even if you don’t think those are as big a threat to our future as UFAI is, if we move towards solving them it will free up some of that “intellectual energy” for other problems. Also, pessimistic thinking about the future in general, and a widespread pessimism about our odds of ever advancing to the point of having true GAI, is also likely to be a factor.
In general, I think that the better the economy is, the more people see the world getting better, and the more other problems we are able to solve, the more likely people are to start thinking about long-term risks, such as UFAI, nanotechnological weapons, ect. I don’t think the difference between a better economy or a worse economy is likely to make much different of the rate of technological change (graphs seem to demonstrate that, for example, the Great Depression didn’t slow down technology much), but it is likely to have a major impact on the kind of short-term thinking that people have during a crisis and the kind of long-term thinking that people are more able to do when the short-term issues seem to be under control.