To be honest, I think crypto is either neutral or even slightly positive for the growth of AGI. The reason is that crypto mining is one of the major drivers of GPU and specialized ASIC designs. While it does cause shortages in the long run it actually helps fund the development of more chip fabs and even more powerful GPUs. It indirectly is making AI R&D cheaper because it is lowering the (long term) cost paid for powerful GPUs, and software stack improvements made to support GPU compute for the crypto market carryover to the inference and network training market.
Moreover, one company that got it’s start making crypto ASICs (bitmain) has used the money and chips it developed to get into the AI market.
However, at least for the niche I work in, AI systems that “take off” seem very far away. I think years of work are needed just to develop the platforms to make development and deployment of AI systems that do relatively well defined things (driving a car without crashing, maneuvering a robot to do manufacturing or logistics) reliable and economical.
Once we have this kind of stuff working reliably maybe we can start looking into the higher level abstractions that artificial sentience would need. But I don’t think it will be simple, and I don’t think it will just accidentally wake up from a random developers computer and eat the internet. Probably.
Preventing 51% attacks. Maybe also others. And for the environmentally minded people, there’s also the reason to decrease the power consumption. I heard a region of China has banned new mining facilities because of their energy consumption. If that continues, 51% attacks may become easier again unless more blockchains switch away from proof of work.
See that’s why I asked what’s the incentive to switch to proof of stake and not why it’s better. Like with climate change, this is a coordination problem.
To be honest, I think crypto is either neutral or even slightly positive for the growth of AGI. The reason is that crypto mining is one of the major drivers of GPU and specialized ASIC designs. While it does cause shortages in the long run it actually helps fund the development of more chip fabs and even more powerful GPUs. It indirectly is making AI R&D cheaper because it is lowering the (long term) cost paid for powerful GPUs, and software stack improvements made to support GPU compute for the crypto market carryover to the inference and network training market.
Moreover, one company that got it’s start making crypto ASICs (bitmain) has used the money and chips it developed to get into the AI market.
However, at least for the niche I work in, AI systems that “take off” seem very far away. I think years of work are needed just to develop the platforms to make development and deployment of AI systems that do relatively well defined things (driving a car without crashing, maneuvering a robot to do manufacturing or logistics) reliable and economical.
Once we have this kind of stuff working reliably maybe we can start looking into the higher level abstractions that artificial sentience would need. But I don’t think it will be simple, and I don’t think it will just accidentally wake up from a random developers computer and eat the internet. Probably.
Good point. A broad switch away from proof of work (as it seems to be happening) may change that dynamic.
What incentive is there for a broad switch to proof of work?
Away from proof of work. :-)
Sorry that’s what I meant to ask
Preventing 51% attacks. Maybe also others. And for the environmentally minded people, there’s also the reason to decrease the power consumption. I heard a region of China has banned new mining facilities because of their energy consumption. If that continues, 51% attacks may become easier again unless more blockchains switch away from proof of work.
See that’s why I asked what’s the incentive to switch to proof of stake and not why it’s better. Like with climate change, this is a coordination problem.