Aside from the theoretical similarities between the two fields, there are also interesting practical aspects.
Some positive effects:
Cryptocurrencies have made some AI alignment researchers much wealthier, allowing them to focus more on their research.
Some of the alignment orgs (e.g. MIRI) got large donations from the crypto folk.
Some negative effects:
Cryptocurrencies allow AIs to directly participate in the economy, without human intermediaries. These days, a Python script can buy / sell goods and services, and even hire freelancers. And some countries are moving to a crypto-based economy (e.g. El Salvador). This could greatly increase the speed of AI takeoff.
Some cryptocurrencies are general-purpose computing systems that are practically uncencorable and indestructible (short of switching off the Internet). Thanks to it, even a sub-human AI could become impossible to switch off.
Both effects are reducing the complexity of the first steps of AI takeoff. Instead of hacking robotic factories or whatever, the AI could just hire freelancers to run its errands. Instead of hacking some closely monitored VMs, the AI could just run itself on Ethereum. And so on. Gaining the first money, human minions, and compute—is now a mundane software problem that doesn’t require a Bayesian superintelligence.
This also makes a stealthy takeoff more realistic. On the Internet, nobody knows you’re a self-aware smart contract who is paying untraceable money to some shady people to do some shady stuff.
This comment lists more negative than positive effects. But I have no idea if crypto is a net positive or not. I haven’t thought deeply on the topic.
As another upside, I’d want to add all the research and experiments in decentralized coordination—funding of public goods, quadratic voting, etc. Insofar as AI x-risk is not just a problem of AI alignment but also of coordination, any novel attempts at improving coordination are potentially very valuable.
It doesn’t seem to be a consequence of Crypto specifically. Any API qualifies here.
For a digital entity, it is tricky to handle fiat currency (say, USD) without relying on humans. For example, to open any kind of the account (e.g. bank, PayPal etc), one need to pass KYC filters, CAPTCHAs etc. Same for any API that allow transfers of fiat currency. The legacy financial system is explicitly designed to be shielded against bots (with the exception of the bots owned by registered humans).
But in the crypto space, you can create your own bank in a few lines of code, without any kind of human assistance. There are no legal requirements for participation. You don’t have to own a valid identification document, a postal address etc.
Thanks to crypto, a smart enough Python script could earn money, trade goods and services, or even hire humans, without a single interaction with the legacy financial system.
Crypto is an AI-friendly tool to convert intelligence directly into financial power.
Although I’m not sure if it has any meaningful impact on the X-risk. For a recursively self-improving AGI, hijacking the legacy financial system could be as trivial as hijacking the crypto space.
Aside from the theoretical similarities between the two fields, there are also interesting practical aspects.
Some positive effects:
Cryptocurrencies have made some AI alignment researchers much wealthier, allowing them to focus more on their research.
Some of the alignment orgs (e.g. MIRI) got large donations from the crypto folk.
Some negative effects:
Cryptocurrencies allow AIs to directly participate in the economy, without human intermediaries. These days, a Python script can buy / sell goods and services, and even hire freelancers. And some countries are moving to a crypto-based economy (e.g. El Salvador). This could greatly increase the speed of AI takeoff.
Some cryptocurrencies are general-purpose computing systems that are practically uncencorable and indestructible (short of switching off the Internet). Thanks to it, even a sub-human AI could become impossible to switch off.
Both effects are reducing the complexity of the first steps of AI takeoff. Instead of hacking robotic factories or whatever, the AI could just hire freelancers to run its errands. Instead of hacking some closely monitored VMs, the AI could just run itself on Ethereum. And so on. Gaining the first money, human minions, and compute—is now a mundane software problem that doesn’t require a Bayesian superintelligence.
This also makes a stealthy takeoff more realistic. On the Internet, nobody knows you’re a self-aware smart contract who is paying untraceable money to some shady people to do some shady stuff.
This comment lists more negative than positive effects. But I have no idea if crypto is a net positive or not. I haven’t thought deeply on the topic.
As another upside, I’d want to add all the research and experiments in decentralized coordination—funding of public goods, quadratic voting, etc. Insofar as AI x-risk is not just a problem of AI alignment but also of coordination, any novel attempts at improving coordination are potentially very valuable.
It doesn’t seem to be a consequence of Crypto specifically. Any API qualifies here.
That said, Crypto could make it harder to block such trades on the financial system level.
For a digital entity, it is tricky to handle fiat currency (say, USD) without relying on humans. For example, to open any kind of the account (e.g. bank, PayPal etc), one need to pass KYC filters, CAPTCHAs etc. Same for any API that allow transfers of fiat currency. The legacy financial system is explicitly designed to be shielded against bots (with the exception of the bots owned by registered humans).
But in the crypto space, you can create your own bank in a few lines of code, without any kind of human assistance. There are no legal requirements for participation. You don’t have to own a valid identification document, a postal address etc.
Thanks to crypto, a smart enough Python script could earn money, trade goods and services, or even hire humans, without a single interaction with the legacy financial system.
Crypto is an AI-friendly tool to convert intelligence directly into financial power.
Although I’m not sure if it has any meaningful impact on the X-risk. For a recursively self-improving AGI, hijacking the legacy financial system could be as trivial as hijacking the crypto space.