Authors, Lawyers, Teachers, Researchers, Doctors, Coders. These are just some people who’ve found that their work can be performed dramatically better with the right implementation of GPT-3/ Large Transformer models. Due to this, I poist that access to the best AI models will become as fundamental to productivity of most jobs, as access to a computer and internet is today.
Unlike computers, large transformers cost millions to train and run- thus there is a far greater risk of centralisation, control and surveillance. OpenAI has announced that they will be closely monitoring access and cancelling access to anyone indulging in harmful usage. Yet this kind of power to exclude somebody from a basic tool of productivity, or censor what they use the tool for, is far too risky for any private organisation to have. Specifically, when any human has such power, even with good intentions, they are susceptible to causing harm/denying access to legitimate users due to pressure, threats, and deception.
If transformer-based AI is going to be a fundamental tool in most high value jobs in the next couple of years, that will likely cause unprecedented concentration of power in the hands of big tech companies: especially if they continue to enforce censorship rules. Imagine if Amazon, Google & OpenAI could decide that you don’t get access their transformer models anymore, because they don’t like what you tweeted. In many ways, this would be much worse than losing your job—without access to the AI, you might never be competitive in the job market again.
---xxx---
Imagine it is 2021 in a parallel universe, where Apple is the only company manufacturing personal computers- let’s say they have exclusive control over mines of a rare earth mineral essential for chip manufacturing. Their latest (mandatory) software update comes with an AI tool that can detect who is using the computer by monitoring their mouse movement and typing styles (let’s assume these are very hard to fake). Now, Apple has suddenly gained the power to effectively exclude anyone from the entire tech ecosystem.
You’re a journalist. The morning after writing a hard hitting piece criticising one of the world governments, you try to log on and see people’s reactions. Instead, you’re greeted with a message : “Apple has suspended your computer access due to violation of terms”. Short of cancer, this might be about the worst news you could get in this parallel universe. Without a computer, all the highly productive skills you’ve acquired since childhood are suddenly worthless. Other people won’t even let you use their devices, for fear of a ban themselves. From a high value journalist, maybe you’re suddenly reduced to waiting tables till Apple decides to lift your ban.
This is just an illustration of the kind of power organisations would wield if they controlled our access to advanced computing. Scarily enough, the two passages above would be equally terrifying if we just replaced the words “personal computer” with “personal AI”. And while (thankfully), anybody with the funds can access a computer today, the same may not be true of advanced AI tools.
---xxx---
Due to all these reasons, I feel a decentralised AI, to which everyone has equal access at the same price, is the need of the hour. First, such an AI will drive down price of the AI to cost of GPUs, preventing large organisations from charging a 2x markup. Second, such an AI cannot be censored, shut down, and its users cannot be otherwise cancelled for any reason—allowing for true equality of opportunity in a world with advanced AI. I’m already working on this, will share more details shortly. Anyone interested in helping me code this up please DM [discord: dmtea#7497].
Access to AI: a human right?
Authors, Lawyers, Teachers, Researchers, Doctors, Coders. These are just some people who’ve found that their work can be performed dramatically better with the right implementation of GPT-3/ Large Transformer models. Due to this, I poist that access to the best AI models will become as fundamental to productivity of most jobs, as access to a computer and internet is today.
Unlike computers, large transformers cost millions to train and run- thus there is a far greater risk of centralisation, control and surveillance. OpenAI has announced that they will be closely monitoring access and cancelling access to anyone indulging in harmful usage. Yet this kind of power to exclude somebody from a basic tool of productivity, or censor what they use the tool for, is far too risky for any private organisation to have. Specifically, when any human has such power, even with good intentions, they are susceptible to causing harm/denying access to legitimate users due to pressure, threats, and deception.
If transformer-based AI is going to be a fundamental tool in most high value jobs in the next couple of years, that will likely cause unprecedented concentration of power in the hands of big tech companies: especially if they continue to enforce censorship rules. Imagine if Amazon, Google & OpenAI could decide that you don’t get access their transformer models anymore, because they don’t like what you tweeted. In many ways, this would be much worse than losing your job—without access to the AI, you might never be competitive in the job market again.
---xxx---
Imagine it is 2021 in a parallel universe, where Apple is the only company manufacturing personal computers- let’s say they have exclusive control over mines of a rare earth mineral essential for chip manufacturing. Their latest (mandatory) software update comes with an AI tool that can detect who is using the computer by monitoring their mouse movement and typing styles (let’s assume these are very hard to fake). Now, Apple has suddenly gained the power to effectively exclude anyone from the entire tech ecosystem.
You’re a journalist. The morning after writing a hard hitting piece criticising one of the world governments, you try to log on and see people’s reactions. Instead, you’re greeted with a message : “Apple has suspended your computer access due to violation of terms”. Short of cancer, this might be about the worst news you could get in this parallel universe. Without a computer, all the highly productive skills you’ve acquired since childhood are suddenly worthless. Other people won’t even let you use their devices, for fear of a ban themselves. From a high value journalist, maybe you’re suddenly reduced to waiting tables till Apple decides to lift your ban.
This is just an illustration of the kind of power organisations would wield if they controlled our access to advanced computing. Scarily enough, the two passages above would be equally terrifying if we just replaced the words “personal computer” with “personal AI”. And while (thankfully), anybody with the funds can access a computer today, the same may not be true of advanced AI tools.
---xxx---
Due to all these reasons, I feel a decentralised AI, to which everyone has equal access at the same price, is the need of the hour. First, such an AI will drive down price of the AI to cost of GPUs, preventing large organisations from charging a 2x markup. Second, such an AI cannot be censored, shut down, and its users cannot be otherwise cancelled for any reason—allowing for true equality of opportunity in a world with advanced AI. I’m already working on this, will share more details shortly. Anyone interested in helping me code this up please DM [discord: dmtea#7497].