Are we sure that OpenAI still believes in “open AI” for its larger, riskier projects? Their recent actions suggest they’re more cautious about sharing their AI’s source code, and projects like GPT-3 are being “released” via API access only so far. See also this news article that criticizes OpenAI for moving away from its original mission of openness (which it frames as a bad thing).
In fact, you could maybe argue that the availability of OpenAI’s APIs acts as a sort of pressure release valve: it allows some people to use their APIs instead of investing in developing their own AI. This could be a good thing.
Are we sure that OpenAI still believes in “open AI” for its larger, riskier projects? Their recent actions suggest they’re more cautious about sharing their AI’s source code, and projects like GPT-3 are being “released” via API access only so far. See also this news article that criticizes OpenAI for moving away from its original mission of openness (which it frames as a bad thing).
In fact, you could maybe argue that the availability of OpenAI’s APIs acts as a sort of pressure release valve: it allows some people to use their APIs instead of investing in developing their own AI. This could be a good thing.