the sort of lumbering big company where great research or tech is developed by one team and then never reaches anybody else
… except one of our primary threat models is accident risk where the tech itself explodes and the blast wave takes out the light cone. Paraphrasing, the sort of “great tech” that we’re worrying about is precisely the tech that would be able to autonomously circumvent this sort of bureaucracy-based causal isolation. So in this one case, it matters comparatively little how bad Microsoft is at deploying its products, compared to how well it can assist their development.
I mean, I can buy that Microsoft is so dysfunctional that just being embedded into it would cripple OpenAI’s ability to even do research, but it sounds like Sam Altman is pretty good at what he does. If it’s possible to do productive work as part of MS at all, he’d probably manage to make his project do it.
… except one of our primary threat models is accident risk where the tech itself explodes and the blast wave takes out the light cone. Paraphrasing, the sort of “great tech” that we’re worrying about is precisely the tech that would be able to autonomously circumvent this sort of bureaucracy-based causal isolation. So in this one case, it matters comparatively little how bad Microsoft is at deploying its products, compared to how well it can assist their development.
I mean, I can buy that Microsoft is so dysfunctional that just being embedded into it would cripple OpenAI’s ability to even do research, but it sounds like Sam Altman is pretty good at what he does. If it’s possible to do productive work as part of MS at all, he’d probably manage to make his project do it.