It’s really heartening to see Apple taking the initiative and making themselves accountable to third party audits of the hardware security properties of their inference machines. That’s always seemed like an obvious step, consumers care about this stuff, win on trust, win consumers. They’re going to try to negotiate OpenAI into running their inference on Apple’s PCC. And if they manage that, they’re going to get consumers to recognize and celebrate it, and that really could raise expectations, it could even garner political interest.
And so I kind of wonder if the reason Elon is being so dismissive of that is that he’s decided that’s a moral standard he can’t compete with and so he has to downplay it, claim, absurdly, that they’re not really doing it. X.AI and Tesla’s business model is all about training on user data without negotiating for it. I guess a less cynical take would be that he just can’t believe the commitment is going to be stuck to, because access to massive training sets may be seen as critical to the near term success of all of these products. Right now, none them work all that well, if the users data-unionize (or if the EU makes Apple-style hardware privacy mandatory) and make it much more expensive to do ever larger training runs, growth may reverse. If he’s right, it might not be possible for Apple to convince OpenAI to move their inference onto PCC.
It’s really heartening to see Apple taking the initiative and making themselves accountable to third party audits of the hardware security properties of their inference machines. That’s always seemed like an obvious step, consumers care about this stuff, win on trust, win consumers.
They’re going to try to negotiate OpenAI into running their inference on Apple’s PCC. And if they manage that, they’re going to get consumers to recognize and celebrate it, and that really could raise expectations, it could even garner political interest.
And so I kind of wonder if the reason Elon is being so dismissive of that is that he’s decided that’s a moral standard he can’t compete with and so he has to downplay it, claim, absurdly, that they’re not really doing it. X.AI and Tesla’s business model is all about training on user data without negotiating for it. I guess a less cynical take would be that he just can’t believe the commitment is going to be stuck to, because access to massive training sets may be seen as critical to the near term success of all of these products. Right now, none them work all that well, if the users data-unionize (or if the EU makes Apple-style hardware privacy mandatory) and make it much more expensive to do ever larger training runs, growth may reverse. If he’s right, it might not be possible for Apple to convince OpenAI to move their inference onto PCC.