debating buying NVDA in 2019

Link post

Alice: You saw GPT-2, right?

Bob: Of course.

Alice: It’s running on GPUs using CUDA. OpenAI will keep scaling that up, and other groups will want to do the same thing.

Bob: Right.

Alice: So, does this mean we should buy Nvidia stock?

Bob: I’m not sure. Nvidia makes the hardware used now, but why should we expect it to be the hardware used in the future? There’s clearly room for improvement: current GPUs aren’t optimized for lower-precision numbers or sparsity. Designs will change, which means a competitor might do better. At the scales we’re talking about, it makes some sense to design your own ASICs. Google already has TPUs, and Amazon & Facebook will probably do something similar.

Alice: OK, but researchers are all people who started out doing stuff on their personal GPU using CUDA.

Bob: And you don’t think AMD or somebody will be able to make other GPUs compatible with CUDA, now that it’s a priority?

Alice: Eventually, maybe, but I think you’re massively underestimating the difficulty of that.

Bob: Again, google is already using TPUs, so clearly they have software for that. Look, neural networks are mostly big matrix multiplications; chips for them should be easier to design than chips for graphics, and Nvidia has strong competition for GPUs.

Alice: You want HBM for NN ASICs, and TSMC is the only company doing that well. Nvidia reserved a lot of their capacity.

Bob: Apple did too. More importantly, if TSMC capacity is the limiting factor, then profits should go to them. There are contracts for now, but GPT-2 is still a ways off from being useful and it wouldn’t make sense to make really big purchases until the next generation of ASICs comes out, at which point those contracts could be renegotiated and TSMC could raise their prices, up to the point where Samsung is almost as good an option.

Alice: Hmm, maybe. I still think you’re underestimating the software moat Nvidia has. Long-term, maybe there’s a real competitor, but every big company is going to want to train their own language model ASAP.

Bob: Why? I could understand fine-tuning, but why would they need to do that? I imagine there will be a few big groups making their own models, but then everybody else could just license the best ones. The competition should be even, meaning low net profits. There might even be competitive open-source models from somebody.

Alice: No, big companies will want to make their own. You’re not considering the incentives of people at those companies. If they think AI will be big, they’ll want job experience “making AI”. And CEOs will be afraid that markets will punish them for not having their own AI program, because investors will think experience with AI could be important in the future.

Bob: What?! OpenAI has only been around for a few years! Corporate “experience” with AI won’t matter; just hire decent people and read the latest papers.

Alice: Maybe so, but that’s not how a lot of investors think.

Bob: Is that the basis of our investment plan, then? CEOs do something dumb to please dumb investors?

Alice: You already said there’s room for improvement with NN ASICs, right?

Bob: Of course, you can [redacted]. But obviously a complete design is too large a project for just me.

Alice: Well then, it seems you think there’s room for them to continue improvements and stay ahead of other designers. Nvidia was leading for hardware acceleration of ray tracing, and they’ll have a big budget, so it seems like they’ll be leading for NN ASIC design too, at least for a while.

Bob: I’m not convinced that such competence carries over to other designs. For all you know Apple or Amazon will do better than them. Or maybe Huawei, or Will Semiconductor.

Alice: Even if that’s true, you’re looking too far ahead. Stock prices are based on profit in the last few quarters and the stories in media. There’s a whole pipeline for this stuff, and it takes years. Also, Nvidia can afford to steal all the best GPU software people from AMD.

Bob: Again, there are already TPUs.

Alice: Fun fact, Nvidia is actually doing better in terms of NN performance per mm^2 than TPUs despite their processors being less special-purpose.

Bob: OK, but presumably Google could still sell those if there’s so much demand.

Alice: Maybe Google just won’t be able to make TPUs fast enough to sell them and do their own stuff. And maybe there just won’t be much other strong competition in the relevant timeframe.

Bob: Do all the best people want to work in Nvidia’s giant open offices, then? It’s not like they have a monopoly on talent; they certainly wouldn’t hire me or [redacted].

Alice: Sure, but neither would the ASIC companies getting VC funding. If any ASIC startup actually becomes a threat, Nvidia can buy them out too. The Chinese talent pool is also somewhat separate, but then the Chinese companies have their geopolitical and management issues.

No comments.