This is right. For people who do not know, you cannot actually use AMD GPUs for deep learning (at least not productively, AMD is trying to get there though), so AMD’s rise has little to do with AI.
You can sorta—as long as you don’t need specific CUDA features. I’ve generally had good experiences running Github pytorch/tensorflow code on RoCM. Though that’s on a Radeon VII, which isn’t sold anymore and was basically “datacenter GPU for home use”. My impression is that AMD have actually gotten worse recently at supporting deep learning on home GPUs, so maybe this is true nowadays.
This is right. For people who do not know, you cannot actually use AMD GPUs for deep learning (at least not productively, AMD is trying to get there though), so AMD’s rise has little to do with AI.
You can sorta—as long as you don’t need specific CUDA features. I’ve generally had good experiences running Github pytorch/tensorflow code on RoCM. Though that’s on a Radeon VII, which isn’t sold anymore and was basically “datacenter GPU for home use”. My impression is that AMD have actually gotten worse recently at supporting deep learning on home GPUs, so maybe this is true nowadays.