I haven't seen any update about AMD able to improve their supply issues to improve MI300 cards, nor do I see wide adoption. A few months ago couple “AI Datacenter” startups shared stage with AMD about their new offerings with MI300 but some of them have quietly embraced NVDA and abandoned AMD due to poor supply.
No good news which highlights how META uses these cards and any performance benefits in training or inference on MI300. Azure and OCI clouds have some regions with few MI300 compute instances but I have not seen any customer success stories using these vs using Nvidia older generation T4, A100 and H100s.
State of ROCm is not bad. It's improving and many have written blogs and KB articles on how Llama3 models are running without issues on MI300 cards. I read some articles of using Mistral as well. However, AMD still isn't able to demonstrate end to end solutions.
Who feels optimistic about AMD and they able to grab at least 5% of AI market in next 2 to 3 years at this snail pace? What are you expecting from tomorrow's AMD AI event?
What, if any, are the grown prospects for AMD in AI era with limited supplies for MI cards?
Leave a Reply