AMD Launches Its New – Incredibly Powerful – MI300 Chip

Reshaping the Competitive AI Accelerator Landscape

Shares of semiconductor leader Advanced Micro Devices (AMD) experienced an increase of over 20%, since the introduction of its latest artificial intelligence (AI) accelerator, the MI300. This development has garnered significant attention from investors, who anticipate that the MI300 could emerge as a competitive alternative to Nvidia’s (NVDA) GPUs in the AI market.

This article has been composed exclusively with the human intelligence (HI) of Lorimer Wilson, Managing Editor of munKNEE.com – Your KEY To Making Money!, from a multitude of sources from the internet. Every phrase has been vetted and attribution has been given where applicable.  It takes much longer than using an Artificial Intelligence (AI) platform which does much the same thing but Wilson’s HI approach has few, if any errors, and is based on current market conditions.

Some Background

Source: S&P Capital IQ

Back on January 3rd, 2023, AMD Chairman and CEO Dr. Lisa Su introduced the AMD Instinct MI300 – describing it as the first data center chip that combines a CPU, GPU, and memory on a single package – announcing that it would be launched in the latter half of 2023 and that happened just last week.

As I mentioned in a previous article entitled Get Positioned! AMD Is About To Launch The Most Powerful AI Chip In History I wrote that “AMD is hoping its MI300X AI chip – the most powerful AI chip in history with 2.4 times the power of Nvidia’s top devices in just one chip as well as 1.6 times the memory bandwidth – can stymie Nvidia’s H100 ‘Hopper’ graphics processing units (GPU) dominance.”

Some Technical information

AMD’s MI300 series of accelerators, introduces two distinct models to cater to diverse market needs. The MI300A integrates both CPU and GPU capabilities, while the MI300X is exclusively a GPU-based product, utilizing CDNA 3 GPU tiles. This differentiation positions the MI300X as a specialized, high-performance GPU, equipped with 192GB of HBM3 memory, tailored for demanding applications.

The MI300X’s design features a 750W Thermal Design Power (TDP), incorporating eight GPU chiplets and delivering a substantial memory bandwidth of 5.2TB/s. This model is specifically targeted at the large language model (LLM) market, addressing the needs of customers requiring extensive memory capacity for running the most sizable models.

I encourage you technical nerds out there to check out this site for detailed technical information.

Marketing Plans

According to anandtech.com the “MI300 family remains on track to ship at some point later this year and is already sampling potential customers.”

Meanwhile, AMD is poised to start shipments of its 192GB MI300X GPU to cloud service providers and Original Equipment Manufacturers (OEMs) in the next weeks. This strategic move positions the MI300X to directly contend with Nvidia’s established data center solutions in the AI training market segment.

The introduction of a 192GB GPU by AMD represents a significant development in the context of LLMs for AI, where memory capacity often serves as a limiting factor. The MI300X, equipped with 8 channels of HBM3 memory, positions AMD to potentially gain a competitive edge in the market.

This advantage, however, hinges on the successful deployment and market reception of the MI300X. A critical factor for AMD in the near future will be its ability to secure substantial contracts, challenging Nvidia’s dominant 80% market share in this sector.

AMD Instinct MI300X
Source: AMD

Addressable AI Accelerator Market Size & Growth Expectations

Patrick Moorhead, CEO of Moor Insights & Strategy, doesn’t believe AMD is going to deliver an immediate “knockout blow” but, instead, make the investments necessary to effectively compete with Nvidia in the long-term, particularly in the hyperscale cloud AI accelerator market. Moorhead projects this market to potentially reach a total addressable market value of $125 billion by 2028.

AMD’s Su is even more optimistic seeing the AI accelerator market growing five-fold between now and 2027, by which time it should be worth $150B.

AMD Acquires an AI Software Provider

AMD announced in early October that it had acquired Nod.ai, an open source AI software provider, to significantly enhance AMD’s ability to provide customers with software that allows them to easily deploy highly performant AI models tuned for AMD hardware. In a press release, AMD SVP Vamsi Boppana said that “The addition of the talented Nod.ai team accelerates our ability to advance open-source compiler technology and enable portable, high-performance AI solutions across the AMD product portfolio. Nod.ai’s technologies are already widely deployed in the cloud, at the edge and across a broad range of end point devices today.”

Su said AMD has projected the company’s data center GPU revenue in the fourth quarter of this year to reach approximately $400 million. This forecast is further anticipated to escalate beyond $2 billion in 2024, a growth trajectory attributed to the impact of its MI300 product lineup.

Conclusion

Anandtech.com concludes that there is significant pressure on AMD and the MI300 family even though the MI300 will not be a “make-or-break product” for the company. Besides getting the technical advantage of being the first to ship a single-chip server APU and the “bragging rights” that come with it, this development is seen as an opportunity for AMD to introduce a new offering in a market with high demand for hardware.

There is a strong anticipation among AMD’s investors that the MI300 series will play a crucial role in the company’s financial performance, akin to the impact of NVIDIA’s H100 series.

FIGURE 1: AMD 1-Year Stock Chart

Source: S&P Capital IQ

Notes: All numbers are in USD unless otherwise stated. The author of this report, and employees, consultants, and family of eResearch may own stock positions in companies mentioned in this article and may have been paid by a company mentioned in the article or research report. eResearch offers no representations or warranties that any of the information contained in this article is accurate or complete. Articles on eresearch.com are provided for general informational purposes only and do not constitute financial, investment, tax, legal, or accounting advice nor does it constitute an offer or solicitation to buy or sell any securities referred to. Individual circumstances and current events are critical to sound investment planning; anyone wishing to act on this information should consult with a financial advisor. The article may contain “forward-looking statements” within the meaning of applicable securities legislation. Forward-looking statements are based on the opinions and assumptions of the Company’s management as of the date made. They are inherently susceptible to uncertainty and other factors that could cause actual events/results to differ materially from these forward-looking statements. Additional risks and uncertainties, including those that the Company does not know about now or that it currently deems immaterial, may also adversely affect the Company’s business or any investment therein. Any projections given are principally intended for use as objectives and are not intended, and should not be taken, as assurances that the projected results will be obtained by the Company. The assumptions used may not prove to be accurate and a potential decline in the Company’s financial condition or results of operations may negatively impact the value of its securities. Please read eResearch’s full disclaimer.

About Lorimer Wilson 17 Articles
Lorimer Wilson is an economic & financial commentator who has written numerous articles on economics, finance, precious metals, and rare earth metals. He is the Editor of www.munKNEE.com that provides a selection of the internet’s best finance articles in an edited, reformatted and abridged format to ensure a fast and easy read. He is a frequent contributor to TalkMarkets.com.