High bandwidth memory (HBM) is a type of memory that offers faster data transfer and lower power consumption than conventional DRAM. It is widely used in applications that require high performance and efficiency, such as artificial intelligence (AI), graphics processing units (GPUs), and high-performance computing (HPC).
According to TrendForce, a market research firm, the demand for HBM has been growing rapidly due to the increasing adoption of AI servers across various industries and domains. AI servers are equipped with GPUs that use HBM as their main memory to handle large-scale and complex data processing tasks. TrendForce estimates that AI server shipments will increase by 8% year-over-year in 2023 and by 15.4% in 2024, driven by the emergence of new applications such as self-driving cars, artificial intelligence of things (AIoT), edge computing, and chatbots.
Among the HBM suppliers, SK Hynix is the market leader with a dominant position. TrendForce reports that SK Hynix had a 50% market share in 2022, followed by Samsung with 40% and Micron with 10%. SK Hynix is expected to further expand its market share to 53% in 2023, while Samsung and Micron will drop to 38% and 9%, respectively.
One of the reasons for SK Hynix’s success is its early adoption and mass production of HBM3, the latest generation of HBM technology. HBM3 offers higher bandwidth, density, and reliability than HBM2e, the previous generation. SK Hynix is currently the only maker that mass produces HBM3, according to TrendForce. This gives SK Hynix a competitive edge over its rivals, as HBM3 is in high demand for advanced GPUs such as Nvidia’s H100 and AMD’s Instinct MI300.
Another reason for SK Hynix’s strong performance is its close collaboration with major AI server customers, such as Google, AWS, Meta, and Microsoft. These four North American cloud service providers (CSPs) accounted for 66.2% of the global AI server demand in 2022. SK Hynix has been supplying them with customized HBM solutions that meet their specific needs and specifications. For example, SK Hynix has been working with Microsoft on its OpenAI project, which uses a large-scale language model named Prometheus and a chatbot technology named ChatGPT. These applications require high amounts of memory and bandwidth, which SK Hynix’s HBM can provide.
As the demand for AI servers and HBM continues to grow in the coming years, SK Hynix is well-positioned to benefit from this trend and maintain its leadership in the HBM market.
Share to your social below!
Very interesting subject, appreciate it for posting.Blog monetyze