Samsung Electronics Co. Ltd. (SSNLF) might finally be finding its footing in the high-stakes world of AI memory chips. In a New Year address that probably had investors leaning forward in their seats, semiconductor chief Jun Young-hyun delivered news the company has been waiting to share: customers are genuinely impressed with Samsung's upcoming HBM4 chips.
This isn't just corporate cheerleading. After struggling to keep pace in the AI memory race, Samsung appears to be regaining momentum at exactly the right moment, as demand for artificial intelligence hardware continues its relentless climb.
The Customer Feedback Samsung Needed
According to a Reuters report, Jun told employees that customers have responded positively to Samsung's High Bandwidth Memory 4 products, highlighting renewed confidence in the company's AI-focused memory strategy. More specifically, Samsung says customers are encouraged by both the performance and competitiveness of its next-generation HBM lineup.
That's the kind of validation that matters in this business. Performance specs on a datasheet are one thing, but getting potential buyers excited about what you're building? That's how you start winning back market share.
Samsung is also in discussions with Nvidia Corp. (NVDA) about supplying HBM4 chips. Landing Nvidia as a customer would be a significant win as Samsung works to narrow the gap with market leader SK Hynix in the rapidly expanding AI memory segment.
Already Serving the AI Giants
It's not like Samsung is starting from zero here. The company already supplies high-bandwidth memory to major AI chip designers, including Alphabet Inc. (GOOGL) Google through Broadcom Inc. (AVGO)-built AI chips. Advanced Micro Devices Inc. (AMD) also buys Samsung HBM under a broader partnership arrangement.
The demand surge makes sense when you understand what these chips actually do. HBM demand is exploding because AI and generative AI workloads require faster, more power-efficient memory to process absolutely massive volumes of data. This memory is central to training large language models and powering the AI accelerators that companies like Nvidia, Google, and AMD are cranking out.
But here's the uncomfortable reality: despite Samsung's progress, the company still trails SK Hynix by a meaningful margin. SK Hynix held a 53% share of the global HBM market in the third quarter of 2025, according to Counterpoint Research. Samsung's share stood at 35%, while Micron Technology Inc. (MU) held 11%.
That's not a disaster by any means, but it's also not where Samsung wants to be. The positive customer feedback on HBM4 suggests the company has a real shot at closing that gap in the next generation.
Beyond Memory: Foundry Wins and Future Worries
Samsung's semiconductor story extends beyond memory chips. Jun cited new momentum in the company's foundry business, pointing to supply deals with major global customers. The headline grabber here is a $16.5 billion agreement with Tesla Inc. (TSLA), the kind of deal that validates Samsung's chip manufacturing capabilities on the global stage.
But it's not all optimism and celebration. Co-CEO TM Roh warned that 2026 could bring rising component costs, global tariff risks, and broader economic uncertainty. Samsung plans to diversify its supply chain to manage those pressures, which is basically the corporate equivalent of not putting all your eggs in one geopolitical basket.
SK Hynix CEO Kwak Noh-Jung echoed the cautious outlook, stressing the need for continued aggressive investment. Translation: the AI chip race isn't slowing down, and neither can the companies competing in it.
For Samsung, the customer enthusiasm around HBM4 represents more than just positive feedback. It's a signal that the company's substantial investments in catching up to SK Hynix might actually be working. Whether that translates into meaningful market share gains will depend on execution, but at least Samsung is heading into the next generation with customers who believe in what they're building.




