In a nutshell: Samsung Electronics rode the AI hype train to a massive quarterly rebound, with operating profit skyrocketing nearly 933% year-over-year. And they're betting big that this AI boom is just getting started.
The tech giant reported blowout Q1 2024 results on Tuesday, with an operating profit of 6.61 trillion Korean won ($4.76 billion) - up an eye-popping 932.8% compared to the same period last year. That crushed analyst expectations of 5.94 trillion won and was bang on their pre-release guidance from earlier this month.
The rebound was fueled by a surge in memory chip prices, as AI companies scrambled for high-end semiconductors to power their increasingly complex large language models and other cutting-edge AI apps. After posting record losses in 2023 from the post-pandemic slump, Samsung's memory business swung back to a 1.91 trillion won operating profit in Q1.
"As AI providers increase, the size of training data becomes proportionally bigger, leading to higher performance and data storage needs. So, we're seeing a lot of incoming requests from the customers," said Samsung during the earnings call.
Overall Q1 revenue jumped nearly 13% year-over-year to 71.92 trillion won ($52.3B), driven by solid Galaxy S24 flagship phone sales and those high-flying memory revenues.
But the path ahead isn't all roses. Samsung warned of escalating geopolitical risks, continued supply chain cost pressures, and slightly lower expected profits in Q2 before new product launches in the second half.
"In the second half of 2024, business conditions are expected to remain positive with demand - mainly around generative AI - holding strong, despite continued volatility relating to macroeconomic trends and geopolitical issues," the company said in a statement.
Still, the outlook for the memory biz remains good thanks to AI, with Samsung predicting a "positive" environment in the latter half of 2024 as generative AI adoption accelerates. It's also going full throttle on expanding high-end chip capacity, with plans to mass produce advanced HBM3E and 128GB DDR5 chips based on 1b nanometer process tech this quarter.
These upcoming HBM3E 12H chips will offer blistering 1,280 GB/s bandwidth and an unprecedented 36GB capacity - all while maintaining the same package height as current 8-layer HBM chips. The secret sauce is advanced thermal compression non-conductive film that lets them cram 12 stacked layers into existing HBM footprints, giving AI giants immense memory bandwidth to fuel their data-hungry models.
Beyond ramping bleeding-edge chip production, Samsung is charging ahead with next-gen tech like 3nm and 2nm process nodes to stay ahead of the curve. It kicked off production of 3nm chips in 2022 and aims to start cranking out 2nm chips by 2025 - which should help feed the AI industry's insatiable appetite for greater compute power.