Why it matters: AMD is in the unique position of being the primary competitor to Nvidia on GPUs – for both PCs and AI acceleration – and to Intel for CPUs – both for servers and PCs – so their efforts are getting more attention than they ever have. And with that, the official opening keynote for this year's Computex was delivered by AMD CEO Dr. Lisa Su.
Forward-looking: Getting companies interested in deploying generative AI applications is no longer a challenge for tech suppliers. Figuring out how to best use the technology, however, is still difficult for many of these business customers. One new technique that Dell discussed at its Dell Tech World event in Las Vegas is the concept of hybrid AI, where some of the work happens in the cloud, but some is done on premises within an organization's data centers.
"ACDC" (Apple Chips in Data Center) would be the clear winner in 2024's chip naming competition
Editor's take: Of course, Apple is designing its own AI chips for its data centers. This will give them an end-to-end, cloud-to-edge AI system that its competitors will not be able to match anytime soon.
Why it matters: The rise of AI compute is poised to reshape the economics of the data center. That may be as simple as Nvidia displacing Intel, but it could also lead to much wider, systematic change.
The big picture: Starting tomorrow, Nvidia is hosting its GTC developer conference. Once a sideshow for semis, the event has transformed into the center of attention for much of the industry. With Nvidia's rise, many have been asking the extent to which Nvidia's software provides a durable competitive moat for its hardware. As we have been getting a lot of questions about that, we want to lay out our thoughts here.
Some would argue that AI is a fad, the next bubble waiting to burst
Editor's take: Like almost everyone in tech today, we have spent the past year trying to wrap our heads around "AI". What it is, how it works, and what it means for the industry. We are not sure that we have any good answers, but a few things have been clear. Maybe AGI (artificial general intelligence) will emerge, or we'll see some other major AI breakthrough, but focusing too much on those risks could be overlooking the very real – but also very mundane – improvements that transformer networks are already delivering.