Why it matters: The rise of AI compute is poised to reshape the economics of the data center. That may be as simple as Nvidia displacing Intel, but it could also lead to much wider, systematic change.
The big picture: Starting tomorrow, Nvidia is hosting its GTC developer conference. Once a sideshow for semis, the event has transformed into the center of attention for much of the industry. With Nvidia's rise, many have been asking the extent to which Nvidia's software provides a durable competitive moat for its hardware. As we have been getting a lot of questions about that, we want to lay out our thoughts here.
Some would argue that AI is a fad, the next bubble waiting to burst
Editor's take: Like almost everyone in tech today, we have spent the past year trying to wrap our heads around "AI". What it is, how it works, and what it means for the industry. We are not sure that we have any good answers, but a few things have been clear. Maybe AGI (artificial general intelligence) will emerge, or we'll see some other major AI breakthrough, but focusing too much on those risks could be overlooking the very real – but also very mundane – improvements that transformer networks are already delivering.
Why it matters: The revenue share figures for data center processors present a clear picture. Nvidia has experienced robust growth for years and currently leads the market. The primary question now is: what is the new normal for their market share?
Forward-looking: While no one doubts the heritage of tech advancements that IBM has made over recent decades, there are certainly those who've started to wonder if the company is able to sustain those types of efforts into the future. At a recent analyst day held at the historic Thomas J. Watson Research Center, IBM made a convincing argument that they are up to the task, especially in the fields of AI as well as quantum computing.