Sales of Nvidia’s GPUs are soaring as AI and big data applications widen, signalling change for the semiconductor industry, and potential trouble for Intel.

Silicon Valley chipmaker Nvidia is on a roll, with first-quarter revenue up 48% from a year earlier. In February 2017, it reported a 55% fourth-quarter increase, and a 38% increase for fiscal 2017. Much of this success is driven by growing demand for its chips, called GPUs (graphics processing units). Originally designed for gaming, the GPUs are finding a wider market, for example in data centres, where their greater computing power runs artificial intelligence programmes.

The rise of GPUs signals an important shift in information technology – with major implications for the semiconductor industry and its leader, Intel. With its CPUs (central processing units), Intel has dominated the PC and server markets, enjoying a market share of about 80% in PCs. CPUs have been able to handle most workloads in PCs and servers. However, they are not enough for machine learning and other AI applications requiring huge amounts of data. Companies with big data centres are choosing more specialised processors from other firms, and designing their own.

For Nvidia, a turning point came during the global financial crisis, when it discovered hedge funds and research institutes were using its chips for new, more complex purposes. The company opened its markets further by developing a coding language, allowing users to program its processors for different tasks. When cloud computing, big data and AI gained prominence, Nvidia’s chips were ready for them. Today every online giant uses Nvidia’s GPUs to power their AI services, and in the past year, the company’s revenue from selling chips to data-centre operators has tripled.

GPUs are only one type of souped-up processor. ASICs (application-specific integrated circuits), the fastest and most energy-efficient, are designed for one purpose. Start-ups are developing ASICs with built-in AI algorithms, and Google has developed an ASIC for speech recognition. Another type, FPGAs (field-programmable gate arrays), can be programmed, offering more flexibility.

Intel runs the risk of being left behind in the shift to GPUs. While conventional CPU processors are still widely used and Intel’s sales from the chips continue to grow, the rise of these “accelerators” could be bad news for the company, according to Alan Priestley of Gartner. In recent years Intel has focused on making its CPUs more powerful, rather than making ASICS or FPGAs.

Intel has been catching up by investing in acquisitions: In 2015 it bought Altera, which makes FPGAs. The following year it acquired Nervana, which is developing specialised AI systems. Diane Bryant, EVP and General Manager of Intel's Data Center Group, is optimistic. She points out that, historically, new computing workloads have often been handled on specialised processors initially, only to be “pulled into the CPU” later. Intel is preparing for an integration, by releasing new chips and combining its CPUs with Altera’s FPGAs.

The course of the semiconductor industry may depend on how AI develops, says Matthew Eastwood of market research firm IDC. As The Economist reports, “If it turns out not to be the revolution that many people expect, and ushers in change for just a few years, Intel’s chances are good…But if AI continues to ripple through business for a decade or more, other kinds of processor will have more of a chance to establish themselves. Given how widely AI techniques can be applied, the latter seems likely.”

This website uses cookies to ensure you get the best experience on our website.  Learn more