Do new CPUs threaten Nvidia's future?

18th Mar 2008 | 11:03

Do new CPUs threaten Nvidia's future?

Chief scientist David Kirk talks ray tracing, Fusion and Intel

It's worrying times for Nvidia. The graphics industry is apparently poised to transition to ray-tracing technology. Fusion-style processors threaten the existence of traditional CPUs and 3D chips. And Intel is getting serious about PC graphics for the first time.

And yet Nvidia's chief scientist David Kirk reckons it's all going to plan.

There used to be a pleasing symmetry among the four key players in PC technology. AMD took on Intel for processing prowess, while ATI and Nvidia duked it out for graphics grunt. Everyone knew their place. Then AMD snaffled up ATI at the end of 2006 for a few billion greenbacks and the balance in the universe was upset.

Since then, shockwaves from the AMD/ATI deal have spread far and wide. It completely torched what had hitherto been an extremely productive alliance between AMD and Nvidia. According to one Nvidia insider, the daily contact he had with AMD dried up literally overnight following the ATI purchase.

Fusion reaction

More recently, the strategic implications of the AMD/ATI tie up have become rather ominous. Intel and AMD both have plans for CPUs with integrated graphics, sometimes known as fusion processors. In the case of Intel, such a chip should be on sale before the end of the year in the form of Nehalem.

Granted, AMD has suffered all manner of woes following its acquisition of ATI, not least slow and buggy processors. But it's actually Nvidia that now looks most vulnerable.

In part, that's because Nvidia has no x86 kit currently on its books. What's more, even if it wanted to make PC-compatible CPUs, it lacks the necessary x86 license. And isn't the general trend supposed to be a gradual convergence of CPU and GPU technology towards a single chip containing a massively multi-core array of floating point fun? That would surely leave Nvidia out in the cold.

But even if Nvidia can carve out a long term strategy that doesn't include CPU production, its core graphics competency is under threat.

Momentum appears to be building for a new approach to graphics rendering known as ray tracing. It's proponents claim it produces more accurate and realistic graphics than the raster-based technology that currently dominates the graphics chip industry.

Ray tracing revolution

The key worry for Nvidia is that ray tracing might just present an industry-wide inflection point that allows Intel to enter the graphics market on an equal footing. That's certainly what Intel seems to be banking on with Larrabee, a multi-core chip that's thought to be highly optimised for ray tracing.

Suffice to say, therefore, that the easy domination Nvidia currently enjoys in the graphics market is hardly a given for the future. To find out exactly how the green-tinged graphics goliath plans to face up to these challenges, TechRadar crossed swords with none other than Dr David Kirk, Nvidia's chief scientist since 1997. If there's a man alive who has a better grasp of where Nvidia is heading, well, he wasn't available for interview!

Kirk is immediately dismissive about the danger posed by upcoming integrated CPU-GPU chips.

"Integrated graphics has traditionally been a low cost play," Kirk told us. Intel first began integrating graphics into its motherboard chipsets because it could be done almost for free. As Kirk says, the integrated option is essentially "the best graphics that no money can buy".

Not exactly a money spinning market segment, therefore. Yes, AMD's Fusion CPU is likely to raise the bar for integrated graphics performance, as Kirk concedes. But it nevertheless won't come close to what one might describe as acceptable gaming graphics performance. And if there's one thing Kirk is confident about, it's that consumers continue to demand high performance graphics.

What price performance?

For proof, he points to the contrasting fortunes of CPU and GPU pricing in recent years. The current CPU price war proves consumers are not sold on the latest high performance multi-core chips. And yet buyers continue to pay a stiff premium for high end 3D chips.

There's also no doubting the enormous difference between a high end graphics card and an entry level item, in terms of the end user experience. It's much, much larger than the typical gap between budget and premium CPUs. If you want decent graphics performance, discrete will be the way to go for years to come.

But what about the threat from Larrabee, Intel's first real effort to crack the discrete graphics market and due out late next year? Is the assumption that ray-tracing will be the next big thing in 3D graphics accurate? Not exactly, according to Kirk, who says, "there's nothing new about ray-tracing".

Historically, ray-tracing hasn't been used for real-time rendering because it is extraordinarily computationally expensive. That remains the case today. For many operations rasterisation does a very good job and does it 100 times faster than ray tracing. However, there are areas where ray tracing can be used efficiently to increase realism.

Hybrid rendering

The future according to Kirk is therefore much more likely to involve a hybrid approach to rendering. "Ray-tracing will not replace rasterisation. But it will add to our bag of tricks." In any case, Kirks says, there's no reason to assume that Nvidia's GPUs won't be extremely good at ray-tracing. Either way, the implication is that Intel's Larrabee will have an extremely tough fight on its hands.

As for the suggestion that CPUs and GPUs are converging towards a single, floating-point solution, Kirk simply isn't having it. "Even the latest multi-core CPUs only offer a small fraction of the floating point power of Nvidia's fastest GPUs," he says. If anything, this performance advantage will mean so-called general purpose applications for Nvidia's GPUs (known as GPGPU for short) are likely to win an increasing share of the market for really intensive computational solutions.

"We've gained lots of traction in the scientific community. Molecular modelling, astrophysics, climate modelling - all of these are highly parallel tasks that demand much more performance than is currently available."

Not that Kirk thinks that GPUs will replace CPUs. He accepts the need for truly general purpose processors will remain for the foreseeable future. But so will the demand for the increasingly flexible and powerful co-processor that is the modern GPU - preferably Nvidia's.

NvidiaGeForceDavid Kirkray-tracinggraphicsgraphica cardGPU
Share this Article

Most Popular

Edition: UK
TopView classic version