Dedicated GPUs are increasingly irrelevant

6th Aug 2010 | 09:08

Dedicated GPUs are increasingly irrelevant

Graphics chips have become bloated and ludicrous

The end is nigh for the modern graphics chip. It genuinely pains me to say that. After all, I'm an unapologetic chip aficionado, someone who loves the technology of integrated circuits for the sake of it.

But it's becoming increasingly apparent that GPUs are over-engineered, increasingly irrelevant and almost definitely not long for this world.

The background here involves a confluence of technological trends. The most ominous of these in terms of the GPU's longevity as a discrete component is the architectural convergence of CPUs and GPUs. However, one of the most debilitating symptoms of the graphics chip's terminal malaise is complexity – sheer, pointless complexity.

Take Nvidia's uber pixel pumper, the GeForce GTX 480. It weighs in at three billion transistors. That's getting on for triple the size of Intel's beefiest PC processor, the six-core Core i7-980X. If the GTX 480 was any use, that monster transistor count would actually add to the allure. But the harsh truth is that it isn't – for almost anything.

And that makes it dumb. You see, despite the hype regarding running non-graphics applications on GPUs, there's still very little outside of games that makes more than passing use of a desktop or laptop GPU. More to the point, the number of games demanding a really high-end GPU that are actually worth playing isn't merely a small number. It's zero.

Put it all together and you have a terminal mismatch between the cost and complexity of GPUs and their real-world utility. In truth, I've felt this way for some time. But it's the apparent emergence of a radical alternative to established 3D rendering technologies that really brings home how bloated and ludicrous graphics chips have become.

Revolution in rendering

This alleged revolution in rendering comes from a small Australian software startup known as Unlimited Detail. It's not actually brand spanking new, having been in development for a year or three. But thanks to the random nature of web-based content aggregators, Unlimited Detail was lifted from obscurity recently in a flurry of YouTube-powered publicity.

Anyway, as far as I could tell the basics of this new rendering technology involve ditching polygons in favour of atomic points in 3D space. The claimed result is quite literally unlimited geometric detail. Oh, and the whole thing runs in software at smooth framerates on a conventional PC processor. The GPU doesn't get a look in until it's time to spit out the final 2D images.

You hardly need me to point out it all seems too good to be true. So, there was nothing for it other than to go straight to the source and speak to the guys at Unlimited Detail.

The technical brains are provided by Bruce Dell, a former supermarket manager, while the business nous comes courtesy of Greg Douglas, a games insider formerly of developers Auran.

The idea of using atoms or points is not new, of course. The really clever bit in UD is the 3D search algorithm developed by Dell. The precise details are UD's big secret. But according to Dell, "The algorithm takes point cloud data and files it in a certain way so that it can be quickly sorted and accessed."

When the algorithm searches for points, it doesn't do so indiscriminately. Instead, it only pulls up a single point for each on-screen pixel being rendered. "We only grab the atoms we need for each pixel, we don't touch the others," explains Dell.

In other words, the workload depends on screen resolution, not the underlying geometric detail of the scene being rendered. Thus, an impression of unlimited geometry is created. The UD guys claim the algorithm is so efficient it runs in real-time in a single thread on just one core of a conventional PC processor. Apparently, it will even scale down to simple CPUs in mobile devices.

So far, the only hard evidence for these incredible claims takes the form of a few pre-recorded videos of dubious quality. However, having spoken to the UD pair, I'm happy to confirm they're not only incredibly passionate, but strike me as completely genuine. It's potentially extremely exciting stuff.

Still, even if UD works exactly as advertised, the established players in graphics are hardly going to embrace a technology that instantly renders several decades and billions of dollars of investment obsolete overnight.

You have to assume Nvidia, and to a lesser extent AMD, will resist the idea strongly. But if Unlimited Detail's technology gains any traction at all, GPUs really will look sillier than ever.

-------------------------------------------------------------------------------------------------------

First published in PC Plus Issue 297

Liked this? Then check out 15 best graphics cards in the world today

Sign up for TechRadar's free Weird Week in Tech newsletter
Get the oddest tech stories of the week, plus the most popular news and reviews delivered straight to your inbox. Sign up at http://www.techradar.com/register

Follow TechRadar on Twitter * Find us on Facebook

GPU graphics cards CPUs processors
Share this Article
Google+

Apps you might like:

Most Popular

Edition: UK
TopView classic version