Best high resolution and multi GPU graphics cards: 10 reviewed
25th May 2013 | 09:00
One GPU or two? That is the question
The road to 4K gaming
Want to justify that £829 graphics card you just bought? Need to figure out how to drive that massive-o-screen that almost melted your credit card? Well, the only realistic answer is either incredibly high-resolution gaming or going for a multi-GPU setup that might melt your PSU, as well as your personal wealth.
The thing is, despite the fact that graphical hardware has evolved at an astounding rate over the last few years, panel and game technology hasn't really kept up. Since the last generation consoles came around, most of us have been aiming for 1080p resolutions.
At first that was a laudable goal and it felt like an age before it became the standard res. But now it is, with the Steam hardware survey claiming around a third of users have a primary display running at 1,920 x 1,080. At that resolution, even sub-£100 graphics cards are more than capable of running most new titles on relatively high settings, at playable frame rates.
To really push your modern midrange/high-end graphics cards you need to up the resolution and, sadly, the 30-inch panels of the last five years - with their 2,560 x 1,600 resolutions - are still at the top of the tech tree, knocking around the £1,000 mark. There are 27-inch screens with the 16:9 res of 2,560 x 1,440, and these are probably your best bet for high-res gaming on a relatively sensible budget.
The next step up is to strap a bunch of screens together in some sort of widescreen surround setup. We're talking about resolutions of around 5,880 x 1,080 when you're linking up three 1080p screens in landscape mode with bezel correction. And with those extra pixels filling up your eyes you may actually need a second GPU just to cope with it all.
When I started this whole long-winded testing process, I was of the opinion that the square peg of surround screen gaming was struggling to fit into the same round hole as 3D. I thought both were resource-intensive frame rate hogs - limiting the fidelity of your game, without adding much to your experience.
And to be honest, that opinion hasn't changed. You need some incredible graphical grunt to get the most out of a triple-screen array, and for the most part that means selling your soul to the dark god of multi-GPU gaming.
Beware, though - that way madness lies. The resolutions involved in multi-screen arrays in-game put huge demands on your available GPU power, but as we start to tread the seemingly long dirt track to 4K gaming, maybe this is something that we need to start thinking about seriously.
Strapping a few monitors to a couple of GPUs is the only realistic way for us to achieve resolutions above what our blessed 30-inch Dell panel can render, which is the only way to really put the latest and greatest graphics chips through their high-res paces.
As an exercise in seeing just how far we can push the modern GPUs of this generation - and seeing how close we are to machines capable of 4K gaming - the high resolutions involved in surround screens setups are useful. But as a gaming experience right now, they're simply an indulgence, with a huge amount of wasted screen real estate and a massive premium placed on frame rates.
The whole deal with surround screen gaming is to have a wrap-around view of your game world, with a pair of panels either side of a central screen essentially acting as your peripheral vision. If extra screens were given away - and the frame rate hit wasn't effectively halving performance - then it would be a neat extra to have, adding a little immersion to your experience.
But that isn't the way of surround gaming. At best you're looking at screens around the £100 mark, so you're adding another £200 to your gaming setup for a pair of panels that, by definition, you're not actually meant to be looking at. The distorted images stretched out over the peripheral screens aren't pretty, but are there to catch the corner of your eye while you're focusing most of your attention on what's happening right in front of you.
In a first-person shooter, that kind of widescreen vision can be useful to see who's creeping up on you, and in racing games it certainly adds to the sensation of speed, but it rarely offers a proper, tangible benefit. Having an extra screen is neat, but definitely not enough to justify such a chunky outlay.
The woes of setup
And then there's the setup anguish. Getting three screens running simultaneously can be something of a chore, although my initial experiences were relatively positive. I began my testing slog with the Nvidia cards, with each and every one of the five different setups immediately recognising the three 1080p DVI screens. This allowed me to either spread the connections across the cards in the multi-GPU sets, or run all three from a single card with the Titan and GTX 690.
Getting the desktop spread across the triple-screen array was simple too, dipping into the Nvidia Control Panel and adjusting bezel correction and the like in a trice.
The AMD cards, those champions of multi-monitor gaming through the Eyefinity initiative, were more of an abject lesson in setup anxiety. For reasons best known to itself, AMD is obsessed with DisplayPort.
This obsession means that if you want to run three screens from its cards, then one of those monitors has to be running from one of the DisplayPort connections on the first card in the array. With our DVI screens that proved a bit of a stumbling block to say the least.
With the Nvidia cards, we could cope with a lack of DVI connectors on a single card by spreading the load across an SLi pair - something that's not possible in CrossfireX. So it's a case of adapt or fail, usually requiring one screen to be attached via a DVI cable, another via an HDMI adaptor, and the third and final displayPort-hungry screen via a DP adapter.
This would be simple were it not for the fact there are both active and passive adaptors available for DisplayPort, and it can be hard to know which are capable of actively altering the clocks in the adapter so the card thinks it's powering a DP monitor. With a passive adaptor you'll only ever get two screens running at a time.
Thankfully, Sapphire was good enough to provide us with a functioning option after two failed attempts by our IT bods and a long trawl around the city trying to track one down locally.
So, Nvidia is the king of multiscreen setups, with its ease of use and the fact that it doesn't rely on DisplayPort to make up the full, surround screen array. But given the extra expense involved and the lack of tangible benefits, we're really straying into the kingdom of the blind here.
The fact remains, then, that if you want the best high-res experience, you need to buy the biggest monitor you can. You can pick up a decent 27-inch panel running at 2,560 x 1,440 for less than £500, but the 30-inch 16:10 panels are still prohibitively pricey.
A larger single monitor can also be run more readily from a single GPU, meaning you don't have to suffer from the woes of multi-GPU setups. Even a single HD 7870 XT, for example, will run pretty much everything at playable speeds at 2,560 x 1,440.
Still, our tests have shown that very high-res gaming is certainly not out of the reach of today's hardware. We may not have quite got to the levels of full 4K with our multi-screen array, but we're not far off and with some decent performance from current silicon.
That's lucky as AMD's current generation of GPU is going to be driving the hardware of the gaming future in the next-gen consoles - and for them, the 4K roadmap is vital.
On the next page we review the GPU set-ups.
High-end GPU reviews
Nvidia GTX Titan SLI
There is definitely something to be said about excess, and that something isn't necessarily good. Nvidia's GTX Titan though is all about luxurious extravagance, and never more so than when you get a few of them together for a graphical good time.
Last month I had three of them plugged into our over-specced test-rig, but the law of diminishing returns was in full effect with so many high-price GPUs in one place. Three Titans tied together don't give you anywhere near the performance boost that you get from dropping in a second card - and even then the second card isn't really justifying its price tag beyond the likes of Unigine's Heaven and Valley, which are the only things to really demonstrate what the extra silicon is getting you.
But that's only when we're talking about standard resolutions like 2,560 x 1,600, not the graphically intensive surround resolution of 5,885 x 1,080. This is where the extra horsepower really comes in - especially the extra graphics memory on offer.
With a pair of GTX Titan cards you're looking at a preposterously huge 12GB of graphics memory, all running at a shade over 6GHz. That sort of framebuffer is what gives this setup the grunt to cope with the 6MP resolution we're forcing it to drive on our three screens.
However, we now end up in the reverse situation, where the Heaven and new Valley synthetic benchmarks just aren't coping with the extra screen real estate coupled with the vagaries of a multi- GPU setup. In both Unigine tests, the SLI pairing fared worse in triple-screen trim than when we had the single Titan doing the grunt work. This is doubtless down to driver/software issues rather than the silicon itself, but it does seem that AMD's GCN architecture is better suited to the multi-GPU/multi-screen code in the Unigine engine.
That all changed in-game, with all of our test titles displaying around a third better frame rate scores with the second card dropped into the multi-screen build. That's to be expected, as the titles we've picked are more likely to have been coded to take advantage of a surround screen gaming array than the synthetic benches.
Can't buy me love
What's really interesting - especially in the gaming benchmarks - is that while this double-bubble GTX Titan setup is easily the priciest, and on paper the most powerful setup in our test, it isn't the outright fastest.
When we're talking about the surround setup, it's the pair of Radeon HD 7970 GHz cards that keep the Titan setup honest, and in Crysis 3 and DiRT Showdown the AMD cards fare far better than the top Nvidia cards.
As both are Gaming Evolved titles it's almost too easy to call shenanigans, but Nvidia has had a fair amount of time to perfect its own driver set, and the fact that the half-price AMD pairing is also incredibly close in both the Nvidia-sponsored titles makes it a tough call for the Titan.
At over £1,600 there's simply no justification in gaming terms for this setup. It isn't the outright fastest, though at least Nvidia's multi-screen tech means it isn't a nightmare to set up.
Nvidia GTX Titan
While it's eminently possible to run a triple-screen array from a single graphics card, most of the time it's simply not worth the trouble. Such huge resolutions are generally too much for a single GPU to handle at frame rates we'd call playable.
That said, there are a few cards that are capable of this feat, and the GTX Titan is one of them. We've seen how it does when you strap a pair of those GK 110 GPUs together in an SLI array, and while it's impressive in terms of both performance and financial extravagance, it's not the outright fastest.
It's a similar situation when you look at the GTX Titan on a card-by-card basis up against the single card/multi-GPU GTX 690. Thanks to the pair of GPUs at its heart, the GTX 690 consistently outperforms the lone GTX Titan in the triple-screen showdown. But because of the huge amount of graphics memory available to the GTX Titan - the same as in the GTX 690 - it's still able to give a very good account of itself at such silicon-shaking resolutions as 5,885 x 1,080.
That means a single GTX Titan isn't far short of a pair of GTX 680s either, but the impressive performance of the CrossFireX cards from the AMD camp still makes for tough reading as a Titan owner. The HD 7970 and HD 7950 both have the same 384-bit memory bus, and in a pair have the same 6GB GDDR5 to call upon when they're rendering to vast resolutions.
Easy does it
But Nvidia's latest darling, the Titan, has never really been about outright performance wins, despite its premium branding and 'ultra enthusiast' pricing. What the Titan is all about is the experience.
Strapping two AMD cards together means readying yourself for CrossFireX profile problems when a new game launches, having to make friends with DisplayPort, dealing with the volume of the cards when they're pushed to their limits, and preparing a beefy PSU to cope with their power demands.The GTX Titan has none of those issues, and its ability to offer playable frame rates in a single GPU card at these resolutions means you avoid the inevitable multi-GPU malaise completely.
The Titan is also incredibly efficient, with a TDP of just 250W, and runs whisper-quiet even when you're torturing that pro-GPU with ultra-demanding games. With a Titan you can be reasonably confident that, as long as the game is coded for multi-screen play, your system will happily be running it on day one. There will be no waiting for multi-GPU profiles and you'll still be getting playable frame rates, if not necessarily the sort of performance you might eventually get from a full CrossFireX or SLI system.
You can even make a value call about a single GTX Titan - not something you could ever do with a pair of them. At £829 it's still hugely overpriced, but considering you're looking at £700-£750 for the top SLI and CrossFireX cards, the price gap is suddenly a lot smaller than you might think.
There's still a large premium placed on the ephemeral notion of 'the experience', but if you can't face the hassle of multi-GPU arrays and want to get into high-resolution gaming, then the GTX Titan is worthy of consideration.
Nvidia GTX 690
Nvidia's GTX 690 is every inch the spiritual predecessor of the GTX Titan. The Titan was where Nvidia cut its injection-moulded magnesium alloy teeth and first put a marker down in the 'ultra-enthusiast' camp.
This single-card/multi-GPU offering is beautifully built, but you won't completely avoid the trials and tribulations of running a multi-GPU setup just because it's all been jammed onto a single slab of PCB.
Still, straight line performance is what the GTX 690 is all about. As we saw when we first met the GTX Titan, Nvidia is positioning its luxury single-GPU card as the GPU for the best all-round experience, whereas the GTX 690 is still the best bet for those who just want the fastest raw frame rates.
Well, in terms of Nvidia cards, that is. The impressive HD 7970 GHz edition, when dropped into a CrossFireX arrangement, is capable of rivalling even a pair of GTX Titan cards in SLI. The greater graphics memory available to the AMD cards and the faster 384-bit memory bus mean that when you're talking about the 5,885 x 1,080 resolutions we're seeing with our triple-screen setup, the Tahiti XT-powered cards are able to throw out a huge number of frames per second.
Full of hot air
Of course, the single GTX 690 has its own benefits. Being a dual-slot card, it doesn't need a giant chassis, an advanced, expensive motherboard or a power supply the size of Chernobyl.
That said, the design of the card - trivalent chromium plating aside - doesn't please me particularly. The fact that it operates with a central fan, distributing air across the GPU heatsinks either side of it, means there is a large amount of hot air exhausted back into the chassis. Getting decent airflow in a GTX 690 machine then is quite a challenge - far beyond my limited understanding of thermodynamics.
So the GTX 690 does its level best to alleviate some of the burdens of multi-GPU arrays while still offering the performance, but there's always going to be the problem of micro-stuttering. The 3DMark test is a great demonstrator of this, and highlights the problems of the GTX 690.
The frame rate graph Futuremark's latest benchmark spits out at the end is interesting for the fact that, if you compare the results from a GTX 690 and a GTX Titan, you can see just how much variance there is in multi-GPU performance compared to single-GPU. The graph on the Titan is relatively stable around the average frame rate, but the GTX 690 shows much greater variation and, even more interestingly, at a more rapid rate. Zoom into the graph and you can see the frame rate moving up and down sometimes up to 20 times a second.
While the GTX 690 has the performance edge over the Titan, if you're spending this much cash on a card I would personally say that the single GPU card is by far the better option. It's going to be more stable and give a far smoother all-round experience. If you're a complete frame rate obsessive though, and raw performance is most definitely your bag, then the HD 7970 GHz CrossFireX pairing is a pacier option.
Nvidia GTX 680 SLI
You've got a GTX 680 and you're all proud of yourself for having the top-end card of this Nvidian generation - then the green goblins go and spoil it by engineering the Tesla K20X down into a consumer graphics card.
That makes you a sad panda - you were never that fussed about the dual-GPU GTX 690 because you remember the 7950 GX2 and all those people who got burned with those multi-GPU cards, but the Titan makes your card second-tier.
You could still run a three-way screen setup from your GTX 680 without having to sign up to the DisplayPort club, but at that resolution your single GPU is really going to feel like that second-tier product it now is.
Sure, you can hit double-digit frame rates with the GTX 680 on its own, but in our tests the Nvidia-favouring Batman: Arkham City is the only benchmark where we were able to top 30fps. Both the Max Payne and Crysis threequels were posting less than 20fps with a single card, sometimes even dropping below 10fps in Crytek's system hog.
Two to tango
You GTX 680 owners wanting to make the move to surround screen gaming - or prepare for the high-resolution future - are going to have to pick up a second card to try to keep pace with the top-end of the Nvidia graphics lineup. With a pair of GTX 680s you're essentially running the same hardware as the GTX 690, which means you're getting the same sort of performance benefits we mentioned, but inevitably you're still going to suffer the same multi-GPU problems that can plague such setups.
What's interesting is that, despite having a lower GPU base clock, the GTX 690 is capable of hitting slightly quicker frame rates at the top surround screen resolution. The difference really is slight, and could possibly be laid at the feet of general testing variance, but having the multi-GPU connector on board could result in some improved memory performance.
At the 5,885 x 1,080 resolution, video memory performance matters, and the fact that the SLI GTX 680s consistently perform better than the GTX 690 at lower resolutions would appear to support that. As a move from single-card to multi-GPU setup, adding in a second GTX 680 makes some sense for high-res gaming shenanigans.
You've already spent a goodly amount of cash on one card, so if you want to start throwing around 6MP resolutions there's little point ditching it and opting for a Titan or GTX 690 when you can simply drop in a second GTX 680. But as a starting point, there's simply no justification for building a machine from scratch with this pairing.
On the Nvidia side you're looking at either the Titan or GTX 690 for the simplest experience or the fastest frame rate respectively, but realistically using the HD 7970 GHz CrossFireX pairing as a starting point makes more sense. They're cheaper and generally faster in-game, but they're also loud and power-hungry, and wrestling with the Catalyst Control Centre is a lesson in pain management.
If you're already sitting on a GTX 680, then an SLI addition will keep you in the high-res game, but it's really not the ideal starting point.
AMD HD 7970 GHz Ed CF
The impressive triple-screen performance of this pair of AMD's finest cards wasn't something I really noticed until I'd finished all the testing and was collating the results.
That wasn't due to a lack of interest in the findings, or a mad rush to get the whole testing process finished - it was all down to the fact that trying to get a pair of AMD cards outputting to three DVI screens was an incredibly painful experience, and any traces of positivity I held about this CrossFireX pairing were pushed deep below a broiling sea of bile and vitriol until I'd calmed myself down and was able to take in the raw performance numbers.
As I mentioned in my introduction, AMD's take on multi-screen/multi-GPU arrays is needlessly painful. Having tested all the Nvidia cards without problem, shifting over to the red camp was a frustrating experience.
There is no need to have all the displays attached to the primary card - I'd stupidly thought the days of master/slave cards was all over - and there is zero need to demand some sort of DisplayPort connectivity to use more than two screens at once.
AMD, that champion of multi-monitor gaming, seems to be trying to make the process much more complicated than it should be. That's a huge shame for AMD, as the straight line performance of this CrossFireX pairing is almost without rival.
I say 'almost' because the only setup that can really stand up against AMD's top cards is a pair of GTX Titan cards, and at £1,600 that's a graphics array that's beyond the wallets of most gamers and beyond the pale for most sane individuals. It seems crazy to think that an AMD array that is so much cheaper can compete, but it's proving AMD's decision to load up its top cards with a huge framebuffer and memory bus completely right.
When you start throwing around these sorts of resolutions, the speed and capacity of the VRAM becomes ever more important.
Showing up Nvidia
The SLI GTX Titan array has the edge in both the Nvidia-sponsored titles, and the HD 7970 GHz setup is faster or equal in both the AMD-sponsored games. It would be easy to draw the lines alongside these marketing splash screens, but the simple fact is that the HD 7970 GHz is a top card capable of making a decent stab at triple-screen surround resolutions on its own, and with two of them you can really show up Nvidia's ultra-enthusiast cards.
Building from the ground up, the bizarre choices AMD has made surrounding its multiscreen/multi-GPU setup are less of an issue. Once all the components are in place and the EyeFinity desktop has been created, you shouldn't need to spend any more of your precious time fighting AMD's design or programming.
You will still need to battle through the driver issues and missing multi-GPU profiles for new games, but the only way around that is the pricier GTX Titan in single-GPU trim. You'll miss the raw performance of the HD 7970 GHz cards, but you'll have a stable base for gaming.
Forget the GTX 690 - the GTX Titan/CrossFireX HD 7970 GHz comparison is where the real performance/ convenience trade-off needs to be measured.
Nvidia GTX 670 SLI
Here in the £500-600 price bracket, it's a straight-up fight between the GTX 670 in SLI and a CrossFireX pairing of AMD HD 7950s. It doesn't make particularly pretty reading for the Nvidia faithful.
For the most part the GTX 670 is only a very slightly cut down version of the GTX 680, keeping the memory configuration identical. That ought to mean we see similar performance to the other GK104-powered card when placed in an SLI setup, but it doesn't quite pan out like that.
At a regular 2,560 x 1,600 resolution things are indeed very similar, with the GTX 680 SLI array only posting single-digit improvements in average frame rate over a pair of GTX 670s. When you start throwing around 6MP resolutions though, things change drastically. No longer can the GTX 670's GPU keep up with the GTX 680s, and suddenly that missing SMX unit and significantly lower base clock start to look like structural weaknesses in the multi-GPU support.
And then you take a look over at the opposition and the similarly priced (though actually a little bit cheaper) CrossFireX pairing of Radeon HD 7950s are totally outmuscling the GTX 670s in our triple-screen array.
Again, things look fairly level at our 30-inch panel resolution, but as soon as you start gaming at the surround screen res the AMD cards take the win, and take it in style. The Batman benchmark is the only place where the Nvidia card can call the shots, and only then by the smallest of margins. There's also the fact that even where Nvidia holds sway in terms of average frame rate, the AMD card has a much higher minimum frame rate.
The GTX 670 setup drops as low as 22fps where the Radeon cards never dropped below 31fps, and that's a rather sizeable difference. Elsewhere the HD 7950s are offering up to twice the performance numbers of the GTX 670 pairing. It's almost a reversal of the performance relationship with the GTX 680, where the VRAM was the same but the GPU was set up differently.
Here the AMD card is running its GPU at around the same sort of speed as the GeForce card is, but it's the memory configuration that's making for all the performance increases on the Radeon side of things. The extra 1GB of GDDR5 and the improved 384-bit memory bus both help shunt things around quickly at that massive-resolution we've been using on our triple-screen array.
After the hassle I endured every time I switched between testing setups - even just between different AMD arrays - I was ready to admit that no matter what benefits the Radeon team might offer, the negatives would outweigh them.
But now I'm sitting here, with the pain receding in my memory, I just can't get behind the Nvidia camp on this one. Sure the driver and multi-screen setup is far easier in green trim, but the sheer performance and comparative price makes the AMD CrossFireX pairings tough to beat - and the GTX 670 in SLI isn't powerful enough to do so.
Again, an otherwise excellent graphics pairing is laid to waste by the intense performance premium that is put in place by the rigours of surround screen gaming.
AMD HD 7950 CF
The traditional received wisdom is that you should spend as much money as you can on the fastest single-GPU card around. That tactic will generally garner better performance and fewer driver/software headaches than if you spent a similar amount of cash on a pair of cards. But then we start throwing around the sorts of crazy-high resolutions like the 6MP 5,885 x 1,080 res we've cooked up for this triple-screen test, and that received wisdom starts to look a little cloudy.
The problem is the HD 7950 in CrossFireX configuration is rather incredible at dealing with the vast resolution we've stuck in front of it. For the money, there simply isn't anything around that can offer the same sort of multi-screen performance as these Tahiti Pro cards.
Sure, at £520 it's a fair bit more expensive than the GTX 680 or HD 7970 GHz Edition, even in their seriously overclocked SKUs, but it's considerably quicker too. Both those top cards are capable of offering surround screen gaming on their own, even at double-digit frame rates on the toughest settings, but the HD 7950s are able to offer totally playable frame rates when in CrossFireX trim.
Speed or simplicity
It's interesting that even the GTX Titan is playing second fiddle at these resolutions compared with the HD 7950 CrossFireX config, and that card is around £300 more expensive than this twin-card setup.
The same single-GPU vs multi-GPU arguments will inevitably rage here too, but when we were talking about the Titan up against a £750 GTX 690 or GTX 680 SLI setup the good experience vs raw performance argument has some sway. When we're putting that £829 card against a pair of cards for £520 - that in general perform as well or better - that argument starts to look rather flawed.
Again, AMD's decision to go VRAM-happy with the top two cards in its HD 7900 series seems to have paid off. Certainly its dominance in the multi-screen environment with the Tahiti GPU can mainly be attributed to the extra graphics memory and superior memory bus.
When you look at the HD 7870 XT in CrossFireX trim, that view is upheld by its comparative performance numbers. The Tahiti LE GPU is largely identical to the one in the HD 7950, but is running the same 256-bit bus and 2GB GDDR5 that the GTX 670 and GTX 680 are. The fact it's producing similar benchmark performance shows what benefits the extra 1GB GDDR5 and the 384-bit bus then give to the HD 7970 and HD 7950 at the highest resolutions.
This must be pleasing reading for anyone already sitting on a HD 7950 in their current rig. It's not going to take a Herculean effort, or a huge amount of cash, to get your rig capable of gaming beyond HD resolutions. You may not necessarily want to make the move to a triple-panel setup right now, with all the redundant screen real estate that entails, but you at least have a path towards the high resolutions of a 4K future.
The HD 7950 is a very impressive high-resolution card when it's got a twin for company, and for the money is a great option for those wanting to make the move beyond high-def right now without breaking the bank.
Nvidia GTX 660 Ti SLI
With these third-tier Nvidia cards, we're starting to get to the point where the lower-end graphics silicon really struggles to cope with the demands of resolutions like the 6MP we're demanding with our triple-screen array.
But with the GTX 660 Ti it's not the actual GPU that's the problem (no more than it was with the GTX 670, anyway), and that's because the GK104 GPU at the heart of the GTX 660 Ti is almost identical to the one that inhabits the second tier card, the GTX 670. So why does the GTX 660 Ti in SLI trim fall so far behind its almost identical twin?
Both chips have the same 1,344 CUDA cores, the same base clock and the same texture units. Okay, the GTX 660 Ti is missing eight render output units (ROPs) compared to its big brother, but those missing parts mean less in the overall scheme of things when the SLI array has a total of 48 ROPs in the pair.
What really does make a difference at this immense scale is that hoary ol' video memory chestnut - specifically the weaker memory bus. The GTX 660 Ti is still rocking the same 2GB GDDR5 as the GTX 670 and GTX 680, so it's mega-quick at over 6GHz. At resolutions this high, you need at least that capacity and speed, though you're going to need some pretty hefty bandwidth numbers as well to squeeze that much information down the GPU's pipes.
Off the pace
With a 192-bit bus, the GTX 660 Ti's memory bandwidth drops down to 144GB/s compared with the 192GB/s boasted by the GTX 670 and GTX 680. Compare that with the 240GB/s the HD 7950 can garner with its 384-bit bus and you can start to see exactly why this third-rate Nvidia pairing is sitting so far off the pace in the massive-oresolution scale.
In fact, the only place where the GTX 660 Ti is able to cope is in the Batman: Arkham City benchmark, where it manages a very impressive 51fps. In DiRT: Showdown though, this Nvidia SLI setup can't even break through the 30fps barrier at 5,885 x 1,080 - and in both Crysis 3 and Max Payne 3, the pair of GTX 660 Ti cards drag themselves wheezing over the line at just 12fps.
More damning than that weak average frame rate though is the fact that in Crysis and Max Payne, the frame rate occasionally dropped as low as 4fps and 1fps respectively. That ain't playable.
You could argue though that these cards are still able to offer gaming-capable frame rates at this resolution if you just drop down some of the more intensive graphical options. After all, that is one of the beautiful things about PC gaming: you can tailor the experience to your own personal choice and your own hardware setup.
So yes, with a few sacrifices, you could nail over 30fps in the latest titles with a pair of GTX 660 Ti cards, but you wouldn't want to. For the £400+ you'd be spending on a pair of these cards you could pick up either an overclocked GTX 680 or HD 7970 GHz Edition to power your multi-monitor monster instead. Either of those cards will offer far better frame rates on their own than this SLI array, with the AMD card in particular offering actual gaming performance.
AMD HD 7870 XT CF
Thanks to the fact that this generation of graphics cards has been around for a good long while now (this is actually one of the longest-lived GPU generations I've experienced in my time with PC Format), AMD has been able to start messing around with its chips without having to worry too much about cannibalising its other cards' sales.
The HD 7870 XT is a perfect example of this, with AMD taking the decision to start dropping higher-end GPUs into the lower echelons of its graphics card stack. The Tahiti LE GPU at the heart of the HD 7870 XT is practically the same silicon as that of the HD 7950, but there are fewer actual Radeon cores, fewer texture units and a lower-spec memory bus running the smaller capacity framebuffer.
That puts this card at a disadvantage compared with its bigger brother and brings it in line technically with the likes of the GTX 670, though it's a country mile away in terms of pricing. The GTX 670 - and actually the GTX 680 too - are running with the same 256-bit bus, offering the same 192GB/s memory bandwidth with the same speed and capacity GDDR5 chips.
At the lower resolutions the better GPU in the GTX 680 gives it some breathing room up against the HD 7870 XT, but in the rarefied air of 5,885 x 1,080 the huge number of pixels that need to be drawn is a great leveller. The HD 7870 XT in CrossFireX trim offers the same sort of performance at this resolution as both the GTX 670 and GTX 680. In a few of the tests the GTX 680 SLI pairing did post better average frame rate numbers, but compared with both the Nvidia GPUs the Tahiti LE-powered cards consistently posted better minimum frame rate figures across the board.
That's important when you drill down past the average frame rate numbers, as it gives you a better idea of how smooth a gaming experience you're going to be able to get at this resolution.
That's all rather impressive for such incredibly well-priced cards. I hesitate to use the word 'cheap' when we're still talking about nigh-on £200 for a GPU, but when it's the best value sub-£300 card, it's tough to argue against. And when you can pick up a pair of them for as little as £344 - the same price as a HD 7970 GHz, and cheaper than a GTX 680 - again the idea of buying the best single-GPU you can afford goes out of the window.
With the HD 7870 XT in CrossFireX trim most of the arguments against multi-GPU and AMD's multi-screen policies melt away under the heat of the price/performance argument. The raw performance you can get at this price is quite staggering in this super high-resolution gaming world. The drop in memory spec from the HD 7950 does mean it's slower, but the reduction in price almost completely outweighs the dip in performance.
AMD's Tahiti GPU really is the top silicon for these high resolution shenanigans, and even when you start stripping back the memory spec to Nvidia levels they can compete magnificently. I already had a pretty high opinion of the HD 7870 XT, but this test has cemented that belief. A great card on its own or in pairs.
AMD HD 7850 1GB
The HD 7850 1GB card is easily one of my favourite budget graphics cards. If it had been put together in a single-slot configuration I would probably have needed a change of underpants, I'd be that excited.
For around £125 it offers performance to rival cards much more expensive, and is the perfect GPU for any budget build, so I obviously had to see what would happen when I threw a couple of them together and beasted them with our super high-res triple-screen array.
Predictably though, things don't turn out too well for this memory-poor graphics card. With only 1GB GDDR5 on offer in each card things were never going to go that swimmingly at such high resolutions and so it came to pass. Up to 2,560 x 1,600 things looked pretty good, delivering 68 and 70fps in Batman and DiRT respectively, but the Max Payne and Crysis 3 results started to show a different story.
In Crysis 3 the twin cards managed a pretty respectable 28fps, but the fact it would drop as low as 8fps was an early warning sign. Max Payne 3 at that resolution looked like a slideshow. A very slow, boring, neon slideshow.
Showing the strain
So when we upped the testing to our triple-screen 5,885 x 1,080 resolution, the cracks in the memory architecture really showed. The HD 7850 1GB has still got the same 256-bit memory bus as the HD 7870 XT and the GTX 670 and GTX 680, but bandwidth is most definitely not the issue here - the problem is obviously a paucity of actual memory capacity.
Still, this budget setup is managing very impressive frame rates in DiRT, but the Batman benchmark is the first to fall with performance dropping as low as 8fps despite a 40fps average. It's worse in Crysis 3 with an average of 18fps and a low of 7fps, but the fact that we couldn't even get the test to run in Max Payne 3 is ultimately damning.
A framebuffer of 2GB is pushing it trying to get decent performance out of a 2,560 x 1,600 screen, but any higher and it becomes an exercise in futility. I felt like a bit of a bully putting these plucky little GPUs through such a tough and ultimately fruitless test, but they still performed admirably despite their failures. If even such low-end cards can get close to the sorts of resolutions we're throwing around here, our high-res future isn't just going to be the domain of the super-rich with their quad-SLI second-gen GTX Titans.
The HD 7850 1GB is then a step-down too far if you want to hit the high resolutions we've been throwing around in this test. The 2GB version ought then to perform much better as the CrossFireX pairing will have a total 4GB, like the other mid-range GPUs we've featured. Ultimately that should help push up the minimum frame rates we've seen in testing, but might not necessarily do a lot for the overall average scores.
Sadly then, if you're sitting on a budget rig like the Daw Computers machine we've looked at this month, you're probably not going to benefit from dropping in a second HD 7850 1GB if you want to up the resolution. That's a shame for my favourite budget card, but not really a surprise.
Benchmarks and analysis are on the next page,
It's all about the numbers
All the tests here were performed on our über test rig comprising an Asus X79 Rampage IV Extreme with a Xeon E5-2687 eight-core CPU and 16GB Corsair 2,133MHz RAM. We used three 22-inch Iiyama screens to make up our triple-screen setup, and benchmarked all our games and synthetic tests at an eye-watering 5,885 x 1,080 resolution.
All the gaming benchmarks ran happily with our multi-GPU/multiscreen setups, though both the Unigine synthetic tests - Heaven and Valley - performed worse with twin GPUs in a surround setup.
And the winner is… AMD HD 7970 GHZ ED CF
The purpose of this test isn't to encourage you to go out and blow your life savings on a triumvirate of screens and a new set of graphics cards so you can indulge in a triple-screen setup. I think we've established that's a gaming array that only those with an enormous disposable income could consider because of the limited benefits and massive performance premium it entails.
No, the point of this test is to see how the current crop of graphics cards is set for a future beyond our current resolution limits. We still test all our hardware at the max 30-inch res of 2,560 x 1,600, but with 3D pretty much dead and buried, the next push for TVs and console hardware is going to be towards 4K.
The 3,840 x 2,160 resolution that 4K represents is the goal for hardware manufacturers wanting to move beyond HD. When the original HD consoles launched, 1080p as a standard seemed a long way off, but it's become the norm. Now the next generation is on the way and is targeting 4K, so that should become the gaming standard a little way down the line.
We stuck three screens together in a 5,885 x 1,080 surround setup to give us a bit of a yardstick. The 4K res is around eight megapixels (MP), while our triple-screen array is just over 6MP, and it has to be said that while it's definitely punishing, our current multi-GPU setups can cope admirably.
The winner though has to be AMD's HD 7970 GHz Edition in CrossFireX trim. The performance is astonishing compared with what Nvidia's finest can do with a £1,600 SLI pairing, and shows how smart the Texans have been by dropping an unprecedented 3GB GDDR5 with a 384-bit bus onto their top two cards.
That said, the setup woes of AMD's multi-GPU/multi-screen tech almost tainted my view of the performance figures. You know there will be times when that game you're desperate to play on day one simply wont work with this setup.
That's where Nvidia's GTX Titan comes in. It's nowhere near the fastest here, but no other single-GPU card is capable of consistently throwing around polygons at this resolution at those sorts of speeds. Because it's a single card there's less worry about day one driver problems, and Nvidia hasn't put a third of its eggs in the DisplayPort basket either, so it's more versatile for a multi-screen setup too.
So there's the experience vs raw performance argument in a nutshell, and if the test stopped there that would be all I'd write, but the second-tier AMD cards are almost as capable as the top-tier at this resolution, and are far cheaper than the GTX Titan on its own.
Other than the Titan, there really isn't a place here for the Nvidia cards at such high resolutions. The smaller memory capacities and weaker memory buses hobble its cards once you move past 2,560 x 1,600, and when you factor in the lower prices, AMD's claims to GPU leadership in this generation certainly bear fruit.