Part of my extra-curricular testing post Computex this year put me in the hands of a Sharp 4K30 monitor for three days and with a variety of AMD and NVIDIA GPUs on an overclocked Haswell system.  With my test-bed SSD at hand and limited time, I was able to test my normal motherboard gaming benchmark suite at this crazy resolution (3840x2160) for several GPU combinations.  Many thanks to GIGABYTE for this brief but eye-opening opportunity.

The test setup is as follows:

Intel Core i7-4770K @ 4.2 GHz, High Performance Mode
Corsair Vengeance Pro 2x8GB DDR3-2800 11-14-14
GIGABYTE Z87X-OC Force (PLX 8747 enabled)
2x GIGABYTE 1200W PSU
Windows 7 64-bit SP1
Drivers: GeForce 320.18 WHQL / Catalyst 13.6 Beta

GPUs:

NVIDIA
GPU Model Cores / SPs MHz Memory Size MHz Memory Bus
GTX Titan GV-NTITAN-6GD-B 2688 837 6 GB 1500 384-bit
GTX 690 GV-N690D5-4GD-B 2x1536 915 2 x 2GB 1500 2x256-bit
GTX 680 GV-N680D5-2GD-B 1536 1006 2 GB 1500 256-bit
GTX 660 Ti GV-N66TOC-2GD 1344 1032 2 GB 1500 192-bit
AMD
GPU Model Cores / SPs MHz Memory Size MHz Memory Bus
HD 7990 GV-R799D5-6GD-B 2x2048 950 2 x 3GB 1500 2x384-bit
HD 7950 GV-R795WF3-3GD 1792 900 3GB 1250 384-bit
HD 7790 GV-R779OC-2GD 896 1075 2GB 1500 128-bit

For some of these GPUs we had several of the same model at hand to test.  As a result, we tested from one GTX Titan to four, 1x GTX 690, 1x and 2x GTX 680, 1x 660Ti, 1x 7990, 1x and 3x 7950, and 1x 7790.  There were several more groups of GPUs available, but alas we did not have time.  Also for the time being we are not doing any GPU analysis on many multi-AMD setups, which we know can have issues – as I have not got to grips with FCAT personally I thought it would be more beneficial to run numbers over learning new testing procedures.

Games:

As I only had my motherboard gaming tests available and little time to download fresh ones (you would be surprised at how slow in general Taiwan internet can be, especially during working hours), we have a standard array of Metro 2033, Dirt 3 and Sleeping Dogs.  Each one was run at 3840x2160 and maximum settings in our standard Gaming CPU procedures (maximum settings as the benchmark GUI allows).

Metro 2033, Max Settings, 3840x2160:

Metro 2033, 3840x2160, Max Settings

Straight off the bat is a bit of a shocker – to get 60 FPS we need FOUR Titans.  Three 7950s performed at 40 FPS, though there was plenty of microstutter visible during the run.  For both the low end cards, the 7790 and 660 Ti, the full quality textures did not seem to load properly.

Dirt 3, Max Settings, 3840x2160:

Dirt 3, 3840x2160, Max Settings

Dirt is a title that loves MHz and GPU power, and due to the engine is quite happy to run around 60 FPS on a single Titan.  Understandably this means that for almost every other card you need at least two GPUs to hit this number, more so if you have the opportunity to run 4K in 3D.

Sleeping Dogs, Max Settings, 3840x2160:

Sleeping Dogs, 3840x2160, Max Settings

Similarly to Metro, Sleeping Dogs (with full SSAA) can bring graphics cards down to their knees.  Interestingly during the benchmark some of the scenes that ran well were counterbalanced by the indoor manor scene which could run slower than 2 FPS on the more mid-range cards.  In order to feel a full 60 FPS average with max SSAA, we are looking at a quad-SLI setup with GTX Titans.

Conclusion:

First of all, the minute you experience 4K with appropriate content it is worth a long double take.  With a native 4K screen and a decent frame rate, it looks stunning.  Although you have to sit further back to take it all in, it is fun to get up close and see just how good the image can be.  The only downside with my testing (apart from some of the low frame rates) is when the realisation that you are at 30 Hz kicks in.  The visual tearing of Dirt3 during high speed parts was hard to miss.

But the newer the game, and the more elaborate you wish to be with the advanced settings, then 4K is going to require horsepower and plenty of it.  Once 4K monitors hit a nice price point for 60 Hz panels (sub $1500), the gamers that like to splash out on their graphics cards will start jumping on the 4K screens.  I mention 60 Hz because the 30 Hz panel we were able to test on looked fairly poor in the high FPS Dirt3 scenarios, with clear tearing on the ground as the car raced through the scene.  Currently users in North America can get the Seiki 50” 4K30 monitor for around $1500, and they recently announced a 39” 4K30 monitor for around $700.  ASUS are releasing their 4K60 31.5” monitor later this year for around $3800 which might bring about the start of the resolution revolution, at least for the high-end prosumer space.

All I want to predict at this point is that driving screen resolutions up will have to cause a sharp increase in graphics card performance, as well as multi-card driver compatibility.  No matter the resolution, enthusiasts will want to run their games with all the eye candy, even if it takes three or four GTX Titans to get there.  For the rest of us right now on our one or two mid-to-high end GPUs, we might have to wait 2-3 years for the prices of the monitors to come down and the power of mid-range GPUs to go up.  These are exciting times, and we have not even touched what might happen in multiplayer.  The next question is the console placement – gaming at 4K would be severely restrictive when using the equivalent of a single 7850 on a Jaguar core, even if it does have a high memory bandwidth.  Roll on Playstation 5 and Xbox Two (Four?), when 4K TVs in the home might actually be a thing by 2023.

16:9 4K Comparison image from Wikipedia

Comments Locked

134 Comments

View All Comments

  • haukionkannel - Wednesday, July 3, 2013 - link

    Ok. So practically current generation GPU hardware is not guite ready for 4K... Nice! So there is really a good reson to push GPU technology forward. The 4K will be near 1500$ in 3 to 4 years in smaller screen sizes, so Nvidia and AMD has that much time to make it happen. Until then 3 to 4 cards combinations are solution for those who really can afford these. The need for GPU upgrades has been stagnated for so long time that this is actually refressing!
    I would take 4K screen now if I could affrd it. Run all desktop aplications in 4K mode and games in 1080p untill I would see enough GPU power for it with single or two card combinations.
  • damianrobertjones - Wednesday, July 3, 2013 - link

    "3840x2160" - I like the fact that it runs as it was intended, full res, nothing scaled etc. Shame that Apple and Google machines get so much praise while offering zero increased 'working' space.
  • Pastuch - Wednesday, July 3, 2013 - link

    You can't be serious, I can't even use my 2560x1440 monitor without large screen fonts which look awful in Windows. I have 20/20 vision too so it has nothing to do with eye sight. The fact that Android scales so beautifully is a huge advantage. The smallest 4k monitor with comfortable font sizes without scaling the UI is over 70 inches for the bulk of north americans. Eyesight is NOT improving.
  • Pastuch - Wednesday, July 3, 2013 - link

    The smallest 4k monitor with comfortable font sizes IN WINDOWS without scaling the UI is over 70 inches for the bulk of north americans.
  • pprime - Thursday, July 4, 2013 - link

    What I want to know is, since this is as near as makes no difference to 4x27" 1080 monitors, do I get the same effect of viewing 4 times the content, or do I just get 4 times the detail?

    Let's say in your game with the 27" the aspect ratio lets you see a 5 foot radius (hor) in front of you (It's an example), with this will I see a 10 foot radius, or will I see the same 5 feet just bigger.

    If it's the latter, what's the point for gaming?
  • Mithan - Monday, July 8, 2013 - link

    Happy with my 2 Dell 2407/2412m monitors.
    1920x1200 is fine for me.
  • bds71 - Tuesday, July 9, 2013 - link

    did anyone notice the perfect scaling of Sleeping Dogs with Titan and the 7950? not sure why the 680 performed so completely and utterly horrendous. Ian: any insight?

    Titan:
    13.63 -> 27.4 -> 41.07 -> 57.78
    1 -> 2.01 -> 3.01 -> 4.24

    7950:
    11.1 -> -> 34.58
    1 -> -> 3.12

    680:
    6.25 -> 8.55
    1 -> 1.37 (ouch?)

    what allows Titan and the 7950 to scale so perfectly (other than the obvious: true GPU limited graphics vs resolution limiting)? and, why does the 680 suck so bad with this title and scale so poorly (i'm thinking driver simply not optimized)? do you think this is indicative of 680 scaling in general? (this effects me personally because i'm looking to get a 2nd 690 specifically for 4k/UHD gaming!!)

    maybe 4k (UHD for those who care) can shed some light on architectural differences not only between AMD and nVidia, but generationally between makers as well. i look forward to any insight you can offer (now, and down the road)

    note: the new 55/65/73 in. Sony Wega line is advertised as both 4k and UHD - for those who are offended by this lack of distinction, blame the manufacterers :) for those who are interested the costs are 5k, 7k, and a whopping 25k for the 73 (ouch!)
  • geok1ng - Tuesday, July 9, 2013 - link

    the numbers provide ZERO evidence of a VRAM limitation at 4k resolutions. at metro 6gb titan much slower than 2x3gb 7990. dirt 6gb titan slower than 2x2gb 690. sleeping dogs 6gb titan less than half performance of 2x3gb 7990 or 1/3 of the performance of 3x3gb 7950. As tested before, its only at 3x1440p/3x1600p resolutions that you start to see some VRAM limitation on a few games. It's amazing how cofirmation bias can work the human mind and distort reality. there not a single number on the article that could remotely speak of a VRAM limitation, but we have dozens of comments saying such. as xkcd once joked " dear god, i would like to file a bug report"
  • mac2j - Tuesday, July 9, 2013 - link

    Someone needs to beat the HDMI 2.0 with their own cables until they release.... I'm pretty sure they promised Q2-Q3 2013. Its crazy we can't get 4K60s right now (or more than 8 bit color) because a bunch of EEs can't get it together and roll out a long overdue cable. (And yea I know its the content providers hassling them about security but its definitely time to tell them to STFU)
  • Parablooper - Friday, July 12, 2013 - link

    Why no 7970/GE...? Crossfiring those tends to give better performance than a 7990. It's also AMD's flagship card so its absence is a little troubling.

Log in

Don't have an account? Sign up now