Part of my extra-curricular testing post Computex this year put me in the hands of a Sharp 4K30 monitor for three days and with a variety of AMD and NVIDIA GPUs on an overclocked Haswell system.  With my test-bed SSD at hand and limited time, I was able to test my normal motherboard gaming benchmark suite at this crazy resolution (3840x2160) for several GPU combinations.  Many thanks to GIGABYTE for this brief but eye-opening opportunity.

The test setup is as follows:

Intel Core i7-4770K @ 4.2 GHz, High Performance Mode
Corsair Vengeance Pro 2x8GB DDR3-2800 11-14-14
GIGABYTE Z87X-OC Force (PLX 8747 enabled)
2x GIGABYTE 1200W PSU
Windows 7 64-bit SP1
Drivers: GeForce 320.18 WHQL / Catalyst 13.6 Beta

GPUs:

NVIDIA
GPU Model Cores / SPs MHz Memory Size MHz Memory Bus
GTX Titan GV-NTITAN-6GD-B 2688 837 6 GB 1500 384-bit
GTX 690 GV-N690D5-4GD-B 2x1536 915 2 x 2GB 1500 2x256-bit
GTX 680 GV-N680D5-2GD-B 1536 1006 2 GB 1500 256-bit
GTX 660 Ti GV-N66TOC-2GD 1344 1032 2 GB 1500 192-bit
AMD
GPU Model Cores / SPs MHz Memory Size MHz Memory Bus
HD 7990 GV-R799D5-6GD-B 2x2048 950 2 x 3GB 1500 2x384-bit
HD 7950 GV-R795WF3-3GD 1792 900 3GB 1250 384-bit
HD 7790 GV-R779OC-2GD 896 1075 2GB 1500 128-bit

For some of these GPUs we had several of the same model at hand to test.  As a result, we tested from one GTX Titan to four, 1x GTX 690, 1x and 2x GTX 680, 1x 660Ti, 1x 7990, 1x and 3x 7950, and 1x 7790.  There were several more groups of GPUs available, but alas we did not have time.  Also for the time being we are not doing any GPU analysis on many multi-AMD setups, which we know can have issues – as I have not got to grips with FCAT personally I thought it would be more beneficial to run numbers over learning new testing procedures.

Games:

As I only had my motherboard gaming tests available and little time to download fresh ones (you would be surprised at how slow in general Taiwan internet can be, especially during working hours), we have a standard array of Metro 2033, Dirt 3 and Sleeping Dogs.  Each one was run at 3840x2160 and maximum settings in our standard Gaming CPU procedures (maximum settings as the benchmark GUI allows).

Metro 2033, Max Settings, 3840x2160:

Metro 2033, 3840x2160, Max Settings

Straight off the bat is a bit of a shocker – to get 60 FPS we need FOUR Titans.  Three 7950s performed at 40 FPS, though there was plenty of microstutter visible during the run.  For both the low end cards, the 7790 and 660 Ti, the full quality textures did not seem to load properly.

Dirt 3, Max Settings, 3840x2160:

Dirt 3, 3840x2160, Max Settings

Dirt is a title that loves MHz and GPU power, and due to the engine is quite happy to run around 60 FPS on a single Titan.  Understandably this means that for almost every other card you need at least two GPUs to hit this number, more so if you have the opportunity to run 4K in 3D.

Sleeping Dogs, Max Settings, 3840x2160:

Sleeping Dogs, 3840x2160, Max Settings

Similarly to Metro, Sleeping Dogs (with full SSAA) can bring graphics cards down to their knees.  Interestingly during the benchmark some of the scenes that ran well were counterbalanced by the indoor manor scene which could run slower than 2 FPS on the more mid-range cards.  In order to feel a full 60 FPS average with max SSAA, we are looking at a quad-SLI setup with GTX Titans.

Conclusion:

First of all, the minute you experience 4K with appropriate content it is worth a long double take.  With a native 4K screen and a decent frame rate, it looks stunning.  Although you have to sit further back to take it all in, it is fun to get up close and see just how good the image can be.  The only downside with my testing (apart from some of the low frame rates) is when the realisation that you are at 30 Hz kicks in.  The visual tearing of Dirt3 during high speed parts was hard to miss.

But the newer the game, and the more elaborate you wish to be with the advanced settings, then 4K is going to require horsepower and plenty of it.  Once 4K monitors hit a nice price point for 60 Hz panels (sub $1500), the gamers that like to splash out on their graphics cards will start jumping on the 4K screens.  I mention 60 Hz because the 30 Hz panel we were able to test on looked fairly poor in the high FPS Dirt3 scenarios, with clear tearing on the ground as the car raced through the scene.  Currently users in North America can get the Seiki 50” 4K30 monitor for around $1500, and they recently announced a 39” 4K30 monitor for around $700.  ASUS are releasing their 4K60 31.5” monitor later this year for around $3800 which might bring about the start of the resolution revolution, at least for the high-end prosumer space.

All I want to predict at this point is that driving screen resolutions up will have to cause a sharp increase in graphics card performance, as well as multi-card driver compatibility.  No matter the resolution, enthusiasts will want to run their games with all the eye candy, even if it takes three or four GTX Titans to get there.  For the rest of us right now on our one or two mid-to-high end GPUs, we might have to wait 2-3 years for the prices of the monitors to come down and the power of mid-range GPUs to go up.  These are exciting times, and we have not even touched what might happen in multiplayer.  The next question is the console placement – gaming at 4K would be severely restrictive when using the equivalent of a single 7850 on a Jaguar core, even if it does have a high memory bandwidth.  Roll on Playstation 5 and Xbox Two (Four?), when 4K TVs in the home might actually be a thing by 2023.

16:9 4K Comparison image from Wikipedia

POST A COMMENT

134 Comments

View All Comments

  • Kjella - Monday, July 1, 2013 - link

    That's what I found out too, I could use a 30"/4K monitor but my 60"/1080p TV is fine for my couch distance - now if they sold 100-120" 4K TVs or 4K projectors priced for mortals it would be different, else I'd have to get a lot closer.. Reply
  • Dribble - Monday, July 1, 2013 - link

    In the end it'll be worth it, but res is only one part of the package for a good gaming monitor - you need low input lag, 120hz refresh, good colours, etc. Are there any 4K monitors with 60hz refresh even, let alone 120hz (most are 30hz right now)?

    So right now you have to spend a fortune on something that ticks the resolution box lots of times, but has a lot of x's elsewhere.

    You'd be best with a 3*1080p @ 120hz surround with lightboost for fast paced games.
    Reply
  • JeBarr - Monday, July 1, 2013 - link

    +1

    I'd take a single 23 or 24 inch 1080p 120Hz+ TN panel over a 4k 30/60Hz any and every day.
    Reply
  • This Guy - Monday, July 1, 2013 - link

    I can see pixels on a 27" 2560*1440 monitor at around a metre. Full stops are still made up of just six pixels at my prefered text size and hence look blocky.

    Those rule of thumbs are silly. Why would we have movie theaters that clearly break those rules unless people enjoyed massive screens?

    Next, these tests are at max settings, including max AA. Duel 770's can average 44fps in Sleeping Dogs at 7680*1440 (33% higher resolution than 4k) with AAx2 and everything else maxed. The GPUs + CPU are rated for around 550W. So yes, 1kW for 4k is stupid.

    Lastly, if you want a 35W CPU and 75W GPU try a laptop. Intel has been selling 35W quad cores that turbo to around 3GHz for the last three generations. ATI and Nvidia both have compelling products that will easily push 1080p at max if you go easy on the AA. Best part is you get a extra screen, UPS and a small form factor.
    Reply
  • xTRICKYxx - Monday, July 1, 2013 - link

    Don't worry, we keep getting better performance per watt every year. Its a no-brainer that we will be able to have notebooks playing 4K games natively and use less power than they do now.

    Current GPU technology does not work well with 4K. But it will.
    Reply
  • douglord - Monday, July 1, 2013 - link

    Of course you think 1080P is fine sitting 3M from a 40" screen. YOU CANT EVEN SEE YOUR TV!!! :P
    I've got less then 10' between me and a 65", and Im installing a 120" drop down screen infront of that. I have a 30" on my desktop and would go bigger if I could. 4k is not BS, but you need 4k content, disks that can store it uncompressed, players and screens. No upsampling etc...

    The delta for true 4k is almost as big as DVD to blueray or cable to DVD
    Reply
  • Gigaplex - Tuesday, July 2, 2013 - link

    Why do you need to store it uncompressed? We have lossless compression for a reason. Reply
  • tackle70 - Tuesday, July 2, 2013 - link

    Meh... 1080p is for people who sit far away from their screen and/or for people with lousy eyesight. I sit about 3 feet away from my 27" 1440p monitor and I can see the pixels quite easily. I'd love a higher resolution screen!

    For TVs, it's pointless because there's no 4k content for a TV and there won't be for a long time. But for a monitor attached to a high end PC, it's great!
    Reply
  • CaedenV - Wednesday, July 3, 2013 - link

    well, if you insist on using a small 22-24" monitor, then I would have to agree with you that 4K would be overkill; But nobody is going to buy a 4K 22" monitor for their computer (though in time I expect 4K 10" tablets as an extension of 5" 1080p phones). We are going to be buying 4K monitors for our computers in the 35-45" range, and still sitting just as close to the monitor as we do currently. At those sizes and distances a 45" 4K monitor is going to have almost the exact same pixel density as your 22" 1080p screen. But the 45" screen will be huge and immersive, while your 22" screen is on the small side even by today's standards.

    I am currently staring at a 45" cardboard cutout which is sitting right behind my 28" monitor and it fits my field of vision quite nicely. It is big, and I am probably going to get a tan just from turning it on, but someday in the next few years that cardboard cutout will be a 4K monitor, and I am going to be a very happy nerd.

    For the living room 4K is going to be huge. As you mentioned, 3m distance equals a 45-50" 1080p TV. 4K has a similar rule, and you just double the size of the TV. At 3m you would technically want a 90-100" TV. The pixel density is the same, but the TV fills more of your vision out of sheer size. 90" is very large... but it is not so large that it is not going to fit in a house (though transporting it there may be a trick).

    But when you start talking about 8K, then the size doubles again. Meaning that the optimal size for an 8K set at 3m would be 180"... which is enormous! That is 9 feet away, but with a diagonal size of 15 feet! We are talking about a 7.5 foot tall screen that is 13 feet wide! That would not even fit in my front door, and the screen would be as tall as the walls in my home before you even add height for a stand and bezel.
    So when you start talking about 8K not being practical, then I will believe you because it plainly isn't practical. I can even say with some certainty that I will probably never own a TV larger than 140" even if it was affordable simply due to size constraints in my home. I may at some point own an 8K TV or monitor, but I am under no illusion thinking that I am going to see any great improvement between 4K and 8K for screen sizes that fit my field of vision. But if it becomes standard and affordable you are not going to hear me belly-aching about how "4K is good enough, and 8K brings nothing to the table but pain and missery". Instead I will get my eyes augmented so that I can appreciate the glory of 16K screens...

    Lastly, for the game results, keep in mind that these games were played with max settings, including max AA/AF turned way up, and dirt was already playable with a single high end GPU. At these resolutions AA and AF are essentially not needed (or maybe at a 2x setting?). This is not going to make these games all of the sudden playable for my GTX570... but a GTX1070 may be able to play more intense games at these resolutions with low AA at decent settings without requiring me to get a 2nd mortgage.
    Or put another way: 10 years ago we were playing on the PS2 which could not even play full res standard def games at 30fps. GTA Vice City, and Tony Hawk's Underground were cutting edge games of the time, and they look absolutely terrible by today's standards! Back then nobody was imagining us playing games at 1080p at 120 fps in 3D with passable physics and realistic textures while being on the verge of realistic lighting... But today you can do all of that, and while it requires decent hardware, it does not require a 4 Titan setup to achieve.

    Point is that we are still a year and a half away from general availability and a wide selection of 4K screens. And another 2 years after that before the price will hit a point where they start selling in real volume and a decent amount of 4K content becomes available. That puts us 3.5 years in the future which will be right on track for high end setups to be playing maxed out settings at nearly 60fps on these screens. Another 2 years after that (5 years from now) and mainstream cards will be able to manage these resolutions just fine. After that it is all gravy.
    Rome was not built in a day, and moving to a new resolution standard does not happen overnight. If you still like your 1080p screen 5 years from now then buy a next gen console when they come out and enjoy it! They will not be playing 4K games for another 9 years. But the PC will be playing 4K resolution in 3-5 years, and we will pay extra to do it, but we will enjoy it. If nothing else, hitting the 4K 'maximum' will finally put an end to the chasing of graphics at the cost of all else, and we will start to see a real focus on story telling, plot, and acting.
    Reply
  • StealthGhost - Monday, July 1, 2013 - link

    They call it "4k" because that's how much you have to spend on GPUs to get playable FPS!

    And I thought I had it rough at 1440p
    Reply

Log in

Don't have an account? Sign up now