Meet The New Future of Gaming: Different Than The Old One

Up until last month, NVIDIA had been pushing a different, more conventional future for gaming and video cards, perhaps best exemplified by their recent launch of 27-in 4K G-Sync HDR monitors, courtesy of Asus and Acer. The specifications and display represented – and still represents – the aspired capabilities of PC gaming graphics: 4K resolution, 144 Hz refresh rate with G-Sync variable refresh, and high-quality HDR. The future was maxing out graphics settings on a game with high visual fidelity, enabling HDR, and rendering at 4K with triple-digit average framerate on a large screen. That target was not achievable by current performance, at least, certainly not by single-GPU cards. In the past, multi-GPU configurations were a stronger option provided that stuttering was not an issue, but recent years have seen both AMD and NVIDIA take a step back from CrossFireX and SLI, respectively.

Particularly with HDR, NVIDIA expressed a qualitative rather than quantitative enhancement in the gaming experience. Faster framerates and higher resolutions were more known quantities, easily demoed and with more intuitive benefits – though in the past there was the perception of 30fps as cinematic, and currently 1080p still remains stubbornly popular – where higher resolution means more possibility for details, higher even framerates meant smoother gameplay and video. Variable refresh rate technology soon followed, resolving the screen-tearing/V-Sync input lag dilemma, though again it took time to catch on to where it is now – nigh mandatory for a higher-end gaming monitor.

For gaming displays, HDR was substantively different than adding graphical details or allowing smoother gameplay and playback, because it meant a new dimension of ‘more possible colors’ and ‘brighter whites and darker blacks’ to gaming. Because HDR capability required support from the entire graphical chain, as well as high-quality HDR monitor and content to fully take advantage, it was harder to showcase. Added to the other aspects of high-end gaming graphics and pending the further development of VR, this was the future on the horizon for GPUs.

But today NVIDIA is switching gears, going to the fundamental way computer graphics are modelled in games today. Of the more realistic rendering processes, light can be emulated as rays that emit from their respective sources, but computing even a subset of the number of rays and their interactions (reflection, refraction, etc.) in a bounded space is so intensive that real time rendering was impossible. But to get the performance needed to render in real time, rasterization essentially boils down 3D objects as 2D representations to simplify the computations, significantly faking the behavior of light.

It’s on real time ray tracing that NVIDIA is staking its claim with GeForce RTX and Turing’s RT Cores. Covered more in-depth in our architecture article, NVIDIA’s real time ray tracing implementation takes all the shortcuts it can get, incorporating select real time ray tracing effects with significant denoising but keeping rasterization for everything else. Unfortunately, this hybrid rendering isn’t orthogonal to the previous concepts. Now, the ultimate experience would be hybrid rendered 4K with HDR support at high, steady, and variable framerates, though GPUs didn’t have enough performance to get to that point under traditional rasterization.

There’s a still a performance cost incurred with real time ray tracing effects, except right now only NVIDIA and developers have a clear idea of what it is. What we can say is that utilizing real time ray tracing effects in games may require sacrificing some or all three of high resolution, ultra high framerates, and HDR. HDR is limited by game support more than anything else. But the first two have arguably minimum performance standards when it comes to modern high-end gaming on PC – anything under 1080p is completely unpalatable, and anything under 30fps or more realistically 45 to 60fps hurts the playability. Variable refresh rate can mitigate the latter and framedrops are temporary, but low resolution is forever.

Ultimately, the real time ray tracing support needs to be implemented by developers via a supporting API like DXR – and many have been working hard on doing so – but currently there is no public timeline of application support for real time ray tracing, Tensor Core accelerated AI features, and Turing advanced shading. The list of games with support for Turing features - collectively called the RTX platform - will be available and updated on NVIDIA's site.

The RTX 2080 Ti & 2080 Review The RTX Recap: A Brief Overview of the Turing RTX Platform
POST A COMMENT

338 Comments

View All Comments

  • Santoval - Wednesday, September 19, 2018 - link

    The problem is that it does not bring those things to the current table but is going to bring them to a future table. Essentially they expect you to buy a graphics cards that no current game can support its advanced features merely on faith that it both will deliver them in the future *and* that they will be will be worth the very high premium.

    If there is one ultimate unwritten rule when buying computers, computer parts or anything really, it must be this one : Never buy anything based on promises of *future* capabilities - always make your purchasing decisions based on what the products you buy can deliver *now*. All experienced computer and console consumers, in particular, must have that maxim engraved on their brain after having been burnt by so many broken promises.
    Reply
  • Writer's Block - Monday, October 1, 2018 - link

    That is certainly true; 'we promise', politicians and companies selling their shit use it a lot... And break it about as often. Reply
  • Inteli - Wednesday, September 19, 2018 - link

    It's not that the price increase wasn't warranted, at least from the transistor count perspective, it's that there's not a lot to show for it.

    Many more transistors...concentrated in Tensor cores and RTX cores, which aren't being touched in current games. The increased price is for a load of baggage that will take at least a year to really get used (and before you say it, 3 games is not "really getting used"). We're used to new GPUs performing better in current games for the same price, not performing the same in current games for the same price (and I'm absolutely discounting everything before 2008 because that was 10 years ago and the expectations of what a new μArch should bring have changed).

    I get the whole "future of gaming" angle you're pushing, and it's a perfectly valid reason to buy these new GPUs, but don't act like an apples-to-apples comparison of performance *right now* is the "wrong way of looking at it". How the card performs right now is an important metric for a lot of people, and will influence their decision. Especially when we're talking a potential price difference of $100+ (with sales on 1080 Ti's, and FE 2080 prices). Obviously there isn't a valid comparison for the 2080 Ti, but anyone who can drop $1300 on a GPU probably doesn't care too much about the price tag.
    Reply
  • Flunk - Thursday, September 20, 2018 - link

    Nvidia is charging what they are because they have no competition at the top end. That's it, nothing else. They're taking in the cash today in preparation for having to price more competitively later. Reply
  • just4U - Thursday, September 20, 2018 - link

    Flunk, we are talking Nvidia here.. typically speaking they don't lower prices to compete.. Sometimes they bump to high and to few bite.. but that's about it. The last time they lowered prices to compete was the 400 series but they'd just come off getting zonked by amd for basically 2 generations.. and when they went to the 500s series it was fairly competitive with amd.. (initially they were better but Amd continued to improve their 5000/6000 series.. til it was consistently beating Nvidia.. did they lower prices? NO.. not one bit..)

    TNT cards were competitive and cheap.. but once Nvidia knocked off all other contenders (aside from AMD) and started in with their geforce line they have always carried premiums regardless competition or not.
    Reply
  • eddman - Thursday, September 20, 2018 - link

    GTX 280, launched at $650 because they thought AMD couldn't do much. AMD came up with 4870. What happened? Nvidia cut the card's price to $500 a mere month after launch. So yes, they do cut prices to compete. Reply
  • Dragonstongue - Thursday, September 20, 2018 - link

    13.6 and 18.6 (bln transistor estimated) die size of 454/754mm2 (2080/2080Ti) 12nm
    7.2 and 12 (bln transistor estimated) die size of 314/471 (1070/1080-1080Ti/TitanX) 16nm

    yes it is "expensive" no doubt about that, but, it is Nv we are talking about, there is a reason they are way over valued as they are, they produce as cheaply as possible and rack them up in price as much as they can even when their actual cards shipped are no where near the $$$ figure they report as it should.

    also, if anything else, they always have and always will BS the numbers to make themselves ALWAYS appear "supreme" no matter if it is actual power used, TDP, API features, or transistor count etc etc etc.

    as far as the ray tracing crap...if they used an open source style so that everyone can use the exact same ray tracing engine so they can be directly compared to see how good they are or not then it might be "worthy" but, it is Nv they are and continue to be "it has to be our way or you don't play" type approach...I remember way back when with PhysX (which Nv boug out Ageia to do it) when Radeons were able to use it (before Nv took the ability away) they ran circles around comparable Nv cards AND used less cpu and power to do it.

    Nv does not want to get "caught" in their BS, so they find nefarious ways around everything, and when you have a massive amount of $$$$$$$$$$$$$$$$ floating everything you do, it is not hard for them to "buy silence" Intel has done so time and time again, Nv does so time and time again........blekk

    DLSS or whatever the fk they want to call it, means jack shit when only specific cards will be able to use it instead of being a truly open source initiative where everyone/everything gets to show how good they are (or not) and also stand to gain benefit from others putting effort into making it as good as it possibly can be...there is a reason why Nv barely supports Vulkan, because they are not "in control" it is way too easy to "prove them wrong"..funny because Vulkan has ray tracing "built in"

    IMO if they are as good as they claim they are, they would do everything in the light to show they are "the best" not find ways to "hide" what they are doing.....their days are numbered....hell their stock price just took a hit....good IMHO because they should not be over $200 anyways, $100, maybe, but they absolutely should not be valued above others whos financials and product shipment as magnitudes larger.
    Reply
  • Spunjji - Friday, September 21, 2018 - link

    Remind me why consumers should give a rats-ass about die size, other than its visible effects of price and performance.

    If you want to sell me a substantially larger, more expensive chip that performs a little better for a lot more money, a better reason is needed than "maybe it will make some games that aren't out yet really cool in a way that we refuse to give you any performance indications about".

    Screw that.
    Reply
  • Writer's Block - Monday, October 1, 2018 - link

    They look poor value; good performance, sure. But a 1080ti offers the same for much less.
    They want me to buy promises! Seriously, promises are never worth the paper they are printed on - digital or the real stuff.
    Reply
  • Writer's Block - Monday, October 1, 2018 - link

    Oh and, yeh agree. Reply

Log in

Don't have an account? Sign up now