Meet The New Future of Gaming: Different Than The Old One

Up until last month, NVIDIA had been pushing a different, more conventional future for gaming and video cards, perhaps best exemplified by their recent launch of 27-in 4K G-Sync HDR monitors, courtesy of Asus and Acer. The specifications and display represented – and still represents – the aspired capabilities of PC gaming graphics: 4K resolution, 144 Hz refresh rate with G-Sync variable refresh, and high-quality HDR. The future was maxing out graphics settings on a game with high visual fidelity, enabling HDR, and rendering at 4K with triple-digit average framerate on a large screen. That target was not achievable by current performance, at least, certainly not by single-GPU cards. In the past, multi-GPU configurations were a stronger option provided that stuttering was not an issue, but recent years have seen both AMD and NVIDIA take a step back from CrossFireX and SLI, respectively.

Particularly with HDR, NVIDIA expressed a qualitative rather than quantitative enhancement in the gaming experience. Faster framerates and higher resolutions were more known quantities, easily demoed and with more intuitive benefits – though in the past there was the perception of 30fps as cinematic, and currently 1080p still remains stubbornly popular – where higher resolution means more possibility for details, higher even framerates meant smoother gameplay and video. Variable refresh rate technology soon followed, resolving the screen-tearing/V-Sync input lag dilemma, though again it took time to catch on to where it is now – nigh mandatory for a higher-end gaming monitor.

For gaming displays, HDR was substantively different than adding graphical details or allowing smoother gameplay and playback, because it meant a new dimension of ‘more possible colors’ and ‘brighter whites and darker blacks’ to gaming. Because HDR capability required support from the entire graphical chain, as well as high-quality HDR monitor and content to fully take advantage, it was harder to showcase. Added to the other aspects of high-end gaming graphics and pending the further development of VR, this was the future on the horizon for GPUs.

But today NVIDIA is switching gears, going to the fundamental way computer graphics are modelled in games today. Of the more realistic rendering processes, light can be emulated as rays that emit from their respective sources, but computing even a subset of the number of rays and their interactions (reflection, refraction, etc.) in a bounded space is so intensive that real time rendering was impossible. But to get the performance needed to render in real time, rasterization essentially boils down 3D objects as 2D representations to simplify the computations, significantly faking the behavior of light.

It’s on real time ray tracing that NVIDIA is staking its claim with GeForce RTX and Turing’s RT Cores. Covered more in-depth in our architecture article, NVIDIA’s real time ray tracing implementation takes all the shortcuts it can get, incorporating select real time ray tracing effects with significant denoising but keeping rasterization for everything else. Unfortunately, this hybrid rendering isn’t orthogonal to the previous concepts. Now, the ultimate experience would be hybrid rendered 4K with HDR support at high, steady, and variable framerates, though GPUs didn’t have enough performance to get to that point under traditional rasterization.

There’s a still a performance cost incurred with real time ray tracing effects, except right now only NVIDIA and developers have a clear idea of what it is. What we can say is that utilizing real time ray tracing effects in games may require sacrificing some or all three of high resolution, ultra high framerates, and HDR. HDR is limited by game support more than anything else. But the first two have arguably minimum performance standards when it comes to modern high-end gaming on PC – anything under 1080p is completely unpalatable, and anything under 30fps or more realistically 45 to 60fps hurts the playability. Variable refresh rate can mitigate the latter and framedrops are temporary, but low resolution is forever.

Ultimately, the real time ray tracing support needs to be implemented by developers via a supporting API like DXR – and many have been working hard on doing so – but currently there is no public timeline of application support for real time ray tracing, Tensor Core accelerated AI features, and Turing advanced shading. The list of games with support for Turing features - collectively called the RTX platform - will be available and updated on NVIDIA's site.

The RTX 2080 Ti & 2080 Review The RTX Recap: A Brief Overview of the Turing RTX Platform
POST A COMMENT

338 Comments

View All Comments

  • Hixbot - Friday, September 21, 2018 - link

    I'm not sure how midrange 2070/2060 cards will sell if they're not a significant value in performance/price compared to 1070/1060 cards. If AMD offer no competition, Nvidia should still compete with itself Reply
  • Wwhat - Saturday, September 22, 2018 - link

    It's interesting that every comment I've seen says a similar thing and that nobody thinks of uses outside of gaming.
    I would think that for real raytracers and Adobe's graphics and video software for instance the tensor and RT cores would be very interesting.
    I wonder though if open source software will be able to successfully use that new hardware or that Nvidia is too closed for it to get the advantages you might expect.
    And apart from raytracers and such there is also the software science students use too.
    And with the interest in AI currently by students and developers it might also be an interesting offering.
    Although that again relies on Nvidia playing ball a bit.
    Reply
  • michaelrw - Wednesday, September 19, 2018 - link

    "where paying 43% or 50% more gets you 27-28% more performance"
    1080 Ti can be bought in the $600 range, wheres the 2080 Ti is $1200 .. so I'd say thats more than 43-50% price increase..at a minimum we're talking a 71% increase, at worst 100% (Launch MSRP for 1080 Ti was $699)
    Reply
  • V900 - Wednesday, September 19, 2018 - link

    Which is the wrong way of looking at it.

    NVIDIA didn’t just increase the price for shit and giggles, the Turing GPUs are much more expensive to fab, since you’re talking about almost 20 BILLION transistors squeezed into a few hundred mm2.

    Regardless: Comparing the 2080 with the 1080, and claiming there is a 70% price increase, is a bogus logic in the first place, since the 2080 brings a number of things to the table that the 1080 isn’t even capable of.

    Find me a 1080ti with DLSS and that is also capable of raytracing, and then we can compare prices and figure out if there’s a price increase or not.
    Reply
  • imaheadcase - Wednesday, September 19, 2018 - link

    In brings it to the table..on paper more like it. You literally listed the two things that are not really shown AT ALL. Reply
  • mscsniperx - Wednesday, September 19, 2018 - link

    No, actually YOUR logic is bogus. Find me a DLSS or Raytracing game to bench.. You can't. There is a reason for that. Raytracing will require a Massive FPS hit, Nvidia knows this and is delaying you from seeing that as damage control. Reply
  • Yojimbo - Wednesday, September 19, 2018 - link

    There are no ray tracing games because the technology is new, not because NVIDIA is "delaying them". As far as DLSS, I think those games will appear faster than ray tracing. Reply
  • Andrew LB - Thursday, September 20, 2018 - link

    Coming soon:

    Darksiders III from Gunfire Games / THQ Nordic
    Deliver Us The Moon: Fortuna from KeokeN Interactive
    Fear The Wolves from Vostok Games / Focus Home Interactive
    Hellblade: Senua's Sacrifice from Ninja Theory
    KINETIK from Hero Machine Studios
    Outpost Zero from Symmetric Games / tinyBuild Games
    Overkill's The Walking Dead from Overkill Software / Starbreeze Studios
    SCUM from Gamepires / Devolver Digital
    Stormdivers from Housemarque
    Ark: Survival Evolved from Studio Wildcard
    Atomic Heart from Mundfish
    Dauntless from Phoenix Labs
    Final Fantasy XV: Windows Edition from Square Enix
    Fractured Lands from Unbroken Studios
    Hitman 2 from IO Interactive / Warner Bros.
    Islands of Nyne from Define Human Studios
    Justice from NetEase
    JX3 from Kingsoft
    Mechwarrior 5: Mercenaries from Piranha Games
    PlayerUnknown’s Battlegrounds from PUBG Corp.
    Remnant: From The Ashes from Arc Games
    Serious Sam 4: Planet Badass from Croteam / Devolver Digital
    Shadow of the Tomb Raider from Square Enix / Eidos-Montréal / Crystal Dynamics / Nixxes
    The Forge Arena from Freezing Raccoon Studios
    We Happy Few from Compulsion Games / Gearbox

    Funny how the same people who praised AMD for being the first to bring full DX12 support yet only 15 games in the first two years used it, are the same people sh*tting on nVidia for bringing a far more revolutionary technology that's going to be in far more games in a shorter time span.
    Reply
  • jordanclock - Thursday, September 20, 2018 - link

    Considering AMD was the first to bring support to an API that all GPUs could have support for, DLSS is not a comparison. DLSS is an Nvidia-only feature and Nvidia couldn't manage to have even ONE game on launch day with DLSS. Reply
  • Manch - Thursday, September 20, 2018 - link

    AMD spawned Mantle which then turned into Vulcan. Also pushed MS to dev DX12 as it was in both their interests. These APIs can be used by all.

    DLSS while potentially very cool, is as Jordan said proprietary. Like hair works and other crap ot will get light support but devs when it comes to feature sets will spend most of their effort building to common ground. With consoles being AMD GPU based, guess where that will be.

    If will be interesting how AMD will ultimatley respond. Ie gsync/freesync CUDA/OpenCL, etc.

    As Nvidia has stated, these features are designed to work with how current game engines already function so they dont (the devs) have to reinvent the wheel. Ultimately this meanz the integration wont be very deep at least not for awhile.

    For consumers the end goal is always better graphics at the same price point when new releases happen.

    Not that these are bad cards, just expensove and two very key features are unavailable, and that sucks. Hopefully the situation will change sooner rather than later.
    Reply

Log in

Don't have an account? Sign up now