With the annual Game Developer Conference taking place next month in San Francisco, the session catalogs for the conference are finally being published and it looks like we may be in for some interesting news on the API front. Word comes via the Tech Report and regular contributor SH SOTN that 3 different low level API sessions have popped up in the session catalog thus far. These sessions are covering both Direct3D and OpenGL, and feature the 4 major contributors for PC graphics APIs: Microsoft, AMD, NVIDIA, and Intel.

The session descriptions only offer a limited amount of information on their respective contents, so we don’t know whether anything here is a hard product announcement or whether it’s being presented for software research & development purposes, but at a minimum it would give us an idea into what both Microsoft and the OpenGL hardware members are looking into as far as API efficiency is concerned. The subject has become an item of significant interest over the past couple of years, first with AMD’s general clamoring for low level APIs, and more recently with the launch of their Mantle API. And with the console space now generally aligned with the PC space (x86 CPUs + D3D11 GPUs), now is apparently as good a time as any to put together a low level API that can reach into the PC space.

With GDC taking place next month we’ll know soon enough just what Microsoft and its hardware partners are planning. In the meantime let’s take a quick look at the 3 sessions.

DirectX: Evolving Microsoft's Graphics Platform

Presented by: Microsoft; Anuj Gosalia, Development Manager, Windows Graphics

For nearly 20 years, DirectX has been the platform used by game developers to create the fastest, most visually impressive games on the planet.

However, you asked us to do more. You asked us to bring you even closer to the metal and to do so on an unparalleled assortment of hardware. You also asked us for better tools so that you can squeeze every last drop of performance out of your PC, tablet, phone and console.

Come learn our plans to deliver.

Direct3D Futures

Presented by: Microsoft; Max McMullen, Development Lead, Windows Graphics

Come learn how future changes to Direct3D will enable next generation games to run faster than ever before!

In this session we will discuss future improvements in Direct3D that will allow developers an unprecedented level of hardware control and reduced CPU rendering overhead across a broad ecosystem of hardware.

If you use cutting-edge 3D graphics in your games, middleware, or engines and want to efficiently build rich and immersive visuals, you don't want to miss this talk.

Approaching Zero Driver Overhead in OpenGL

Presented By: NVIDIA; Cass Everitt, OpenGL Engineer, NVIDIA; Tim Foley, Advanced Rendering Technology Team Lead, Intel; John McDonald,  Senior Software Engineer, NVIDIA; Graham Sellers,  Senior Manager and Software Architect, AMD

Driver overhead has been a frustrating reality for game developers for the entire life of the PC game industry. On desktop systems, driver overhead can decrease frame rate, while on mobile devices driver overhead is more insidious--robbing both battery life and frame rate. In this unprecedented sponsored session, Graham Sellers (AMD), Tim Foley (Intel), Cass Everitt (NVIDIA) and John McDonald (NVIDIA) will present high-level concepts available in today's OpenGL implementations that radically reduce driver overhead--by up to 10x or more. The techniques presented will apply to all major vendors and are suitable for use across multiple platforms. Additionally, they will demonstrate practical demos of the techniques in action in an extensible, open source comparison framework.

Source: SH SOTN (via the Tech Report)

POST A COMMENT

63 Comments

View All Comments

  • BrightCandle - Wednesday, February 26, 2014 - link

    You couldn't be more wrong. NVAPI doesn't do gaming interfacing to the card at all. You should consider it more like an API that contains all the things the Nvidia control panel contains, like the ability to override antialiasing or resolution settings. It does not include rendering commands. Reply
  • chizow - Wednesday, February 26, 2014 - link

    Overall agree that NVAPI isn't as obtrusive as Mantle, but it does actually include rendering commands. You can write any low level shader code you like in ASM via NVAPI, it's just not exposed to the application, but you can get to low level ASM code by hooking in through the driver. This is how Nvidia's 3D Vision works, and how brilliant tools like HeliX mod literally change shader code to correct Stereo 3D images. Reply
  • ppi - Wednesday, February 26, 2014 - link

    Keep in mind 3dfx had no response to nVidia products after ~Voodoo3. Especially with Voodoo5 they were late to market, and as I recall the benchmarks, a less expensive GeForce2 did easily beat it and GF3 was final nail in the coffin. Reply
  • Mathos - Thursday, February 27, 2014 - link

    Thats because the Voodoo 3 and 5 were done in the time frame after 3DFX had bought out STB. In my time as a gamer, and PC enthusiast, I've owned a Diamond card based on the S3 Virge, as well as one based on the Rendition Verite chipset. I owned a Diamond Monster 3D during the Voodoo 1 days. I owned an STB Black Magic 2 during the Voodoo 2 days. I owned a Guillomat (spelling?) Voodoo Banshee card. I've owned Nvidia Riva TNT, and TNT2 cards. And I even had a Voodoo3 for some time, as well as access to a voodoo5, they were actually faster, but they were limited to 16bit color, which was their major flaw. A Geforce 3 Ti 450 I can't remember the brand of. I had a Geforce 4 Ti 4600 for a time. And then Had a Geforce FX 5600 Ultra for some time. Changed to ATI/AMD after the HD3000 series, and more or less just upgrade every other generation depending on performance. Still may go Nvidia if the card they have performs best at the price point I can afford at the time.

    The use of Nvidia Nvapi started in 2002, used for games that were specifically designed to run better on Nvidia cards of the time, aka Geforce 3 and 4. Thats when the Nvidia the way it's meant to be played logo popped up. Guess what... the Xbox, which used an Nvidia geforce 3 based gpu, had launched just before that, in late 2001, November to be exact.... Coincidence anyone? Contrary to popular belief, it did stir up bad reactions from people back then, saying it was competing with OGL, and DX. Even though in reality, it works the same way mantle does, and it's proprietary and specific to nvidia's hardware. This is why NTWIMBP titles have always performaned better on Nvidia cards, than competing ATI cards even when ATI/AMD had clear performance advantage in none nvidia mtb games.

    Fast Forward to Today. AMD/ATI GPU's are currently in all. AMD is simply doing the same thing. But on the bright side, it looks like it was enough of a fire being lit under someones rears to start improving the main graphics API's. Mantle doesn't ignore DX, it uses code specific to AMD cards to improve or enhance performance while running DX. Which is the same thing Nvapi does.

    This is a direct quote from Nvidia's own website "NVAPI is NVIDIA's core software development kit that allows direct access to NVIDIA GPUs and drivers on all windows platforms. NVAPI provides support for categories of operations that range beyond the scope of those found in familiar graphics APIs such as DirectX and OpenGL."
    Reply
  • chizow - Wednesday, February 26, 2014 - link

    NVAPI exposed some low level functions for Nvidia GPUs when DX and OpenGL functionality was limited or not fully supported, but it ALWAYS sat below the DX or OpenGL APIs. It never tried to usurp them, instead leaving the existing API hierarchy and support chain intact. Same cannot be said for Mantle. Surely you see the difference right?

    Just look at it from the other perspective, if Nvidia decided to develop and promote NVAPI as a low level API to replace existing HLSL, would you in favor of it? For -5 to 5% scaling in commonly used gaming scenarios? Why bother?
    Reply
  • JlHADJOE - Thursday, February 27, 2014 - link

    I thought it was more because they got really complacent and fell so far behind in terms of features, and then eventually performance.

    Hell 3Dfx didn't even have true 16-bit color when Nvidia already had 32-bit, but in a bout of supreme arrogance 3Dfx basically went "16-bit is good enough for everybody". Well Q3A's alpha-blending certainly proved them wrong.
    Reply
  • Creig - Wednesday, February 26, 2014 - link

    And you, Wreckage, may want to google "TROLL" and "OBSESSIVE BEHAVIOR". Reply
  • nathanddrews - Wednesday, February 26, 2014 - link

    I've generally got nothing against DX or OGL, but we're overdue for a shake-up of graphics APIs. Mantle has made clear that both bleeding edge (minimum frame rates) and mainstream (minimum and maximum frame rates) can benefit from more efficient API/driver combinations.

    P.S. The lockdown of DX features to only the newest versions of Windows also needs to go away. Also, I wouldn't be mad if OGL gained wider acceptance.
    Reply
  • Wreckage - Wednesday, February 26, 2014 - link

    So Mantle is dead before it ever truly lived... OpenGL has had low level support for a while and DX looks to be catching up. Good news for everyone...except maybe AMD who have spent a lot of money on Mantle. Reply
  • The_Assimilator - Wednesday, February 26, 2014 - link

    And Dice and all the other chump game companies that have wasted time and money on adding Mantle support to their engines. Reply

Log in

Don't have an account? Sign up now