Even Intel announced they will support Freesync and they don't even have dGPUs. Nvidia would have lost market share if they didn't do it now. They could have done it years ago but they wanted every bit of cash from the gamers for G-Sync chips. Now the 20 series are so expensive almost nobody (less than 5% of the market) can realistically afford even an entry level 20 series card AND a G-Sync monitor.
Gamers must feel so grateful that Nvidia has liberated them from the prison they themselves built using the Nvidia closed hardware.
@uefi: "Bless nVidia for liberating gamers from screen tearing once again."
I'm decidedly less thankful than I otherwise would be had they not themselves built the prison that supporting VESA Adaptive Sync is "liberating" us from.
That said, I'm not ungrateful. They could have blindly stuck to their proprietary solution indefinitely, which would have benefited nobody. So, better late than never. It certainly eliminates one of the more practical reasons some consumers avoid nVidia, but it shouldn't be considered a feather in their cap regardless of how their marketing tries to spin it.
Nvidia is dragged kicking and screaming into supporting an industry standard, huzzah! I can't wait to see the former G-Sync module purists cheering them on, even though they were the ones scoffing at the idea that VESA Adaptive Sync could DARE compete with the Gesus-Sync.
Also, how much Nvidia will extort out of manufacturers to get certified (and thus gain "enabled by default on Nvidia cards" status)? Gotta make up the money lost on G-Sync modules somehow.
You rather mean "good for Nvidia", since the price of the monitors they certify will certainly rise. They are definitely going to charge the manufacturers for that, so they in turn will pass the cost to the consumer. I am quite certain that Nvidia can make plenty more money from the monitor certification and the increase in graphics cards sales (due to them supporting adaptive sync) than for the G-Sync modules they have been selling.
The fact is most freesync monitors have rubbish implementations of freesync, that's why 12 out of 400 passed. It's basic stuff like like 30-max htz support, overdrive working with freesync on, low framerate compensation. All this cert tells you is what monitors work properly.
AMD never cared about the quality of experience, they will label pretty well anything freesync, and anything with a vague mention of HDR in the title freesync 2. It doesn't need to work. Surely it's a good thing that Nvidia actually have sensible quality standards? The alternative means you need to try and work it out yourself before buying the monitor - pretty difficult.
Well, yes and no. I think that the VESA Adaptive Sync was codified in mid 2014, but was based around already existing tech in the eDP spec from around 2007. I think that Adaptive Sync was a branding that VESA stuffed into their Display Port 1.2a spec somewhat after the fact. AMD's first demo of using that capability I think was in 2014, but they didn't officially brand it as Freesync until release in March if 2015.
So while AMD demoed the tech in early 2014 (a few months before the branding of VESA Adaptive Sync was official), they also didn't brand Freesync until they released it in March of 2015.
Recall AMD demoed FreeSync at CES 2014 then proposed it to VESA. Then VESA made adaptive sync part of the DisplayPort 1.2a spec. AMD didn't have FreeSync products until VESA adopted adaptive sync. It's a bit of chicken and egg so I could see going either way with who came first here.
Freesync was AMD's graphics card side support for any monitor that could claim VESA Adaptive Sync support. So in practice there are a lot of implementations that do not meet (fail miserably in fact) the intended goal. AMD wanted to avoid a proprietary solution as the open standard argument was their most effective marketing tool to combat nVidia's proprietary solution, but low quality implementations were tarnishing the branding. Enter Freesync 2.
You seem to have a misunderstanding about what all is required to claim Freesync 2 support. Freesync 2 came about precisely because AMD does in fact care about the quality of the experience. While HDR400 is a requirement, and not especially strong for HDR, it does establish a standards based minimum and there are "HDR" monitors out their that have Freesync support but weren't qualified for Freesync 2. Also, claiming Freesync 2 is just an HDR checkbox ignores two other major requirements that largely close the gap between Freesync 2 and G-SYNC. Low Input Latency and Low Framerate Compensation.
That all said, it does appear that nVidia is trying to differentiate by pushing the quality standards even higher here. While there may be little effective difference between G-SYNC Compatible and Freesync 2, the G-SYNC label and its associated tests may potentially provide a practical benefit. Also, the G-SYNC Ultimate label should undoubtedly bring a discernible improvement.
Look at most of the freesync 2 monitor reviews - the range was either 48-144hz or 72-144hz. That's hardly worth it for a tech that is there for when the frame rate is low (remembering every single gsync monitor supports 30hz-max fps). If you look at freesync as a freebee that's fine, but if you actually want it to work properly you can't use the freesync 2 stamp as an indicator. What a freesync 2 stamp should have been is a guarantee of a great freesync experience.
It's these simple quality things that get people to buy Nvidia - if you don't know about PC's or can't be bothered spending 20 hours researching everything then just buy Nvidia because if they put a their stamp on it you can expect it to work well. AMD needs to change the perception of their company, and for something like freesync this should be easy - how hard can it be for AMD to test monitors for compliance to a sensible spec?
Freesync 2 requires low frame rate compensation - that is to say, a range where the greatest value is at least double that of the lower bound. That means that when a Freesync 2 screen (or any that meets that requirement) falls below the minimum range, the frames are DOUBLED UP, so you still STAY IN FREESYNC.
There are hundreds of Freesync models that have been *thouroughly* tested by hardcore, monitor-only review sites like Prad.de and TFTCentral.co.uk. They cover response times, LFC, flicker, overdrive at different refreshes and more.
Clearly, there are some bad Freesync displays out there, but the overwhelming majority of Freesync monitors work fine with AMD cards. The problem is that Nvidia have no experience at writing drivers for VESA VRR standards, since they've spent six years touting their $200 G-Sync FPGA solution. The limited number of certified monitors just means that Nvidia's rubbish early-stage driver doesn't support everything under the VESA standard yet. Likely they've only listed Freesync monitors that happen to meet the old G-Sync spec - and knowing Nvidia, there's probably some vendor kickbacks/bribery going on with their certification program too!
Oh noes we need to do research on computer hardware if only we had websites and forums and shiz for that. Sooo different from buying monitors before adaptive sync existed, when you could just pick any monitor at random and they were all amazing.
Clearly the solution is massively overpriced proprietary sync modules, wait that isn't working, clearly the REAL solution is to charge companies for certification.
If you need that much hand-holding maybe look into the TUF program.
The remaining question is how (or if, or at least 'how well') 'A-sync' HDR displays will work. Nvidia's solution was to move processing to the G-sync module (simultaneous control of backlight modulation and colourspace changes for refresh interval variation) and remain device-agnostic on the PC/game end of things; while AMD's was to move all that to the GPU end with a proprietary API that games would need to implement to work.
I'm wondering if the low number of certified monitors means that the passing criteria is roughly equivalent to AMDs Freesync 2 which eliminated most of the worse than gsync limitations and barely adaptive refresh range limits that could get a Freesync 1 sticker without having a wide enough range to actually be useful in many games.
'Default on' is one of the criteria, so I don't recall if my monitor (ASUS MG279Q) would pass, even if it met ALL the other requirements. That being said, I don't see any answer to this question:
"If Nvidia hasn't tested your 'freesync'/VESA adaptive sync monitor, can you still enable G-Sync?"
"For VRR monitors yet to be validated as G-SYNC Compatible, a new NVIDIA Control Panel option will enable owners to try and switch the tech on - it may work, it may work partly, or it may not work at all."
The MG279Q will probably fail due to not supporting variable sync up to its maximum refresh rate. I believe the sync range is 35Hz - 90Hz (which is good enough for LFC), while the monitor can go up to 144Hz.
It sounds like that mostly describes the G-SYNC compatible label. It would be reasonable that at this level nVidia only charges for the use of G-SYNC branding on promotional materials. The G-SYNC label appears to have a picture quality test that may or may not be useful in practice. I expect it is at this level that nVidia will want to charge extra for the testing. The G-SYNC Ultimate level has extra requirements that will drive quality in a discernible way.
Wow, that means nVidia is back on my shopping list! If AMD doesent deliver compelling product with Navi, I can resort to GeForce to go with my MG278Q (it even has G-Sync compatible badge now). Well done nVidia, it only took you like bajilion years to man up.
Nothing wrong with charging a premium for value added technology. Years in, Nvidia now sees the value to tier their G-Sync product. VESA compatibility will get you the basic G-Sync for free while their Ultimate product will come at a premium price. What's so complicated about that?
nothing wrong with it, they milked that cow as long as it was sustainable. Their choice but it didn't make them look good and brand perception can be important in the long-term.
While not up to Apple's level Nvidia's Reality distortion field is real and most people will not take that delay into account and will now rejoyce that they have given us plebs cheap G-Sync monitors...
My Samsung QLED supports FreeSync w my Xbox One X and Hades Canyon NUC (AMD GPU). It was great that NV pioneered this stuff, but now that there's open standards, I will NOT be buying any video card in the future without support for the HDMI Freesync standard on ANY compatible device. YOU HEAR ME NVIDIA.
So, is G-Sync dead or will there still be G-Sync monitors coming? What's stopping monitor manufacturers from releasing simply "Adaptive Sync compatible" monitors now without going through anyone's certification?
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
45 Comments
Back to Article
uefi - Monday, January 7, 2019 - link
Bless nVidia for liberating gamers from screen tearing once again.sgeocla - Monday, January 7, 2019 - link
Even Intel announced they will support Freesync and they don't even have dGPUs.Nvidia would have lost market share if they didn't do it now. They could have done it years ago but they wanted every bit of cash from the gamers for G-Sync chips.
Now the 20 series are so expensive almost nobody (less than 5% of the market) can realistically afford even an entry level 20 series card AND a G-Sync monitor.
Gamers must feel so grateful that Nvidia has liberated them from the prison they themselves built using the Nvidia closed hardware.
close - Monday, January 7, 2019 - link
"they don't even have dGPUs"Yet. They are very much planning on launching some.
BurntMyBacon - Monday, January 7, 2019 - link
@uefi: "Bless nVidia for liberating gamers from screen tearing once again."I'm decidedly less thankful than I otherwise would be had they not themselves built the prison that supporting VESA Adaptive Sync is "liberating" us from.
That said, I'm not ungrateful. They could have blindly stuck to their proprietary solution indefinitely, which would have benefited nobody. So, better late than never. It certainly eliminates one of the more practical reasons some consumers avoid nVidia, but it shouldn't be considered a feather in their cap regardless of how their marketing tries to spin it.
Alexvrb - Monday, January 7, 2019 - link
Nvidia is dragged kicking and screaming into supporting an industry standard, huzzah! I can't wait to see the former G-Sync module purists cheering them on, even though they were the ones scoffing at the idea that VESA Adaptive Sync could DARE compete with the Gesus-Sync.Also, how much Nvidia will extort out of manufacturers to get certified (and thus gain "enabled by default on Nvidia cards" status)? Gotta make up the money lost on G-Sync modules somehow.
ET - Monday, January 7, 2019 - link
Perhaps NVIDIA's certification will improve the quality of adaptive sync monitors. Good for everyone.Santoval - Monday, January 7, 2019 - link
You rather mean "good for Nvidia", since the price of the monitors they certify will certainly rise. They are definitely going to charge the manufacturers for that, so they in turn will pass the cost to the consumer. I am quite certain that Nvidia can make plenty more money from the monitor certification and the increase in graphics cards sales (due to them supporting adaptive sync) than for the G-Sync modules they have been selling.Dribble - Monday, January 7, 2019 - link
The fact is most freesync monitors have rubbish implementations of freesync, that's why 12 out of 400 passed. It's basic stuff like like 30-max htz support, overdrive working with freesync on, low framerate compensation. All this cert tells you is what monitors work properly.AMD never cared about the quality of experience, they will label pretty well anything freesync, and anything with a vague mention of HDR in the title freesync 2. It doesn't need to work. Surely it's a good thing that Nvidia actually have sensible quality standards? The alternative means you need to try and work it out yourself before buying the monitor - pretty difficult.
psychobriggsy - Monday, January 7, 2019 - link
Other sites say that Nvidia has so-far tested 12 monitors thoroughly, but they are going to test 400 eventually. Not that 388 have failed.Also this article suggests that Adaptive Sync from VESA came before AMD's FreeSync, which is rather revisionist.
erple2 - Monday, January 7, 2019 - link
Well, yes and no. I think that the VESA Adaptive Sync was codified in mid 2014, but was based around already existing tech in the eDP spec from around 2007. I think that Adaptive Sync was a branding that VESA stuffed into their Display Port 1.2a spec somewhat after the fact. AMD's first demo of using that capability I think was in 2014, but they didn't officially brand it as Freesync until release in March if 2015.So while AMD demoed the tech in early 2014 (a few months before the branding of VESA Adaptive Sync was official), they also didn't brand Freesync until they released it in March of 2015.
FreckledTrout - Monday, January 7, 2019 - link
Recall AMD demoed FreeSync at CES 2014 then proposed it to VESA. Then VESA made adaptive sync part of the DisplayPort 1.2a spec. AMD didn't have FreeSync products until VESA adopted adaptive sync. It's a bit of chicken and egg so I could see going either way with who came first here.Stanri010 - Thursday, January 10, 2019 - link
They said they have tested 250 of 400 with 150 more to go. 12 monitors have passed.BurntMyBacon - Monday, January 7, 2019 - link
Freesync was AMD's graphics card side support for any monitor that could claim VESA Adaptive Sync support. So in practice there are a lot of implementations that do not meet (fail miserably in fact) the intended goal. AMD wanted to avoid a proprietary solution as the open standard argument was their most effective marketing tool to combat nVidia's proprietary solution, but low quality implementations were tarnishing the branding. Enter Freesync 2.You seem to have a misunderstanding about what all is required to claim Freesync 2 support. Freesync 2 came about precisely because AMD does in fact care about the quality of the experience. While HDR400 is a requirement, and not especially strong for HDR, it does establish a standards based minimum and there are "HDR" monitors out their that have Freesync support but weren't qualified for Freesync 2. Also, claiming Freesync 2 is just an HDR checkbox ignores two other major requirements that largely close the gap between Freesync 2 and G-SYNC. Low Input Latency and Low Framerate Compensation.
That all said, it does appear that nVidia is trying to differentiate by pushing the quality standards even higher here. While there may be little effective difference between G-SYNC Compatible and Freesync 2, the G-SYNC label and its associated tests may potentially provide a practical benefit. Also, the G-SYNC Ultimate label should undoubtedly bring a discernible improvement.
Dribble - Monday, January 7, 2019 - link
Look at most of the freesync 2 monitor reviews - the range was either 48-144hz or 72-144hz. That's hardly worth it for a tech that is there for when the frame rate is low (remembering every single gsync monitor supports 30hz-max fps). If you look at freesync as a freebee that's fine, but if you actually want it to work properly you can't use the freesync 2 stamp as an indicator. What a freesync 2 stamp should have been is a guarantee of a great freesync experience.It's these simple quality things that get people to buy Nvidia - if you don't know about PC's or can't be bothered spending 20 hours researching everything then just buy Nvidia because if they put a their stamp on it you can expect it to work well. AMD needs to change the perception of their company, and for something like freesync this should be easy - how hard can it be for AMD to test monitors for compliance to a sensible spec?
piroroadkill - Monday, January 7, 2019 - link
Freesync 2 requires low frame rate compensation - that is to say, a range where the greatest value is at least double that of the lower bound. That means that when a Freesync 2 screen (or any that meets that requirement) falls below the minimum range, the frames are DOUBLED UP, so you still STAY IN FREESYNC.Alexvrb - Monday, January 7, 2019 - link
SCIENCE!levizx - Tuesday, January 8, 2019 - link
You don't even know what you are talking about.Chrispy_ - Monday, January 7, 2019 - link
There are hundreds of Freesync models that have been *thouroughly* tested by hardcore, monitor-only review sites like Prad.de and TFTCentral.co.uk. They cover response times, LFC, flicker, overdrive at different refreshes and more.Clearly, there are some bad Freesync displays out there, but the overwhelming majority of Freesync monitors work fine with AMD cards. The problem is that Nvidia have no experience at writing drivers for VESA VRR standards, since they've spent six years touting their $200 G-Sync FPGA solution. The limited number of certified monitors just means that Nvidia's rubbish early-stage driver doesn't support everything under the VESA standard yet. Likely they've only listed Freesync monitors that happen to meet the old G-Sync spec - and knowing Nvidia, there's probably some vendor kickbacks/bribery going on with their certification program too!
Alexvrb - Monday, January 7, 2019 - link
Oh noes we need to do research on computer hardware if only we had websites and forums and shiz for that. Sooo different from buying monitors before adaptive sync existed, when you could just pick any monitor at random and they were all amazing.Clearly the solution is massively overpriced proprietary sync modules, wait that isn't working, clearly the REAL solution is to charge companies for certification.
If you need that much hand-holding maybe look into the TUF program.
Manch - Monday, January 7, 2019 - link
Will probably just improve the price for vendors.DesktopMan - Monday, January 7, 2019 - link
Now we just need confirmation of HDMI VRR and QFT support, which they can support at HDMI 2.0 speeds if 20xx can't do 2.1.nathanddrews - Monday, January 7, 2019 - link
Error 404, features not found: please upgrade to 30X0 GPU.BurntMyBacon - Monday, January 7, 2019 - link
It is reportedly supposed to work with Pascal and newer graphics processors.eddman - Monday, January 7, 2019 - link
Which reports?edzieba - Monday, January 7, 2019 - link
The remaining question is how (or if, or at least 'how well') 'A-sync' HDR displays will work. Nvidia's solution was to move processing to the G-sync module (simultaneous control of backlight modulation and colourspace changes for refresh interval variation) and remain device-agnostic on the PC/game end of things; while AMD's was to move all that to the GPU end with a proprietary API that games would need to implement to work.BurntMyBacon - Monday, January 7, 2019 - link
I imagine it will all eventually converge to Microsoft's HDR solution as that will at least be standard across vendors.DanNeely - Monday, January 7, 2019 - link
I'm wondering if the low number of certified monitors means that the passing criteria is roughly equivalent to AMDs Freesync 2 which eliminated most of the worse than gsync limitations and barely adaptive refresh range limits that could get a Freesync 1 sticker without having a wide enough range to actually be useful in many games.Araemo - Monday, January 7, 2019 - link
'Default on' is one of the criteria, so I don't recall if my monitor (ASUS MG279Q) would pass, even if it met ALL the other requirements. That being said, I don't see any answer to this question:"If Nvidia hasn't tested your 'freesync'/VESA adaptive sync monitor, can you still enable G-Sync?"
eddman - Monday, January 7, 2019 - link
You could've read the source:"For VRR monitors yet to be validated as G-SYNC Compatible, a new NVIDIA Control Panel option will enable owners to try and switch the tech on - it may work, it may work partly, or it may not work at all."
BurntMyBacon - Monday, January 7, 2019 - link
The MG279Q will probably fail due to not supporting variable sync up to its maximum refresh rate. I believe the sync range is 35Hz - 90Hz (which is good enough for LFC), while the monitor can go up to 144Hz.BurntMyBacon - Monday, January 7, 2019 - link
It sounds like that mostly describes the G-SYNC compatible label. It would be reasonable that at this level nVidia only charges for the use of G-SYNC branding on promotional materials. The G-SYNC label appears to have a picture quality test that may or may not be useful in practice. I expect it is at this level that nVidia will want to charge extra for the testing. The G-SYNC Ultimate level has extra requirements that will drive quality in a discernible way.kamild1996 - Monday, January 7, 2019 - link
...only for latest two generations, naturally. No love for GTX 9xx cards?Dodozoid - Monday, January 7, 2019 - link
Wow, that means nVidia is back on my shopping list! If AMD doesent deliver compelling product with Navi, I can resort to GeForce to go with my MG278Q (it even has G-Sync compatible badge now).Well done nVidia, it only took you like bajilion years to man up.
Dr. Swag - Monday, January 7, 2019 - link
I was so happy since I have an nvidia gpu and freesync monitor until I read only 10 series and 20 seriesWhy you do this to me nvidia
PeachNCream - Monday, January 7, 2019 - link
It's official, G-Sync is the GPU Betamax of 2018.trparky - Monday, January 7, 2019 - link
Will someone please check Hell, I think it just froze over.vicbee - Monday, January 7, 2019 - link
Nothing wrong with charging a premium for value added technology. Years in, Nvidia now sees the value to tier their G-Sync product. VESA compatibility will get you the basic G-Sync for free while their Ultimate product will come at a premium price. What's so complicated about that?Murloc - Monday, January 7, 2019 - link
nothing wrong with it, they milked that cow as long as it was sustainable. Their choice but it didn't make them look good and brand perception can be important in the long-term.valinor89 - Tuesday, January 8, 2019 - link
While not up to Apple's level Nvidia's Reality distortion field is real and most people will not take that delay into account and will now rejoyce that they have given us plebs cheap G-Sync monitors...HeavyHemi - Monday, January 7, 2019 - link
Stop calling VESA Adaptive Sync, "FreeSync". Thank you.hubick - Monday, January 7, 2019 - link
My Samsung QLED supports FreeSync w my Xbox One X and Hades Canyon NUC (AMD GPU). It was great that NV pioneered this stuff, but now that there's open standards, I will NOT be buying any video card in the future without support for the HDMI Freesync standard on ANY compatible device. YOU HEAR ME NVIDIA.Shahnewaz - Tuesday, January 8, 2019 - link
So, is G-Sync dead or will there still be G-Sync monitors coming? What's stopping monitor manufacturers from releasing simply "Adaptive Sync compatible" monitors now without going through anyone's certification?valinor89 - Tuesday, January 8, 2019 - link
Not dead, they launched G-Sync ULTIMATE! Nothing, just as they have been launching them without the Freesync label.celpas - Tuesday, January 8, 2019 - link
I think the gsync only monitors with the modules will die out. The gsync hdr models remain.HollyDOL - Saturday, January 12, 2019 - link
Link to currently validated screens ( -compatible are at the bottom)https://www.nvidia.com/en-us/geforce/products/g-sy...