No surprise, it's obvious AMD is running into all kinds of problems with the limitations of their FreeSync spec.
" but given the fact that it’s more important to get this right than to get it out quickly, this is likely for the best."
What is interesting is that AMD did not apply the same level of restraint on FreeSync as a whole and pump the brakes until they got it right. This is the second glaring deficiency relative to G-Sync yet that didn't stop AnandTech from misleadingly declaring FreeSync an equivalent alternative.
Love how anything AMD your the first to comment, and anything that doesn't allow you to be anti AMD I never seem to see you comment on. Is the only thing you do is sit here and troll anti AMD stuff on this site?
Either way this tech is still a nitche market with the premium it calls for. Not to mention the small range of choices for monitors that support G-Sync or FreeSync. Sad to see that crossfire got delayed for this I am sure the people who do go for this are enthusiast level and probably already run crossfire.
Have you ever considered that's because you only bother to read AMD headlines to defend their honor in their time of need? lol. I guess you missed the half dozen or so comments I made in other articles this week, everything from Apple to Microsoft's interesting announcements this week, but don't let that stop you from commenting ignorantly on the matter. I guess it never occurred to you that people will comment on articles that interest them, go figure!
But yes this is a niche market, just as I stated, which makes you wonder whether or not AMD has the requisite resources or knowledge to be dabbling in this market when they clearly have bigger issues on their plate, most notably, launching competitive CPU and GPUs in the marketplace.
Yes, PCWorld, TrustedReviews, and Toms Hardware are all just lying. I'll capitulate that Intel puts out better CPU's, but where have you been the last 5 years claiming their gpu's are anything less than nVidia, not even getting into if you get into the performance/$$?
Yes, sorry about the grammar. My weak intellect is too brainwashed by AMD to proof read. Guess I frequent too many sites that allow editing for that, my apologies.
AMD has never had the fastest GPU to end a generation since G80 launched, and while they are generally competitive in overall performance, they are still far behind when it comes to technology, drivers, features, and support. This entire FreeSync debacle is just another example of this.
Nvidia took the time and resources to develop G-Sync years ago, it took a lot of time, research, and money and when they brought it to market, it worked, just as advertised to the world's amazement. AMD claims there was no reason for Nvidia to have gone the hardware route, that they were thinking of maybe working on something similar using open/existing DP specs, then half-assedly throw a solution together and 18 months later, we get something that is clearly inferior in FreeSync. See the difference?
Who wants to spend their time and money on products that don't just work? You wouldn't pay a premium for a product that works as advertised, today, without having to wait? Given the life cycles of these products turns over rather frequently (18-24 months), every month you have to wait for a feature or product to mature is time and money wasted.
Let's see what the the people who actually reviewed the technology said about FreeSync, rather than Chizow's nvidia-distorted view:
Anandtech - It took a while to get here, but if the proof is in the eating of the pudding, FreeSync tastes just as good as G-SYNC when it comes to adaptive refresh rates
TheTechReport - the BenQ XL2730Z is good enough that I think it's time for the rest of the industry to step up and support the VESA standard for variable refresh rates.
Guru3D - FreeSync works as well as Gsync does and vice versa
Gamespot - Freesync is the more attractive proposition, even if Nvidia has a wider slice of the GPU market. More monitor features, and a lower cost of entry make it even better.
LMAO yeah let's see what the actual reviews that dug a bit deeper than the superficial, staged demo at AMD HQ that Creig presents:
TFTCentral: http://www.tftcentral.co.uk/reviews/benq_xl2730z.h... "As a result, response times are fairly slow at ~8.5ms G2G and there is a more noticeable blur to the moving image. See the more detailed response time tests in the previous sections for more information, but needless to say this is not the optimum AMA (response time) setting on this screen. For some reason, the combination of FreeSync support and this display disables the AMA function.
This only happens when you are using a FreeSync enabled graphics card, FreeSync capable drivers and the DisplayPort interface...
Having spoken to BenQ about it the issue is a known bug which apparently currently affects all FreeSync monitors. The AMD FreeSync command disturbs the response time (AMA) function, causing it to switch off. It's something which will require an update from AMD to their driver behaviour, which they are currently working on. It will also require a firmware update for the screen itself to correct the problem."
Forbes: http://www.forbes.com/sites/jasonevangelho/2015/03... "Note to readers: I have seen this firsthand on the Acer FreeSync monitor I’m reviewing, and PC Perspective noticed it with 2 additional FreeSync monitors, the BenQ XL2730Z and LG 34UM67. To illustrate the problem they recorded the aforementioned monitors running AMD’s Windmill demo, as well as the same demo running on a G-Sync enabled Asus ROG Swift. Ignore the stuttering you see (this is a result of recording at high speed) and pay attention to the trailing lines, or ghosting. I agree that it’s jarring by comparison."
PCPer: http://www.pcper.com/reviews/Graphics-Cards/Dissec... "It's also important to note that the experience below the VRR window on a FreeSync panel today is actually worse in practice than in theory. Because the refresh rates stays at 40 Hz when your frame rates are low, you get a combination of stutter and frame tearing (if V-Sync is off) that is worse than if the refresh rate was higher, at say 60 Hz or even 144 Hz. No doubt complications would arise from an instantaneous refresh rate shift of ~40 Hz to 144 Hz but some middle ground likely exists that FreeSync could implement to improve low-FPS experiences."
In Summary: 3 things that make FreeSync worse than G-Sync, today
1) Ghosting due to FreeSync commands disabling OverDrive anti-blur technologies in ALL FreeSync monitors. 2) Worst than V-sync behavior outside of relatively limited VRR windows (input lag, stuttering, tearing all present, as high as 40Hz). 3) No CF support to help mitigate these issues, leaving you with effectively 1 GPU choice for this technology (290/X), and good luck getting one of those to run acceptably well at 2560x1440 much less 4K in current games without turning settings WAY down.
The alternative: Paying a premium for better Nvidia products that do everything they said they would relative to G-Sync and VRR, for the last 20 months since they launched it
And I guess the follow-up to that question Creig, in your opinion, knowing there are reviews that have brought these issues to light, do you honestly think AMD's FreeSync is equivalent to Nvidia's G-Sync? Would you go and buy one of these monitors today, if VRR was important to you? Would you recommend someone else who was interested in Variable Refresh, buy one of these monitors today?
See the difference between me and you is, I wouldn't recommend something that sucked to someone else if they were spending their hard-earned money. Clearly, you have no issue in doing so.
Wait, you are stuck with the illusion that G-Sync is working properly as of today?
It is not. Nvidia is still countering lots of issues. Granted, they are less universal and more game-specific (or monitor-specific!) but to say G-Sync is smooth sailing and Nvidia delivered a great flawless tech, is very far from reality.
ULMB is disabled on *all* monitors when G-Sync is enabled. However, that's also the case for FreeSync. I'm pretty sure that strobing the backlight is not possible when dealing with variable refresh rates... at least for now.
I think comparing ULMB to pixel overdrive (anti-ghosting) is apples to oranges, anyway.
Everyone knows he's a rabid Nvidia fan. Wait, I take that back. That statement might be unfair to average, everyday rabid Nvidia fans. There is no talking changing Chizow's mind or talking any kind of sense into him. He will NEVER agree with anything that isn't pro-Nvidia.
My personal favorite is how he insisted that the R9 285 would throttle like crazy, and that said throttling was completely unacceptable. When it didn't he refused to admit he was wrong and changed arguments. Then it's proven that some Maxwell cards throttle like crazy even on regular games and suddenly it's not important if a card throttles.
Anyway he'd be perfectly happy to pay a hefty premium for a proprietary solution that only works on Nvidia cards. Many normal people would prefer a solution which is based on VESA standards and can be implemented by any GPU vendor. That way you can buy one monitor and you aren't locked into a single vendor to use one of its biggest features.
LMAO, from Alexvrb, the captain of the AMD turd polishing patrol. Did you personally work on Turdga btw? I have to imagine so as that is the only possible explanation for why someone would so vigorously defend such an awful part. There's now rumors that Turdga wasn't even the fully enabled ASIC, so there's a good chance it would've actually used significantly more power without much of a performance increase. But only a massive AMD fanboy like you would consider this a positive for AMD and attempt to draw parallels to Nvidia's Maxwell arch as you did. You didn't have much to say when Nvidia completely killed it with the 970/980 did you?
But yes, just as I stated, 285's TDP was bogus, in order to avoid boost issues, they needed to slap a 3rd party cooler on it at which point its TDP was closer to the 280 with barely the performance to match. Truly, only a turd a die hard AMD fanboy like you would appreciate (or bother to defend as you did so vigorously).
Unlike you, I own products from multiple vendors. My tablet even has a Tegra 3 (which granted is due for replacement soon, probably with an Intel powered model - such an AMD fanboy!). The only reason I appear to be an "AMD fanboy" to you is that you're buried so deep in Nvidia Cult Worship that any time ANYONE disagrees with you it's blasphemy and they must be vigorously attacked with multiple strawmen arguments and other such nonsense designed to move the goalposts so you can "win".
Actually I do own products from multiple vendors as well, as recently as an R9 290X that was given to me by a vendor to validate, and of course, it reinforced the fact AMD Radeon products simply aren't good enough for my uses at either home or work.
I also looked seriously into getting a Kaveri-based HTPC system, but after all the total costs, need for a chassis and PSU, and total power usage, it simply did not make sense for my needs for such a simple task as a DVR/mediaplex.
And I've said many times, to AMD fanboys such as yourself, if roles were reversed and there was an AMD logo on any of the Nvidia products I've purchased in the last 9 years since G80, I would've bought AMD without hesitation. Can you say the same? Of course not, because you buy inferior AMD products as any loyal fanboy would.
But you seem confused, you appear to be an AMD fanboy because you defend their most egregiously bad products, plain and simple. I mean anyone who bothers to defend Turdga as vigorously as you did is going to set off fanboy alarms anywhere, even AMDZone HQ.
And of course I'd pay a premium for something that actually worked and did all it says it would, who would pay a hefty premium for a monitor (AMD FreeSync panels still carry a premium) that didn't do all it said it would and actually performed WORST than a standard panel in many use-cases?
You're right, we have the consoles to thank for low level APIs, and Microsoft in particular for DX12.
And 28GB/s VRAM partition schemes on a part that completely decimated AMD's product stack was certainly welcome, even for AMD fans who could find their favorite cards for a fraction of the price, thanks to Nvidia.
What are you going on about? You're trying to say we have AMD to thank for Mantle/Vulkan/DX12 and I'm saying that's a farce. I guess John Carmack was espousing the benefits of target hardware and low level APIs 5-6 years ago and referring to AMD and Mantle? Of course not. Do you HONESTLY believe Microsoft was not going to roll out a DX12 at some point, and that it would not have many of the same features as the low level DirectX based API they had been using for over a decade? You do realize, Microsoft was demo'ing the "Xbox One" before the console actually launched on Nvidia hardware right? I was just pre-emptively giving your assertion the objectivity and rationality it deserved, utter nonsense.
And you seem hung up on the 28GB/s VRAM partition, please tell me how this impacts you and your usage cases with regard to the GTX 970, and how this in any way diminishes the immense value and performance it offered at that $330 price point, thanks.
And I see you aren't spewing your usual nonsense cheerleading AMD's woeful products, as usual, Creig. Did you finally get the pinkslip?
What happened to price:performance trumping all with the 290/X leading AMD into an absolute nosedive the last 2 quarters since Maxwell launched and decimated AMD's product stack, just as I said it would?
You kept insisting a miniscule price difference was going to be the key differentiator in AMD's favor, but none of that happened. Once again, maybe you are doing it wrong, Creig?
Ah, poor Chizow. Once again you display your utter lack of comprehension. The main reason I recommend cards based on favorable price:performance ratios isn't because it's good for AMD. I recommend them because it's good for the end consumer. Namely, us. It just so happens that it's AMD who typically offers the best performance for the money.
See, that's the difference between you and me. You treat AMD and Nvidia like rival football teams. You cheer Nvidia and boo AMD every chance you get because you feel like you'll be on the "winning" side if Nvidia comes out on top. Here in the real world, the only "winner" is the person who gets the best deal for their money, regardless of who makes the product.
Nvidia must just love fanatics like you who literally foam at the mouth anytime somebody has the unmitigated gall to point out how you can usually get a better deal by purchasing an AMD card. So just keep ranting away, chizow. The rest of us will just continue to sit back and chuckle at your antics.
LMAO poor Creig, you once again backpedal to cover the stupid things you've said in the past.
Again, what forced the price changes that you and other poor AMD have-nots enjoyed recently? Do we need to go back and check to see if you were recommending to everyone to go out and buy Nvidia cards when Nvidia completely decimated AMD's entire product stack after Maxwell launch? Of course not. You waited the month+ before going on about how AMD offered such a great deal, refusing to even acknowledging the fact it was Nvidia that drove this "positive change for the consumer".
Then you went on to stupidly claim only price:performance mattered, when in reality, that couldn't be further from the truth! While it is an important metric, it is obvious that if price:performance are CLOSE then the consumer will buy the BETTER product, because the premium is justified. What makes a product better is NOT strictly tied to price and performance, because a number of other factors like support, feature-level, drivers, efficiency, aesthetics, acoustics, developer relationships etc. all factor into this decision. And G-Sync vs. FreeSync is just one more example of this.
Again, I think your mistake in comparing me with you is that I actually USE these products I am talking about to maximize enjoyment of my favorite pasttime, PC gaming, so spending top dollar or paying slightly more for better product is no issue or obstacle for me. Waiting for some miracle price drop on last-year's performance does not outweigh the loss in performance, efficiency, features, etc. that I would lose by having to wait.
But obviously, none of this is important to you because you keep going on about how this deal is better than that for people, is it a good deal to buy a product that doesn't even do what it claims to do and to be in a perpetual holding pattern waiting for proper support of said product? LMAO, again just points to certain people having low standards and expecting less of what they buy, simple as that, for anyone else a small premium is no object when it comes to buying the better product and once again, for 2 quarters running, the market agrees with me, not you as AMD has been slaughtered in the marketplace.
So like I said Creig, maybe you are doing it wrong? What GPU do you even own now? 5850? 6950? Your opinion doesn't even matter, your buying habits and usage patterns aren't even relevant to the discussion.
When have I ever indicated I was blindly cheer leading for either side. I actually own a pair of 970s driving a 1440p gsync display and before that I owned a 7970...basically, whatever offers good value for money.
My comment was about you and your frothy mouthed rants you go on in every AMD related post I've ever read on this site. You are either a paid shill or just demented with how shrill you are in your diatribes.
Who said anything about blindly cheerleading? I'm the FIRST person to admit I'm a fan of great tech that makes life more full and enjoyable, and overwhelmingly, Nvidia provides that when it comes to my favorite pastime, PC gaming, but only for good reason.
You say you are running a pair of 970s driving a G-Sync panel, so you should know first-hand there are reasons to prefer Nvidia over AMD, especially in light of the limitations documented here. Would you be happy if only one of those 970s worked with your G-Sync panel that actually performed worst below, say 40Hz? Would you say, someone pointing out those flaws and deficiencies in AMD relative to G-Sync was automatically, a paid shill? Don't be so obtuse.
See I actually started using the internet and reading AnandTech at a time when the primary purpose of the internet was to research and inform interesting and emerging tech, and as such, I find it annoying to have to wade through mounds of bullshit and misinformation. Overwhelmingly throughout the years, I've found AMD/ATI and their fanboys relish in this kind of FUD when they simply don't have competitive parts to bring to market. This is just one more example of this.
Simply put, I like to hold people accountable for what they say and do, and this is just another example. I fully expect others to do the same of what I say and do, and I'm happy to stand by what I've said on this topic and others. Feel free to go back and see what was said about FreeSync from Day 1 by me and others, then come back and see who is blindly cheerleading for one side or another.
Sounds like a nice setup, Palerma. 970s are a great value. Nvidia does very well above $300. That would be overkill for me, I won't be replacing my "old" 1080p LG display for a while longer.
I've used (and continue to use) parts from all the big vendors. In the past I've also used parts from many less-known manufacturers, even Cyrix put out a decent chip here and there (mostly as drop-in upgrades for platforms after Intel abandoned them). Yet that doesn't matter to Chizow. If you dare stand against his Nvidia Gospel, you're a heretic and clearly an <insert competitor here> fanboy.
When Nvidia supports DP adaptive sync in the future, he'll support it. Until then it's garbage. End of story.
Yep figures, typical low standards and usage patterns that don't even matter, like most AMD fanboys who most vigorously defend them. You can't feel impacted by their lies and dishonesty because you don't even use the products being discussed, you just blindly defend and cheerlead for them whenever their reputation comes under fire by meanies like me. :(
And no, when Nvidia supports DP adaptive sync, I won't support it, because I'll continue using the superior solution G-Sync as I have been for the last 9 months while AMD produced lies, misinformation and excuses, repeated by fanboys like you. But I honestly do hope Nvidia supports DP Adaptive Sync as an alternative and choice, it will be one less reason to use AMD and will actually put AMD's word to the test again, on how open and Free FreeSync is.
And what, pray tell, is the first glaring deficiency? If you mention enforced V-SYNC again like you did in the last thread, you were already proven wrong on that point and refused to acknowledge it.
What are you talking about? Freesync I clearly inferior to G-Sync, it doesn't cost $200 per monitor to implement and is a freely available standard. Those benchmarks that claim the petfomance is nearly identical to G-Sync are fabricated lies!
You need to mention how AMD fanboys no nothing and refuse to acknowledge when your proven wrong. Then it will be perfect. Love how chizow's argument here was, "it is also bad".
Yes, FreeSync costs less, but deservedly so, because its bad. Again, if a solution makes your gaming experience WORSE (input lag, tearing, stuttering) outside of its relatively high supported refresh range (40Hz as a worst-case) and doesn't even allow you to throw more money at the problem (no CF support), that is a pretty awful solution.
What's even worst is sites like AnandTech recommending people turn down their settings to stay above this minimum refresh rate when one of the main benefits of VRR (at least with G-Sync) was the freedom to crank your settings up without having to deal with the big drop in framerates with V-Sync On (Doublebuffering) and Input lag (Triplebuffering).
But its OK, I know the casual observer is not going to understand the differences here, and how they relate to your gaming experience.
Huh? It does force V-Sync again when you fall out of the supported VRR range, go read actual reviews, again and see how you were proven wrong as usual. But beyond that, FreeSync performs WORST than Vsync outside of the VRR range because not only do you get all the tearing and stutter again, you also get bad ghosting because FreeSync disables anti-ghosting and overdrive mechanisms. This is a fact acknowledged by both AMD and monitor vendors.
When going outside the range (pumping more frames than the monitor can handle) G-SYNC goes into forced V-Sync, increasing input latency. Freesync gives the option of V-Sync on or V-Sync off. You don't get both tearing and stutter, you get either tearing or increased input latency (you get to choose what is more important to you), whereas with G-SYNC you don't get the choice. Many competitive gamers would prefer the tearing to keep input latency low.
There's nothing in the GPU disabling anti-ghosting and overdrive mechanisms with Freesync. That's an implementation detail in current monitors (assuming it's true, I haven't heard anyone else claim this but I'll give you the benefit of the doubt).
When the frame rate drops below the monitor refresh rate, G-SYNC does some frame duplication to trick the monitor into thinking it's operating at a higher rate. Freesync doesn't currently do this, but this is entirely up to the GPU driver. This can easily change in the future. If you're regularly encountering this situation though, then your GPU isn't powerful enough to run the game at the current settings. You're going to have a sub-par experience with either G-SYNC or Freesync.
That's not entirely true. VSYNC at 144Hz and GSYNC at 144hz are completely different. In playing FPS games (I average a 7.3 KDR) I haven't noticed any input lag by hitting the 144Hz cap. That's because VSYNC and GSYNC are entirely different beasts.
It's not that hitting above 144fps all of a sudden introduces input lag. It should be no different than if you were to play with VSYNC off and enforced a 144fps cap through rivatuner/Msi afterburner.
Again, you're wrong on this. V-Sync is NEVER enabled with G-Sync, in fact, you MUST force it off in-game for G-Sync to work properly. At capped FPS, you do get FPS capped to the max refresh rate of the monitor, but it is still not V-sync, because it is not nearly as input laggy, and while it does have more input lag than Vsync off or G-Sync below the capped FPS, it is still fluid and dynamic. This was by design as Nvidia's goal was to never have tearing, ever. However, Tom Petersen has stated they liked the idea of disabling VSync at the top end of the refresh range and will look into making that happen in the future.
FreeSync on the other hand completely forces Vsync off when you fall out of VRR range, but in that transition, there is a noticeable "jarring" jerk as you move from VRR on to VRR off. Also, if you look at usage scenarios, the flaws with FreeSync VRR range are MUCH more likely to be exhibited since most of these panels are high-res 2560x1440 and 4K panels. What makes it worst is you can't even rely on CF to help you stay above these relatively high VRR thresholds. 40 FPS on a 4K monitor for example, is still damn hard to accomplish with any single GPU, even the Titan X.
Also, I am right about FreeSync disabling OD, which is why you end up getting ghosting. Some responsible review sites covered it more than others (PCPer, TFTCentral, Forbes), but TFT confirmed it happens with ALL FreeSync panels. Later, PCPer correctly surmised Nvidia's G-Sync module took over these OD timings and controls as it essentially replaces the existing scaler module.
"For some reason, the combination of FreeSync support and this display disables the AMA function.....Having spoken to BenQ about it the issue is a known bug which apparently currently affects all FreeSync monitors. The AMD FreeSync command disturbs the response time (AMA) function, causing it to switch off. It's something which will require an update from AMD to their driver behaviour, which they are currently working on."
AMD claims they can fix this via driver, but does that really seem likely? I mean they use a simple protocol, V-Blank, to replace the functions a scaler performed at a fixed refresh rate, and expect to just fix this with drivers? I guess it is possible, but it is looking less and less likely, every passing day, and certainly not as "simple" as some made it out to be (over 1 month since simple claims were made when FreeSync launched).
I don't understand why you would want to have "vsync off" in any situation when using VRR. It defeats the point of VRR entirely when you have tearing and visual inconsistency. If you're a competitive gamer and that matters to you, turn off VRR.
Well, I guess the belief is at the high-end of a 120-144Hz VRR range, Vsync off negates any input lag (which I've found negligible anyways, and certainly less than Vsync On) and when your refresh rate is that high, the associated tearing is much less noticeable as the on-screen distance/artifacts between split frames would be much less noticeable. I guess you can say it is the same reason some might prefer ULMB at high FPS over G-Sync. They prefer the lack of blur over the minor tearing at high FPS.
I think it would be a good option to allow the user to decide, and it sounds like Tom Petersen is going to make it happen if it is possible within the limitations of G-Sync.
Isn't it obvious? We keep hearing how "simple" it is to fix these problems, that AMD is just going to release a driver fix to address these issues.
Where are these fixes? Oh right, see announcement: DELAYED.
Obviously, it is not as simple as AMD claimed, maybe Nvidia was right when they said getting VRR right was hard, which is why they ultimately went with their own custom FPGA to get the job done rather than relying on existing hodge podge standards that leave you with a half-baked solution that introduces as many problems as it fixes?
Certainly appears like AMD jumped the gun with FreeSync just so they could fill in a checkbox and say "me too" to NVIDIA's revolutionary G-Sync. The difference is that NVIDIA fully fleshed out G-Sync before release to work with SLI where as AMD did a lackluster half assed job (as usual).
Pretty much, which is why I said on FreeSync review day, if anyone impacted by these use-cases was interested in FreeSync or had concerns about the ghosting, wait for them to fix it before diving in because as we have seen time and again, AMD has a horrible track record of overpromising and underperforming.
Pretty clear AMD's driver development resources are not sufficient to concurrently implement FreeSync, create Windows 10 drivers and maintain any semblance of a regular update cycle. Last WHQL driver release was December 19, 2015 and it's almost May
Ya, I feel most of their resources are on the hardware side right now. Carrizo, Zen, Kaveri refreshes(I forget architecture name at the moment), R9 300's, they do have a lot going on.
And they've laid off a lot of both, yet they are still making promises on features they clearly aren't delivering upon. Maybe all those millions paid to EA and man hours wasted on Mantle might have been better suited in getting FreeSync launch right from the start? Just a thought.
Yes, they're understaffed. That part is clear. My point was that their hardware engineers being focused on hardware refreshes is unrelated to the software engineer under-staffing problems.
Sure it is related when they both draw from the same pool of resources. The money wasted on Mantle and that deal with EA support of Mantle could and may have been better suited to hire engineers to work on refreshing their product stack. The lack of funds and resources is clearly related.
After having tried gsync, and loving it, I realized that both gsync and freesync are baby steps towards what needs to come next: ULMB. Current iterations of ULMB in gsync monitors like the Acer Predator XB270HU are limited to 100Hz, do not work at the same time as gsync, and cut down on screen brightness.
In the testing I did, I came to the realization that ULMB a is absolutely godly and 144Hz ULMB with gsync/freesync would be an even better step than going to something like 240Hz.
So put the gsync/freesync crap aside. ULMB has the potential to be significantly more important than this technology.
Unfortunately, the two are mutually exclusive until the brightness variance problem can be solved. When you don't know how long until the next display update will be, you don't know how bright to pulse the backlight for the current update. Underpredict, and the display will dim. Overpredict and it will be overbright.
No, the dimming byproduct is strictly due to the fact the light is pulsing for a lower amount of time, so less light is hitting your retina. Even if they fix ULMB to work with G-Sync you will get reduced brightness, this is well-known trade off of ULMB that will never get fixed. They already strobe the LED to higher brightness, but if you are only getting light for a fraction of the time due to the pulse, the macro effect is overall dimmer light. It is also not unlike the effect using 3D Active shutter glasses, the dimness is due to the shutters cutting at least 50% of the light hitting your retina.
I do think ULMB + G-Sync is the next evolution of G-Sync, possibly G-Sync 2.0. Tom Petersen did leave this as a possibility in his interview with PCPer with a "no comment but stay tuned" response, which I have found generally means they are already working on it and there is a high likelihood it makes it to market if they find a way to implement it.
Personally I like ULMB at high FPS, but at lower FPS, I find G-Sync is definitely the more noteworthy tech, but a solution that incorporates both without compromise would be great. Petersen explained how PWM was used to control the strobe backlighting with ULMB and said that was hard to get right with a variable refresh rate, but since that interview, PCPer was able to conclude (and Nvidia more or less confirmed) Nvidia's G-Sync module takes over the OverDrive process from the monitor. If they could implement similar controls over whatever IC is responsible for the PWM rate, that would open the door for G-Sync + ULMB together.
I wish AMD's driver team would stop promising release dates it can't realistically meet; it does more harm to their brand than the benefit of the sales gained by promising the feature. AMD did the same thing with their Crossfire frame pacing driver, which was also missing deadline after deadline. If you are not sure when you will have a driver ready, then you shouldn't give a specific date of its release. Also, AMD, despite what you may argue otherwise, releasing a buggy beta driver on the deadline does not qualify as a full release.
I agree, and I fully expected more AMD fans and users to have this same sentiment, because honestly, you are the ones who are impacted the most by this kind of behavior, and this is the main reason it would be difficult for me to buy and support AMD products. I just don't agree with their development and release philosophy as it just doesn't meet my needs. Plus, I know there's an alternative out there that does it better.
I also wish Nvidia wouldn't do things like release $300+ cards with half the VRAM speed of a 2007 midrange card, and not bother to tell anyone about it.
Cool yeah I wish AMD didn't sell everyone extra graphics cards as CrossFire and claim it actually improved their gaming experience when it actually made it worst.
Exactly, it would have been much better if AMD just took their time developing this feature and released it when it was ready, but they spent much of the last 18 months disparaging G-Sync, buying time, trying to slow adoption with all of their FUD. And now, they're stuck with an inferior solution that they won't be able to backtrack from (DP AdaptiveSync) because of all the disparaging FUD they said about why Nvidia needed to use their own, expensive, "license fee" ASIC.
Perhaps, but had this been NVIDIA spreading disparaging FUD and being late with a driver, I'm not sure you'd be so quick to judge. Bordering on Schadenfreude, if I'm being honest here. If you're not interested in FreeSync in any way, shape or form, I don't see why you feel the need to slate it endlessly in the comments. Did you know that you've posted 22 responses to this subject out of a grand total of 56?
I imagine AMD won't take forever over the driver, but it won't be a week or two in all likeliness. Regardless of all the FUD, it's still better to perfect a driver than release it in an unfinished stage, and whilst I'd have preferred to see the first FreeSync monitors coming out when the software was 100%, the fact they're already out generally points to the spec itself being fine rather than any specific implementation issues.
And that's the point lol, the reason I buy Nvidia is because they DON'T need to do that, they just unapologetically develop new and interesting technology that is useful to anyone interested in gaming.
Unlike AMD, they spend the majority of the time talking shit about the competition in an attempt to slow adoption or give themselves time to play catch-up. Then they release some half-assed, half-baked solution and say they need more time to fix the problems. FreeSync is just ONE MORE example.
If it comes down to a simple choice to spend a little bit more for a better supported product with fewer question marks and overall more features, it is going to be an easy choice and overwhelmingly, that choice is going to be Nvidia.
Also, did you know, out of the grand total responses from AMD fanboys and apologists, only 1 person has actually responded in disappointment to another instance of AMD overpromising and underdelivering? I mean going full Schadenfreude here, to me, its obvious the only reason AMD gets away with this kind of poor support and behavior is because their users and supporters, like you, allow them to and don't demand better.
Instead, look at all the fanboy angst and anger directed at me for simply pointing out the obvious! Honestly, AMD can only burn their most loyal fans so many times before they've had enough, but it becomes more and more obvious by the day that their most loyal defenders don't actually use their products, because there is no way in hell that if roles were reversed, I would be satisfied with what AMD has put out there with FreeSync.
The ONLY way these problems are going to get fixed if AMD's actual users demand better, and in that respect, I can guarantee you anyone who buys a FreeSync panel today, reads my comments thinking I'm just some Nvidia fanboy, will appreciate them a lot more when they see these same glaring problems I've already pointed to from more responsible review sites.
I would have thought AMD fans would have gotten peeved after broken enduro on $1800 laptops and then janky frame pacing on a pricy set of cards in crossfire. I guess the amount of punishment their fans will take is endless.
Not to mention all the promises regarding Mantle, TrueAudio that turned out to be bogus. But yeah Enduro is such a joke, just another example of AMD reactively attempting to be feature-competitive with Nvidia, launching a half-baked broken solution and then abandoning it later.
You know what's sad about this whole Freesync / Gsync arguement is how Freesync has become the VESA defacto standard. Technically if Intel who also makes graphics cards wanted to implement one standard or the other then Freesync would be the easier one to implement because it is the royalty free standard. In the long run I imagine Freesync will win out for this fact alone because it is the cheaper implementation and anyone can do it because the specifications are free. With Nvidia you MUST buy their ASIC to produce a monitor, there's no attempting to do it in house. Now if Nvidia freely offered up the design of that chip to competitors or offered their specifications royalty free then I wouldn't have any issues with them but history has shown that Nvidia will purposely block out competitors when given the chance.
What's sad is that AMD in their BS run-up to an actual FreeSync launch has brainwashed their fans into believing this, among a bunch of other bits of FUD out there. I'll be honest when I say there's a good chance FreeSync panels are one and done if they can't fix the stated problems with it. Poor sales, unfixable problems, and AMD throwing their vendors under the bus are not a good combination when it comes to product lifecycles.
Why should Nvidia spend millions of dollars and engineering resources developing a new technology only to immediately give it away to AMD and Intel? I am all for open standards, but without that incentive, the investment wouldn't have been made in the first place, and there would likely be no FreeSync or VESA standard on the market today.
It's clear that the current VESA adaptive refresh standard isn't quite up to the standard of G-Sync. Nvidia has clearly experimented with it, as indicated by the laptop G-Sync drivers that just require an eDP display. It is also clear that the FPGA and 512 MB of DRAM within the module aren't doing nothing.
Well you know with AMD fans they all insist we NEED AMD because competition is good, but when Intel and Nvidia compete and drive innovative new tech and features, its bad and they are "competing too hard" and they should just give it away and share it with AMD.
Their hypocrisy knows no bounds, I truly look forward to the day where we don't have to deal with these tech bottomfeeders sandbagging the rest of the industry.
What makes you so sure Nvidia is willing to sell their G-Sync scheme to their competitors huh? In theory all Nvidia needs to do is to sell their FPGA module to display makers. But in order to use G-Sync you also need to BUY A LICENSE from Nvidia as well. AMD on the other hand has provided the specifications which means as long as you know what you're doing you are free to support it as you see fit. If anything this is Nvidia's play it my way or you're not allowed. I mean just look at PhysX. If you have an Nvidia card and AMD (formerly ATI) card in your system, Nvidia purposely disables PhysX. What if in the future Intel's graphics division grows and they decide, "Hey let's disable certain CPU functions if we detect anything but our own graphics card."
The whole adaptive sync scheme sadly is only being used with the most high end gamers on the market. Speak to your average joe on the street and no one will really know what it is even. I suspect Intel will adopt FreeSync because well it's free. Monitor makers will likely pick up Freesync faster than G-Sync because it doesn't cost them nearly as much to adopt the standard and in time Nvidia will change their minds and add it to their system as well because well they don't need to pay AMD anything to do it. The only folks who might be burned are the early adopters of G-Sync.
Who said I was sure Nvidia would sell or license their solution to the competition? LOL. I made no such claims, nor do I care. Nvidia (and AMD for that matter) have made no secret that their G-Sync solution is proprietary, and Nvidia has also made it clear in this case and others, that they have no interest in doing the work for everyone. What's wrong with proprietary if it gets the job done and drives innovation in the marketplace? Would you have preferred Nvidia never invent the implementation of VRR and wait for some standards board to maybe invent it some X years later?
It's a mentality I strongly agree with, Nvidia unapologetically spends their resources to create new and innovative technologies that improve the gaming ecosystem for users that BUY IN to their technology. They are in no way obligated to improve the situation for everyone, least of all their competitors, but there have been times in the past they have introduced new technologies that improve gaming for everyone, like GameWorks.
You keep parroting the nonsense that G-Sync requires a licensing fee paid to Nvidia, but there's simply no proof of that. What you do have is an expensive, proprietary G-Sync FPGA module, local DRAM, and all necessary TCONs to replace a monitor's scaler logic. AMD repeatedly made claims that this module wasn't necessary, but as we are seeing now, that is CLEARLY not the case and the G-Sync module and local DRAM buffer were doing quite a bit of work to address the deficiencies we see in AMD's FreeSync solution.
Most notably: 1) the ability to repeat frames on the low-end using the local DRAM as a lookaside buffer. 2) the ability to take over the overdrive function at a variable rate to give the panel similar anti-ghosting characteristics as other panels on the market.
But yes, I see you are also buying into the popular FUD that while this is a niche market (I agree), Intel will be AMD's savior in adopting FreeSync bc well, it's free (this part is nonsense). Sorry to burst your bubble, but aside from the fact Intel has shown no interest whatsoever in FreeSync, FreeSync is actually trademarked by AMD and it is unlikely that they would license it to one of their competitors. Plus, there's the question whether or not Intel iGPUs are even capable of implementing VRR, given the fact even many of AMD's own recent, relevant GPUs cannot. Do you honestly think an Intel IGP has more advanced display controllers, than say, the 7970 Tahiti? Because that part amongst all other GCN 1.0 cards actually can't support FreeSync. Just sayin.
Free-sync with Crossfire right NOW is almost irrelevant.
With DX12 AMD single dGPU's will deliver something close to 18 MILLION draw calls. More than enough for 4K gaming.
AMD Radeon 3xx series will likely be over 20 million draw calls; more than adequate for 4k games. Running dualie dGPU Radeon cards will likely produce and AP overhead of 30-40 million draws calls. Hardly necessary.
How many gamers NOW are using two Radeon dGPU cards and are going to run out and buy a new flat screen display? Not too many.
And yet the 290X doesn't outperform equivalent Nvidia cards in Mantle games on all but low end CPUs, despite Mantle being capable of making "more draw calls."
Providing a firm release date and then reneging on that is bad enough but not even providing another target date is basically unacceptable.
Nvidia and AMD both need to have more respect for their customers and investors.
1) No more rebadging. If you want to make a minor update, then add "Rev. X" to the card's name, where X is the series number. That way people will know what the GPU chip actually is but also know how it fits into the current lineup (the series). Simple rebadging is pretty close to fraud.
2) No more selling different GPUs under the same name. I particularly recall lower-end Nvidia cards that had very different specs and yet had the same name. There are quite a few instances.
3) Absolutely no scamming people by baiting them with better specs and selling them something with lower specs (GTX 970). And, if you really want people to believe your "We didn't know because we're incompetent" claim, then back it up with a real remedy.
4) If you can't deliver the product within close proximity of the time you announced it would be available then don't announce the date in the first place. Tell people the timing is TBA. If you give an estimate, make sure people know it's highly changeable.
Wow, I don't know who has worse of a life, chizow for writing all this, or me for reading it all. Probably me for reading it all. I think I'm going to go pull my r9 295x2 out and hug it in the corner sucking my thumb and cry.
I don't think the issue really is wether freesync is better than gsync or vice versa. The issue is that nvidia should support freesync as well as gsync. If gsync is much better than they should have nothing to fear as those willing to pay more will pay more. Simple and easy.
I haven't had hands on experience with both so I have no idea which is better even with the reviews. Also, I am betting that it will be different depending on the implementation by the monitor manufacturers. So none of that matters to me. I will have to decide which I want. The issue is right now I can't. I feel like I have to pick a monitor then match my computer to it...and that is crazy.
Open it all up as much as they can. Nvidia shoudl support freesync. Then they will have the best platform. Right now the market will be muddled and make it harder for people to get into pc gaming and stay there. This tech could have really boosted the market by making pc gaming smooth and tear free...instead we get two companies going for money..nvidia more so with proprietary tech. Bad for all around.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
92 Comments
Back to Article
chizow - Thursday, April 30, 2015 - link
No surprise, it's obvious AMD is running into all kinds of problems with the limitations of their FreeSync spec." but given the fact that it’s more important to get this right than to get it out quickly, this is likely for the best."
What is interesting is that AMD did not apply the same level of restraint on FreeSync as a whole and pump the brakes until they got it right. This is the second glaring deficiency relative to G-Sync yet that didn't stop AnandTech from misleadingly declaring FreeSync an equivalent alternative.
Crunchy005 - Thursday, April 30, 2015 - link
Love how anything AMD your the first to comment, and anything that doesn't allow you to be anti AMD I never seem to see you comment on. Is the only thing you do is sit here and troll anti AMD stuff on this site?Either way this tech is still a nitche market with the premium it calls for. Not to mention the small range of choices for monitors that support G-Sync or FreeSync. Sad to see that crossfire got delayed for this I am sure the people who do go for this are enthusiast level and probably already run crossfire.
chizow - Thursday, April 30, 2015 - link
Have you ever considered that's because you only bother to read AMD headlines to defend their honor in their time of need? lol. I guess you missed the half dozen or so comments I made in other articles this week, everything from Apple to Microsoft's interesting announcements this week, but don't let that stop you from commenting ignorantly on the matter. I guess it never occurred to you that people will comment on articles that interest them, go figure!But yes this is a niche market, just as I stated, which makes you wonder whether or not AMD has the requisite resources or knowledge to be dabbling in this market when they clearly have bigger issues on their plate, most notably, launching competitive CPU and GPUs in the marketplace.
mrmessma - Friday, May 1, 2015 - link
Yes, PCWorld, TrustedReviews, and Toms Hardware are all just lying. I'll capitulate that Intel puts out better CPU's, but where have you been the last 5 years claiming their gpu's are anything less than nVidia, not even getting into if you get into the performance/$$?mrmessma - Friday, May 1, 2015 - link
Yes, sorry about the grammar. My weak intellect is too brainwashed by AMD to proof read. Guess I frequent too many sites that allow editing for that, my apologies.chizow - Friday, May 1, 2015 - link
AMD has never had the fastest GPU to end a generation since G80 launched, and while they are generally competitive in overall performance, they are still far behind when it comes to technology, drivers, features, and support. This entire FreeSync debacle is just another example of this.Nvidia took the time and resources to develop G-Sync years ago, it took a lot of time, research, and money and when they brought it to market, it worked, just as advertised to the world's amazement. AMD claims there was no reason for Nvidia to have gone the hardware route, that they were thinking of maybe working on something similar using open/existing DP specs, then half-assedly throw a solution together and 18 months later, we get something that is clearly inferior in FreeSync. See the difference?
Who wants to spend their time and money on products that don't just work? You wouldn't pay a premium for a product that works as advertised, today, without having to wait? Given the life cycles of these products turns over rather frequently (18-24 months), every month you have to wait for a feature or product to mature is time and money wasted.
Creig - Friday, May 1, 2015 - link
Let's see what the the people who actually reviewed the technology said about FreeSync, rather than Chizow's nvidia-distorted view:Anandtech - It took a while to get here, but if the proof is in the eating of the pudding, FreeSync tastes just as good as G-SYNC when it comes to adaptive refresh rates
TheTechReport - the BenQ XL2730Z is good enough that I think it's time for the rest of the industry to step up and support the VESA standard for variable refresh rates.
Guru3D - FreeSync works as well as Gsync does and vice versa
Gamespot - Freesync is the more attractive proposition, even if Nvidia has a wider slice of the GPU market. More monitor features, and a lower cost of entry make it even better.
Keep trying, chizow.
chizow - Friday, May 1, 2015 - link
LMAO yeah let's see what the actual reviews that dug a bit deeper than the superficial, staged demo at AMD HQ that Creig presents:TFTCentral: http://www.tftcentral.co.uk/reviews/benq_xl2730z.h...
"As a result, response times are fairly slow at ~8.5ms G2G and there is a more noticeable blur to the moving image. See the more detailed response time tests in the previous sections for more information, but needless to say this is not the optimum AMA (response time) setting on this screen. For some reason, the combination of FreeSync support and this display disables the AMA function.
This only happens when you are using a FreeSync enabled graphics card, FreeSync capable drivers and the DisplayPort interface...
Having spoken to BenQ about it the issue is a known bug which apparently currently affects all FreeSync monitors. The AMD FreeSync command disturbs the response time (AMA) function, causing it to switch off. It's something which will require an update from AMD to their driver behaviour, which they are currently working on. It will also require a firmware update for the screen itself to correct the problem."
Forbes: http://www.forbes.com/sites/jasonevangelho/2015/03...
"Note to readers: I have seen this firsthand on the Acer FreeSync monitor I’m reviewing, and PC Perspective noticed it with 2 additional FreeSync monitors, the BenQ XL2730Z and LG 34UM67. To illustrate the problem they recorded the aforementioned monitors running AMD’s Windmill demo, as well as the same demo running on a G-Sync enabled Asus ROG Swift. Ignore the stuttering you see (this is a result of recording at high speed) and pay attention to the trailing lines, or ghosting. I agree that it’s jarring by comparison."
PCPer: http://www.pcper.com/reviews/Graphics-Cards/Dissec...
"It's also important to note that the experience below the VRR window on a FreeSync panel today is actually worse in practice than in theory. Because the refresh rates stays at 40 Hz when your frame rates are low, you get a combination of stutter and frame tearing (if V-Sync is off) that is worse than if the refresh rate was higher, at say 60 Hz or even 144 Hz. No doubt complications would arise from an instantaneous refresh rate shift of ~40 Hz to 144 Hz but some middle ground likely exists that FreeSync could implement to improve low-FPS experiences."
In Summary: 3 things that make FreeSync worse than G-Sync, today
1) Ghosting due to FreeSync commands disabling OverDrive anti-blur technologies in ALL FreeSync monitors.
2) Worst than V-sync behavior outside of relatively limited VRR windows (input lag, stuttering, tearing all present, as high as 40Hz).
3) No CF support to help mitigate these issues, leaving you with effectively 1 GPU choice for this technology (290/X), and good luck getting one of those to run acceptably well at 2560x1440 much less 4K in current games without turning settings WAY down.
The alternative: Paying a premium for better Nvidia products that do everything they said they would relative to G-Sync and VRR, for the last 20 months since they launched it
chizow - Friday, May 1, 2015 - link
And I guess the follow-up to that question Creig, in your opinion, knowing there are reviews that have brought these issues to light, do you honestly think AMD's FreeSync is equivalent to Nvidia's G-Sync? Would you go and buy one of these monitors today, if VRR was important to you? Would you recommend someone else who was interested in Variable Refresh, buy one of these monitors today?See the difference between me and you is, I wouldn't recommend something that sucked to someone else if they were spending their hard-earned money. Clearly, you have no issue in doing so.
Vayra - Tuesday, May 5, 2015 - link
Wait, you are stuck with the illusion that G-Sync is working properly as of today?It is not. Nvidia is still countering lots of issues. Granted, they are less universal and more game-specific (or monitor-specific!) but to say G-Sync is smooth sailing and Nvidia delivered a great flawless tech, is very far from reality.
Vayra - Tuesday, May 5, 2015 - link
ULMB for example gets disabled on many a monitor, quite similar to the lack of anti-ghosting on FreeSync...doggghouse - Wednesday, May 6, 2015 - link
ULMB is disabled on *all* monitors when G-Sync is enabled. However, that's also the case for FreeSync. I'm pretty sure that strobing the backlight is not possible when dealing with variable refresh rates... at least for now.I think comparing ULMB to pixel overdrive (anti-ghosting) is apples to oranges, anyway.
Alexvrb - Friday, May 1, 2015 - link
Everyone knows he's a rabid Nvidia fan. Wait, I take that back. That statement might be unfair to average, everyday rabid Nvidia fans. There is no talking changing Chizow's mind or talking any kind of sense into him. He will NEVER agree with anything that isn't pro-Nvidia.My personal favorite is how he insisted that the R9 285 would throttle like crazy, and that said throttling was completely unacceptable. When it didn't he refused to admit he was wrong and changed arguments. Then it's proven that some Maxwell cards throttle like crazy even on regular games and suddenly it's not important if a card throttles.
Anyway he'd be perfectly happy to pay a hefty premium for a proprietary solution that only works on Nvidia cards. Many normal people would prefer a solution which is based on VESA standards and can be implemented by any GPU vendor. That way you can buy one monitor and you aren't locked into a single vendor to use one of its biggest features.
chizow - Saturday, May 2, 2015 - link
LMAO, from Alexvrb, the captain of the AMD turd polishing patrol. Did you personally work on Turdga btw? I have to imagine so as that is the only possible explanation for why someone would so vigorously defend such an awful part. There's now rumors that Turdga wasn't even the fully enabled ASIC, so there's a good chance it would've actually used significantly more power without much of a performance increase. But only a massive AMD fanboy like you would consider this a positive for AMD and attempt to draw parallels to Nvidia's Maxwell arch as you did. You didn't have much to say when Nvidia completely killed it with the 970/980 did you?But yes, just as I stated, 285's TDP was bogus, in order to avoid boost issues, they needed to slap a 3rd party cooler on it at which point its TDP was closer to the 280 with barely the performance to match. Truly, only a turd a die hard AMD fanboy like you would appreciate (or bother to defend as you did so vigorously).
Alexvrb - Sunday, May 3, 2015 - link
Unlike you, I own products from multiple vendors. My tablet even has a Tegra 3 (which granted is due for replacement soon, probably with an Intel powered model - such an AMD fanboy!). The only reason I appear to be an "AMD fanboy" to you is that you're buried so deep in Nvidia Cult Worship that any time ANYONE disagrees with you it's blasphemy and they must be vigorously attacked with multiple strawmen arguments and other such nonsense designed to move the goalposts so you can "win".chizow - Sunday, May 3, 2015 - link
Actually I do own products from multiple vendors as well, as recently as an R9 290X that was given to me by a vendor to validate, and of course, it reinforced the fact AMD Radeon products simply aren't good enough for my uses at either home or work.I also looked seriously into getting a Kaveri-based HTPC system, but after all the total costs, need for a chassis and PSU, and total power usage, it simply did not make sense for my needs for such a simple task as a DVR/mediaplex.
And I've said many times, to AMD fanboys such as yourself, if roles were reversed and there was an AMD logo on any of the Nvidia products I've purchased in the last 9 years since G80, I would've bought AMD without hesitation. Can you say the same? Of course not, because you buy inferior AMD products as any loyal fanboy would.
But you seem confused, you appear to be an AMD fanboy because you defend their most egregiously bad products, plain and simple. I mean anyone who bothers to defend Turdga as vigorously as you did is going to set off fanboy alarms anywhere, even AMDZone HQ.
chizow - Saturday, May 2, 2015 - link
And of course I'd pay a premium for something that actually worked and did all it says it would, who would pay a hefty premium for a monitor (AMD FreeSync panels still carry a premium) that didn't do all it said it would and actually performed WORST than a standard panel in many use-cases?Oxford Guy - Sunday, May 3, 2015 - link
Because we can all thank Nvidia for Mantle/Vulkan/DX12.Oh, that's right... they were too busy coming up with 28 GB/s VRAM partition schemes.
chizow - Sunday, May 3, 2015 - link
You're right, we have the consoles to thank for low level APIs, and Microsoft in particular for DX12.And 28GB/s VRAM partition schemes on a part that completely decimated AMD's product stack was certainly welcome, even for AMD fans who could find their favorite cards for a fraction of the price, thanks to Nvidia.
Oxford Guy - Sunday, May 3, 2015 - link
If you're not even going to bother with the pretense of objectivity and rationality then don't reply to my comments.chizow - Sunday, May 3, 2015 - link
What are you going on about? You're trying to say we have AMD to thank for Mantle/Vulkan/DX12 and I'm saying that's a farce. I guess John Carmack was espousing the benefits of target hardware and low level APIs 5-6 years ago and referring to AMD and Mantle? Of course not. Do you HONESTLY believe Microsoft was not going to roll out a DX12 at some point, and that it would not have many of the same features as the low level DirectX based API they had been using for over a decade? You do realize, Microsoft was demo'ing the "Xbox One" before the console actually launched on Nvidia hardware right? I was just pre-emptively giving your assertion the objectivity and rationality it deserved, utter nonsense.And you seem hung up on the 28GB/s VRAM partition, please tell me how this impacts you and your usage cases with regard to the GTX 970, and how this in any way diminishes the immense value and performance it offered at that $330 price point, thanks.
Creig - Friday, May 1, 2015 - link
It's nice to see you're still out there making friends, chizow. Tell wreckage and rollo I said "Hi" the next time you bump into them.chizow - Friday, May 1, 2015 - link
And I see you aren't spewing your usual nonsense cheerleading AMD's woeful products, as usual, Creig. Did you finally get the pinkslip?What happened to price:performance trumping all with the 290/X leading AMD into an absolute nosedive the last 2 quarters since Maxwell launched and decimated AMD's product stack, just as I said it would?
You kept insisting a miniscule price difference was going to be the key differentiator in AMD's favor, but none of that happened. Once again, maybe you are doing it wrong, Creig?
Creig - Friday, May 1, 2015 - link
Ah, poor Chizow. Once again you display your utter lack of comprehension. The main reason I recommend cards based on favorable price:performance ratios isn't because it's good for AMD. I recommend them because it's good for the end consumer. Namely, us. It just so happens that it's AMD who typically offers the best performance for the money.See, that's the difference between you and me. You treat AMD and Nvidia like rival football teams. You cheer Nvidia and boo AMD every chance you get because you feel like you'll be on the "winning" side if Nvidia comes out on top. Here in the real world, the only "winner" is the person who gets the best deal for their money, regardless of who makes the product.
Nvidia must just love fanatics like you who literally foam at the mouth anytime somebody has the unmitigated gall to point out how you can usually get a better deal by purchasing an AMD card. So just keep ranting away, chizow. The rest of us will just continue to sit back and chuckle at your antics.
chizow - Friday, May 1, 2015 - link
LMAO poor Creig, you once again backpedal to cover the stupid things you've said in the past.Again, what forced the price changes that you and other poor AMD have-nots enjoyed recently? Do we need to go back and check to see if you were recommending to everyone to go out and buy Nvidia cards when Nvidia completely decimated AMD's entire product stack after Maxwell launch? Of course not. You waited the month+ before going on about how AMD offered such a great deal, refusing to even acknowledging the fact it was Nvidia that drove this "positive change for the consumer".
Then you went on to stupidly claim only price:performance mattered, when in reality, that couldn't be further from the truth! While it is an important metric, it is obvious that if price:performance are CLOSE then the consumer will buy the BETTER product, because the premium is justified. What makes a product better is NOT strictly tied to price and performance, because a number of other factors like support, feature-level, drivers, efficiency, aesthetics, acoustics, developer relationships etc. all factor into this decision. And G-Sync vs. FreeSync is just one more example of this.
Again, I think your mistake in comparing me with you is that I actually USE these products I am talking about to maximize enjoyment of my favorite pasttime, PC gaming, so spending top dollar or paying slightly more for better product is no issue or obstacle for me. Waiting for some miracle price drop on last-year's performance does not outweigh the loss in performance, efficiency, features, etc. that I would lose by having to wait.
But obviously, none of this is important to you because you keep going on about how this deal is better than that for people, is it a good deal to buy a product that doesn't even do what it claims to do and to be in a perpetual holding pattern waiting for proper support of said product? LMAO, again just points to certain people having low standards and expecting less of what they buy, simple as that, for anyone else a small premium is no object when it comes to buying the better product and once again, for 2 quarters running, the market agrees with me, not you as AMD has been slaughtered in the marketplace.
So like I said Creig, maybe you are doing it wrong? What GPU do you even own now? 5850? 6950? Your opinion doesn't even matter, your buying habits and usage patterns aren't even relevant to the discussion.
Notmyusualid - Saturday, May 2, 2015 - link
+1.at80eighty - Thursday, April 30, 2015 - link
for his sake i hope he is getting paid. because otherwise..LOLpalerma - Friday, May 1, 2015 - link
He's likely a paid shill.Alexvrb - Friday, May 1, 2015 - link
No way. You can't buy this level of mouth-foaming fanboyism with money alone. He's more like the leader of a cult.chizow - Saturday, May 2, 2015 - link
Lol 3 AMD fanboys that would rather attack the messenger than demand better from the company they blindly worship. Go figure. :)palerma - Sunday, May 3, 2015 - link
When have I ever indicated I was blindly cheer leading for either side. I actually own a pair of 970s driving a 1440p gsync display and before that I owned a 7970...basically, whatever offers good value for money.My comment was about you and your frothy mouthed rants you go on in every AMD related post I've ever read on this site. You are either a paid shill or just demented with how shrill you are in your diatribes.
chizow - Sunday, May 3, 2015 - link
Who said anything about blindly cheerleading? I'm the FIRST person to admit I'm a fan of great tech that makes life more full and enjoyable, and overwhelmingly, Nvidia provides that when it comes to my favorite pastime, PC gaming, but only for good reason.You say you are running a pair of 970s driving a G-Sync panel, so you should know first-hand there are reasons to prefer Nvidia over AMD, especially in light of the limitations documented here. Would you be happy if only one of those 970s worked with your G-Sync panel that actually performed worst below, say 40Hz? Would you say, someone pointing out those flaws and deficiencies in AMD relative to G-Sync was automatically, a paid shill? Don't be so obtuse.
See I actually started using the internet and reading AnandTech at a time when the primary purpose of the internet was to research and inform interesting and emerging tech, and as such, I find it annoying to have to wade through mounds of bullshit and misinformation. Overwhelmingly throughout the years, I've found AMD/ATI and their fanboys relish in this kind of FUD when they simply don't have competitive parts to bring to market. This is just one more example of this.
Simply put, I like to hold people accountable for what they say and do, and this is just another example. I fully expect others to do the same of what I say and do, and I'm happy to stand by what I've said on this topic and others. Feel free to go back and see what was said about FreeSync from Day 1 by me and others, then come back and see who is blindly cheerleading for one side or another.
Alexvrb - Sunday, May 3, 2015 - link
Sounds like a nice setup, Palerma. 970s are a great value. Nvidia does very well above $300. That would be overkill for me, I won't be replacing my "old" 1080p LG display for a while longer.I've used (and continue to use) parts from all the big vendors. In the past I've also used parts from many less-known manufacturers, even Cyrix put out a decent chip here and there (mostly as drop-in upgrades for platforms after Intel abandoned them). Yet that doesn't matter to Chizow. If you dare stand against his Nvidia Gospel, you're a heretic and clearly an <insert competitor here> fanboy.
When Nvidia supports DP adaptive sync in the future, he'll support it. Until then it's garbage. End of story.
chizow - Sunday, May 3, 2015 - link
Yep figures, typical low standards and usage patterns that don't even matter, like most AMD fanboys who most vigorously defend them. You can't feel impacted by their lies and dishonesty because you don't even use the products being discussed, you just blindly defend and cheerlead for them whenever their reputation comes under fire by meanies like me. :(And no, when Nvidia supports DP adaptive sync, I won't support it, because I'll continue using the superior solution G-Sync as I have been for the last 9 months while AMD produced lies, misinformation and excuses, repeated by fanboys like you. But I honestly do hope Nvidia supports DP Adaptive Sync as an alternative and choice, it will be one less reason to use AMD and will actually put AMD's word to the test again, on how open and Free FreeSync is.
Gigaplex - Thursday, April 30, 2015 - link
And what, pray tell, is the first glaring deficiency? If you mention enforced V-SYNC again like you did in the last thread, you were already proven wrong on that point and refused to acknowledge it.I know, I know, don't feed the trolls...
Flunk - Thursday, April 30, 2015 - link
What are you talking about? Freesync I clearly inferior to G-Sync, it doesn't cost $200 per monitor to implement and is a freely available standard. Those benchmarks that claim the petfomance is nearly identical to G-Sync are fabricated lies!Everything AMD does is terrible!
chizow - Thursday, April 30, 2015 - link
Yes, and its also bad, so as usual with AMD, you get what you pay for.Crunchy005 - Friday, May 1, 2015 - link
You need to mention how AMD fanboys no nothing and refuse to acknowledge when your proven wrong. Then it will be perfect. Love how chizow's argument here was, "it is also bad".chizow - Friday, May 1, 2015 - link
Yes, FreeSync costs less, but deservedly so, because its bad. Again, if a solution makes your gaming experience WORSE (input lag, tearing, stuttering) outside of its relatively high supported refresh range (40Hz as a worst-case) and doesn't even allow you to throw more money at the problem (no CF support), that is a pretty awful solution.What's even worst is sites like AnandTech recommending people turn down their settings to stay above this minimum refresh rate when one of the main benefits of VRR (at least with G-Sync) was the freedom to crank your settings up without having to deal with the big drop in framerates with V-Sync On (Doublebuffering) and Input lag (Triplebuffering).
But its OK, I know the casual observer is not going to understand the differences here, and how they relate to your gaming experience.
chizow - Thursday, April 30, 2015 - link
Huh? It does force V-Sync again when you fall out of the supported VRR range, go read actual reviews, again and see how you were proven wrong as usual. But beyond that, FreeSync performs WORST than Vsync outside of the VRR range because not only do you get all the tearing and stutter again, you also get bad ghosting because FreeSync disables anti-ghosting and overdrive mechanisms. This is a fact acknowledged by both AMD and monitor vendors.I know, I know, don't feed the idiots.
chizow - Thursday, April 30, 2015 - link
*force V-Sync offGigaplex - Friday, May 1, 2015 - link
When going outside the range (pumping more frames than the monitor can handle) G-SYNC goes into forced V-Sync, increasing input latency. Freesync gives the option of V-Sync on or V-Sync off. You don't get both tearing and stutter, you get either tearing or increased input latency (you get to choose what is more important to you), whereas with G-SYNC you don't get the choice. Many competitive gamers would prefer the tearing to keep input latency low.There's nothing in the GPU disabling anti-ghosting and overdrive mechanisms with Freesync. That's an implementation detail in current monitors (assuming it's true, I haven't heard anyone else claim this but I'll give you the benefit of the doubt).
When the frame rate drops below the monitor refresh rate, G-SYNC does some frame duplication to trick the monitor into thinking it's operating at a higher rate. Freesync doesn't currently do this, but this is entirely up to the GPU driver. This can easily change in the future. If you're regularly encountering this situation though, then your GPU isn't powerful enough to run the game at the current settings. You're going to have a sub-par experience with either G-SYNC or Freesync.
Socius - Friday, May 1, 2015 - link
That's not entirely true. VSYNC at 144Hz and GSYNC at 144hz are completely different. In playing FPS games (I average a 7.3 KDR) I haven't noticed any input lag by hitting the 144Hz cap. That's because VSYNC and GSYNC are entirely different beasts.It's not that hitting above 144fps all of a sudden introduces input lag. It should be no different than if you were to play with VSYNC off and enforced a 144fps cap through rivatuner/Msi afterburner.
chizow - Friday, May 1, 2015 - link
Exactly. Anyone who has actually used G-Sync understands this so thank you for chiming in.chizow - Friday, May 1, 2015 - link
Again, you're wrong on this. V-Sync is NEVER enabled with G-Sync, in fact, you MUST force it off in-game for G-Sync to work properly. At capped FPS, you do get FPS capped to the max refresh rate of the monitor, but it is still not V-sync, because it is not nearly as input laggy, and while it does have more input lag than Vsync off or G-Sync below the capped FPS, it is still fluid and dynamic. This was by design as Nvidia's goal was to never have tearing, ever. However, Tom Petersen has stated they liked the idea of disabling VSync at the top end of the refresh range and will look into making that happen in the future.FreeSync on the other hand completely forces Vsync off when you fall out of VRR range, but in that transition, there is a noticeable "jarring" jerk as you move from VRR on to VRR off. Also, if you look at usage scenarios, the flaws with FreeSync VRR range are MUCH more likely to be exhibited since most of these panels are high-res 2560x1440 and 4K panels. What makes it worst is you can't even rely on CF to help you stay above these relatively high VRR thresholds. 40 FPS on a 4K monitor for example, is still damn hard to accomplish with any single GPU, even the Titan X.
Also, I am right about FreeSync disabling OD, which is why you end up getting ghosting. Some responsible review sites covered it more than others (PCPer, TFTCentral, Forbes), but TFT confirmed it happens with ALL FreeSync panels. Later, PCPer correctly surmised Nvidia's G-Sync module took over these OD timings and controls as it essentially replaces the existing scaler module.
"For some reason, the combination of FreeSync support and this display disables the AMA function.....Having spoken to BenQ about it the issue is a known bug which apparently currently affects all FreeSync monitors. The AMD FreeSync command disturbs the response time (AMA) function, causing it to switch off. It's something which will require an update from AMD to their driver behaviour, which they are currently working on."
AMD claims they can fix this via driver, but does that really seem likely? I mean they use a simple protocol, V-Blank, to replace the functions a scaler performed at a fixed refresh rate, and expect to just fix this with drivers? I guess it is possible, but it is looking less and less likely, every passing day, and certainly not as "simple" as some made it out to be (over 1 month since simple claims were made when FreeSync launched).
chizow - Friday, May 1, 2015 - link
Oops, here's the link to the TFT review, just scroll down to the big yellow "Incomplete" portion of the review.http://www.tftcentral.co.uk/reviews/benq_xl2730z.h...
gnuliver - Saturday, May 2, 2015 - link
I don't understand why you would want to have "vsync off" in any situation when using VRR. It defeats the point of VRR entirely when you have tearing and visual inconsistency. If you're a competitive gamer and that matters to you, turn off VRR.chizow - Sunday, May 3, 2015 - link
Well, I guess the belief is at the high-end of a 120-144Hz VRR range, Vsync off negates any input lag (which I've found negligible anyways, and certainly less than Vsync On) and when your refresh rate is that high, the associated tearing is much less noticeable as the on-screen distance/artifacts between split frames would be much less noticeable. I guess you can say it is the same reason some might prefer ULMB at high FPS over G-Sync. They prefer the lack of blur over the minor tearing at high FPS.I think it would be a good option to allow the user to decide, and it sounds like Tom Petersen is going to make it happen if it is possible within the limitations of G-Sync.
Cellar Door - Thursday, April 30, 2015 - link
"No surprise, it's obvious AMD is running into all kinds of problems with the limitations of their FreeSync spec."And you are an expert on FreeSync and know what the issues are - dude you sounds like an idiot.
chizow - Thursday, April 30, 2015 - link
Isn't it obvious? We keep hearing how "simple" it is to fix these problems, that AMD is just going to release a driver fix to address these issues.Where are these fixes? Oh right, see announcement: DELAYED.
Obviously, it is not as simple as AMD claimed, maybe Nvidia was right when they said getting VRR right was hard, which is why they ultimately went with their own custom FPGA to get the job done rather than relying on existing hodge podge standards that leave you with a half-baked solution that introduces as many problems as it fixes?
5150Joker - Friday, May 1, 2015 - link
Certainly appears like AMD jumped the gun with FreeSync just so they could fill in a checkbox and say "me too" to NVIDIA's revolutionary G-Sync. The difference is that NVIDIA fully fleshed out G-Sync before release to work with SLI where as AMD did a lackluster half assed job (as usual).chizow - Friday, May 1, 2015 - link
Pretty much, which is why I said on FreeSync review day, if anyone impacted by these use-cases was interested in FreeSync or had concerns about the ghosting, wait for them to fix it before diving in because as we have seen time and again, AMD has a horrible track record of overpromising and underperforming.Jtaylor1986 - Thursday, April 30, 2015 - link
Pretty clear AMD's driver development resources are not sufficient to concurrently implement FreeSync, create Windows 10 drivers and maintain any semblance of a regular update cycle. Last WHQL driver release was December 19, 2015 and it's almost MayCrunchy005 - Thursday, April 30, 2015 - link
Ya, I feel most of their resources are on the hardware side right now. Carrizo, Zen, Kaveri refreshes(I forget architecture name at the moment), R9 300's, they do have a lot going on.Gigaplex - Thursday, April 30, 2015 - link
They have different engineers for the hardware and software side.chizow - Thursday, April 30, 2015 - link
And they've laid off a lot of both, yet they are still making promises on features they clearly aren't delivering upon. Maybe all those millions paid to EA and man hours wasted on Mantle might have been better suited in getting FreeSync launch right from the start? Just a thought.Gigaplex - Friday, May 1, 2015 - link
Yes, they're understaffed. That part is clear. My point was that their hardware engineers being focused on hardware refreshes is unrelated to the software engineer under-staffing problems.chizow - Friday, May 1, 2015 - link
Sure it is related when they both draw from the same pool of resources. The money wasted on Mantle and that deal with EA support of Mantle could and may have been better suited to hire engineers to work on refreshing their product stack. The lack of funds and resources is clearly related.nandnandnand - Thursday, April 30, 2015 - link
December 19, 2014 right?Mayyybe the drivers are already perfect.
OrphanageExplosion - Friday, May 1, 2015 - link
Try playing GTA5 on an AMD card without the recently released beta drivers and you'll see that this is not the case.Socius - Friday, May 1, 2015 - link
After having tried gsync, and loving it, I realized that both gsync and freesync are baby steps towards what needs to come next: ULMB. Current iterations of ULMB in gsync monitors like the Acer Predator XB270HU are limited to 100Hz, do not work at the same time as gsync, and cut down on screen brightness.In the testing I did, I came to the realization that ULMB a is absolutely godly and 144Hz ULMB with gsync/freesync would be an even better step than going to something like 240Hz.
So put the gsync/freesync crap aside. ULMB has the potential to be significantly more important than this technology.
edzieba - Friday, May 1, 2015 - link
Unfortunately, the two are mutually exclusive until the brightness variance problem can be solved. When you don't know how long until the next display update will be, you don't know how bright to pulse the backlight for the current update. Underpredict, and the display will dim. Overpredict and it will be overbright.chizow - Friday, May 1, 2015 - link
No, the dimming byproduct is strictly due to the fact the light is pulsing for a lower amount of time, so less light is hitting your retina. Even if they fix ULMB to work with G-Sync you will get reduced brightness, this is well-known trade off of ULMB that will never get fixed. They already strobe the LED to higher brightness, but if you are only getting light for a fraction of the time due to the pulse, the macro effect is overall dimmer light. It is also not unlike the effect using 3D Active shutter glasses, the dimness is due to the shutters cutting at least 50% of the light hitting your retina.chizow - Friday, May 1, 2015 - link
I do think ULMB + G-Sync is the next evolution of G-Sync, possibly G-Sync 2.0. Tom Petersen did leave this as a possibility in his interview with PCPer with a "no comment but stay tuned" response, which I have found generally means they are already working on it and there is a high likelihood it makes it to market if they find a way to implement it.Personally I like ULMB at high FPS, but at lower FPS, I find G-Sync is definitely the more noteworthy tech, but a solution that incorporates both without compromise would be great. Petersen explained how PWM was used to control the strobe backlighting with ULMB and said that was hard to get right with a variable refresh rate, but since that interview, PCPer was able to conclude (and Nvidia more or less confirmed) Nvidia's G-Sync module takes over the OverDrive process from the monitor. If they could implement similar controls over whatever IC is responsible for the PWM rate, that would open the door for G-Sync + ULMB together.
The Von Matrices - Friday, May 1, 2015 - link
I wish AMD's driver team would stop promising release dates it can't realistically meet; it does more harm to their brand than the benefit of the sales gained by promising the feature. AMD did the same thing with their Crossfire frame pacing driver, which was also missing deadline after deadline. If you are not sure when you will have a driver ready, then you shouldn't give a specific date of its release. Also, AMD, despite what you may argue otherwise, releasing a buggy beta driver on the deadline does not qualify as a full release.chizow - Friday, May 1, 2015 - link
I agree, and I fully expected more AMD fans and users to have this same sentiment, because honestly, you are the ones who are impacted the most by this kind of behavior, and this is the main reason it would be difficult for me to buy and support AMD products. I just don't agree with their development and release philosophy as it just doesn't meet my needs. Plus, I know there's an alternative out there that does it better.Oxford Guy - Sunday, May 3, 2015 - link
I also wish Nvidia wouldn't do things like release $300+ cards with half the VRAM speed of a 2007 midrange card, and not bother to tell anyone about it.chizow - Sunday, May 3, 2015 - link
Cool yeah I wish AMD didn't sell everyone extra graphics cards as CrossFire and claim it actually improved their gaming experience when it actually made it worst.Gunbuster - Friday, May 1, 2015 - link
Coming Soon™, it will be great just keep waiting and chugging down the FUD they spread about the competitors...chizow - Friday, May 1, 2015 - link
Exactly, it would have been much better if AMD just took their time developing this feature and released it when it was ready, but they spent much of the last 18 months disparaging G-Sync, buying time, trying to slow adoption with all of their FUD. And now, they're stuck with an inferior solution that they won't be able to backtrack from (DP AdaptiveSync) because of all the disparaging FUD they said about why Nvidia needed to use their own, expensive, "license fee" ASIC.silverblue - Saturday, May 2, 2015 - link
Perhaps, but had this been NVIDIA spreading disparaging FUD and being late with a driver, I'm not sure you'd be so quick to judge. Bordering on Schadenfreude, if I'm being honest here. If you're not interested in FreeSync in any way, shape or form, I don't see why you feel the need to slate it endlessly in the comments. Did you know that you've posted 22 responses to this subject out of a grand total of 56?I imagine AMD won't take forever over the driver, but it won't be a week or two in all likeliness. Regardless of all the FUD, it's still better to perfect a driver than release it in an unfinished stage, and whilst I'd have preferred to see the first FreeSync monitors coming out when the software was 100%, the fact they're already out generally points to the spec itself being fine rather than any specific implementation issues.
chizow - Saturday, May 2, 2015 - link
And that's the point lol, the reason I buy Nvidia is because they DON'T need to do that, they just unapologetically develop new and interesting technology that is useful to anyone interested in gaming.Unlike AMD, they spend the majority of the time talking shit about the competition in an attempt to slow adoption or give themselves time to play catch-up. Then they release some half-assed, half-baked solution and say they need more time to fix the problems. FreeSync is just ONE MORE example.
If it comes down to a simple choice to spend a little bit more for a better supported product with fewer question marks and overall more features, it is going to be an easy choice and overwhelmingly, that choice is going to be Nvidia.
chizow - Saturday, May 2, 2015 - link
Also, did you know, out of the grand total responses from AMD fanboys and apologists, only 1 person has actually responded in disappointment to another instance of AMD overpromising and underdelivering? I mean going full Schadenfreude here, to me, its obvious the only reason AMD gets away with this kind of poor support and behavior is because their users and supporters, like you, allow them to and don't demand better.Instead, look at all the fanboy angst and anger directed at me for simply pointing out the obvious! Honestly, AMD can only burn their most loyal fans so many times before they've had enough, but it becomes more and more obvious by the day that their most loyal defenders don't actually use their products, because there is no way in hell that if roles were reversed, I would be satisfied with what AMD has put out there with FreeSync.
The ONLY way these problems are going to get fixed if AMD's actual users demand better, and in that respect, I can guarantee you anyone who buys a FreeSync panel today, reads my comments thinking I'm just some Nvidia fanboy, will appreciate them a lot more when they see these same glaring problems I've already pointed to from more responsible review sites.
Gunbuster - Saturday, May 2, 2015 - link
I would have thought AMD fans would have gotten peeved after broken enduro on $1800 laptops and then janky frame pacing on a pricy set of cards in crossfire. I guess the amount of punishment their fans will take is endless.chizow - Sunday, May 3, 2015 - link
Not to mention all the promises regarding Mantle, TrueAudio that turned out to be bogus. But yeah Enduro is such a joke, just another example of AMD reactively attempting to be feature-competitive with Nvidia, launching a half-baked broken solution and then abandoning it later.WaltC - Friday, May 1, 2015 - link
nVidia gpus & drivers are specifically designed for the aged-12 & under set, imo. They're crazy about nVidia. (Sorry, couldn't resist...;))foxalopex - Friday, May 1, 2015 - link
You know what's sad about this whole Freesync / Gsync arguement is how Freesync has become the VESA defacto standard. Technically if Intel who also makes graphics cards wanted to implement one standard or the other then Freesync would be the easier one to implement because it is the royalty free standard. In the long run I imagine Freesync will win out for this fact alone because it is the cheaper implementation and anyone can do it because the specifications are free. With Nvidia you MUST buy their ASIC to produce a monitor, there's no attempting to do it in house. Now if Nvidia freely offered up the design of that chip to competitors or offered their specifications royalty free then I wouldn't have any issues with them but history has shown that Nvidia will purposely block out competitors when given the chance.chizow - Saturday, May 2, 2015 - link
What's sad is that AMD in their BS run-up to an actual FreeSync launch has brainwashed their fans into believing this, among a bunch of other bits of FUD out there. I'll be honest when I say there's a good chance FreeSync panels are one and done if they can't fix the stated problems with it. Poor sales, unfixable problems, and AMD throwing their vendors under the bus are not a good combination when it comes to product lifecycles.gnuliver - Saturday, May 2, 2015 - link
Why should Nvidia spend millions of dollars and engineering resources developing a new technology only to immediately give it away to AMD and Intel? I am all for open standards, but without that incentive, the investment wouldn't have been made in the first place, and there would likely be no FreeSync or VESA standard on the market today.It's clear that the current VESA adaptive refresh standard isn't quite up to the standard of G-Sync. Nvidia has clearly experimented with it, as indicated by the laptop G-Sync drivers that just require an eDP display. It is also clear that the FPGA and 512 MB of DRAM within the module aren't doing nothing.
chizow - Sunday, May 3, 2015 - link
Well you know with AMD fans they all insist we NEED AMD because competition is good, but when Intel and Nvidia compete and drive innovative new tech and features, its bad and they are "competing too hard" and they should just give it away and share it with AMD.Their hypocrisy knows no bounds, I truly look forward to the day where we don't have to deal with these tech bottomfeeders sandbagging the rest of the industry.
foxalopex - Monday, May 4, 2015 - link
What makes you so sure Nvidia is willing to sell their G-Sync scheme to their competitors huh? In theory all Nvidia needs to do is to sell their FPGA module to display makers. But in order to use G-Sync you also need to BUY A LICENSE from Nvidia as well. AMD on the other hand has provided the specifications which means as long as you know what you're doing you are free to support it as you see fit. If anything this is Nvidia's play it my way or you're not allowed. I mean just look at PhysX. If you have an Nvidia card and AMD (formerly ATI) card in your system, Nvidia purposely disables PhysX. What if in the future Intel's graphics division grows and they decide, "Hey let's disable certain CPU functions if we detect anything but our own graphics card."The whole adaptive sync scheme sadly is only being used with the most high end gamers on the market. Speak to your average joe on the street and no one will really know what it is even. I suspect Intel will adopt FreeSync because well it's free. Monitor makers will likely pick up Freesync faster than G-Sync because it doesn't cost them nearly as much to adopt the standard and in time Nvidia will change their minds and add it to their system as well because well they don't need to pay AMD anything to do it. The only folks who might be burned are the early adopters of G-Sync.
chizow - Monday, May 4, 2015 - link
Who said I was sure Nvidia would sell or license their solution to the competition? LOL. I made no such claims, nor do I care. Nvidia (and AMD for that matter) have made no secret that their G-Sync solution is proprietary, and Nvidia has also made it clear in this case and others, that they have no interest in doing the work for everyone. What's wrong with proprietary if it gets the job done and drives innovation in the marketplace? Would you have preferred Nvidia never invent the implementation of VRR and wait for some standards board to maybe invent it some X years later?It's a mentality I strongly agree with, Nvidia unapologetically spends their resources to create new and innovative technologies that improve the gaming ecosystem for users that BUY IN to their technology. They are in no way obligated to improve the situation for everyone, least of all their competitors, but there have been times in the past they have introduced new technologies that improve gaming for everyone, like GameWorks.
You keep parroting the nonsense that G-Sync requires a licensing fee paid to Nvidia, but there's simply no proof of that. What you do have is an expensive, proprietary G-Sync FPGA module, local DRAM, and all necessary TCONs to replace a monitor's scaler logic. AMD repeatedly made claims that this module wasn't necessary, but as we are seeing now, that is CLEARLY not the case and the G-Sync module and local DRAM buffer were doing quite a bit of work to address the deficiencies we see in AMD's FreeSync solution.
Most notably:
1) the ability to repeat frames on the low-end using the local DRAM as a lookaside buffer.
2) the ability to take over the overdrive function at a variable rate to give the panel similar anti-ghosting characteristics as other panels on the market.
But yes, I see you are also buying into the popular FUD that while this is a niche market (I agree), Intel will be AMD's savior in adopting FreeSync bc well, it's free (this part is nonsense). Sorry to burst your bubble, but aside from the fact Intel has shown no interest whatsoever in FreeSync, FreeSync is actually trademarked by AMD and it is unlikely that they would license it to one of their competitors. Plus, there's the question whether or not Intel iGPUs are even capable of implementing VRR, given the fact even many of AMD's own recent, relevant GPUs cannot. Do you honestly think an Intel IGP has more advanced display controllers, than say, the 7970 Tahiti? Because that part amongst all other GCN 1.0 cards actually can't support FreeSync. Just sayin.
akamateau - Friday, May 1, 2015 - link
Free-sync with Crossfire right NOW is almost irrelevant.With DX12 AMD single dGPU's will deliver something close to 18 MILLION draw calls. More than enough for 4K gaming.
AMD Radeon 3xx series will likely be over 20 million draw calls; more than adequate for 4k games. Running dualie dGPU Radeon cards will likely produce and AP overhead of 30-40 million draws calls. Hardly necessary.
How many gamers NOW are using two Radeon dGPU cards and are going to run out and buy a new flat screen display? Not too many.
chizow - Saturday, May 2, 2015 - link
lol and these replies are a good example of AMD's BS leading to typical misinformation that dies hard.gnuliver - Saturday, May 2, 2015 - link
And yet the 290X doesn't outperform equivalent Nvidia cards in Mantle games on all but low end CPUs, despite Mantle being capable of making "more draw calls."akamateau - Friday, May 1, 2015 - link
WHaaaaa...somebody call the whammmmbulance for this whining nvidia fanboyo whaaaaaa whaaaaaaAMD is late with a Dual dGPU driver. BIG eFFFFing deal. IT's FREE SYNC.
What that means is this, IT"S FREE.
And way better than nVidia.
gnuliver - Saturday, May 2, 2015 - link
I don't think Nvidia fanboys are whining. After all, Nvidia has had no compromises VRR for pretty much as long as VRR technology has existed.akamateau - Friday, May 1, 2015 - link
WHaaaaa...somebody call the whammmmbulance for this whining nvidia fanboyo whaaaaaa whaaaaaaAMD is late with a Dual dGPU driver. BIG eFFFFing deal. IT's FREE SYNC.
What that means is this, IT"S FREE.
And way better than nVidia.
Oxford Guy - Sunday, May 3, 2015 - link
Providing a firm release date and then reneging on that is bad enough but not even providing another target date is basically unacceptable.Nvidia and AMD both need to have more respect for their customers and investors.
1) No more rebadging. If you want to make a minor update, then add "Rev. X" to the card's name, where X is the series number. That way people will know what the GPU chip actually is but also know how it fits into the current lineup (the series). Simple rebadging is pretty close to fraud.
2) No more selling different GPUs under the same name. I particularly recall lower-end Nvidia cards that had very different specs and yet had the same name. There are quite a few instances.
3) Absolutely no scamming people by baiting them with better specs and selling them something with lower specs (GTX 970). And, if you really want people to believe your "We didn't know because we're incompetent" claim, then back it up with a real remedy.
4) If you can't deliver the product within close proximity of the time you announced it would be available then don't announce the date in the first place. Tell people the timing is TBA. If you give an estimate, make sure people know it's highly changeable.
dotpex - Monday, May 4, 2015 - link
chizow is an nvidia bot, his real name is Jason Chowhere is a picture of him: https://pbs.twimg.com/profile_images/2917202787/85...
colhoop - Friday, May 15, 2015 - link
Wow, I don't know who has worse of a life, chizow for writing all this, or me for reading it all. Probably me for reading it all. I think I'm going to go pull my r9 295x2 out and hug it in the corner sucking my thumb and cry.JTWrenn - Monday, June 15, 2015 - link
I don't think the issue really is wether freesync is better than gsync or vice versa. The issue is that nvidia should support freesync as well as gsync. If gsync is much better than they should have nothing to fear as those willing to pay more will pay more. Simple and easy.I haven't had hands on experience with both so I have no idea which is better even with the reviews. Also, I am betting that it will be different depending on the implementation by the monitor manufacturers. So none of that matters to me. I will have to decide which I want. The issue is right now I can't. I feel like I have to pick a monitor then match my computer to it...and that is crazy.
Open it all up as much as they can. Nvidia shoudl support freesync. Then they will have the best platform. Right now the market will be muddled and make it harder for people to get into pc gaming and stay there. This tech could have really boosted the market by making pc gaming smooth and tear free...instead we get two companies going for money..nvidia more so with proprietary tech. Bad for all around.