FreeSync Features

In many ways FreeSync and G-SYNC are comparable. Both refresh the display as soon as a new frame is available, at least within their normal range of refresh rates. There are differences in how this is accomplished, however.

G-SYNC uses a proprietary module that replaces the normal scaler hardware in a display. Besides cost factors, this means that any company looking to make a G-SYNC display has to buy that module from NVIDIA. Of course the reason NVIDIA went with a proprietary module was because adaptive sync didn’t exist when they started working on G-SYNC, so they had to create their own protocol. Basically, the G-SYNC module controls all the regular core features of the display like the OSD, but it’s not as full featured as a “normal” scaler.

In contrast, as part of the DisplayPort 1.2a standard, Adaptive Sync (which is what AMD uses to enable FreeSync) will likely become part of many future displays. The major scaler companies (Realtek, Novatek, and MStar) have all announced support for Adaptive Sync, and it appears most of the changes required to support the standard could be accomplished via firmware updates. That means even if a display vendor doesn’t have a vested interest in making a FreeSync branded display, we could see future displays that still work with FreeSync.

Having FreeSync integrated into most scalers has other benefits as well. All the normal OSD controls are available, and the displays can support multiple inputs – though FreeSync of course requires the use of DisplayPort as Adaptive Sync doesn’t work with DVI, HDMI, or VGA (DSUB). AMD mentions in one of their slides that G-SYNC also lacks support for audio input over DisplayPort, and there’s mention of color processing as well, though this is somewhat misleading. NVIDIA's G-SYNC module supports color LUTs (Look Up Tables), but they don't support multiple color options like the "Warm, Cool, Movie, User, etc." modes that many displays have; NVIDIA states that the focus is on properly producing sRGB content, and so far the G-SYNC displays we've looked at have done quite well in this regard. We’ll look at the “Performance Penalty” aspect as well on the next page.

One other feature that differentiates FreeSync from G-SYNC is how things are handled when the frame rate is outside of the dynamic refresh range. With G-SYNC enabled, the system will behave as though VSYNC is enabled when frame rates are either above or below the dynamic range; NVIDIA's goal was to have no tearing, ever. That means if you drop below 30FPS, you can get the stutter associated with VSYNC while going above 60Hz/144Hz (depending on the display) is not possible – the frame rate is capped. Admittedly, neither situation is a huge problem, but AMD provides an alternative with FreeSync.

Instead of always behaving as though VSYNC is on, FreeSync can revert to either VSYNC off or VSYNC on behavior if your frame rates are too high/low. With VSYNC off, you could still get image tearing but at higher frame rates there would be a reduction in input latency. Again, this isn't necessarily a big flaw with G-SYNC – and I’d assume NVIDIA could probably rework the drivers to change the behavior if needed – but having choice is never a bad thing.

There’s another aspect to consider with FreeSync that might be interesting: as an open standard, it could potentially find its way into notebooks sooner than G-SYNC. We have yet to see any shipping G-SYNC enabled laptops, and it’s unlikely most notebooks manufacturers would be willing to pay $200 or even $100 extra to get a G-SYNC module into a notebook, and there's the question of power requirements. Then again, earlier this year there was an inadvertent leak of some alpha drivers that allowed G-SYNC to function on the ASUS G751j notebook without a G-SYNC module, so it’s clear NVIDIA is investigating other options.

While NVIDIA may do G-SYNC without a module for notebooks, there are still other questions. With many notebooks using a form of dynamic switchable graphics (Optimus and Enduro), support for Adaptive Sync by the Intel processor graphics could certainly help. NVIDIA might work with Intel to make G-SYNC work (though it’s worth pointing out that the ASUS G751 doesn’t support Optimus so it’s not a problem with that notebook), and AMD might be able to convince Intel to adopt DP Adaptive Sync, but to date neither has happened. There’s no clear direction yet but there’s definitely a market for adaptive refresh in laptops, as many are unable to reach 60+ FPS at high quality settings.

FreeSync Displays and Pricing FreeSync vs. G-SYNC Performance
Comments Locked

350 Comments

View All Comments

  • lordken - Thursday, March 19, 2015 - link

    are you sure? they only have to come with different name (if they want). Just as both amd/nvidia calls and use "DisplayPort" as DisplayPort , they didnt have came up with their own implementations of it as DP is standardized by VESA so they used that.
    Or I am missing your point what you wanted to say.

    Question is if it become core/regular part of lets say DP1.4 onwards as just now it is only optional aka 1.2a and not even in DP1.3 - if I understand that correctly.
  • iniudan - Thursday, March 19, 2015 - link

    Well the implementation need to be in their driver, they not gonna give that to Nvidia. =p
  • chizow - Thursday, March 19, 2015 - link

    So it is also closed/proprietary on an open spec? Gotcha, so I guess Nvidia should just keep supporting their own proprietary solution. Makes sense to me.
  • ddarko - Thursday, March 19, 2015 - link

    You know repeating a falsehood 100 times doesn't make it true, right?
  • chizow - Tuesday, March 24, 2015 - link

    You mean like repeating FreeSync can be made backward compatible with existing monitors with just a firmware flash, essentially for Free? I can't remember how many times that nonsense was tossed about in the 15 months it took before FreeSync monitors finally materialized.

    Btw, it is looking more and more like FreeSync is a proprietary implementation based on an open-spec just as I stated. FreeSync has recently been trademarked by AMD so there's not even a guarantee AMD would allow Nvidia to enable their own version of Adaptive-Sync on FreeSync (TM) branded monitors.
  • ddarko - Thursday, March 19, 2015 - link

    From the PC Perspective article you've been parroting around like gospel all day today:

    "That leads us to AMD’s claims that FreeSync doesn’t require proprietary hardware, and clearly that is a fair and accurate benefit of FreeSync today. Displays that support FreeSync can use one of a number of certified scalars that support the DisplayPort AdaptiveSync standard. The VESA DP 1.2a+ AdaptiveSync feature is indeed an open standard, available to anyone that works with VESA while G-Sync is only offered to those monitor vendors that choose to work with NVIDIA and purchase the G-Sync modules mentioned above."

    http://www.pcper.com/reviews/Displays/AMD-FreeSync...

    That's the difference between an open and closed standard, as you well know but are trying to obscure with FUD.
  • chizow - Friday, March 20, 2015 - link

    @ddarko, it says a lot that you quote the article but omit the actually relevant portion:

    "Let’s address the licensing fee first. I have it from people that I definitely trust that NVIDIA is not charging a licensing fee to monitor vendors that integrate G-Sync technology into monitors. What they do charge is a price for the G-Sync module, a piece of hardware that replaces the scalar that would normally be present in a modern PC display. It might be a matter of semantics to some, but a licensing fee is not being charged, as instead the vendor is paying a fee in the range of $40-60 for the module that handles the G-Sync logic."

    And more on that G-Sync module AMD claims isn't necessary (but we in turn have found out a lot of what AMD said about G-Sync turned out to be BS even in relation to their own FreeSync solution):

    "But in a situation where the refresh rate can literally be ANY rate, as we get with VRR displays, the LCD will very often be in these non-tuned refresh rates. NVIDIA claims its G-Sync module is tuned for each display to prevent ghosting by change the amount of voltage going to pixels at different refresh rates, allowing pixels to untwist and retwist at different rates."

    In summary, AMD's own proprietary spec just isn't as good as Nvidia's.
  • Crunchy005 - Friday, March 20, 2015 - link

    AMDs spec is not proprietary so stop lying. Also I love how you just quoted in context what you quoted out of context in an earlier comment. The only argument you have against freeSync is ghosting and as many people have pointed out is that it is not an inherent issue with free sync but the monitors themselves. The example given in that shows three different displays that all are affected differently. The LG and benq both show ghosting differently but use the same freeSync standard so something else is different here and not freeSync. On top of that the LG is $100 less than the asus and the benQ $150 less for the same features and more inputs. I don't see how a better more well rounded monitor that can offer variable refresh rates with more features that is cheaper is a bad thing. From the consumer side of things that is great! A few ghosting issues that i'm sure are hardly noticeable to the average user is not a major issue. The videos shown there are taken at a high frame rate and slowed down, then put into a compressed format and thrown on youtube in what is a very jerky hard to see video, great example for your only argument. If the tech industry could actually move away from proprietary/patented technology, and maybe try to actually offer better products and not "good enough" products that force customers into choosing and being locked into one thing we could be a lot father along.
  • chizow - Friday, March 20, 2015 - link

    Huh? How do you know Nvidia can use FreeSync? I am pretty sure AMD has said Nvidia can't use FreeSync, if they decide to use something with DP 1.2a Adaptive Sync they have to call it something else and create their own implementation, so clearly it is not an Open Standard as some claim.

    And how is it not an issue inherent with FreeSync? Simple test that any site like AT that actually wants answers can do:

    1) Run these monitors with Vsync On.
    2) Run these monitors with Vsync Off.
    3) Run these monitors with FreeSync On.

    Post results. If these panels only exhibit ghosting in 3), then obviously it is an issue caused by FreeSync. Especially when we have these same panel makers (the 27" 1440p BenQ is apparently the same AU optronics panel as the ROG Swift) have panels on the market, both non-FreeSync and G-Sync) that have no such ghosting.

    And again you mention the BenQ vs. the Asus, well guess what? Same panel, VERY different results. Maybe its that G-Sync module doing its magic, and that it actually justifies its price. Maybe that G-Sync module isn't bogus as AMD claimed and it is actually the Titan X of monitor scalers and is worth every penny it costs over AMD FreeSync if it is successful at preventing the kind of ghosting we see on AMD panels, while allowing VRR to go as low as 1FPS.

    Just a thought!
  • Crunchy005 - Monday, March 23, 2015 - link

    Same panel different scalers. AMD just uses the standard built into the display port, the scaler handles the rest there so it isn't necessarily freeSync but the variable refresh rate technology in scaler that would be causing the ghosting. So again not AMD but the manufacturer.

    "If these panels only exhibit ghosting in 3), then obviously it is an issue caused by FreeSync"
    Haven't seen this and you haven't shown us either.

    "Maybe its that G-Sync module doing its magic"
    This is the scaler so the scaler, not made by AMD, that supports the VRR standard that AMD uses is what is controlling that panel not freeSync itself. Hence an issue outside of AMDs control. Stop lying and saying it is an issue with AMD. Nvidia fanboys lying, gotta keep them on the straight and narrow.

Log in

Don't have an account? Sign up now