FreeSync Features

In many ways FreeSync and G-SYNC are comparable. Both refresh the display as soon as a new frame is available, at least within their normal range of refresh rates. There are differences in how this is accomplished, however.

G-SYNC uses a proprietary module that replaces the normal scaler hardware in a display. Besides cost factors, this means that any company looking to make a G-SYNC display has to buy that module from NVIDIA. Of course the reason NVIDIA went with a proprietary module was because adaptive sync didn’t exist when they started working on G-SYNC, so they had to create their own protocol. Basically, the G-SYNC module controls all the regular core features of the display like the OSD, but it’s not as full featured as a “normal” scaler.

In contrast, as part of the DisplayPort 1.2a standard, Adaptive Sync (which is what AMD uses to enable FreeSync) will likely become part of many future displays. The major scaler companies (Realtek, Novatek, and MStar) have all announced support for Adaptive Sync, and it appears most of the changes required to support the standard could be accomplished via firmware updates. That means even if a display vendor doesn’t have a vested interest in making a FreeSync branded display, we could see future displays that still work with FreeSync.

Having FreeSync integrated into most scalers has other benefits as well. All the normal OSD controls are available, and the displays can support multiple inputs – though FreeSync of course requires the use of DisplayPort as Adaptive Sync doesn’t work with DVI, HDMI, or VGA (DSUB). AMD mentions in one of their slides that G-SYNC also lacks support for audio input over DisplayPort, and there’s mention of color processing as well, though this is somewhat misleading. NVIDIA's G-SYNC module supports color LUTs (Look Up Tables), but they don't support multiple color options like the "Warm, Cool, Movie, User, etc." modes that many displays have; NVIDIA states that the focus is on properly producing sRGB content, and so far the G-SYNC displays we've looked at have done quite well in this regard. We’ll look at the “Performance Penalty” aspect as well on the next page.

One other feature that differentiates FreeSync from G-SYNC is how things are handled when the frame rate is outside of the dynamic refresh range. With G-SYNC enabled, the system will behave as though VSYNC is enabled when frame rates are either above or below the dynamic range; NVIDIA's goal was to have no tearing, ever. That means if you drop below 30FPS, you can get the stutter associated with VSYNC while going above 60Hz/144Hz (depending on the display) is not possible – the frame rate is capped. Admittedly, neither situation is a huge problem, but AMD provides an alternative with FreeSync.

Instead of always behaving as though VSYNC is on, FreeSync can revert to either VSYNC off or VSYNC on behavior if your frame rates are too high/low. With VSYNC off, you could still get image tearing but at higher frame rates there would be a reduction in input latency. Again, this isn't necessarily a big flaw with G-SYNC – and I’d assume NVIDIA could probably rework the drivers to change the behavior if needed – but having choice is never a bad thing.

There’s another aspect to consider with FreeSync that might be interesting: as an open standard, it could potentially find its way into notebooks sooner than G-SYNC. We have yet to see any shipping G-SYNC enabled laptops, and it’s unlikely most notebooks manufacturers would be willing to pay $200 or even $100 extra to get a G-SYNC module into a notebook, and there's the question of power requirements. Then again, earlier this year there was an inadvertent leak of some alpha drivers that allowed G-SYNC to function on the ASUS G751j notebook without a G-SYNC module, so it’s clear NVIDIA is investigating other options.

While NVIDIA may do G-SYNC without a module for notebooks, there are still other questions. With many notebooks using a form of dynamic switchable graphics (Optimus and Enduro), support for Adaptive Sync by the Intel processor graphics could certainly help. NVIDIA might work with Intel to make G-SYNC work (though it’s worth pointing out that the ASUS G751 doesn’t support Optimus so it’s not a problem with that notebook), and AMD might be able to convince Intel to adopt DP Adaptive Sync, but to date neither has happened. There’s no clear direction yet but there’s definitely a market for adaptive refresh in laptops, as many are unable to reach 60+ FPS at high quality settings.

FreeSync Displays and Pricing FreeSync vs. G-SYNC Performance
Comments Locked

350 Comments

View All Comments

  • chizow - Thursday, March 19, 2015 - link

    See link: http://www.pcper.com/image/view/54234?return=node%...

    Also: still unaddressed concerns with how and why FreeSync is still tied to Vsync and how this impacts latency.
  • happycamperjack - Thursday, March 19, 2015 - link

    The ghosting problem actually has nothing to do with the G-Sync and FreeSync technologies like the article said, but more have to do with the components in the monitor. So if Asus made a ROG Swift FreeSync version of the same monitor, there would've been no ghosting just like the G-SYNC version. So your example is invalid.
  • chizow - Friday, March 20, 2015 - link

    @happycamperjack. Again, incorrect. Why is it that panels from the SAME manufacturers, that possibly use the same panels even, using the same prevaling panel technologies of this time, exhibit widely different characteristics under variable refresh? Maybe that magic G-Sync module that AMD claims is pointless is actually doing something....like controlling the drive electronics that control pixel response variably in response to changing framerates. Maybe AMD needs another 18 months to refine those scalers with the various scaler mfgs?

    http://www.pcper.com/reviews/Displays/AMD-FreeSync...

    "Modern monitors are often tuned to a specific refresh rate – 144 Hz, 120 Hz, 60 Hz, etc. – and the power delivery to pixels is built to reduce ghosting and image defects. But in a situation where the refresh rate can literally be ANY rate, as we get with VRR displays, the LCD will very often be in these non-tuned refresh rates. NVIDIA claims its G-Sync module is tuned for each display to prevent ghosting by change the amount of voltage going to pixels at different refresh rates, allowing pixels to untwist and retwist at different rates."

    BenQ for example makes a fine G-Sync monitor, and multiple high refresh 3D Vision monitors well known for their lack of ghosting. Are you going to tell me that suddenly they are using inferior panel tech that can't handle ghosting? This is 2015 and TN panels we are talking about here right? This kind of ghosting has not been seen since circa 2007 when PMVA was all the rage.
  • AnnihilatorX - Thursday, March 19, 2015 - link

    chizow stop your biased preconceptions and actually read the article
  • AnnihilatorX - Thursday, March 19, 2015 - link

    I will summarize it for you in case your prejudice clouds your comprehension

    1) At no point in the article it finds any performance advantage from FreeSync or Gsync (AMD claims 0.5-1% advantage but that's too small to detect, so we disregard that)
    2) Freesync has better monitor choices, including IPS and ones with better specs in general
    3) Freesync monitors are about USD200 cheaper, almost half the cost of a decent graphic card
    4) Freesync monitors have on-screen dialogues (OSD) that works, Gsync monitor doesn't due to implementation
    5) Freesync has better potential in future for support, especially laptops becuase of zero royalty fees and only minor update to hardware
    6. Freesync allows users the option to choose whether they want to enable Vsync or not, Gsync locks Vsync to be on. This mean the user can have better latency if they can stand tearing. The important thing is option, having the option is always advantageous
    7. AMD claims Freesync works from 9Hz-240Hz wheras Gsync only works from 30Hz to 144Hz.
  • chizow - Thursday, March 19, 2015 - link

    @AnnihilatorX

    1) You assume the tests conducted here are actually relevant.
    2) No, they don't. Nvidia has an IPS in the works that may very well be the best of all, but in the meantime, it is obvious that for whatever reason the FreeSync panels are subpar compared to the G-Sync offerings. Coutesy of PCPER: http://www.pcper.com/image/view/54234?return=node%...
    3) Sure they are cheaper, but they also aren't as good, and certainly not "Free" as there is a clear premium compared to non-FreeSync panels, and certainly no firmware flash is going to change that. Also, that $200 is going to have to be spent on a new AMD graphics GCN1.1+ graphics card anyways as anyone who doesn't already own a newer AMD card will have to factor that into their decision. Meanwhile, G-Sync supports everything from Nvidia from Kepler on. Nice and tidy (and dominant in terms of installed user base).
    4) OSDs, scalers and such add input lag, while having multiple inputs is nice, OSDs are a feature gaming purists can live without (See: all the gaming direct input modes on newer LCDs that bypass the scalers).
    5) Not if they're tied to AMD hardware. They can enjoy a minor share of the dGPU graphics market as their TAM.
    6) Uh, this is nonsense. FreeSync is still tied to Vsync in ways THIS review certainly doesn't cover indepth, but that's certainly not going to be a positive since Vsync inherently adds latency. Meanwhile, Vsync is never enabled with G-Sync, and while there is more latency at the capped FPS, it is a driver-side cap and not Vsync enabled.
    7) Well, AMD can claim all they like it goes as low as 9Hz but as we have seen the implementation is FAR worst, falling apart below 40FPS where blurring, tearing, basically the image falls apart and everything you invested hundreds of dollars basically became a huge waste. Meanwhile, G-Sync shows none of these issues, and I play some MMOs that regularly dip into the 20s in crowded cities, no sign of any of this.

    So yes, as I've shown, there are still many issues with FreeSync that need to be addressed that show it is clearly not as good as G-Sync. But like I said, this is a good introduction to the tech that Nvidia invented some 18 months ago, maybe with another 18 months AMD will make more refinements and close the gap?
  • lordken - Thursday, March 19, 2015 - link

    5) what? Where did you got that Adaptive sync is tied to AMD HW? Thats pretty bullshit, if it would then it wouldnt be standardized by VESA right?
    If today it is only AMD HW that can support it (cause they implement first) doesnt validate your claim that it is AMD tied. Intel/nvidia/... can implement it in their products if they want.
    It is like you would be saying that if for example LG release first monitor that will support DP1.3 that it implies DP1.3 is LG tied lol
    On other hand Gsync is Nvidia tied. But you know this right?
  • chizow - Thursday, March 19, 2015 - link

    @lordken, who else supports FreeSync? No one but AMD. Those monitor makers can ONLY expect to get business from a minor share of the graphics market given that is going to be the primary factor in paying the premium for one over a non-FreeSync monitor. This is a fact.
  • anubis44 - Tuesday, March 24, 2015 - link

    VESA supports FreeSync, which means Intel will probably support it, too. Intel graphics drive far more computers than AMD or nVidia, which means that if Intel does support it, nVidia is euchred, and even if Intel doesn't support it, many more gamers will choose free over paying an extra $150-$200 for a gaming setup. Between the 390-series coming out shortly and the almost guaranteed certainty that some hacked nVidia drivers will show up on the web to support FreeSync, G-Sync is a doomed technology. Period.
  • chizow - Tuesday, March 24, 2015 - link

    Intel has no reason to support FreeSync, and they have shown no interest either. Hell they showed more interest in Mantle but as we all know, AMD denied them (so much for being the open hands across the globe company).

    But yes I'm hoping Nvidia does support Adaptive Sync as their low-end solution and keeps G-Sync as their premium solution. As we have seen, FreeSync just isn't good enough but at the very least it means people will have even less reason to buy AMD if Nvidia supports both lower-end Adaptive Sync and premium G-Sync monitors.

Log in

Don't have an account? Sign up now