FreeSync Features

In many ways FreeSync and G-SYNC are comparable. Both refresh the display as soon as a new frame is available, at least within their normal range of refresh rates. There are differences in how this is accomplished, however.

G-SYNC uses a proprietary module that replaces the normal scaler hardware in a display. Besides cost factors, this means that any company looking to make a G-SYNC display has to buy that module from NVIDIA. Of course the reason NVIDIA went with a proprietary module was because adaptive sync didn’t exist when they started working on G-SYNC, so they had to create their own protocol. Basically, the G-SYNC module controls all the regular core features of the display like the OSD, but it’s not as full featured as a “normal” scaler.

In contrast, as part of the DisplayPort 1.2a standard, Adaptive Sync (which is what AMD uses to enable FreeSync) will likely become part of many future displays. The major scaler companies (Realtek, Novatek, and MStar) have all announced support for Adaptive Sync, and it appears most of the changes required to support the standard could be accomplished via firmware updates. That means even if a display vendor doesn’t have a vested interest in making a FreeSync branded display, we could see future displays that still work with FreeSync.

Having FreeSync integrated into most scalers has other benefits as well. All the normal OSD controls are available, and the displays can support multiple inputs – though FreeSync of course requires the use of DisplayPort as Adaptive Sync doesn’t work with DVI, HDMI, or VGA (DSUB). AMD mentions in one of their slides that G-SYNC also lacks support for audio input over DisplayPort, and there’s mention of color processing as well, though this is somewhat misleading. NVIDIA's G-SYNC module supports color LUTs (Look Up Tables), but they don't support multiple color options like the "Warm, Cool, Movie, User, etc." modes that many displays have; NVIDIA states that the focus is on properly producing sRGB content, and so far the G-SYNC displays we've looked at have done quite well in this regard. We’ll look at the “Performance Penalty” aspect as well on the next page.

One other feature that differentiates FreeSync from G-SYNC is how things are handled when the frame rate is outside of the dynamic refresh range. With G-SYNC enabled, the system will behave as though VSYNC is enabled when frame rates are either above or below the dynamic range; NVIDIA's goal was to have no tearing, ever. That means if you drop below 30FPS, you can get the stutter associated with VSYNC while going above 60Hz/144Hz (depending on the display) is not possible – the frame rate is capped. Admittedly, neither situation is a huge problem, but AMD provides an alternative with FreeSync.

Instead of always behaving as though VSYNC is on, FreeSync can revert to either VSYNC off or VSYNC on behavior if your frame rates are too high/low. With VSYNC off, you could still get image tearing but at higher frame rates there would be a reduction in input latency. Again, this isn't necessarily a big flaw with G-SYNC – and I’d assume NVIDIA could probably rework the drivers to change the behavior if needed – but having choice is never a bad thing.

There’s another aspect to consider with FreeSync that might be interesting: as an open standard, it could potentially find its way into notebooks sooner than G-SYNC. We have yet to see any shipping G-SYNC enabled laptops, and it’s unlikely most notebooks manufacturers would be willing to pay $200 or even $100 extra to get a G-SYNC module into a notebook, and there's the question of power requirements. Then again, earlier this year there was an inadvertent leak of some alpha drivers that allowed G-SYNC to function on the ASUS G751j notebook without a G-SYNC module, so it’s clear NVIDIA is investigating other options.

While NVIDIA may do G-SYNC without a module for notebooks, there are still other questions. With many notebooks using a form of dynamic switchable graphics (Optimus and Enduro), support for Adaptive Sync by the Intel processor graphics could certainly help. NVIDIA might work with Intel to make G-SYNC work (though it’s worth pointing out that the ASUS G751 doesn’t support Optimus so it’s not a problem with that notebook), and AMD might be able to convince Intel to adopt DP Adaptive Sync, but to date neither has happened. There’s no clear direction yet but there’s definitely a market for adaptive refresh in laptops, as many are unable to reach 60+ FPS at high quality settings.

FreeSync Displays and Pricing FreeSync vs. G-SYNC Performance
Comments Locked

350 Comments

View All Comments

  • silverblue - Saturday, March 21, 2015 - link

    I can certainly let you off most of those, but third party activities shouldn't count, so you can subtract 6 and 12. Additionally, 13 can be picked apart as the 295X2 showed that AMD can present a high quality cooler, and because I believe lumping the aesthetic qualities of a cooler in with heat and noise is a partial falsehood (admit it - you WILL have been thinking of metal versus plastic shrouds). I also don't agree with you on 11; at least, not if you move back past the 2XX generation as AMD had more aggressive bundles back then. 8 is subjective but NVIDIA usually gets the nod here.

    Also, some of your earlier items are proprietary tech, to which I could always tease you about as it's not as if they couldn't license any of this out. ;)

    I'll hand it to you and credit you with your dozen.
  • chizow - Saturday, March 21, 2015 - link

    And I thank you for not doing the typical dismissive approach of "Oh I don't care about those features" that some on these forums might respond with.

    I would still disagree on 6 and 12 though, ultimately they are still a part of Nvidia's ecosystem and end-user experience, and in many cases, Nvidia affords them the tools and support to enable and offer these value-add features. 3rd party tools for example, they specifically take advantage of Nvidia's NVAPI to access hardware features via driver and Nvidia's very transparent XML settings to manipulate AA/SLI profile data. Similarly, every feature EVGA offers to end users has to be worth their effort and backed by Nvidia to make business sense for them.

    And 13, I would absolutely disagree on that one. I mean we see the culmination of Nvidia's cooling technology, the Titan NVTTM cooler, which is awesome. Having to resort to a triple slot water cooled solution for a high-end graphics card is terrible precedent imo and a huge barrier to entry for many, as you need additional case mounting and clearance which could be a problem if you already have a CPU CLC as many do. But that's just my opinion.

    AMD did make a good effort with their Gaming Evolved bundles and certainly offered better than Nvidia for a brief period, but its pretty clear their marketing dollars dried up around the same time they cut that BF4 Mantle deal and their current financial situation hasn't allowed them to offer anything compelling since. But I stand by that bulletpoint, Nvidia typically offers the more relevant and attractive game bundle at any given time.

    One last point in favor of Nvidia, is Optimus. I don't use it at home as I have no interest in "gaming" laptops, but it is a huge benefit there. We do have them on powerful laptops at work however, and the ability to "elevate" an application to the Nvidia dGPU on command is a huge benefit there as well.
  • anubis44 - Tuesday, March 24, 2015 - link

    @chizow:
    But hey kids, remember, after reading this 16 point PowerPoint presentation where he points out the superiority of nVidia using detailed arguments like "G-Sync" and "GRID" as strengths, chizow DOES NOT WORK FOR nVidia! He is not sitting in the marketing department in Santa Clara, California, with a group of other marketing mandarins running around, grabbing factoids for him to type in as responses to chat forums. No way!

    Repeat after me, 'chizow does NOT work for nVidia.' He's just an ordinary, everyday psychopath who spends 18 hours a day at keyboard responding to every single criticism of nVidia, no matter how trivial. But he does NOT work for nVidia! Perish the thought! He just does it out of his undying love for the green goblin.
  • chizow - Tuesday, March 24, 2015 - link

    But hey remember AMD fantards, there's no reason that the overwhelming majority of the market prefers Nvidia, those 16 things I listed don't actually mean anything if you prefer subpar product and don't demand better, and you continually choose to ignore the obvious one product supports more features and the other doesn't. But hey, just keep accepting subpar products and listen to AMD fanboys like anubis44, don't give in to the reality the rest of us all accept as fact.
  • sr1030nx - Saturday, March 21, 2015 - link

    Only if they were NVIDIA branded speaker cables 😉
  • Darkito - Friday, March 20, 2015 - link

    False
  • Darkito - Friday, March 20, 2015 - link

    False, it's indistinguishable "Within the supported refresh rate range" as per this review. What happens outside the VRR window however, and especially under it, is incredibly different. With G-sync, if you get 20 fps it'll actually duplicate frames and tune the monitor to 40Hz, which means smooth gaming at sub-30Hz refresh rates (well, as smooth as 20fps can be). With FreeSync, it'll just fall back to v-sync on or off, with all the stuttering or tearing that involves. That means that if your game ever falls below the VRR window on FreeSync, image quality falls apart dramatically. And according to PCPer, this isn't just something AMD can fix with a driver update because it requires the frame buffer and logic on the G-Sync module!

    http://www.pcper.com/reviews/Displays/AMD-FreeSync...

    Take note that the LG panel tested actually has a VRR window lower bound of 48Hz, so image quality starts falling apart if you dip below 48fps, which is clearly unacceptable.
  • AdamW0611 - Sunday, March 22, 2015 - link

    Yah just like R.I.P Direct X, Mantle will rule the day, now AMD is telling developers to ignore Mantle, Gsync is great, and for those of us who prefer drivers being updated the same day games are released will stick with Nvidia, while months later AMD users will be crying that games still don't work right for them.
  • anubis44 - Tuesday, March 24, 2015 - link

    Company of Heroes 2 worked like shit for nVidia users for months after release, while my Radeon 7950 was pulling more FPS than a Titan card. To this day, Radeons pull better, and smoother FPS than equivalently priced nVidia cards in this, my favourite game. The GTX970 is still behind the R9 290 today. Is that the 'same day' nVidia driver support you're referring to?
  • chizow - Tuesday, March 24, 2015 - link

    More BS from one of the biggest AMD fanboys on the planet, an AMD *CPU* fanboy nonetheless. CoH2 ran faster on Nvidia hardware from Day1, and also runs much faster on Intel CPUs, so yeah, as usual, you're running the slower hardware in your favorite game simply bc you're a huge AMD fanboy.

    http://www.techspot.com/review/689-company-of-hero...
    http://www.anandtech.com/show/8526/nvidia-geforce-...

Log in

Don't have an account? Sign up now