FreeSync Features

In many ways FreeSync and G-SYNC are comparable. Both refresh the display as soon as a new frame is available, at least within their normal range of refresh rates. There are differences in how this is accomplished, however.

G-SYNC uses a proprietary module that replaces the normal scaler hardware in a display. Besides cost factors, this means that any company looking to make a G-SYNC display has to buy that module from NVIDIA. Of course the reason NVIDIA went with a proprietary module was because adaptive sync didn’t exist when they started working on G-SYNC, so they had to create their own protocol. Basically, the G-SYNC module controls all the regular core features of the display like the OSD, but it’s not as full featured as a “normal” scaler.

In contrast, as part of the DisplayPort 1.2a standard, Adaptive Sync (which is what AMD uses to enable FreeSync) will likely become part of many future displays. The major scaler companies (Realtek, Novatek, and MStar) have all announced support for Adaptive Sync, and it appears most of the changes required to support the standard could be accomplished via firmware updates. That means even if a display vendor doesn’t have a vested interest in making a FreeSync branded display, we could see future displays that still work with FreeSync.

Having FreeSync integrated into most scalers has other benefits as well. All the normal OSD controls are available, and the displays can support multiple inputs – though FreeSync of course requires the use of DisplayPort as Adaptive Sync doesn’t work with DVI, HDMI, or VGA (DSUB). AMD mentions in one of their slides that G-SYNC also lacks support for audio input over DisplayPort, and there’s mention of color processing as well, though this is somewhat misleading. NVIDIA's G-SYNC module supports color LUTs (Look Up Tables), but they don't support multiple color options like the "Warm, Cool, Movie, User, etc." modes that many displays have; NVIDIA states that the focus is on properly producing sRGB content, and so far the G-SYNC displays we've looked at have done quite well in this regard. We’ll look at the “Performance Penalty” aspect as well on the next page.

One other feature that differentiates FreeSync from G-SYNC is how things are handled when the frame rate is outside of the dynamic refresh range. With G-SYNC enabled, the system will behave as though VSYNC is enabled when frame rates are either above or below the dynamic range; NVIDIA's goal was to have no tearing, ever. That means if you drop below 30FPS, you can get the stutter associated with VSYNC while going above 60Hz/144Hz (depending on the display) is not possible – the frame rate is capped. Admittedly, neither situation is a huge problem, but AMD provides an alternative with FreeSync.

Instead of always behaving as though VSYNC is on, FreeSync can revert to either VSYNC off or VSYNC on behavior if your frame rates are too high/low. With VSYNC off, you could still get image tearing but at higher frame rates there would be a reduction in input latency. Again, this isn't necessarily a big flaw with G-SYNC – and I’d assume NVIDIA could probably rework the drivers to change the behavior if needed – but having choice is never a bad thing.

There’s another aspect to consider with FreeSync that might be interesting: as an open standard, it could potentially find its way into notebooks sooner than G-SYNC. We have yet to see any shipping G-SYNC enabled laptops, and it’s unlikely most notebooks manufacturers would be willing to pay $200 or even $100 extra to get a G-SYNC module into a notebook, and there's the question of power requirements. Then again, earlier this year there was an inadvertent leak of some alpha drivers that allowed G-SYNC to function on the ASUS G751j notebook without a G-SYNC module, so it’s clear NVIDIA is investigating other options.

While NVIDIA may do G-SYNC without a module for notebooks, there are still other questions. With many notebooks using a form of dynamic switchable graphics (Optimus and Enduro), support for Adaptive Sync by the Intel processor graphics could certainly help. NVIDIA might work with Intel to make G-SYNC work (though it’s worth pointing out that the ASUS G751 doesn’t support Optimus so it’s not a problem with that notebook), and AMD might be able to convince Intel to adopt DP Adaptive Sync, but to date neither has happened. There’s no clear direction yet but there’s definitely a market for adaptive refresh in laptops, as many are unable to reach 60+ FPS at high quality settings.

FreeSync Displays and Pricing FreeSync vs. G-SYNC Performance
Comments Locked

350 Comments

View All Comments

  • willis936 - Thursday, March 19, 2015 - link

    I would like an actual look at added input latency from these adaptive sync implementations. Nobody has even mentioned it but there's a very real possibility that either the graphics TX or monitor's scaler has to do enough thinking to cause a significant delay from when pixels come it to when they're displayed on the screen. Why isn't the first issue to be scrutinized be the thing that these technologies seek to solve?
  • mutantmagnet - Thursday, March 19, 2015 - link

    Acer already posted the MSRP

    http://us.acer.com/ac/en/US/content/model/UM.HB0AA...

    $800
  • mutantmagnet - Thursday, March 19, 2015 - link

    I forgot to mention it's already on sale in Europe.
  • JarredWalton - Thursday, March 19, 2015 - link

    Google was failing me last night, though granted I haven't slept much in the past two days.
  • ezridah - Thursday, March 19, 2015 - link

    It's odd that on their product page they don't mention G-Sync or the refresh rate anywhere... It's like they don't want to sell it or something.
  • eanazag - Thursday, March 19, 2015 - link

    My monitors last longer than 5 years. Basically I keep them till they die. I have a 19" 1280x1024 on the shared home computer I'm considering replacing. I'd be leaning towards neither or Freesync monitors.

    I currently am sporting AMD GPUs, but I am one of those who go back and forth between vendors and I don't think it is as small a minority as was assumed. I bought two R9 290's when AMD last February. If I was buying right now, I'd be getting a GTX 970. I do like the GeForce Experience software. I'm still considering a GTX 750 Ti.

    I'm not totally sold on what AMD has in the market at the moment. I have a lot of heat concerns using in Crossfire and the wattage is higher than I like. The original 290 blowers sucked. I'd like blower cards again that are quality like Nvidia's.
  • Dorek - Thursday, March 19, 2015 - link

    Wait, you didn't just say that you use two R9 290s ona 1280x1024 monitor, right?
  • medi03 - Thursday, March 19, 2015 - link

    I don't get how 970 is better than 290x. it is slower and more expensive:
    http://www.anandtech.com/show/8568/the-geforce-gtx...

    And total system consumption is lower by about 20-25% (305w on 970 vs 365 on 290x). No big deal
  • JarredWalton - Thursday, March 19, 2015 - link

    It's not "better" but it is roughly equivalent. I've got benchmarks from over 20 games. Average for 290 X at 2560x1440 "Ultra" across those games is 57.4 FPS while the average for 970 is 56.8 FPS. Your link to Crysis: Warhead is one title where AMD wins, but I could counter with GRID 2/Autosport and Lord of the Fallen where NVIDIA wins. And of the two GPUs, 970 will overclock more than 290X if you want to do that.
  • TallestJon96 - Thursday, March 19, 2015 - link

    I'm an NVIDIA User, but in happy to see the proprietary GSYNC get beat down. I've got a 1080p144 non GSYNC panel, so I won't be upgrading for 3-5 years, and hopefully 4k and FreeSync will both be standard by then.

Log in

Don't have an account? Sign up now