FreeSync Features

In many ways FreeSync and G-SYNC are comparable. Both refresh the display as soon as a new frame is available, at least within their normal range of refresh rates. There are differences in how this is accomplished, however.

G-SYNC uses a proprietary module that replaces the normal scaler hardware in a display. Besides cost factors, this means that any company looking to make a G-SYNC display has to buy that module from NVIDIA. Of course the reason NVIDIA went with a proprietary module was because adaptive sync didn’t exist when they started working on G-SYNC, so they had to create their own protocol. Basically, the G-SYNC module controls all the regular core features of the display like the OSD, but it’s not as full featured as a “normal” scaler.

In contrast, as part of the DisplayPort 1.2a standard, Adaptive Sync (which is what AMD uses to enable FreeSync) will likely become part of many future displays. The major scaler companies (Realtek, Novatek, and MStar) have all announced support for Adaptive Sync, and it appears most of the changes required to support the standard could be accomplished via firmware updates. That means even if a display vendor doesn’t have a vested interest in making a FreeSync branded display, we could see future displays that still work with FreeSync.

Having FreeSync integrated into most scalers has other benefits as well. All the normal OSD controls are available, and the displays can support multiple inputs – though FreeSync of course requires the use of DisplayPort as Adaptive Sync doesn’t work with DVI, HDMI, or VGA (DSUB). AMD mentions in one of their slides that G-SYNC also lacks support for audio input over DisplayPort, and there’s mention of color processing as well, though this is somewhat misleading. NVIDIA's G-SYNC module supports color LUTs (Look Up Tables), but they don't support multiple color options like the "Warm, Cool, Movie, User, etc." modes that many displays have; NVIDIA states that the focus is on properly producing sRGB content, and so far the G-SYNC displays we've looked at have done quite well in this regard. We’ll look at the “Performance Penalty” aspect as well on the next page.

One other feature that differentiates FreeSync from G-SYNC is how things are handled when the frame rate is outside of the dynamic refresh range. With G-SYNC enabled, the system will behave as though VSYNC is enabled when frame rates are either above or below the dynamic range; NVIDIA's goal was to have no tearing, ever. That means if you drop below 30FPS, you can get the stutter associated with VSYNC while going above 60Hz/144Hz (depending on the display) is not possible – the frame rate is capped. Admittedly, neither situation is a huge problem, but AMD provides an alternative with FreeSync.

Instead of always behaving as though VSYNC is on, FreeSync can revert to either VSYNC off or VSYNC on behavior if your frame rates are too high/low. With VSYNC off, you could still get image tearing but at higher frame rates there would be a reduction in input latency. Again, this isn't necessarily a big flaw with G-SYNC – and I’d assume NVIDIA could probably rework the drivers to change the behavior if needed – but having choice is never a bad thing.

There’s another aspect to consider with FreeSync that might be interesting: as an open standard, it could potentially find its way into notebooks sooner than G-SYNC. We have yet to see any shipping G-SYNC enabled laptops, and it’s unlikely most notebooks manufacturers would be willing to pay $200 or even $100 extra to get a G-SYNC module into a notebook, and there's the question of power requirements. Then again, earlier this year there was an inadvertent leak of some alpha drivers that allowed G-SYNC to function on the ASUS G751j notebook without a G-SYNC module, so it’s clear NVIDIA is investigating other options.

While NVIDIA may do G-SYNC without a module for notebooks, there are still other questions. With many notebooks using a form of dynamic switchable graphics (Optimus and Enduro), support for Adaptive Sync by the Intel processor graphics could certainly help. NVIDIA might work with Intel to make G-SYNC work (though it’s worth pointing out that the ASUS G751 doesn’t support Optimus so it’s not a problem with that notebook), and AMD might be able to convince Intel to adopt DP Adaptive Sync, but to date neither has happened. There’s no clear direction yet but there’s definitely a market for adaptive refresh in laptops, as many are unable to reach 60+ FPS at high quality settings.

FreeSync Displays and Pricing FreeSync vs. G-SYNC Performance
Comments Locked

350 Comments

View All Comments

  • Hrel - Thursday, March 19, 2015 - link

    I'm so sick of this proprietary shit, when companies want to do something new why not work with the other companies in the industry to come up with the best solution for the problem and make it universal? Customers NEVER jump onto a technology that only works from one company, we aren't going to commit to a monopoly. I get that they want a competitive edge in the market, but it literally never works that way. What happens is they spend all this money on R&D, marketing, prototyping, waste everyone's time with marketing and reviews only to have a handful of people pick it up (probably too few to even break even) and then it stops growing right there until a universal standard comes out that the industry adapts as a whole.

    Just fucking stop wasting everyone's time and money and choose cooperation from the start!
  • HunterKlynn - Thursday, March 19, 2015 - link

    For the record, AMDs solution here *is* an attempt at an open standard. GSync is the proprietary one.
  • dragonsqrrl - Thursday, March 19, 2015 - link

    ...
  • Black Obsidian - Thursday, March 19, 2015 - link

    It's hard to be more open than being an official (albeit optional) part of the DisplayPort spec itself.
  • DominionSeraph - Thursday, March 19, 2015 - link

    I'm sure AMD will be happy to give you the masks to the 290X and 390X, if you only ask. "I'm sick of there only being two players in the market. Why don't you let me in? I'm sure I could sell your products for less than your prices!"
  • lordken - Thursday, March 19, 2015 - link

    next time when you try to make someone look stupid, try no to look like fool by yourself. Another dogma believer that thinks without patents world will end. Same as copyright believers that music&entertainment without copyright will stop to exist...
    If you think about it, you can see that patents can actually pretty much slow down technology advancement. You can even see that today with Intel CPUs, as AMD cannot catch up and CPUs are patent locked we are left to be milked by Intel with minimal performance gains between generations. If either AMD would have better CPUs or could simply copy good parts from Intel and put into their design today, imho, we would be much far with performance. Also look around you and see that 3D printing boom? Guess what few years back and last year patents expired and allowed this. Yes 3D printing was invented 30years ago, yet it gets to your desk only today. So much for patent believers.

    btw even if AMD would give you their blueprints what would you do? Start selling R390X tomorrow right? Manufactured out of thin air. By the time you could sell R390X we would be at 590 generation. Possibly only nvidia/intel would be able to benefit that sooner (which isnt necesarily a bad thing)
  • lordken - Thursday, March 19, 2015 - link

    @Hrel: a) cause most corporations are run by greedy bastards imho b) today managers of said corporations cant employ common sense and are disconnected from reality making stupid/bad decisions. I see it in big corporation i work for...
    so using brain and "pro-consumer" way of thinking is forbidden.
  • Flunk - Thursday, March 19, 2015 - link

    Please just support the Adaptive VSync standard Nvidia, your G-Sync implmentation doesn't have any benefits over it. You don't need to call it FreeSync, but we all need to settle on one standard because if I have to be locked in to one brand of GPU based on my monitor isn't not going to be the one that's not using industry standards.
  • Murloc - Thursday, March 19, 2015 - link

    a monitor lasts much longer than a GPU and costs more too for most users out there so yeah, standards win.

    They can call it adaptive sync as that's what it is. A Displayport standard.
  • praeses - Thursday, March 19, 2015 - link

    It would be interesting to see input lag comparisons and videos of panning scenes in games that would typically cause tearing captured at higher speed played in slow motion.

Log in

Don't have an account? Sign up now