FreeSync Features

In many ways FreeSync and G-SYNC are comparable. Both refresh the display as soon as a new frame is available, at least within their normal range of refresh rates. There are differences in how this is accomplished, however.

G-SYNC uses a proprietary module that replaces the normal scaler hardware in a display. Besides cost factors, this means that any company looking to make a G-SYNC display has to buy that module from NVIDIA. Of course the reason NVIDIA went with a proprietary module was because adaptive sync didn’t exist when they started working on G-SYNC, so they had to create their own protocol. Basically, the G-SYNC module controls all the regular core features of the display like the OSD, but it’s not as full featured as a “normal” scaler.

In contrast, as part of the DisplayPort 1.2a standard, Adaptive Sync (which is what AMD uses to enable FreeSync) will likely become part of many future displays. The major scaler companies (Realtek, Novatek, and MStar) have all announced support for Adaptive Sync, and it appears most of the changes required to support the standard could be accomplished via firmware updates. That means even if a display vendor doesn’t have a vested interest in making a FreeSync branded display, we could see future displays that still work with FreeSync.

Having FreeSync integrated into most scalers has other benefits as well. All the normal OSD controls are available, and the displays can support multiple inputs – though FreeSync of course requires the use of DisplayPort as Adaptive Sync doesn’t work with DVI, HDMI, or VGA (DSUB). AMD mentions in one of their slides that G-SYNC also lacks support for audio input over DisplayPort, and there’s mention of color processing as well, though this is somewhat misleading. NVIDIA's G-SYNC module supports color LUTs (Look Up Tables), but they don't support multiple color options like the "Warm, Cool, Movie, User, etc." modes that many displays have; NVIDIA states that the focus is on properly producing sRGB content, and so far the G-SYNC displays we've looked at have done quite well in this regard. We’ll look at the “Performance Penalty” aspect as well on the next page.

One other feature that differentiates FreeSync from G-SYNC is how things are handled when the frame rate is outside of the dynamic refresh range. With G-SYNC enabled, the system will behave as though VSYNC is enabled when frame rates are either above or below the dynamic range; NVIDIA's goal was to have no tearing, ever. That means if you drop below 30FPS, you can get the stutter associated with VSYNC while going above 60Hz/144Hz (depending on the display) is not possible – the frame rate is capped. Admittedly, neither situation is a huge problem, but AMD provides an alternative with FreeSync.

Instead of always behaving as though VSYNC is on, FreeSync can revert to either VSYNC off or VSYNC on behavior if your frame rates are too high/low. With VSYNC off, you could still get image tearing but at higher frame rates there would be a reduction in input latency. Again, this isn't necessarily a big flaw with G-SYNC – and I’d assume NVIDIA could probably rework the drivers to change the behavior if needed – but having choice is never a bad thing.

There’s another aspect to consider with FreeSync that might be interesting: as an open standard, it could potentially find its way into notebooks sooner than G-SYNC. We have yet to see any shipping G-SYNC enabled laptops, and it’s unlikely most notebooks manufacturers would be willing to pay $200 or even $100 extra to get a G-SYNC module into a notebook, and there's the question of power requirements. Then again, earlier this year there was an inadvertent leak of some alpha drivers that allowed G-SYNC to function on the ASUS G751j notebook without a G-SYNC module, so it’s clear NVIDIA is investigating other options.

While NVIDIA may do G-SYNC without a module for notebooks, there are still other questions. With many notebooks using a form of dynamic switchable graphics (Optimus and Enduro), support for Adaptive Sync by the Intel processor graphics could certainly help. NVIDIA might work with Intel to make G-SYNC work (though it’s worth pointing out that the ASUS G751 doesn’t support Optimus so it’s not a problem with that notebook), and AMD might be able to convince Intel to adopt DP Adaptive Sync, but to date neither has happened. There’s no clear direction yet but there’s definitely a market for adaptive refresh in laptops, as many are unable to reach 60+ FPS at high quality settings.

FreeSync Displays and Pricing FreeSync vs. G-SYNC Performance
Comments Locked

350 Comments

View All Comments

  • Welsh Jester - Friday, March 20, 2015 - link

    To add to my last post, i think 1440p Freesync screens will be good for people with only 1 higher end GPU. Since they won't have to deal with micro stuttering that multi cards bring, smooth experience at 40+ fps. Good news.
  • FlushedBubblyJock - Saturday, March 21, 2015 - link

    I was just about ready to praise AMD but then I see "must have and use display port"...
    Well, at least it appears AMD got it to work on their crap, and without massive Hz restrictions as they were on earlier reviews.
    So, heck, AMD might have actually done something right for once ?
    I am cautious - I do expect some huge frikkin problem we can't see right now --- then we will be told to ignore it, then we will be told it's nothing, then there's a workaround fix, then after a couple years it will be full blow admitted and AMD will reluctantly "fix it" in "the next release of hardware".
    Yes, that's what I expect, so I'm holding back the praise for AMD because I've been burned to a crisp before.
  • cykodrone - Saturday, March 21, 2015 - link

    I actually went to the trouble to make an account to say sometimes I come here just to read the comments, some of the convos have me rolling on the floor busting my guts laughing, seriously, this is free entertainment at its best! Aside from that, the cost of this Nvidia e-penis would feed 10 starving children for a month. I mean seriously, at what point is it overkill? By that I mean is there any game out there that would absolutely not run good enough on a slightly lesser card at half the price? When I read this card alone requires 250W, my eyes popped out of my head, holy electric bill batman, but I guess if somebody has a 1G to throw away on an e-penis, they don't have electric bill worries. One more question, what kind of CPU/motherboard would you need to back this sucker up? I think this card would be retarded without at least the latest i7 Extreme(ly overpriced), can you imagine some tool dropping this in an i3? What I'm saying is, this sucker would need an expensive 'bed' too, otherwise, you'd just be wasting your time and money.
  • cykodrone - Saturday, March 21, 2015 - link

    This got posted to the wrong story, was meant for the NVIDIA GeForce GTX Titan X Review, my humble apologies.
  • mapesdhs - Monday, March 23, 2015 - link

    No less amusing though. ;D

    Btw, I've tested 1/2/3x GTX 980 on a P55 board, it works a lot better than one might think.
    Also test 1/2x 980 with an i5 760, again works quite well. Plus, the heavier the game, the
    less they tend to rely on main CPU power, especially as the resolution/detail rises.

    Go to 3dmark.com, Advanced Search, Fire Strike, enter i7 870 or i5 760 & search, my
    whackadoodle results come straight back. :D I've tested with a 5GHz 2700K and 4.8GHz
    3930K aswell, the latter are quicker of course, but not that much quicker, less than most
    would probably assume.

    Btw, the Titan X is more suited to solo pros doing tasks that are mainly FP32, like After Effects.
    However, there's always a market for the very best, and I know normal high street stores make
    their biggest profit margins on premium items (and the customers who buy them), so it's an
    important segment - it drives everything else in a way.

    Ian.
  • mapesdhs - Monday, March 23, 2015 - link

    (Damn, still no edit, I meant to say the 3-way testing was with an i7 870 on a P55)
  • Vinny DePaul - Sunday, March 22, 2015 - link

    I am a big fan of open standard. More the merrier. I stay with nVidia because they support their products better. I was a big fan of AMD GPU but the drivers were so buggy. nVidia updates their drivers so quickly and the support is just a lot better. I like G-sync. It worths the extra money. I hope my monitor can support FreeSync with a firmware upgrade. (Not that I have an AMD GPU.)
  • Teknobug - Sunday, March 22, 2015 - link

    Now if I only can find a 24" monitor with these features, anything bigger than 24" is too large for me.
  • gauravnba - Monday, March 23, 2015 - link

    Lot of G-Sync versus AMD bashing here. Personally it all comes down to whether or not I am being confined to an eco-system when going for either technology. If nVidia starts to become like Apple in that respect, I'm not very comfortable with it.
    However, I wonder if to adapt to FreeSync, does it take a lot of addition or modification of hardware on the GPU end. That might be one reason that nVidia didn't have to change much of their architecture on the GPU during the G-Sync launch and confined that to the scaler.
    AMD worked with VESA to get this working on GCN 1.1, but not on GCN 1.0. This may be another area where the technologies are radically different- one is heavily reliant on the scaler while the other may be able to divide the work to a certain extent? Then again, I'm quite ignorant of how scalers and GPUs work in this case.
  • PixelSupreme - Monday, March 23, 2015 - link

    To be honest I don't give half a fudge about FreeSync or G-Sync. What gets my attention is ULMB/ strobed backlight. An IPS-Display (or well, OLED but...), WQHD and strobing that works on a range of refresh rates, including some that are mutliples of 24. That would be MY holy grail. The announced Acer XB270HU comes close but ULMB apparently only works on 85Hz ans 100Hz.

Log in

Don't have an account? Sign up now