Introduction to FreeSync and Adaptive Sync

The first time anyone talked about adaptive refresh rates for monitors – specifically applying the technique to gaming – was when NVIDIA demoed G-SYNC back in October 2013. The idea seemed so logical that I had to wonder why no one had tried to do it before. Certainly there are hurdles to overcome, e.g. what to do when the frame rate is too low, or too high; getting a panel that can handle adaptive refresh rates; supporting the feature in the graphics drivers. Still, it was an idea that made a lot of sense.

The impetus behind adaptive refresh is to overcome visual artifacts and stutter cause by the normal way of updating the screen. Briefly, the display is updated with new content from the graphics card at set intervals, typically 60 times per second. While that’s fine for normal applications, when it comes to games there are often cases where a new frame isn’t ready in time, causing a stall or stutter in rendering. Alternatively, the screen can be updated as soon as a new frame is ready, but that often results in tearing – where one part of the screen has the previous frame on top and the bottom part has the next frame (or frames in some cases).

Neither input lag/stutter nor image tearing are desirable, so NVIDIA set about creating a solution: G-SYNC. Perhaps the most difficult aspect for NVIDIA wasn’t creating the core technology but rather getting display partners to create and sell what would ultimately be a niche product – G-SYNC requires an NVIDIA GPU, so that rules out a large chunk of the market. Not surprisingly, the result was that G-SYNC took a bit of time to reach the market as a mature solution, with the first displays that supported the feature requiring modification by the end user.

Over the past year we’ve seen more G-SYNC displays ship that no longer require user modification, which is great, but pricing of the displays so far has been quite high. At present the least expensive G-SYNC displays are 1080p144 models that start at $450; similar displays without G-SYNC cost about $200 less. Higher spec displays like the 1440p144 ASUS ROG Swift cost $759 compared to other WQHD displays (albeit not 120/144Hz capable) that start at less than $400. And finally, 4Kp60 displays without G-SYNC cost $400-$500 whereas the 4Kp60 Acer XB280HK will set you back $750.

When AMD demonstrated their alternative adaptive refresh rate technology and cleverly called it FreeSync, it was a clear jab at the added cost of G-SYNC displays. As with G-SYNC, it has taken some time from the initial announcement to actual shipping hardware, but AMD has worked with the VESA group to implement FreeSync as an open standard that’s now part of DisplayPort 1.2a, and they aren’t getting any royalties from the technology. That’s the “Free” part of FreeSync, and while it doesn’t necessarily guarantee that FreeSync enabled displays will cost the same as non-FreeSync displays, the initial pricing looks quite promising.

There may be some additional costs associated with making a FreeSync display, though mostly these costs come in the way of using higher quality components. The major scaler companies – Realtek, Novatek, and MStar – have all built FreeSync (DisplayPort Adaptive Sync) into their latest products, and since most displays require a scaler anyway there’s no significant price increase. But if you compare a FreeSync 1440p144 display to a “normal” 1440p60 display of similar quality, the support for higher refresh rates inherently increases the price. So let’s look at what’s officially announced right now before we continue.

FreeSync Displays and Pricing
Comments Locked

350 Comments

View All Comments

  • eanazag - Thursday, March 19, 2015 - link

    The AMD and Nvidia haters all come out of the wood work for these type articles.

    Intel needs to chime in. I suspect they will go the FreeSync route since it is part of the spec and there are no costs.

    I understand Nvidia has some investment here. I fully expect them to support adaptive sync - at least in 5 years. They really need to do something about Phys-X. As a customer I see it as irrelevant. I know it isn't their style to open up their tech.
  • eddman - Thursday, March 19, 2015 - link

    Not to go off-topic too much, but physx as a CPU physics engine, like havok, etc., is quite popular. There are hundreds of titles out there using it and more are coming.

    As for GPU physx, which is what you had in mind, yes, it'd never become widely adopted unless nvidia opens it up, and that would probably not happen, unless someone else comes up with another, open GPU accelerated physics engine.
  • mczak - Thursday, March 19, 2015 - link

    Minor nitpick, intel's solution won't be called FreeSync - this is reserved for AMD certified solutions. Pretty sure though it's going to be technically the same, just using the adaptive sync feature of DP 1.2a.
    (My guess would be at some point in the future nvidia is going to follow suit, first with notebooks because gsync is more or less impossible there though even then it will be initially restricted to notebooks which drive the display from the nvidia gpu which aren't many but everything else is going to require intel to support it first. I'm quite confident they are going to do this with desktop gpus too, though I would suspect they'd continue to call it GSync. Let's face it requiring a specific nvidia gsync module in the monitor just isn't going to fly with anything but high-end gaming market whereas adaptive sync should trickle down to a lot more markets, thus imho there's no way nvidia's position on this doesn't have to change.)
  • anubis44 - Tuesday, March 24, 2015 - link

    @eanazag: nVidia will be supporting FreeSync about 20 minutes after the first hacked nVidia driver to support FreeSync makes it onto the web, whether they like it or not.
  • chizow - Tuesday, March 24, 2015 - link

    Cool, I welcome it, one less reason to buy anything AMD related.
  • chizow - Thursday, March 19, 2015 - link

    There's no need to be disappointed honestly, Jarred just copy/pasted half of AMD's slide deck and then posted a Newegg Review. Nothing wrong with that, Newegg Reviews have their place in the world, its just unfortunate that people will take his conclusions and actually believe Freesync and G-Sync are equivalents, when there are already clear indications this is not the case.

    - 40 to 48 minimums are simply unacceptably low thresholds before things start falling apart, especially given many of these panels are higher than 1080p. 40 Minimum at 4K for example is DAMN hard to accomplish, in fact the recently launched Titan X can't even do it in most games. CrossFireX isn't going to be an option either until AMD fixes FreeSync + CF, if ever.

    -The tearing/ghosting/blurring issues at low frame rates is significant. AMD mentioned issues with pixel decay causing problems at low refresh, but honestly, this alone shows us G-Sync is worth the premium because it is simply better. http://www.pcper.com/files/imagecache/article_max_...
    Jarred has mused multiple times these panels may use the same one as the one in the Swift, so why are the FreeSync panels faling so badly at low refresh? Maybe that G-Sync module is actually doing something, like actively sync'ing with the monitor to force overdrive without breaking the kind of guesswork framesync FreeSync is using?

    -Input lag? We can show AMD's slide and take their word for it without even bothering to test? High speed camera, USB input double attached to a mouse, scroll and see which one responds faster. FreeSync certainly seems to work within its supported frequency bands in preventing tearing, but that was only half of the problem related to Vsync on/off. The other trade off for Vsync ON was how much input lag this introduced.

    -A better explanation of Vsync On/Off and tearing? Is this something the driver handles automatically? Is Vsync being turned on and off by the driver dynamically, similar to Nvidia's Adaptive Vsync? When it is on, does it introduce input lag?

    In any case, AnandTech's Newegg Review of FreeSync is certainly a nice preview and proof of concept of FreeSync, but I wouldn't take it as more than that. I'd wait for actual reviews to cover the science of display technology that actually matter, like input lag, blurring, image retention etc that can only really be captured and quantified with equipment like high speed cameras and a sound testing methodology.
  • at80eighty - Thursday, March 19, 2015 - link

    Waaa
  • chizow - Thursday, March 19, 2015 - link

    Another disappointed AMD user I see, I agree, FreeSync certainly isn't as good as one might have hoped.
  • at80eighty - Friday, March 20, 2015 - link

    had more nvidia cards than amd; so keep trying.
  • chizow - Friday, March 20, 2015 - link

    Doubt it, but keep trying.

Log in

Don't have an account? Sign up now