Introduction to FreeSync and Adaptive Sync

The first time anyone talked about adaptive refresh rates for monitors – specifically applying the technique to gaming – was when NVIDIA demoed G-SYNC back in October 2013. The idea seemed so logical that I had to wonder why no one had tried to do it before. Certainly there are hurdles to overcome, e.g. what to do when the frame rate is too low, or too high; getting a panel that can handle adaptive refresh rates; supporting the feature in the graphics drivers. Still, it was an idea that made a lot of sense.

The impetus behind adaptive refresh is to overcome visual artifacts and stutter cause by the normal way of updating the screen. Briefly, the display is updated with new content from the graphics card at set intervals, typically 60 times per second. While that’s fine for normal applications, when it comes to games there are often cases where a new frame isn’t ready in time, causing a stall or stutter in rendering. Alternatively, the screen can be updated as soon as a new frame is ready, but that often results in tearing – where one part of the screen has the previous frame on top and the bottom part has the next frame (or frames in some cases).

Neither input lag/stutter nor image tearing are desirable, so NVIDIA set about creating a solution: G-SYNC. Perhaps the most difficult aspect for NVIDIA wasn’t creating the core technology but rather getting display partners to create and sell what would ultimately be a niche product – G-SYNC requires an NVIDIA GPU, so that rules out a large chunk of the market. Not surprisingly, the result was that G-SYNC took a bit of time to reach the market as a mature solution, with the first displays that supported the feature requiring modification by the end user.

Over the past year we’ve seen more G-SYNC displays ship that no longer require user modification, which is great, but pricing of the displays so far has been quite high. At present the least expensive G-SYNC displays are 1080p144 models that start at $450; similar displays without G-SYNC cost about $200 less. Higher spec displays like the 1440p144 ASUS ROG Swift cost $759 compared to other WQHD displays (albeit not 120/144Hz capable) that start at less than $400. And finally, 4Kp60 displays without G-SYNC cost $400-$500 whereas the 4Kp60 Acer XB280HK will set you back $750.

When AMD demonstrated their alternative adaptive refresh rate technology and cleverly called it FreeSync, it was a clear jab at the added cost of G-SYNC displays. As with G-SYNC, it has taken some time from the initial announcement to actual shipping hardware, but AMD has worked with the VESA group to implement FreeSync as an open standard that’s now part of DisplayPort 1.2a, and they aren’t getting any royalties from the technology. That’s the “Free” part of FreeSync, and while it doesn’t necessarily guarantee that FreeSync enabled displays will cost the same as non-FreeSync displays, the initial pricing looks quite promising.

There may be some additional costs associated with making a FreeSync display, though mostly these costs come in the way of using higher quality components. The major scaler companies – Realtek, Novatek, and MStar – have all built FreeSync (DisplayPort Adaptive Sync) into their latest products, and since most displays require a scaler anyway there’s no significant price increase. But if you compare a FreeSync 1440p144 display to a “normal” 1440p60 display of similar quality, the support for higher refresh rates inherently increases the price. So let’s look at what’s officially announced right now before we continue.

FreeSync Displays and Pricing
Comments Locked

350 Comments

View All Comments

  • 5150Joker - Friday, March 20, 2015 - link

    NewEgg review LOL! In defense of Jared, he's probably working in the confines of the equipment made available to him by the parent company of this place. TFTCentral and PRAD have really expensive equipment they use to quantify the metrics in their reviews.
  • chizow - Thursday, March 19, 2015 - link

    G-Sync isn't going anywhere, but its nice to see AMD provide their fans with an inferior option as usual. Works out well, given their customers are generally less discerning anyways. Overall its a great day for AMD fans that can finally enjoy the tech they've been downplaying for some 18 months since G-Sync was announced.
  • Black Obsidian - Thursday, March 19, 2015 - link

    AMD offers an option that's indistinguishable in actual use from nVidia's, and significantly cheaper to boot. Sure, it's not enough for the "discerning" set who are willing to pay big premiums for minuscule gains just so they can brag that they have the best, but who other than nVidia stockholders cares who gets to fleece that crowd?

    Frankly, I wish that AMD could pull the same stunt in the CPU market. Intel could use a price/performance kick in the pants.
  • chizow - Thursday, March 19, 2015 - link

    Well unfortunately, for less discerning customers, the type that would just take such a superficial review as gospel to declare equivalency, the issues with input lag focuses on minor differences that were not easily quantified or identified, but over thousands of frames, the differences are much more apparent.

    If you're referring to differences in FPS charts, you've already failed in seeing the value Nvidia provides to end-users in their products as graphics cards have become much more than just spitting out frames on a bar graph. FreeSync and G-Sync are just another example of this, going beyond the "miniscule gains" vs price tag that less discerning individuals might prioritize.
  • Ranger101 - Friday, March 20, 2015 - link

    My heartfelt thanks to chizow. Your fanboy gibberings are a constant source of amusement :)
  • chizow - Friday, March 20, 2015 - link

    Np, without the nonsense posted by AMD fanboys there wouldn't be a need to post at all!
  • Black Obsidian - Friday, March 20, 2015 - link

    You're so discerning that I'm sure you could wax poetical on how your $3K monocrystalline speaker cables properly align the electrons to improve the depth of your music in ways that aren't easily quantifiable.
  • chizow - Friday, March 20, 2015 - link

    No, but I can certainly tell you why G-Sync and dozen or so other features Nvidia provides as value-add features for their graphics cards make them a better solution for me and the vast majority of the market.
  • silverblue - Friday, March 20, 2015 - link

    A dozen? Please.

    No really, I mean PLEASE tell us this "dozen or so other features".
  • chizow - Friday, March 20, 2015 - link

    Np, always nice mental exercise reminding myself why I prefer Nvidia over AMD:

    1. G-Sync
    2. 3D Vision (and soon VRDirect)
    3. PhysX
    4. GeForce Experience
    5. Shadowplay
    6. Better 3rd party tool support (NVInspector, Afterburner, Precision) which gives control over SLI/AA settings in game profiles and overclocking
    7. GameWorks
    8. Better driver support and features (driver-level FXAA and HBAO+), profiles as mentioned above, better CF profile and Day 1 support.
    9. Better AA support, both in-game and forced via driver (MFAA, TXAA, and now DSR)
    10. Better SLI compatibility and control (even if XDMA and CF have come a long way in terms of frame pacing and scaling).
    11. Better game bundles
    12. Better vendor partners and warranty (especially EVGA).
    13. Better reference/stock cooler, acoustics, heat etc.

    Don't particularly use these but they are interesting to me at either work or in the future:
    14. CUDA, we only use CUDA at work, period.
    15. GameStream. This has potential but not enough for me to buy a $200-300 Android device for PC gaming, yet.
    16. GRID. Another way to play your PC games on connected mobile devices.

    Damn, was that 16? No sweat.

Log in

Don't have an account? Sign up now