Introduction to FreeSync and Adaptive Sync

The first time anyone talked about adaptive refresh rates for monitors – specifically applying the technique to gaming – was when NVIDIA demoed G-SYNC back in October 2013. The idea seemed so logical that I had to wonder why no one had tried to do it before. Certainly there are hurdles to overcome, e.g. what to do when the frame rate is too low, or too high; getting a panel that can handle adaptive refresh rates; supporting the feature in the graphics drivers. Still, it was an idea that made a lot of sense.

The impetus behind adaptive refresh is to overcome visual artifacts and stutter cause by the normal way of updating the screen. Briefly, the display is updated with new content from the graphics card at set intervals, typically 60 times per second. While that’s fine for normal applications, when it comes to games there are often cases where a new frame isn’t ready in time, causing a stall or stutter in rendering. Alternatively, the screen can be updated as soon as a new frame is ready, but that often results in tearing – where one part of the screen has the previous frame on top and the bottom part has the next frame (or frames in some cases).

Neither input lag/stutter nor image tearing are desirable, so NVIDIA set about creating a solution: G-SYNC. Perhaps the most difficult aspect for NVIDIA wasn’t creating the core technology but rather getting display partners to create and sell what would ultimately be a niche product – G-SYNC requires an NVIDIA GPU, so that rules out a large chunk of the market. Not surprisingly, the result was that G-SYNC took a bit of time to reach the market as a mature solution, with the first displays that supported the feature requiring modification by the end user.

Over the past year we’ve seen more G-SYNC displays ship that no longer require user modification, which is great, but pricing of the displays so far has been quite high. At present the least expensive G-SYNC displays are 1080p144 models that start at $450; similar displays without G-SYNC cost about $200 less. Higher spec displays like the 1440p144 ASUS ROG Swift cost $759 compared to other WQHD displays (albeit not 120/144Hz capable) that start at less than $400. And finally, 4Kp60 displays without G-SYNC cost $400-$500 whereas the 4Kp60 Acer XB280HK will set you back $750.

When AMD demonstrated their alternative adaptive refresh rate technology and cleverly called it FreeSync, it was a clear jab at the added cost of G-SYNC displays. As with G-SYNC, it has taken some time from the initial announcement to actual shipping hardware, but AMD has worked with the VESA group to implement FreeSync as an open standard that’s now part of DisplayPort 1.2a, and they aren’t getting any royalties from the technology. That’s the “Free” part of FreeSync, and while it doesn’t necessarily guarantee that FreeSync enabled displays will cost the same as non-FreeSync displays, the initial pricing looks quite promising.

There may be some additional costs associated with making a FreeSync display, though mostly these costs come in the way of using higher quality components. The major scaler companies – Realtek, Novatek, and MStar – have all built FreeSync (DisplayPort Adaptive Sync) into their latest products, and since most displays require a scaler anyway there’s no significant price increase. But if you compare a FreeSync 1440p144 display to a “normal” 1440p60 display of similar quality, the support for higher refresh rates inherently increases the price. So let’s look at what’s officially announced right now before we continue.

FreeSync Displays and Pricing
POST A COMMENT

350 Comments

View All Comments

  • lordken - Thursday, March 19, 2015 - link

    mmh your point is? ofc if you have AMD you can only get freesync because if nothing else nvidia kept gsync for themself. What did you try to say? Nvidia is fragmenting monitor market. Reply
  • dragonsqrrl - Thursday, March 19, 2015 - link

    My point is that Nvidia currently has more options for variable refresh rate tech, on top of a much larger install base, than AMD. It often helps to read a response in the context of the comment it's responding to. If you can't see how that's a relevant response to FriendlyUser's comment, then I can't help you. Reply
  • chizow - Tuesday, March 24, 2015 - link

    Exactly, yet AMD fans and surprisingly, even the author Jarred (who should know better), would have you believe G-Sync somehow faces the uphill battle? Reply
  • JeffFlanagan - Thursday, March 19, 2015 - link

    Having Nvidia refuse to embrace a standard does not make overpriced Gsync devices "better." It's just Nvidia failing their users yet-again.

    They screwed me on stereoscopic 3D by dropping support for the $1K eMagin HMD when changing business partners, making it clear that they do not care to support their customers if not supporting them will drive sales of new displays. I won't get fooled again.
    Reply
  • chizow - Thursday, March 19, 2015 - link

    Nvidia failing their users, that's interesting. So they failed their users by inventing a tech the world had never seen before and bringing it to market some 18 months before the competition. Having owned and used an ROG Swift for the past 7 months which completely changed my gaming experience, I'd disagree with you.

    Nvidia once again did exactly what I expect them to do: introduce great new technology to improve their ecosystem for their users.
    Reply
  • AnnihilatorX - Thursday, March 19, 2015 - link

    For those 18 months yes, Nvidia was good. But now, It fails its customers because now, refusing to support the VESA standard, they are effectively limiting their choice of monitors and by forcing customers to pay a premium if they want smooth gameplay. Reply
  • chizow - Thursday, March 19, 2015 - link

    No, they're reinforcing their position and rewarding their customers by sticking to and developing a technology they're invested in. Their customers will pay the premium as long as their solution is better, and the only way to continue to ensure it remains better is to continue investing and developing it. Nvidia has already said they aren't done improving G-Sync, given how awesome it has been in its first incarnation, I can't wait to see what they have in store.

    Meanwhile, FreeSync is a good introduction into VRR for AMD, let's hope they continue to invest the same way Nvidia has to make sure they produce the best product they can for their users.
    Reply
  • maximumGPU - Friday, March 20, 2015 - link

    you can't possibly believe that, it's ridiculous!
    rewarding their customers by making them pay a premium for an end result that's clearly not noticeably different from the free alternative??
    It's really business 101: they had the market cornered and they could charge whatever they want, fair play to them and well done.
    But now when an equally good free open standard alternative comes into play, not adopting it IS a complete disregard to their customers. I own nvidia gpus (sli) now, and i DON'T want to pay for their solution after seeing what freesync can do. Not providing me with that option simply makes me a disgruntled customer that'll take my business elsewhere.
    The problem is people like you who can't see that continue to blindly buy into it, making them reluctant to change their stance as long as the money rolls in. They'd drop gsync in an instant if no one buys their overpriced tech, and we'd all be better for it.
    Reply
  • chizow - Friday, March 20, 2015 - link

    And you can't possibly believe that can you? Read some actual reviews that clearly show right now FreeSync is the inferior solution. Reply
  • silverblue - Friday, March 20, 2015 - link

    Besides the PCPerspective review (which, like this one, is a work-in-progress anyway), please provide links to these reviews (plural, as stated). Reply

Log in

Don't have an account? Sign up now