Introduction to FreeSync and Adaptive Sync

The first time anyone talked about adaptive refresh rates for monitors – specifically applying the technique to gaming – was when NVIDIA demoed G-SYNC back in October 2013. The idea seemed so logical that I had to wonder why no one had tried to do it before. Certainly there are hurdles to overcome, e.g. what to do when the frame rate is too low, or too high; getting a panel that can handle adaptive refresh rates; supporting the feature in the graphics drivers. Still, it was an idea that made a lot of sense.

The impetus behind adaptive refresh is to overcome visual artifacts and stutter cause by the normal way of updating the screen. Briefly, the display is updated with new content from the graphics card at set intervals, typically 60 times per second. While that’s fine for normal applications, when it comes to games there are often cases where a new frame isn’t ready in time, causing a stall or stutter in rendering. Alternatively, the screen can be updated as soon as a new frame is ready, but that often results in tearing – where one part of the screen has the previous frame on top and the bottom part has the next frame (or frames in some cases).

Neither input lag/stutter nor image tearing are desirable, so NVIDIA set about creating a solution: G-SYNC. Perhaps the most difficult aspect for NVIDIA wasn’t creating the core technology but rather getting display partners to create and sell what would ultimately be a niche product – G-SYNC requires an NVIDIA GPU, so that rules out a large chunk of the market. Not surprisingly, the result was that G-SYNC took a bit of time to reach the market as a mature solution, with the first displays that supported the feature requiring modification by the end user.

Over the past year we’ve seen more G-SYNC displays ship that no longer require user modification, which is great, but pricing of the displays so far has been quite high. At present the least expensive G-SYNC displays are 1080p144 models that start at $450; similar displays without G-SYNC cost about $200 less. Higher spec displays like the 1440p144 ASUS ROG Swift cost $759 compared to other WQHD displays (albeit not 120/144Hz capable) that start at less than $400. And finally, 4Kp60 displays without G-SYNC cost $400-$500 whereas the 4Kp60 Acer XB280HK will set you back $750.

When AMD demonstrated their alternative adaptive refresh rate technology and cleverly called it FreeSync, it was a clear jab at the added cost of G-SYNC displays. As with G-SYNC, it has taken some time from the initial announcement to actual shipping hardware, but AMD has worked with the VESA group to implement FreeSync as an open standard that’s now part of DisplayPort 1.2a, and they aren’t getting any royalties from the technology. That’s the “Free” part of FreeSync, and while it doesn’t necessarily guarantee that FreeSync enabled displays will cost the same as non-FreeSync displays, the initial pricing looks quite promising.

There may be some additional costs associated with making a FreeSync display, though mostly these costs come in the way of using higher quality components. The major scaler companies – Realtek, Novatek, and MStar – have all built FreeSync (DisplayPort Adaptive Sync) into their latest products, and since most displays require a scaler anyway there’s no significant price increase. But if you compare a FreeSync 1440p144 display to a “normal” 1440p60 display of similar quality, the support for higher refresh rates inherently increases the price. So let’s look at what’s officially announced right now before we continue.

FreeSync Displays and Pricing
Comments Locked

350 Comments

View All Comments

  • YukaKun - Thursday, March 19, 2015 - link

    Until we don't have video showing the 2 of them going in parallel, we can't decide for a winner. There might be a lot of metrics for measuring "tearing", but this is not about "hard metrics", but how the bloody frame sequences look on your screen. Smooth or not.

    Cheers!
  • eddman - Thursday, March 19, 2015 - link

    The difference cannot be shown on video. How can a medium like video which has a limited and constant frame-rate be used to demonstrate a dynamic, variable frame rate technology?

    This is one of those scenarios where you can experience it only on a real monitor.
  • Murloc - Thursday, March 19, 2015 - link

    putting it on video makes the comparison kinda useless.
  • invinciblegod - Thursday, March 19, 2015 - link

    I am one of those who switch every time I upgrade my GPU (which is every few years). Sometimes, AMD is on top while other time Nvidia is better. Now, I must be locked into one forever or buy 6 monitors (3 for eyefinity and 3 for nvidia surround)!
  • jackstar7 - Thursday, March 19, 2015 - link

    If they can put out a confirmed 1440p 21:9 w/Freesync they will get my money. The rumors around the Acer Predator are still just rumors. Please... someone... give me the goods!
  • Black Obsidian - Thursday, March 19, 2015 - link

    It's pretty likely that LG will do just that. They already make two 1440p 21:9 monitors, and since it sounds like FreeSync will be part of new scalers going forward, you can probably count on the next LG 1440p 21:9 picking up that ability.
  • xthetenth - Thursday, March 19, 2015 - link

    I'm right there with you. I'm already preparing to get the update on the LG 1440 21:9 and a 390X, because if the rumors for the latter are anything like what the card is, it's going to be fantastic, and after getting a 21:9 for work I can't make myself use any other resolution.
  • Black Obsidian - Thursday, March 19, 2015 - link

    Same deal here. If nVidia supported FreeSync and priced the Titan X (or impending 980 Ti) in a more sane manner I'd consider going that way because I have no great love for either company.

    But so long as they expect to limit my monitor choices to their price-inflated special options and pretend that $1K is a reasonable price for a flagship video card, they've lost my business to someone with neither of those hangups.
  • kickpuncher - Thursday, March 19, 2015 - link

    I have no experience with 144hz screens. I've been waiting for freesync to come but you're saying the difference is negligble with a static 144hz monitor? Is that with any FPS or does the FPS also have to be very high? (in regards to 4th paragarph on last page). Thanks
  • JarredWalton - Thursday, March 19, 2015 - link

    I'd have to do more testing, but 144Hz redraws the display every 6.9ms compared to 60Hz redrawing every 16.7ms. With pixel response times often being around 5ms in the real world (not the marketing claims of 1ms), the "blur" between frames will hide some of the tearing. And then there's the fact that things won't change as much between frames that are 7ms apart compared to frames that are 17ms apart.

    Basically at 144Hz tearing can still be present but it ends up being far less visible to the naked eye. Or at least that's my subjective experience using my 41 year old eyes. :-)

Log in

Don't have an account? Sign up now