Closing Thoughts

It took a while to get here, but if the proof is in the eating of the pudding, FreeSync tastes just as good as G-SYNC when it comes to adaptive refresh rates. Within the supported refresh rate range, I found nothing to complain about. Perhaps more importantly, while you’re not getting a “free” monitor upgrade, the current prices of the FreeSync displays are very close to what you’d pay for an equivalent display that doesn’t have adaptive sync. That’s great news, and with the major scaler manufacturers on board with adaptive sync the price disparity should only shrink over time.

The short summary is that FreeSync works just as you’d expect, and at least in our limited testing so far there have been no problems. Which isn’t to say that FreeSync will work with every possible AMD setup right now. As noted last month, the initial FreeSync driver that AMD provided (Catalyst 15.3 Beta 1) only allows FreeSync to work with single GPU configurations. Another driver should be coming next month that will support FreeSync with CrossFire setups.

Besides needing a driver and FreeSync display, you also need a GPU that uses AMD’s GCN 1.1 or later architecture. The list at present consists of the R7 260/260X, R9 285, R9 290/290X/295X2 discrete GPUs, as well as the Kaveri APUs – A6-7400K, A8-7600/7650K, and A10-7700K/7800/7850K. First generation GCN 1.0 cards (HD 7950/7970 or R9 280/280X and similar) are not supported.

All is not sunshine and roses, however. Part of the problem with reviewing something like FreeSync is that we're inherently tied to the hardware we receive, in this case the LG 34UM67 display. Armed with an R9 290X and running at the native resolution, the vast majority of games will run at 48FPS or above even at maximum detail settings, though of course there are exceptions. This means they look and feel smooth. But what happens with more demanding games or with lower performance GPUs? If you're running without VSYNC, you'd get tearing below 48FPS, while with VSYNC you'd get stuttering.

Neither is ideal, but how much this impacts your experience will depend on the game and individual. G-SYNC handles dropping below the minimum FPS more gracefully than FreeSync, though if you're routinely falling below the minimum FreeSync refresh rate we'd argue that you should lower the settings. Mostly what you get with FreeSync/G-SYNC is the ability to have smooth gaming at 40-60 FPS and not just 60+ FPS.

Other sites are reporting ghosting on FreeSync displays, but that's not inherent to the technology. Rather, it's a display specific problem (just as the amount of ghosting on normal LCDs is display specific). Using higher quality panels and hardware designed to reduce/eliminate ghosting is the solution. The FreeSync displays so far appear to not have the same level of anti-ghosting as the currently available G-SYNC panels, which is unfortunate if true. (Note that we've only looked at the LG 34UM67, so we can't report on all the FreeSync displays.) Again, ghosting shouldn't be a FreeSync issue so much as a panel/scaler/firmware problem, so we'll hold off on further commentary until we get to the monitor reviews.

One final topic to address is something that has become more noticeable to me over the past few months. While G-SYNC/FreeSync can make a big difference when frame rates are in the 40~75 FPS range, as you go beyond that point the benefits are a lot less clear. Take the 144Hz ASUS ROG Swift as an example. Even with G-SYNC disabled, the 144Hz refresh rate makes tearing rather difficult to spot, at least in my experience. Considering pixel response times for LCDs are not instantaneous and combine that with the way our human eyes and brain process the world and for all the hype I still think having high refresh rates with VSYNC disabled gets you 98% of the way to the goal of smooth gaming with no noticeable visual artifacts (at least for those of us without superhuman eyesight).

Overall, I’m impressed with what AMD has delivered so far with FreeSync. AMD gamers in particular will want to keep an eye on the new and upcoming FreeSync displays. They may not be the “must have” upgrade right now, but if you’re in the market and the price premium is less than $50, why not get FreeSync? On the other hand, for NVIDIA users things just got more complicated. Assuming you haven’t already jumped on the G-SYNC train, there’s now this question of whether or not NVIDIA will support non-G-SYNC displays that implement DisplayPort’s Adaptive Sync technology. I have little doubt that NVIDIA can support FreeSync panels, but whether they will support them is far less certain. Given the current price premium on G-SYNC displays, it’s probably a good time to sit back and wait a few months to see how things develop.

There is one G-SYNC display that I’m still waiting to see, however: Acer’s 27” 1440p144 IPS (AHVA) XB270HU. It was teased at CES and it could very well be the holy grail of displays. It’s scheduled to launch next month, and official pricing is $799 (with some pre-orders now online at higher prices). We might see a FreeSync variant of the XB270HU as well in the coming months, if not from Acer than likely from some other manufacturer. For those that work with images and movies as well as playing games, IPS/AHVA displays with G-SYNC or FreeSync support are definitely needed.

Wrapping up, if you haven’t upgraded your display in a while, now is a good time to take stock of the various options. IPS and other wide viewing angle displays have come down quite a bit in pricing, and there are overclockable 27” and 30” IPS displays that don’t cost much at all. Unfortunately, if you want a guaranteed high refresh rate, there’s a good chance you’re going to have to settle for TN. The new UltraWide LG displays with 75Hz IPS panels at least deliver a moderate improvement though, and they now come with FreeSync as an added bonus.

Considering a good display can last 5+ years, making a larger investment isn’t a bad idea, but by the same token rushing into a new display isn’t advisable either as you don't want to end up stuck with a "lemon" or a dead technology. Take some time, read the reviews, and then find the display that you will be happy to use for the next half decade. At least by then we should have a better idea of which display technologies will stick around.

FreeSync vs. G-SYNC Performance
Comments Locked

350 Comments

View All Comments

  • barleyguy - Thursday, March 19, 2015 - link

    I was already shopping for a 21:9 monitor for my home office. I'm now planning to order a 29UM67 as soon as I see one in stock. The GPU in that machine is an R7/260X, which is on the compatible list. :-)
  • boozed - Thursday, March 19, 2015 - link

    "the proof is in the eating of the pudding"

    Thankyou for getting this expression right!

    Oh, and Freesync looks cool too.
  • D. Lister - Thursday, March 19, 2015 - link

    I have had my reservations with claims made by AMD these days, and my opinion of 'FreeSync' wasn't quite in contrast. If this actually works at least just as well as G-Sync (as claimed by this rather brief review) with various hardware/software setups then it is indeed a praiseworthy development. I personally would certainly be glad that the rivalry of two tech giants resulted (even if only inadvertently) in something that benefits the consumer.
  • cmdrdredd - Thursday, March 19, 2015 - link

    I love the arguments about "freesync is an open standard" when it doesn't matter. 80% of the market is Nvidia and won't be using it. Intel is a non-issue because not many people are playing games that benefit from adaptive v-sync. Think about it, either way you're stuck. If you buy a GSync monitor now you likely will upgrade your GPU before the monitor goes out. So your options are only Nvidia. If you buy a freesync monitor your options are only AMD. So everyone arguing against gsync because you're stuck with Nvidia, have fun being stuck with AMD the other way around.

    Best to not even worry about either of these unless you absolutely do not see yourself changing GPU manufacturers for the life of the display.
  • barleyguy - Friday, March 20, 2015 - link

    NVidia is 71% of the AIB market, as of the latest released numbers from Hexus. That doesn't include AMD's APUs, which also support Freesync and are often used by "midrange" gamers.

    The relevance of being an open standard though, is that monitor manufacturers can add it with almost zero extra cost. If it's built into nearly every monitor in a couple of years, then NVidia might have a reason to start supporting it.
  • tsk2k - Thursday, March 19, 2015 - link

    @Jarred Walton
    You disappoint me.
    What you said about G-sync below minimum refresh rate is not correct, also there seems to be issues with ghosting on freesync. I encourage everyone to go to PCper(dot)com and read a much more in-depth article on the subject.
    Get rekt anandtech.
  • JarredWalton - Friday, March 20, 2015 - link

    If you're running a game and falling below the minimum refresh rate, you're using settings that are too demanding for your GPU. I've spent quite a few hours playing games on the LG 34UM67 today just to see if I could see/feel issues below 48 FPS. I can't say that I did, though I also wasn't running settings that dropped below 30 FPS. Maybe I'm just getting too old, but if the only way to quantify the difference is with expensive equipment, perhaps we're focusing too much on the theoretical rather than the practical.

    Now, there will undoubtedly be some that say they really see/feel the difference, and maybe they do. There will be plenty of others where it doesn't matter one way or the other. But if you've got an R9 290X and you're looking at the LG 34UM67, I see no reason not to go that route. Of course you need to be okay with a lower resolution and a more limited range for VRR, and you're also willing to go with a slower response time IPS (AHVA) panel rather than dealing with TN problems. Many people are.

    What's crazy to me is all the armchair experts reading our review and the PCPer review and somehow coming out with one or the other of us being "wrong". I had limited time with the FreeSync display, but even so there was nothing I encountered that caused me any serious concern. Are there cases where FreeSync doesn't work right? Yes. The same applies to G-SYNC. (For instance, at 31 FPS on a G-SYNC display, you won't get frame doubling but you will see some flicker in my experience. So that 30-40 FPS range is a problem for G-SYNC as well as FreeSync.)

    I guess it's all a matter of perspective. Is FreeSync identical to G-SYNC? No, and we shouldn't expect it to be. The question is how much the differences matter. Remember the anisotropic filtering wars of last decade where AMD and NVIDIA were making different optimizations? Some were perhaps provably better, but in practice most gamers didn't really care. It was all just used as flame bait and marketing fluff.

    I would agree that right now you can make the case the G-SYNC is provably better than FreeSync in some situations, but then both are provably better than static refresh rates. It's the edge cases where NVIDIA wins (specifically, when frame rates fall below the minimum VRR rate), but when that happens you're already "doing it wrong". Seriously, if I play a game and it starts to stutter, I drop the quality settings a notch. I would wager most gamers do the same. When we're running benchmarks and comparing performance, it's all well and good to say GPU 1 is better than GPU 2, but in practice people use settings that provide a good experience.

    Example:
    Assassin's Creed: Unity runs somewhat poorly on AMD GPUs. Running at Ultra settings or even Very High in my experience is asking for problems, no matter if you have a FreeSync display or not. Stick with High and you'll be a lot happier, and in the middle of a gaming session I doubt anyone will really care about the slight drop in visual fidelity. With an R9 290X running at 2560x1080 High, ACU typically runs at 50-75FPS on the LG 34UM67; with a GTX 970, it would run faster and be "better". But unless you have both GPUs and for some reason you like swapping between them, it's all academic: you'll find settings that work and play the game, or you'll switch to a different game.

    Bottom Line: AMD users can either go with FreeSync or not; they have no other choice. NVIDIA users likewise can go with G-SYNC or not. Both provide a smoother gaming experience than 60Hz displays, absolutely... but with a 120/144Hz panel only the high speed cameras and eagle eyed youth will really notice the difference. :-)
  • chizow - Friday, March 20, 2015 - link

    Haha love it, still feisty I see even in your "old age" there Jarred. I think all the armchair experts want is for you and AT to use your forum on the internet to actually do the kind of testing and comparisons that matter for the products being discussed, not just provide another Engadget-like experience of superficial touch-feely review, dismissing anything actually relevant to this tech and market as not being discernable to someone "your age".
  • JarredWalton - Friday, March 20, 2015 - link

    It's easy to point out flaws in testing; it's a lot harder to get the hardware necessary to properly test things like input latency. AnandTech doesn't have a central location, so I basically test with what I have. Things I don't have include gadgets to measure refresh rate in a reliable fashion, high speed cameras, etc. Another thing that was lacking: time. I received the display on March 17, in the afternoon; sometimes you just do what you can in the time you're given.

    You however are making blanket statements that are pro-NVIDIA/anti-AMD, just as you always do. The only person that takes your comments seriously is you, and perhaps other NVIDIA zealots. Mind you, I prefer my NVIDIA GPUs to my AMD GPUs for a variety of reasons, but I appreciate competition and in this case no one is going to convince me that the closed ecosystem of G-SYNC is the best way to do things long-term. Short-term it was the way to be first, but now there's an open DisplayPort standard (albeit an optional one) and NVIDIA really should do everyone a favor and show that they can support both.

    If NVIDIA feels G-SYNC is ultimately the best way to do things, fine -- support both and let the hardware enthusiasts decide which they actually want to use. With only seven G-SYNC displays there's not a lot of choice right now, and if most future DP1.2a and above displays use scalers that support Adaptive Sync it would be stupid not to at least have an alternate mode.

    But if the only real problem with FreeSync is when you fall below the minimum refresh rate you get judder/tearing, that's not a show stopper. As I said above, if that happens to me I'm already changing my settings. (I do the same with G-SYNC incidentally: my goal is 45+ FPS, as below 40 doesn't really feel smooth to me. YMMV.)
  • Soulwager - Saturday, March 21, 2015 - link

    You can test absolute input latency to sub millisecond precision with ~50 bucks worth of hobby electronics, free software, and some time to play with it. For example, an arduino micro, a photoresistor, a second resistor to make a divider, a breadboard, and a usb cable. Set the arduino up to emulate a mouse, and record the difference in timing between a mouse input and the corresponding change in light intensity. Let it log a couple minutes of press/release cycles, subtract 1ms of variance for USB polling, and there you go, full chain latency. If you have access to a CRT, you can get a precise baseline as well.

    As for sub-VRR behavior, if you leave v-sync on, does the framerate drop directly to 20fps, or is AMD using triple buffering?

Log in

Don't have an account? Sign up now