AMD FreeSync Update

by Jarred Walton on 3/5/2015 12:30 PM EST
Comments Locked

55 Comments

Back to Article

  • zmeul - Thursday, March 5, 2015 - link

    I was sure I read somewhere, quite a while back, that AMD has issues enabling FreeSync on CrossFire configurations, but could not remember the source

    thank you for bringing the info - looks like they're on track
  • Gunbuster - Thursday, March 5, 2015 - link

    Dear AMD keep this up and you will steal the "Coming Soon™" crown from Microsoft's Windows Phone division.
  • sonicmerlin - Thursday, March 5, 2015 - link

    Two of the monitors you mentioned on overclockers.co.uk are 1440p. Reasonable prices I think. Definitely going to wait for a 4K freesync monitor though.
  • JarredWalton - Thursday, March 5, 2015 - link

    I only want 4K if it can do at least 100Hz. :-)
  • DarkStryke - Thursday, March 5, 2015 - link

    Have fun driving 4k @ 100FPS on a single card (to use freesync)
  • sonicmerlin - Thursday, March 5, 2015 - link

    You don't need to run everything at ultra settings. There really isn't that difference between medium and ultra.
  • Nintendo Maniac 64 - Thursday, March 5, 2015 - link

    The problem is bandwidth - with 8bpp and full 4:4:4 chroma, you would need DisplayPort 1.3 to do 4k @ 120hz over a single cable.
  • Nintendo Maniac 64 - Thursday, March 5, 2015 - link

    Derp, that's supposed to say "8bpc" (bits per channel).

    Also to clarify, DisplayPort 1.2 and 1.2a flat-out doesn't have the bandwidth to do 8bc + 4:4:4 chroma at 4k 100hz.
  • Nintendo Maniac 64 - Thursday, March 5, 2015 - link

    Of for the love of... where's an edit button when you need one.

    *8bpc + 4:4:4
  • Nintendo Maniac 64 - Thursday, March 5, 2015 - link

    ...words, I have a loss for them.

    *Oh for
  • Harry Lloyd - Friday, March 6, 2015 - link

    If you are saying that there is a bigger difference between 60 an 100 Hz, than between medium and ultra details, then you are an alien from another dimension in the future.
  • SlyNine - Wednesday, March 11, 2015 - link

    The difference between 60hz and 100hz is huge to everyone I've asked (blind tests with my monitors), ie normal people. Depending on the game the difference between medium and ultra varies.
  • Midwayman - Tuesday, March 10, 2015 - link

    I'm hoping gaming orientated 4k will have a clean 1080p mode with simple pixel doubling.
  • praeses - Thursday, March 5, 2015 - link

    I would be pretty happy with somewhere between 32-42" 3840x1600 @ 96hz 8bpc 4:4:4 w/Freesync, that should be doable with display port 1.2. Maybe if more people share my interest and voice up, we might get such a monitor (hint hint).
  • D. Lister - Thursday, March 5, 2015 - link

    I look forward to a quality/usability comparison with G-Sync. Although I'm predicting that such an article would end with something like "[FreeSync] needs more work, but step in the right direction." Of course knowing AMD's track record with implementing promised features, that "more work" will never actually happen.
  • medi03 - Thursday, March 5, 2015 - link

    Are you at least paid for this crap?
  • D. Lister - Thursday, March 5, 2015 - link

    lol, nah, it is completely out of charity. How's that "Mantle" thing going, BTW? :P
  • TheJian - Thursday, March 5, 2015 - link

    I think he means Mantle was supposed to ship for everyone, but now it's dead. It was supposed to be open, not it's not (smart people always knew in never would be). etc.
  • tuxRoller - Thursday, March 5, 2015 - link

    Dead yet still living on in vulkan...
  • D. Lister - Thursday, March 5, 2015 - link

    ... and by the will of our lord AMDhova, the holy FreeSync shall live on as well, maybe not as a variable refresh rate solution, but certainly in the smiles of young children everywhere. :D
  • Alexvrb - Thursday, March 5, 2015 - link

    The Cult of NVIDIA is so different? I think not.

    Anyway most people saw Mantle as a stop-gap and a means to force MS and Khronos to get off their butts. It worked, too. But Adaptive Sync is an implementation of a VESA standard... it's already open and others will implement it if it does well (Intel for example).
  • D. Lister - Friday, March 6, 2015 - link

    Most people saw Mantle as a technological edge for GCN over the competition, and a significant reason for buying (and recommending) a new AMD GPU or APU over the alternatives. Many of those same people are only NOW saying "Mantle was not really a game-changing feature to keep the aging GCN architecture competitive, but just a stop-gap and a means to force MS and Khronos to release DX12, teehehe."

    As for the "cult of Nvidia", we are doing quite well thank you. We recently renovated our temple and bought flowing green robes for our priests. Come visit us sometime. Let us show you the glory of power savings and lower TDP, and the splendor of a unified, 1st-party software suite that is the GeForce Experience. Let us lead you on a path to true graphic contentment that can only be achieved with stable drivers. All hail our lord, that Korean dude whose name I always forget.
  • Murloc - Friday, March 6, 2015 - link

    I had to uninstall it because it wants me to update drivers to a version that isn't supported by my main GPU, but is supported by the newer and very low-end GPU I'm using only for sending audio over hdmi to my AVR.
    When it did, it caused a mess.
  • silverblue - Friday, March 6, 2015 - link

    I'll try not to throw a Fermi-shaped stone into your glass house of power savings. Legitimate question - does NVIDIA still sell Fermi derivatives within the current product stack?
  • D. Lister - Friday, March 6, 2015 - link

    The 4xx series you mean? Bah, and once people use to ride jalopies and steam locomotives, lol.
  • silverblue - Friday, March 6, 2015 - link

    The 5xx series as well, plus various lower end options since (there were three flavours of the GT 730, one being based on GF108, all of which came out last June).
  • D. Lister - Friday, March 6, 2015 - link

    5xx series was actually a fair improvement over the TDP/Wattage disaster that was the 4xx series. Especially in terms of performance/watt, the disparity with the AMD equivalent was comparatively much smaller.

    Still, that was 4 gens ago (or 3, not counting the 8xxm parts). At this point though, I wouldn't recommend buying a Fermi GPU.

    As for the x20/x30 parts - that's the bargain basement of performance, where power and heat aren't really a significant issue anyway.
  • silverblue - Friday, March 6, 2015 - link

    I'd be annoyed if I had a hungry budget card. :)
  • tuxfool - Friday, March 6, 2015 - link

    I'm not likely to go for the likes of you to see what "most" people think. Almost as much BSing and FUD as that Jian fellow.
  • D. Lister - Friday, March 6, 2015 - link

    @tuxfool

    That's good - form your own opinions. Don't let anyone tell you how to think. Not the likes of me, not a website journalist, or a corporate marketing rep, a politician, or a clergyman. Understand this fact of modern life, that 99.99% stuff that we see, hear or read, is complete, and often commercially fabricated, bullshit. AAMOF, every time we read or hear anything from anyone, our first thought should be, "this is bullshit, unless unequivocally proven otherwise." Skepticism is a survival skill.
  • Jon Irenicus - Friday, March 6, 2015 - link

    I thought mantle was a competitive advantage for AMD, but clearly the market did not care as seen in those market share numbers that were released. It was a blood bath for amd because product is king and nvidia launched the 980/970 as the shiny new cards on the block and all amd had were older cards.

    Long term though, I DO think having mantle like functionality built in helps AMD more than nvidia on the performance side in two respects. Lower speed cpus will gain a larger boost which helps them sell more cpus. On review sites when testing nvidia vs amd with mantle the reviewers almost always paired the test system with a god damn i7 quad or SIX core cpu. Some might have even used the 8 core haswell e. Mantle shined best when cpu constrained, and the reviews often never highlighted the gains there by using non TOP OF THE LINE cpus. The second area where amd will gain is their dx11 multi threaded cpu drivers. Nvidia had a clear edge there, dx11 performance was much better than amd unless the game/engine makers actually put in the work to optimize their own game. Johan and his team using frostbite 3 took that time (dx11 performance on games like bf4 and dragon age inquisition is competitive), but clearly companies like ubisoft and games like Assassins creed don't seem to care or have the resources to do that. With mantle and dx12 and now the new open gl, that optimization work is offloaded onto the actual game maker to a greater degree. So the penalty for having weaker multithreaded cpu drivers in dx11 will be lower.
  • Gunbuster - Friday, March 6, 2015 - link

    How long has the Cult of Nvidia been running G-sync monitors while AMD shovels out Coming Soon™ promises? Results win over press releases and FUD from the marketing department.
  • tuxRoller - Friday, March 6, 2015 - link

    :)
    I don't credit amd with variable refresh as that already existed in eDP.
  • D. Lister - Friday, March 6, 2015 - link

    @tuxRoller

    Don't say it out loud man :O. Someone's gonna accuse you of being an Nvidia shill for taking ANY credit away from AMD. Just kidding ;).
  • TheJian - Friday, March 6, 2015 - link

    You can say that, but for people who bought AMD cards hoping mantle was going to gain traction, they are now hopeless correct? Everyone contributes to Vulkan that is in Khronos. AMD will get no advantage from special mantle crap in it, as Vulkan is NOT mantle rev2. But they will have to have R&D $$ to get good drivers/support for it out the door. We know NV has the cash for that right?

    RE: your fud comment to D. Lister about me: This is not FUD. It's a fact. There won't be 100 mantle games coming for people suckered by it. That is over. Considering the guy who runs Nvidia's Mobile Ecosystem division ALSO is at the top of Khronos, we'll see who ends up getting the better end of this death of Mantle deal. FUD? LOL. I'm glad NV's response to AMD's Mantle was "We won't be doing an API". OF course not, they would do it through Khronos or spending that driver money on DirectX driver improvements (which they did) ;)
  • silverblue - Friday, March 6, 2015 - link

    I'm not sure people thought there would be 100 or so Mantle games. Well, I certainly didn't, and the list of prospective Mantle games never really got too long. It took internal resources away within AMD at a time where they could've been spending more on D3D performance, but it did serve to highlight that DirectX 11 needed to die a very quick death. I'm not sure if Mantle was developed before DirectX 12 or vice versa - certainly, the idea that Intel wanted to license it definitely raises eyebrows about how far DirectX 12 had come along - but pretty soon, assuming there's a far bigger push for DirectX 12 than there was for 11, we're going to see a lot of titles that are less of a strain on the CPU, and consequently a renewed focus on buying better GPUs, which should benefit both AMD and NVIDIA, regardless of what API with an emphasis on reducing CPU load was thought up first.
  • tuxfool - Saturday, March 7, 2015 - link

    Most people that liked mantle, were excited for it as a technology, not because it gave amd a competitive advantage. Any reasonable person would see that a manufacturer specific api would have difficulty gaining traction. Given the similarity between mantle and vulkan (which your rants seem to ignore), I'd mantle has gone to good use.
  • Galatian - Thursday, March 5, 2015 - link

    Well my BenQ is coming tomorrow (Germany) through Amazon. Didn't realize the driver wasn't ready! Oh well I can wait one more month.
  • Oxford Guy - Thursday, March 5, 2015 - link

    What we need is a way to mod our existing monitor. I have an $800 monitor and I'm NOT going to replace it just to get FreeSync or G-Sync, no matter how much the GPU makers and the monitor makers want me to.
  • Oxford Guy - Thursday, March 5, 2015 - link

    I'd be willing to pay a reasonable fee to have a monitor modded so someone needs to step up and make this happen. We have enough landfill waste without people replacing perfectly good monitors in order to get a feature like this that should have been part of the specs from the start of discreet 3D cards.
  • CPUGPUGURU - Thursday, March 5, 2015 - link

    Oxford Guy, I'm kinda in the same boat as you,

    Nvidia has a solution for you. Combine this mod with your $800 monitor and a high end Maxwell card and you'll be happy for years to come. Two GTX970/980 in SLI and DX12 which stacks memory would be what I would do, but I waiting for Max Maxwell to be unleashed before I take the plunge.

    G-SYNC Do-It-Yourself Kit

    http://international.download.nvidia.com/geforce-c...
  • Oxford Guy - Thursday, March 5, 2015 - link

    "THIS MODIFICATION KIT IS ONLY COMPATIBLE WITH AN ASUS VG248QE MONITOR."
  • CPUGPUGURU - Thursday, March 5, 2015 - link

    Didn't know that, thanks Ox for the info, my gaming friend did a G-SYNC DIY mod and was happy with the result but you're right its only available on the VG248QE so its a nogo for me too, oh well I needed a bigger monitor anyways.
  • Manch - Friday, March 6, 2015 - link

    I read a while back that most current monitors can supt Free Sync as it's already in the VESA standard but just hasn't been implemented. Will any display manufacturers update via firmware? Hack maybe? If you're running consistently over 60fps does either of these tech do anything for me? I can see the benefit in a laptop where hitting 60FPS is spotty at best. Yall two seem to be read up on it so I figured I'd ask yall instead of the NVIDA/AMD cult members duking it out.
  • mr_tawan - Friday, March 6, 2015 - link

    I have never seen any monitor firmware updates before in my life... And I have never expect ones.

    But it would be nice if there are any though.
  • Oxford Guy - Monday, March 9, 2015 - link

    Dell updated the firmware for the U2410 several times to correct pink tint issues and a broken user calibration mode.
  • Yojimbo - Saturday, March 7, 2015 - link

    From what I remember reading I don't think that's correct. Some sort of hardware modification needs to be performed to the scale chip for FreeSync to work in a stand-alone monitor. Maybe what you said is true for laptops.
  • HollyDOL - Friday, March 6, 2015 - link

    Wonder when/if we'll see video codecs with adaptive frame rates... I can imagine static scenes where it wouldn't be possible to notice 5FPS rate while on the other side you could get very dynamic scenes where high FPS makes much bigger impact...
  • Urizane - Friday, March 6, 2015 - link

    3GPP2 does this. Technically, MP4 (QuickTime derived) can do this to, but it's very uncommon.
  • Urizane - Friday, March 6, 2015 - link

    I should also mention that variable frame rates are a feature of the container, not the video codec. A video codec doesn't give a damn what the frame rate is unless buffer size and bitrate constraints are set. Even then, only the encoder cares while the decoder just plays frames when the container specifies it should. Also, 3GPP2 is basically an alternate implementation of the base format of MP4, which is why MP4 files could have VFR content. It's just not widely supported unless the extension happens to be 3G2 and the player understands the implications of that (VLC does, for example).
  • HollyDOL - Saturday, March 7, 2015 - link

    Thanks for info, I'll look into that a bit more, atm I don't understand how VFR could do just with container support - at least encoder needs to inspect frames and extend/drop those that have difference below some delta. Kind of extension of I/P/B frame philosophy.
  • mikato - Friday, March 6, 2015 - link

    WTF is EMEA and when did it become a thing? Is it next to Oceania? Do we live in CONUS or maybe NASA? I'll be vacationing in ANZAC this summer (their winter). Don't get me started on POTUS, SCOTUS, and .... COTUS? (almost COITUS there... Sheldon is that you?) What's the point of an acronym (duh shortening) if you purposely add in "of the United States" to make it longer. Because you sound cool and politico saying it? Give me a break. Politico, I hate that! But I still made it an adjective. And what happened to the A for America? POTUSA? Sounds like a Sanjay Gupta CNN special.

    "as having to go with a TN panel to get higher refresh rates tends to be a case of one step forward, one step back"
    F*** TN.

    Friday morning stream of consciousness hate! lol. Go FreeSync, there. Now give me some Carrizo.
  • geok1ng - Friday, March 6, 2015 - link

    I do not get the part about IPS free sync without mention to VA free sync. Last gen VA panels reached faster gtG pixel transition times than competing IPS panels, and this without the use of overdrive.
    http://www.tftcentral.co.uk/reviews/philips_bdm406...
  • Oxford Guy - Monday, March 9, 2015 - link

    Unfortunately you're also looking at moderate input lag.
  • Will Robinson - Wednesday, March 11, 2015 - link

    Of course the sweet spot is with IPS and FreeSync.
    All out gamers can grab the TN panels now for wicked fast response numbers but many users like me will wait for the rich color palette and plushness that a high end IPS display brings.
    Good to see FreeSync is finally available,hope they do well and sell plenty.

Log in

Don't have an account? Sign up now