Meet The Radeon VII

First things first is the design and build, and for the AMD Radeon VII, we've already noticed the biggest change: an open air cooler. Keeping the sleek brushed metal look of the previous RX Vega 64 Limited Edition and Liquid variants, they've forgone the blower for a triple axial fan setup, the standard custom AIB configuration for high-end cards.

While NVIDIA's GeForce RTX series went this way with open-air dual-fan coolers, AMD is no stranger to changing things up themselves. Aside from the RX Vega 64 Liquid, the R9 Fury X's AIO CLC was also quite impressive for a reference design. But as we mentioned with the Founders Edition cards, moving away from blowers for open-air means adopting a cooling configuration that can no longer guarantee complete self-cooling. That is, cooling effectiveness won't be independent of chassis airflow, or lack thereof. This is usually an issue for large OEMs that configure machines assuming blower-style cards, but this is less the case for the highest-end cards, which for pre-builts tend to come from boutique system integrators.

The move to open-air does benefit higher TDP, and at 300W TBP the Radeon VII is indeed one for higher power consumption. While 5W more than the RX Vega 64, there's presumably more localized heat with two more HBM2 stacks, plus the fact that the same amount of power is being consumed but on a smaller die area. And at 300W TBP, this would mean that all power-savings from the smaller process were re-invested into performance. If higher clockspeeds are where the Radeon VII is bringing the majority of its speedup over RX Vega 64, then there would be little alternative to abandoning the blower.

Returning to the Radeon VII build, then, the card naturally has dual 8-pin PCIe connectors, but lacks the BIOS switch of the RX Vega cards that toggled a lower-power BIOS. And with the customary LEDs, the 'Radeon' on the side lights up, as does the 'R' cube in the corner.

In terms of display outputs, there are no surprises here with 3x DisplayPort and 1x HDMI.

A few teardowns of the card elsewhere revealed a vapor chamber configuration with a thermal pad for the TIM, rather than the usual paste. While lower-performing in terms of heat transfer, we know that the RX Vega cards ended up having molded and unmolded package variants, requiring specific instructions to manufacturers on the matter. So this might be a way to head off potential ASIC height difference issues.

FP64 Perf and Separating Radeon VII from MI50 The Test
Comments Locked

289 Comments

View All Comments

  • KateH - Friday, February 8, 2019 - link

    thirded on still enjoying SupCom! i have however long ago given up on attempting to find the ultimate system to run it. i7 920 @ 4.2Ghz, nope. FX-8150 @ 4.5Ghz, nope. The engine still demands more CPU for late-game AI swarms! (and i like playing on 81x81 maps which makes it much worse)
  • Korguz - Friday, February 8, 2019 - link

    Holliday75 and KateH

    ive run supcom on a i7 930 OC'd to 4.2 on a 7970, slow as molasses late in the game VS the AI, and on my current i7 5930k and strix 1060 and.. same thing.. very slow late in the game.... the later patches supposedly helped the game use more then 1 or 2 cores, i think Gas Powered games called it " multi core aware "

    makes me wonder how it would run on something newer like a threadripper, top en Ryzen or top end i7 and an i9 with a 1080 + vid card though, compared to my current comp....
  • eva02langley - Friday, February 8, 2019 - link

    Metal Gear Solid V, Street Fighter 5, Soulcalibur 6, Tekken 7, Senua Sacrifice...

    Basically, nothing from EA or Ubisoft or Activision or Epic.
  • ballsystemlord - Thursday, February 7, 2019 - link

    Oh oh! Would you be willing to post some FLOSS benchmarks? Xonotic, 0AD, Openclonk and Supertuxkart?
  • Manch - Friday, February 8, 2019 - link

    I would like to see a mixture of games that are dedicated to a singular API, and ones that support all three or at least two of them. I think that would make for a good spread.
  • Manch - Thursday, February 7, 2019 - link

    Not sure that I expected more. The clock for clock against the V64 is telling. @$400 for the V64 vs $700 for the VII, ummm....if you need a compute card as well sure, otherwise, Nvidia got the juice you want at better temps for the same price. Not a bad card, but it's not a great card either. I think a full 64CU's may have improved things a bit more and even put it over the top.

    Could you do a clock for clock compare against the 56 since they have the same CU count?? I'd be curious to see this and extrapolate what a VII with 64CU's would perform like just for shits and giggles.
  • mapesdhs - Friday, February 8, 2019 - link

    Are you really suggesting that, given two products which are basically the same, you automatically default to NVIDIA because of temperatures?? This really is the NPC mindset at work. At least AMD isn't ripping you off with the price, Radeon VII is expensive to make, whereas NVIDIA's margin is enormous. Remember the 2080 is what should actually have been the 2070, the entire stack is a level higher than it should be, confirmed by die code numbers and the ludicrous fact that the 2070 doesn't support SLI.

    Otoh, Radeon II is generally too expensive anyway; I get why AMD have done it, but really it's not the right way to tackle this market. They need to hit the midrange first and spread outwards. Stay out of it for a while, come back with a real hammer blow like they did with CPUs.
  • Manch - Friday, February 8, 2019 - link

    Well, they're not basically the same. Who's the NPC LOL? I have a V64 in my gaming rig. It's loud but I do like it for the price. The 2080 is a bit faster than the VII for the same price. It does run cooler and quieter. For some that is more important. If games is all you care about, get it. If you need compute, live with the noise and get the VII.

    I don't care how expensive it is to make. If AMD could put out a card at this level of performance they would and they would sell it at this price.
    Barely anyone uses SLI/Crossfire. It's not worth it. I previously had 2 290X 8GB in Crossfire. I needed a beter card for VR, V64 was the answer. It's louder but it was far cheaper than competitors. The game bundle helped. Before that, I had bought a 1070 for the wife's computer. It was a good deal at the time. Some of yall get too attached to your brands get all frenzied at any criticism. I buy what suits my needs at the best price/perf.
  • AdhesiveTeflon - Friday, February 8, 2019 - link

    Not our fault AMD decided to make a video card with more expensive components and not beat the competition,
  • mapesdhs - Friday, February 8, 2019 - link

    Are you really suggesting that, given two products which are basically the same, you automatically default to NVIDIA because of temperatures?? This really is the NPC mindset at work. At least AMD isn't ripping you off with the price, Radeon VII is expensive to make, whereas NVIDIA's margin is enormous. Remember the 2080 is what should actually have been the 2070, the entire stack is a level higher than it should be, confirmed by die code numbers and the ludicrous fact that the 2070 doesn't support SLI.

    Otoh, Radeon II is generally too expensive anyway; I get why AMD have done it, but really it's not the right way to tackle this market. They need to hit the midrange first and spread outwards. Stay out of it for a while, come back with a real hammer blow like they did with CPUs.

Log in

Don't have an account? Sign up now