Meet The Radeon VII

First things first is the design and build, and for the AMD Radeon VII, we've already noticed the biggest change: an open air cooler. Keeping the sleek brushed metal look of the previous RX Vega 64 Limited Edition and Liquid variants, they've forgone the blower for a triple axial fan setup, the standard custom AIB configuration for high-end cards.

While NVIDIA's GeForce RTX series went this way with open-air dual-fan coolers, AMD is no stranger to changing things up themselves. Aside from the RX Vega 64 Liquid, the R9 Fury X's AIO CLC was also quite impressive for a reference design. But as we mentioned with the Founders Edition cards, moving away from blowers for open-air means adopting a cooling configuration that can no longer guarantee complete self-cooling. That is, cooling effectiveness won't be independent of chassis airflow, or lack thereof. This is usually an issue for large OEMs that configure machines assuming blower-style cards, but this is less the case for the highest-end cards, which for pre-builts tend to come from boutique system integrators.

The move to open-air does benefit higher TDP, and at 300W TBP the Radeon VII is indeed one for higher power consumption. While 5W more than the RX Vega 64, there's presumably more localized heat with two more HBM2 stacks, plus the fact that the same amount of power is being consumed but on a smaller die area. And at 300W TBP, this would mean that all power-savings from the smaller process were re-invested into performance. If higher clockspeeds are where the Radeon VII is bringing the majority of its speedup over RX Vega 64, then there would be little alternative to abandoning the blower.

Returning to the Radeon VII build, then, the card naturally has dual 8-pin PCIe connectors, but lacks the BIOS switch of the RX Vega cards that toggled a lower-power BIOS. And with the customary LEDs, the 'Radeon' on the side lights up, as does the 'R' cube in the corner.

In terms of display outputs, there are no surprises here with 3x DisplayPort and 1x HDMI.

A few teardowns of the card elsewhere revealed a vapor chamber configuration with a thermal pad for the TIM, rather than the usual paste. While lower-performing in terms of heat transfer, we know that the RX Vega cards ended up having molded and unmolded package variants, requiring specific instructions to manufacturers on the matter. So this might be a way to head off potential ASIC height difference issues.

FP64 Perf and Separating Radeon VII from MI50 The Test
Comments Locked

289 Comments

View All Comments

  • peevee - Tuesday, February 12, 2019 - link

    "that the card operates at a less-than-native FP64 rate"

    The chip is capapble of 2 times higher f64 performance. Marketoids must die.
  • FreckledTrout - Thursday, February 7, 2019 - link

    Performance wise it did better than I expected. This card is pretty loud and runs a bit hot for my tastes. Nice review. Where are the 8K and 16K tests :)-
  • IGTrading - Thursday, February 7, 2019 - link

    When drivers mature, AMD Radeon VII will beat the GF 2080.

    Just like Radeon Furry X beats the GF 980 and Radeon Vega 64 beats the GF 1080.

    When drivers mature and nVIDIA's blatant sabotage against its older cards (and AMD's cards) gets mitigated, the long time owner of the card will enjoy better performance.

    Unfortunately, on the power side, nVIDIA still has the edge, but I'm confident that those 16 GB of VRAM will really show their worth in the following year.
  • cfenton - Thursday, February 7, 2019 - link

    I'd rather have a card that performs better today than one that might perform better in two or three years. By that point, I'll already be looking at new cards.

    This card is very impressive for anyone who needs FP64 compute and lots of VRAM, but it's a tough sell if you primarily want it for games.
  • Benjiwenji - Thursday, February 7, 2019 - link

    AMD cards have traditional age much better than Nvidia. GamerNexus just re-benchmarked the 290x from 2013 on modern games and found it comparable to the 980, 1060, and 580.

    The GTX 980 came late 2014 with a $550USD tag, now struggles on 1440p.

    Not to mention that you can get a lot out of AMD cards if you're willing to tinker. My 56, which I got from Microcenter on Nov, 2017, for $330. (total steal) Now performs at 1080 level after BIOs flash + OC.
  • eddman - Friday, February 8, 2019 - link

    What are you talking about? GTX 980 still performs as it should at 1440.

    https://www.anandtech.com/bench/product/2142?vs=22...
  • Icehawk - Friday, February 8, 2019 - link

    My 970 does just fine too, I can play 1440p maxed or near maxed in everything - 4k in older/simpler games too (ie, Overwatch). I was planning on a new card this gen for 4k but pricing is just too high for the gains, going to hold off one more round...
  • Gastec - Tuesday, February 12, 2019 - link

    That's because, as the legend has it, Nvidia is or was in the past gimping their older generation cards via drivers.
  • kostaaspyrkas - Sunday, February 10, 2019 - link

    in same frame rates nvidia gameplay gives me a sense of choppiness...amd radeon more fluid gameplay...
  • yasamoka - Thursday, February 7, 2019 - link

    This wishful in-denial conjecture needs to stop.

    1) AMD Radeon VII is based on the Vega architecture which has been on the platform since June 2017. It's been about 17 months. The drivers had more than enough time to mature. It's obvious that in certain cases there are clear bottlenecks (e.g. GTA V), but this seems to be the fundamental nature of AMD's drivers when it comes to DX11 performance in some games that perform a lot of draw calls. Holding out for improvements here isn't going to please you much.

    2) The Radeon Fury X was meant to go against the GTX 980Ti, not the GTX 980. The Fury, being slightly under the Fury X, would easily cover the GTX 980 performance bracket. The Fury X still doesn't beat the GTX 980Ti, particularly due to its limited VRAM where it even falls back in performance compared to the RX480 8GB and its siblings (RX580, RX590).

    3) There is no evidence of Nvidia's sabotage against any of its older cards when it comes to performance, and frankly your dig against GameWorks "sabotaging" AMD's cards performance is laughable when the same features, when enabled, also kill performance on Nvidia's own cards. PhysX has been open-source for 3 years and has now moved on to its 4th iteration, being used almost universally now in game engines. How's that for vendor lockdown?

    4) 16GB of VRAM will not even begin to show their worth in the next year. Wishful thinking, or more like licking up all the bad decisions AMD tends to make when it comes to product differentiation between their compute and gaming cards. It's baffling at this point that they still didn't learn to diverge their product lines and establish separate architectures in order to optimize power draw and bill of materials on the gaming card by reducing architectural features that are unneeded for gaming. 16GB are unneeded, 1TB/s of bandwidth is unneeded, HBM is expensive and unneeded. The RTX 2080 is averaging higher scores with half the bandwidth, half the VRAM capabity, and GDDR6.

    The money is in the gaming market and the professional market. The prosumer market is a sliver in comparison. Look at what Nvidia do, they release a mere handful of mascots every generation, all similar to one another (the Titan series), to take care of that sliver. You'd think they'd have a bigger portfolio if it were such a lucrative market? Meanwhile, on the gaming end, entire lineups. On the professional end, entire lineups (Quadro, Tesla).

    Get real.

Log in

Don't have an account? Sign up now