Following up on last week’s Radeon pricing observations, it looks like there has been one final shift in Radeon R9 290 series pricing. While R9 290 has held steady around $299 with the occasional small rebate, we’ve seen R9 290X continue to fall and drop below the roughly $400 price they were going for last week. Finally stabilizing, the R9 290X has leveled out at around $370, with a handful of cards going for even a bit less than that. At $370, the R9 290X is now $30 less than the week before and this puts it just $40 over the MSRP of the GeForce GTX 970.

Though I had been expecting prices to fall further, I am a bit surprised to see R9 290X prices drop below $400 so soon. With GTX 900 series availability still being outstripped by demand, Radeon prices needed to come down from their initial MSRPs in reaction to the NVIDIA launch, though not necessarily this quickly. Regardless, this does mean that the R9 290X is in a better position than it was last week; AMD can’t completely close NVIDIA’s technology advantage gap, but from a price/performance ratio anything that brings R9 290X closer to the similarly performing GTX 970 will help AMD’s partners move cards. In the meantime it’s worth noting that AMD appears to be sticking to their guns on influencing product value through game bundles rather than engaging in a pure price war, as the $370 R9 290X goes hand-in-hand with the continued inclusion of AMD’s Never Settle Forever bundle.

Speaking of game bundles, NVIDIA sends word this afternoon that Borderlands: The Pre-Sequel is now shipping for customers who received vouchers as part of NVIDIA’s recent game bundle. This bundle was never extended to the GTX 900 series – NVIDIA is clearly having no trouble selling those cards right now – but this offer is still active on the higher-end GTX 700 series cards as part of the company’s efforts to sell off the remaining GTX 770/780 inventory.

Fall 2014 GPU Pricing Comparison
AMD Price NVIDIA
Radeon R9 295X2 $1000  
  $550 GeForce GTX 980
Radeon R9 290X $370  
  $330 GeForce GTX 970
Radeon R9 290 $300  
Radeon R9 280X
Radeon R9 285
$250  
Radeon R9 280 $200 GeForce GTX 760
Comments Locked

82 Comments

View All Comments

  • przemo_li - Wednesday, October 15, 2014 - link

    And that Mantle thing in few games, add extra bit.
  • silverblue - Wednesday, October 15, 2014 - link

    Get those Maxwell GPUs working on compute and their power usage quickly jumps up. Get those Hawaii GPUs on a better implementation of PowerTune and their power usage will drop a little.

    Really, all the cries of "AMD iz doomed!!!!1" are premature. I'd be surprised if they didn't have a high-end answer coming within six months.
  • TiGr1982 - Wednesday, October 15, 2014 - link

    I'm just curious, if you have any source on next gen Radeon e.g. 390X or whatever which, you say, is just around the corner? Or just rumours currently?
  • The Von Matrices - Wednesday, October 15, 2014 - link

    If they have any official information, it's highly likely that they're under an NDA anyway and can't tell you it's official.
  • TiGr1982 - Wednesday, October 15, 2014 - link

    Yes, indeed, everybody in touch has NDA, i guess - this is a usual practice.
    However, right now, I think it may be beneficial for AMD itself to tease something somehow to prevent some amount of people switching to nV by buying GM204 - of course, if AMD has something really new to tease.
  • IntelligentAj - Wednesday, October 15, 2014 - link

    Can someone explain to me why you would place power efficiency high on the list of reasons to pick one over the other? Is electricity that expensive in some places?
  • TiGr1982 - Wednesday, October 15, 2014 - link

    It's not the electicity usage, mainly - it's noise, heat, and in some cases it may be the need to upgrade the power supply (PSU) (read: buy a new better PSU also and do something about the old PSU, e.g. sell it for pennies).
  • chizow - Wednesday, October 15, 2014 - link

    It's more about heat output and ambient room temps. Generally, 1 card up to 250W is not going to make a huge difference for most high-end users, that's tolerable I'd say for most, especially if the difference between options is <50W. But once you start tacking on multiple GPUs, with multiple displays and a high-end rig backing it, the difference in ambient room temps in your PC gameroom/office and the rest of your house can be massive.

    I generally tell people to do the "lightbulb test" to illustrate the difference. Using a 100W oldschool incandescent bulb will definitely impact room temps in just a few minutes to the point you can feel it. There is no difference here between choosing a card that consumes 150W and 250W in that respect.

    Most people are more tolerant in the winter time, when they can just use the PC as a source of heat, or open a window, but in warmer summer months, there is no escaping it unless you crank up the AC.
  • silverblue - Wednesday, October 15, 2014 - link

    I can't say I've ever felt a temperature difference from having a 100W incandescent bulb, I suppose the size of the room and its proximity would be important factors, however I have been in a small bathroom with a 250W bulb before and it was quite a thing to behold, both in terms of light and heat output.

    If AMD were to eke decent performance gains from their drivers over the next few months, the apparent power consumption gap may not matter so much. Here's to hoping they have a team dedicated to looking at their D3D performance.
  • chizow - Wednesday, October 15, 2014 - link

    Haha yeah 250W might leave you with a bit of a tan even! But yes proximity/size of room definitely matters before you feel it, I use a floor standing lamp next to my PC desk and as soon as I turn it on, I feel the difference in the room in a matter of minutes (my home office is maybe 10'x14'). Over time however, the ambient temps in the room will rise.

    But there's other examples people can relate to in their own upgrading/buying experiences, for example, going from an OC'd X58 rig to my current Z87 rig dropped light PC usage temps down ~75W as well, from 200W to ~125W. Also noticeable. And a few years ago, going from 2xGTX 480 to 2xGTX 670 made a massive difference in ambient room temps to the point I won't ever go 2x250W GPUs ever again. I've since swapped those 2x670s to a single 980 and again, big drop, but I will probably go back to 2x980 again in the next few months.

Log in

Don't have an account? Sign up now