Meet The EVGA GeForce GTX 1660 Ti XC Black GAMING

As a pure virtual launch, the release of the GeForce GTX 1660 Ti does not bring any Founders Edition model, and so everything is in the hands of NVIDIA’s add-in board partners. For today, we look at EVGA’s GeForce GTX 1660 Ti XC Black, a 2.75-slot single-fan card with reference clocks and a slightly increased TDP of 130W.

GeForce GTX 1660 Ti Card Comparison
  GTX 1660 Ti Ref Spec EVGA GTX 1660 Ti XC Black GAMING
Base Clock 1500MHz 1500MHz
Boost Clock 1770MHz 1770MHz
Memory Clock 12Gbps GDDR6 12Gbps GDDR6
VRAM 6GB 6GB
TDP 120W 130W
Length N/A 7.48"
Width N/A 2.75-Slot
Cooler Type N/A Open Air
Price $279 $279

Seeing as the GTX 1660 Ti is intended to replace the GTX 1060 6GB, EVGA’s cooler and card design is new and improved compared to their Pascal cards, and was first introduced with the RTX 20-series as they rolled out the iCX2 cooling design and new “XC” card branding, complementing their existing SC and Gaming series. As we’ve seen before, the iCX platform is comprised of a medley of features, and some of the core technology is utilized even when the full iCX suite isn’t. For one, EVGA reworked their cooler design with hydraulic dynamic bearing (HDB) fans, offering lower noise and higher lifespan than sleeve and ball bearing types, and this is present in the EVGA GTX 1660 Ti XC Black.

In general, the card essentially shares the design of the RTX 2060 XC, complete with those new raised EVGA ‘E’s on the fans, intended to improve slipstream. The single-fan RTX 2060 XC was paired with a thinner dual-fan XC Ultra variant, and in the same vein the GTX 1660 Ti XC Black is a one-fan design that essentially occupies three slots due to the thick heatsink and correspondingly taller fan hub. Being so short, though, makes the size a natural fit for mini-ITX form factors.

As one of the cards lower down the RTX 20 and now GTX 16 series stack, the GTX 1660 Ti XC Black also lacks LEDs and zero-dB fan capability, where fans turn off completely at low idle temperatures. The former is an eternal matter of taste, as opposed to the practicality of the latter, but both tend to be perks of premium models and/or higher-end GPUs. Putting price aside for the moment, the reference-clocked GTX 1660 Ti and RTX 2060 XC Black editions are the more mainstream variant anyhow.

Otherwise, the GTX 1660 Ti XC Black unsurprisingly lacks a USB-C/VirtualLink output, offering up the mainstream-friendly 1x DisplayPort/1x HDMI/1x DVI setup. Although the TU116 GPU still supports VirtualLink, the decision to implement it is up to partners; the feature is less applicable for cards further down the stack, where cards are more sensitive to cost and are less likely to be used for VR. Additionally, the 30W USB-C controller power budget could be significant amount relative to the overall TDP.

And on the topic of power, the GTX 1660 Ti XC Black’s power limit is actually capped at the default 130W, though theoretically the card’s single 8-pin PCIe power connector could supply 150W on its own.

The rest of the other GPU-tweaking knobs are there for your overclocking needs, and for EVGA this goes hand-in-hand with Precision, their overclocking utility. For NVIDIA’s Turing cards, EVGA released Precision X1, which allows modifying the voltage-frequency curve and scanning for auto-overclocking as part of Turing’s GPU Boost 4. Of course, NVIDIA’s restriction of actual overvolting is still in place, and for Turing there is a cap at 1.068v.

TU116: When Turing Is Turing… And When It Isn’t The Test
Comments Locked

157 Comments

View All Comments

  • Yojimbo - Saturday, February 23, 2019 - link

    My guess is that in the next (7 nm) generation, NVIDIA will create the RTX 3050 to have a very similar number of "RTX-ops" (and, more importantly, actual RTX performance) as the RTX 2060, thereby setting the capabilities of the RTX 2060 as the minimum targetable hardware for developers to apply RTX enhancements for years to come.
  • Yojimbo - Saturday, February 23, 2019 - link

    I wish there were an edit button. I just want to say that this makes sense, even if it eats into their margins somewhat in the short term. Right now people are upset over the price of the new cards. But that will pass assuming RTX actually proves to be successful in the future. However, if RTX does become successful but the people who paid money to be early adopters for lower-end RTX hardware end up getting squeezed out of the ray-tracing picture that is something that people will be upset about which NVIDIA wouldn't overcome so easily. To protect their brand image, NVIDIA need a plan to try to make present RTX purchases useful in the future being that they aren't all that useful in the present. They can't betray the faith of their customers. So with that in mind, disabling perfectly capable RTX hardware on lower end hardware makes sense.
  • u.of.ipod - Friday, February 22, 2019 - link

    As a SFFPC (mITX) user, I'm enjoying the thicker, but shorter, card as it makes for easier packaging.
    Additionally, I'm enjoying the performance of a 1070 at reduced power consumption (20-30w) and therefore noise and heat!
  • eastcoast_pete - Friday, February 22, 2019 - link

    Thanks! Also a bit disappointed by NVIDIA's continued refusal to "allow" a full 8 GB VRAM in these middle-class cards. As to the card makers omitting the VR required USB3 C port, I hope that some others will offer it. Yes, it will add $20-30 to the price, but I don't believe I am the only one who's like the option to try some VR gaming out on a more affordable card before deciding to start saving money for a full premium card. However, how is VR on Nvidia with 6 GB VRAM? Is it doable/bearable/okay/great?
  • eastcoast_pete - Friday, February 22, 2019 - link

    "who'd like the option". Google keyboard, your autocorrect needs work and maybe some grammar lessons.
  • Yojimbo - Friday, February 22, 2019 - link

    Wow, is a USB3C port really that expensive?
  • GreenReaper - Friday, February 22, 2019 - link

    It might start to get closer once you throw in the circuitry needed for delivering 27W of power at different levels, and any bridge chips required.
  • OolonCaluphid - Friday, February 22, 2019 - link

    >However, how is VR on Nvidia with 6 GB VRAM? Is it doable/bearable/okay/great?

    It's 'fine' - the GTX 1050ti is VR capable with only 4gb VRAM, although it's not really advisable (see Craft computings 1050ti VR assessment on youtube - it's perfectly useable and a fun experience). The RTX 2060 is a very capable VR GPu, with 6gb VRAm. It's not really VRAM that is critical in VR GPU performance anyway - more the raw compute performance in rendering the same scene from 2 viewpoints simultaneously. So, I'd assess that the 1660ti is a perfectly viable entry level VR GPU. Just don't expect miracles.
  • eastcoast_pete - Saturday, February 23, 2019 - link

    Thanks for the info! About the miracles: Learned a long time ago not to expect those from either Nvidia or AMD - fewer disappointments this way.
  • cfenton - Friday, February 22, 2019 - link

    You don't need a USB C port for VR, at least not with the two major headsets on the market today.

Log in

Don't have an account? Sign up now