Comments Locked

60 Comments

Back to Article

  • extide - Wednesday, June 20, 2012 - link

    For posting folding benchmarks! A lot of people really appreciate that!
  • Zink - Wednesday, June 20, 2012 - link

    +1
    No one else uses your benchmarking tool and it doesn't always correlate to performance with current F@H projects but that is the only reason I care about GPUs.
  • Marlin1975 - Wednesday, June 20, 2012 - link

    Good design if it had DDR5. If they can do 2gig of DDR5 then it be a great mid-price card.
  • Homeles - Wednesday, June 20, 2012 - link

    It would still be terrible until the price dropped.
  • Samus - Thursday, June 21, 2012 - link

    There's no reason this wouldn't be similar in speed to a GTX460 if it had DDR5. The only difference would be 128-bit vs 192-bit memory bus, everything else would be an advantage: same number cores, substantially higher clock speed, lower power consumption increasing overclocking headroom, etc.
  • MrSpadge - Thursday, June 21, 2012 - link

    You forget: substantially lower shader clock speed, more coarse shader grouping -> more difficult to use them all at once, and software scheduling -> need a better compiler, can't do runtime optimizations.
  • t_case - Wednesday, June 20, 2012 - link

    So who has the Sony VPL-vw1000ES? Now that's a nice projector... only roughly the price of a new car heh.
  • stephenasmith - Wednesday, June 20, 2012 - link

    I love me some painfully slow gaming!
  • nitrousoxide - Wednesday, June 20, 2012 - link

    Just curious if the most powerful IGP can keep up with entry-level Kepler
  • Roland00Address - Wednesday, June 20, 2012 - link

    But this should get you an idea of what performance you would be getting with llano. (Numbers taken from Llano review that appeared 12 months ago so drivers will be old.)

    Crysis Warhead 1680x1050 performance quality
    A8-6550D with 1600 mhz memory
    58.8 fps
    A8-6550D with 1866 mhz memory
    62.5 fps
    GT640
    99.8 fps

    This makes the 640 about 69.7% faster than a non overclock Llano (people are going to get 1600mhz memory).
  • Joe H - Wednesday, June 20, 2012 - link

    This is the type of review that other hardware sites can't even imagine, let alone write. Thanks for putting this and the other HTPC articles together. It's great to see a hardware review site taking HTPC enthusiasts and their needs seriously. Excellent review.
  • n0b0dykn0ws - Wednesday, June 20, 2012 - link

    Is there a chance of a follow up once a few driver updates have been released?

    I would love to see if the card gets even better after a few releases.

    I have a Radeon 6570 right now, and I've found it to be palatable for HTPC purposes.

    n0b0dykn0ws
  • Taft12 - Wednesday, June 20, 2012 - link

    They haven't done it before, I don't know why they'd start now.
  • Ryan Smith - Thursday, June 21, 2012 - link

    What specifically are you looking for? Gaming performance or HTPC functionality? Gaming performance isn't likely to improve; even with the newer architecture it's not Kepler that's the limiting factor. HTPC functionality on the other hand can easily be improved with drivers.
  • n0b0dykn0ws - Thursday, June 21, 2012 - link

    HTPC only. For gaming I would get a 670.

    Sometimes drivers break HTPC performance/quality though. At least in the AMD world.

    n0b0dykn0ws
  • Kevin G - Wednesday, June 20, 2012 - link

    If they're going to release a DDR3 version, why not just offer a version with no onboard memory and two DIMM slots so that users can add there own? You can get a DDR3-2133 kit which would boost bandwidth limited scenarios by roughly 15%. While I don't see the need, such a card could be upgraded all the way to 16 GB of memory.
  • MrSpadge - Thursday, June 21, 2012 - link

    Sockets
    - are unconventional (I don't think nVidia likes this word)
    - introduce a little cost (GPU manufacturer doesn't like it)
    - make the board larger (GPU manufacturer doesn't like it)
    - make the bus timing worse, so it's harder to clock them as high as directly soldered chips (wouldn't matter with DDR3, though)
    - introduce another point of failure (GPU manufacturer doesn't like higher RAM rates)
    - add cost to the overall product, as the end user wouldn't get as sweet a deal on RAM as the GPU manufacturer (this would eat into the GPU manufacturers profit margin)
  • Stuka87 - Wednesday, June 20, 2012 - link

    Sounds like unless temps are really important to you, the 7750-800 is by far the better choice. It outperforms the GT640 (And by a wide margin in some cases) in what looks like, every single test.

    And they are priced the same, which makes the GT640 kind of worthless for its intended price point.
  • cjs150 - Wednesday, June 20, 2012 - link

    Great review.

    It is too noisy, and the HDMI socket is an epic design fail. As a card for an HTPC what were Zotac thinking of? This is so badly wrong.

    Now onto frame rates. Nvidia, AMD and Intel really are total and utter idiots or they have decided that we the customers are total and utter idiots. There is simply no excuse for all IGPs and video cards not to be able to lock on to the correct frame rate with absolute precision. It is not as though the frame rate specs for film have changed recently. I cannot decide whether it is sloppiness, arrogance or they simply do not give a rats a##e for the customer experience.
  • Stuka87 - Wednesday, June 20, 2012 - link

    God forbid there be a technical reason for it....
  • cjs150 - Thursday, June 21, 2012 - link

    "God forbid there be a technical reason for it.... "

    Intel and Nvidia have had several generations of chip to fix any technical issue and didnt (HD4000 is good enough though). AMD have been pretty close to the correct frame rate for a while.

    But it is not enough to have the capability to run at the correct frame rate is you make it too difficult to change the frame rate to the correct setting. That is not a hardware issue just bad design of software.
  • UltraTech79 - Wednesday, June 20, 2012 - link

    Anyone else really disappointed in 4 still being standardized around 24 fps? I thought 60 would be the min standard by now with 120 in higher end displays. 24 is crap. Anyone that has seen a movie recorded at 48+FPS know whats I'm talking about.

    This is like putting shitty unleaded gas into a super high-tech racecar.
  • cjs150 - Thursday, June 21, 2012 - link

    You do know that Blu-ray is displayed at 23.976 FPS? That looks very good to me.

    Please do not confuse screen refresh rates with frame rates. Screen refresh runs on most large TVs at between 60 and 120 Hz, anything below 60 tends to look crap. (if you want real crap trying running American TV on an European PAL system - I mean crap in a technical sense not creatively!)

    I must admit that having a fps of 23.976 rather than some round number such as 24 (or higher) FPS is rather daft and some new films are coming out with much higher FPS. I have a horrible recollection that the reason for such an odd FPS is very historic - something to do with the length of 35mm film that would be needed per second, the problem is I cannot remember whether that was simply because 35mm film was expensive and it was the minimum to provide smooth movement or whether it goes right back to days when film had a tendency to catch light and then it was the maximum speed you could put a film through a projector without friction causing the film to catch light. No doubt there is an expert on this site who could explain precisely why we ended up with such a silly number as the standard
  • UltraTech79 - Friday, June 22, 2012 - link

    You are confusing things here. I clearly said 120(fps) would need higher end displays (120Hz) I was rounding up 23.976 FPS to 24, give me a break.

    It looks good /to you/ is wholly irrelevant. Do you realize how many people said "it looks very good to me." Referring to SD when resisting the HD movement? Or how many will say it again referring to 1080p thinking 4k is too much? It's a ridiculous mindset.

    My point was that we are upping the resolution, but leaving another very important aspect in the dust that we need to improve. Even audio is moving faster than framerates in movies, and now that most places are switching to digital, the cost to goto the next step has dropped dramatically.
  • nathanddrews - Friday, June 22, 2012 - link

    It was NVIDIA's choice to only implement 4K @ 24Hz (23.xxx) due to limitations of HDMI. If NVIDIA had optimized around DisplayPort, you could then have 4K @ 60Hz.

    For computer use, anything under 60Hz is unacceptable. For movies, 24Hz has been the standard for a century - all film is 24fps and most movies are still shot on film. In the next decade, there will be more and more films that will use 48, 60, even 120fps. Cameron was cock-blocked by the studio when he wanted to film Avatar at 60fps, but he may get his wish for the sequels. Jackson is currently filming The Hobbit at 48fps. Eventually all will be right with the world.
  • karasaj - Wednesday, June 20, 2012 - link

    If we wanted to use this to compare a 640M or 640M LE to the GT640, is this doable? If it's built on the same card, (both have 384 CUDA cores) can we just reduce the numbers by a rough % of the core clock speed to get rough numbers that the respective cards would put out? I.E. the 640M LE has a clock of 500mhz, the 640M is ~625Mhz. Could we expect ~55% of this for the 640M LE and 67% for the 640M? Assuming DDR3 on both so as not to have that kind of difference.
  • Ryan Smith - Wednesday, June 20, 2012 - link

    It would be fairly easy to test a desktop card at a mobile card's clocks (assuming memory type and functional unit count was equal) but you can't extrapolate performance like that because there's more to performance than clockspeeds. In practice performance shouldn't drop by that much since we're already memory bandwidth bottlenecked with DDR3.
  • jstabb - Wednesday, June 20, 2012 - link

    Can you verify if creating a custom resolution breaks 3D (frame packed) blu-ray playback?

    With my GT430, once a custom resolution has been created for 23/24hz, that custom resolution overrides the 3D frame-packed resolution created when 3D vision is enabled. The driver appeared to have a simple fall through logic. If a custom resolution is defined for the selected resolution/refresh rate it is always used, failing that it will use a 3D resolution if one is defined, failing that it will use the default 2D resolution.

    This issue made the custom resolution feature useless to me with the GT430 and pushed me to an AMD solution for their better OOTB refresh rate matching. I'd like to consider this card if the issue has been resolved.

    Thanks for the great review!
  • MrSpadge - Wednesday, June 20, 2012 - link

    It consumes about just as much as the HD7750-800, yet performs miserably in comparison. This is an amazing win for AMD, especially comparing GTX680 and HD7970!
  • UltraTech79 - Wednesday, June 20, 2012 - link

    This preform about as well as an 8800GTS for twice the price. Or half the preformance of a 460GTX for the same price.

    These should have been priced at 59.99.
  • MrSpadge - Thursday, June 21, 2012 - link

    And they support newer features and cost significantly less to run. Still, the price is ridiculous, especially for DDR3.
  • Taft12 - Wednesday, June 20, 2012 - link

    "Zotac has worked themselves into an interesting position as the only partner currently offering a single-slot card"

    I think EVGA's launched even before Zotac. No blocked mini-HDMI port either!

    http://www.newegg.ca/Product/Product.aspx?Item=N82...
  • nathanddrews - Wednesday, June 20, 2012 - link

    First off, I want to thank you for testing the 4K capabilities of this card. While disappointing that there is no DP output for 4K@60Hz, I suppose it's only a matter of time.

    Second, and more important, I wanted to make you aware of this in case you haven't seen it. Shot in 4K, edited in 4K, mastered in 4K and you can buy it in any format including Blu-ray (1080p), 2560x1440p, and even its raw 140GB 4K Cineform resolution. Seeing as how one of you awesome people now has the Sony 1000ES (jealous!), you definitely shouldn't waste time showing 4K YouTube clips!

    http://timescapes.org/products/default.aspx
    http://www.youtube.com/watch?v=e-GYrbecb88
  • Hrel - Wednesday, June 20, 2012 - link

    I saw a GTX560 on newegg for 145bucks after MIR today. Whenever people ask me about gaming cards and say they don't want to spend much more than 100 bucks I say, cut back on coffee for a week or skip that night at the bars and just spend the extra 30 bucks or so. Makes absolutely NO sense to handicap yourself over 30 bucks. GTX560 FTW!!!
  • just4U - Thursday, June 21, 2012 - link

    That's pretty much where the 7750 comes in. Performance overall is fairly similiar. Plus you can do away with the confusion since the 560 comes in 4-5 different flavors, yes?
  • maroon1 - Saturday, June 23, 2012 - link

    What ?! GTX 560 is much faster than even HD7770.

    http://www.anandtech.com/show/5541/amd-radeon-hd-7...
  • Lolimaster - Wednesday, June 20, 2012 - link

    So slow that a 6670 feels like high end and 7750 a total monster. $109 for these joke of a gpu (gt640)? You must be trolling.
  • Lolimaster - Wednesday, June 20, 2012 - link

    A8 5800K is a tiny bit slow thatn 6670 so it should be faster (for free with the APU) than a $109 nvidia discrete gpu.
  • bhima - Wednesday, June 20, 2012 - link

    This card is horrible. I was initially unimpressed with AMD's 7750 and 7770 performance, but now those cards just look beastly compared to this. This should be a $50-60 card at max for that kind of performance. Hell I think my 540m performs almost as good as this card.
  • staryoshi - Wednesday, June 20, 2012 - link

    GDDR5 would have really lifted the performance of this card. I'm sure they went with GDDR3 as a cost-saving measure and to not canibalize the sale of other cards.. but at this price point it's not a compelling item at all for most.

    They really need to get the 28nm process under control and wrangle in pricing on this pup.
  • HighTech4US - Wednesday, June 20, 2012 - link

    At least hen the GT240 was released it came in both DDR3 and GDDR5.
  • UNhooked - Wednesday, June 20, 2012 - link

    I wish there was some sort of Video encoding benchmark. I have been told AMD/ATI cards aren't very good when it comes to video encoding.
  • mosu - Thursday, June 21, 2012 - link

    who told you that kind of crap ?Please check the internet.
  • Rumpelstiltstein - Thursday, June 21, 2012 - link

    Did this low-end offering really manage to pull off these kind of numbers? I'm impressed. Not something I would buy personally, but I would have no problems recommending this to someone else.
  • Samus - Thursday, June 21, 2012 - link

    DDR3....ruined a perfectly good chip.
  • Deanjo - Thursday, June 21, 2012 - link


    Really the only thing we don’t have a good handle on for HTPC usage right now is video encoding through NVENC. We’ve already seen NVENC in action with beta-quality programs on the GTX 680, but at this point we’re waiting on retail programs to ship with support for both NVENC and VCE so that we can better evaluate how well these programs integrate into the HTPC experience along with evaluating the two encoders side-by-side. But for that it looks like we won’t have our answer next month.


    Noooooo! Come on, post some benchmarks as it is right now. Some of us do not want to wait for AMD to get their VCE in order. People have been waiting for VCE for months and there is no valid reason to hold off NVENC waiting for their competitor to catch up. When and if VCE support comes out then run a comparison then.
  • ganeshts - Thursday, June 21, 2012 - link

    NVIDIA indicated that official NVENC support in CyberLink / ArcSoft transcoding applications would come in July only. Till then, it is beta, and has scope for bugs.
  • Deanjo - Thursday, June 21, 2012 - link

    So? That didn't prevent them benching trinity and it's encoding capabilities despite it all being beta there.

    http://www.anandtech.com/show/5835/testing-opencl-...
  • drizzo4shizzo - Thursday, June 21, 2012 - link

    So... do these new cards still support HDTV 1080i analog signals for those of us who refuse to give up our 150lb 34" HDTV CRTs?

    ie. ship with a breakout dongle cable that plugs into the DVI-I port? If they don't ship with one can anyone tell me if they are compatible with a 3rd party solution? For it to work the card has to convert to the YUV colorspace. My old 7600gt *did* support this feature, but none of the new cards mention it...

    Upgrading my TV also means buying a new receiver for HDMI switching to the projector, fishing cable in walls, and all manner of other unacceptable tradeoffs. Plus monay.

    Thanks!
  • philipma1957 - Thursday, June 21, 2012 - link

    I have a sapphire hd7750 ultimate passive cooled card.

    This card seems to be worse in every case except it is 1 slot not 2.

    The passive hd7750 is 125 usd this is 110 usd.

    I am not sure that I would want this until they make a passive version.
  • saturn85 - Thursday, June 21, 2012 - link

    great folding@home benchmark.
  • kallogan - Thursday, June 21, 2012 - link

    WORST GPU EVER
  • dertechie - Friday, June 22, 2012 - link

    Here's hoping DDR4 is cheap and cheery enough for low end cards when it comes out, because this is ridiculous. We have here a card with 50% more shader horsepower than an 8800 Ultra, and 70% less memory bandwidth. Way to ruin a perfectly good GPU by not shipping with real memory.

    My old 7900 GS had more memory bandwidth than this. . . in 2006.
  • skgiven - Saturday, June 23, 2012 - link

    At GPUGRID the CUDA4.2 crunching performance of the GT 640 matches that of a GTX460.
    65W TDP vs 150W TDP.
    The low running cost, no high end PSU, or 6-pin power cable requirements make it a good entry card for crunchers.
    The 950MHz GDDR5 version (75W TDP) and the 797MHz DDR3 (50W) TDP versions should also perform well.
  • anac6767 - Thursday, June 28, 2012 - link

    A video card with a fan on it has no place in a modern HTPC... we're well past that. You might as well order a full tower (80's off white) ATX case and corded peripherals to go along with your noisy card.
  • infoilrator - Wednesday, July 4, 2012 - link

    Not appealing at this price.

    FWIW department, mITX motherboards taking single slot cards are maybe due for an upgrade.
    Maybe a motherboard could mount connectors sideways to allow fitting a two slot card.
    Maybe mITX cases could come with provision for two slot cards.

    If the numbers are right AMD Llano/Trinity and Intel IVB HD4000 make more sense than adding a $100 discrete card with limited capabilities. At least at the moment.
    I am seeing AMD Llano 3850/ A75 Combinations for $150 in mATX. Better, even though I find FM1 limiting.

    Contemplating an FM1 or FM2 such a build in a couple months. Unless I go after more GPU power.

    ? still new at these decisions.
  • Felip3 - Saturday, July 7, 2012 - link

    Look what I found ...
    http://www.gainward.com/main/vgapro.php?id=886&...
  • xeizo - Friday, July 27, 2012 - link

    That's old Fermi and not new Kepler, rather uninteresting even though it sure is gddr5, a passive GT640 with gddr5 would be interesting but seems nonexistent. Too bad!
  • stanislav_kosev - Thursday, September 20, 2012 - link

    I love me some painfully slow gaming! http://www.insightvision.biz/cd-dvd-packaging
  • Montmac - Friday, March 1, 2013 - link

    Don't expect Zotac to admit this when you call them to try and get a replacement card. One of the high ups told me they had never heard of this problem.

    However another in tech support told me he had and will be sending me a call tag to get the card I just bought replaced.

    It has taken almost 4 weeks to get this accomplished. I'm not very impressed with Zotac at all.

    When a company manufactures something wrong it shouldn't be a problem getting an exchange but it's not the case with them.

Log in

Don't have an account? Sign up now