Meet The Zotac GeForce GT 640 DDR3

As we mentioned in our introduction NVIDIA is not sampling a reference GT 640, leaving that up to their partners. As always NVIDIA’s partners have spread out with a variety of designs ranging from low-profile cards to relatively large double-slot cards, with a few even trying their hand at factory overclocking. Today we’ll be taking a look at Zotac’s GeForce GT 640, a design that’s right in the middle of those extremes and fairly close to NVIDIA’s internal reference design.

Taking things from the top, at this moment Zotac is in a rather unique position with their design. They currently offer the only single-slot GT 640 available at retail, with every other card being a double-slot card to fit a larger cooler. Consequently for users with ITX cases where space is at a premium, Zotac’s GT 640 is generally going to be the only option.

Because the Zotac GT 640 is a single-slot card, Zotac’s cooler of choice is a fairly wide but shallow aluminum heatsink that spans roughly half of the card. At the center of the heatsink is a fairly small 2pin 55mm fan, which provides the necessary airflow to keep the card cool. The card’s DDR3 RAM is sitting underneath the heatsink, but no contact is made nor is it necessary due to DDR3’s very low operating power and temperatures.

Removing the heatsink we see the bare PCB, with the GK107 GPU and DDR3 mounted on it. Physically GK107 is an unremarkable GPU and the entire package appears to be roughly the same size as GF108’s GPU package. On this Zotac card it’s paired with 8 2Gb Micron DDR3-1866 modules operating in 16bit mode, which means for casual overclocking there should be a bit of headroom, but GPU DDR3 is rarely capable of going much past its rated speeds.

The rest of the PCB is solid yet simple; users worried about coil whine will be glad to see that Zotac is using solid chokes here as opposed to ring chokes, and we didn’t notice any coil whine during operation. The card measures 5.75” long – the minimum length necessary for a PCIe x16 card – so it should fit in virtually any full-profile case.

Meanwhile looking at the ports on the card’s bracket, we find NVIDIA’s favored configuration of 1 DL-DVI-I port, 1 DL-DVI-D port, and a mini-HDMI port. As one of the major improvements in the Kepler family NVIDIA now has more than 2 display controllers on their GPUs, so the GT 640 can actually drive all 3 ports simultaneously. You likely wouldn’t want to use the GT 640 for gaming but it’s certainly powerful enough for desktop work, and this is one of the few situations where that extra 1GB of VRAM might come in handy.


Top: Zotac GeForce GT 640. Bottom: NVIDIA Reference GeForce GT 640

Unfortunately the display ports on Zotac’s GT 640 also expose its one flaw, and it’s a big one. On NVIDIA’s reference design the mini-HDMI port is centered at the middle of the card, similar to the DVI ports. However for reasons unknown to us, Zotac has moved the mini-HDMI port on their GT 640 down by about 2mm. This doesn’t sound like much, but by putting the mini-HDMI port so close to the edge of the card it introduces a crippling flaw: it doesn’t leave any room for a cable to attach to it.


Zotac GeForce GT 640 Installed. Note the lack of clearance around the mini-HDMI port

Specifically, because the port is so low it’s right on the edge of the usable area of the bracket, as everything below the port will be covered by the I/O shielding of the computer case. Consequently if you attempt to plug in a mini-HDMI cable or adapter, the boot of the cable will run into the case’s I/O shielding before the cable is fully inserted, preventing the cable from getting a good connection and/or locking into place. The HDMI specification is actually rather strict about the size of the boot on mini-HDMI cables/adapters, and after having tested a few different adapters everything we’ve encountered is within spec, so this is poor planning on Zotac’s part. NVIDIA’s reference design and cards similar to it do not have this problem since if the port is properly centered it leaves plenty of space for the boot, which is why this is the first time we’ve run into this issue.

We’ve already brought this up with Zotac and they’ve told us that they intend to fix it once they’ve exhausted their current supply of brackets and mini-HDMI connectors, but for the time being all of their GT 640 cards will have this flaw. In the meantime the problem is not unworkable – with enough tampering it should be possible to force a mini-HDMI cable/adapter in there – but Zotac really shot themselves in the foot here by making the mini-HDMI port so inaccessible. On that note, if you do intend to take advantage of this port you’ll need to bring your own gear (Zotac doesn’t provide a mini-HDMI adapter), and you’ll want to either use a cable or a more specialized mini-HDMI-to-HDMI adapter. The stubby adapter Monoprice and most other retailers carry won’t work because the port is so close to the top of the bracket, which has been a recurring quirk with NVIDIA cards since NVIDIA started using the mini-HDMI port.

Moving on, rounding out the package is the typically bare minimum collection of extras found on budget cards. Along with a driver CD and quick start guide Zotac includes a DVI-to-VGA adapter, but not a mini-HDMI adapter. Zotac typically bundles Ubisoft games with their cards but on a budget card like this that isn’t really possible, so the GT 640 comes with the next best thing, which is a 3 day trial of TrackMania 2.

Finally, as we stated earlier the Zotac GT 640 is currently priced at $109 at Newegg and most other retailers, which makes it on-par with most other GT 640 cards. Meanwhile for the warranty Zotac is offering a base 2 year warranty, which is extended to a rather generous full limited lifetime warranty upon registration of the card.

Zotac GeForce GT 640 DDR3 Review HTPC Aspects : What is New?
POST A COMMENT

60 Comments

View All Comments

  • cjs150 - Thursday, June 21, 2012 - link

    "God forbid there be a technical reason for it.... "

    Intel and Nvidia have had several generations of chip to fix any technical issue and didnt (HD4000 is good enough though). AMD have been pretty close to the correct frame rate for a while.

    But it is not enough to have the capability to run at the correct frame rate is you make it too difficult to change the frame rate to the correct setting. That is not a hardware issue just bad design of software.
    Reply
  • UltraTech79 - Wednesday, June 20, 2012 - link

    Anyone else really disappointed in 4 still being standardized around 24 fps? I thought 60 would be the min standard by now with 120 in higher end displays. 24 is crap. Anyone that has seen a movie recorded at 48+FPS know whats I'm talking about.

    This is like putting shitty unleaded gas into a super high-tech racecar.
    Reply
  • cjs150 - Thursday, June 21, 2012 - link

    You do know that Blu-ray is displayed at 23.976 FPS? That looks very good to me.

    Please do not confuse screen refresh rates with frame rates. Screen refresh runs on most large TVs at between 60 and 120 Hz, anything below 60 tends to look crap. (if you want real crap trying running American TV on an European PAL system - I mean crap in a technical sense not creatively!)

    I must admit that having a fps of 23.976 rather than some round number such as 24 (or higher) FPS is rather daft and some new films are coming out with much higher FPS. I have a horrible recollection that the reason for such an odd FPS is very historic - something to do with the length of 35mm film that would be needed per second, the problem is I cannot remember whether that was simply because 35mm film was expensive and it was the minimum to provide smooth movement or whether it goes right back to days when film had a tendency to catch light and then it was the maximum speed you could put a film through a projector without friction causing the film to catch light. No doubt there is an expert on this site who could explain precisely why we ended up with such a silly number as the standard
    Reply
  • UltraTech79 - Friday, June 22, 2012 - link

    You are confusing things here. I clearly said 120(fps) would need higher end displays (120Hz) I was rounding up 23.976 FPS to 24, give me a break.

    It looks good /to you/ is wholly irrelevant. Do you realize how many people said "it looks very good to me." Referring to SD when resisting the HD movement? Or how many will say it again referring to 1080p thinking 4k is too much? It's a ridiculous mindset.

    My point was that we are upping the resolution, but leaving another very important aspect in the dust that we need to improve. Even audio is moving faster than framerates in movies, and now that most places are switching to digital, the cost to goto the next step has dropped dramatically.
    Reply
  • nathanddrews - Friday, June 22, 2012 - link

    It was NVIDIA's choice to only implement 4K @ 24Hz (23.xxx) due to limitations of HDMI. If NVIDIA had optimized around DisplayPort, you could then have 4K @ 60Hz.

    For computer use, anything under 60Hz is unacceptable. For movies, 24Hz has been the standard for a century - all film is 24fps and most movies are still shot on film. In the next decade, there will be more and more films that will use 48, 60, even 120fps. Cameron was cock-blocked by the studio when he wanted to film Avatar at 60fps, but he may get his wish for the sequels. Jackson is currently filming The Hobbit at 48fps. Eventually all will be right with the world.
    Reply
  • karasaj - Wednesday, June 20, 2012 - link

    If we wanted to use this to compare a 640M or 640M LE to the GT640, is this doable? If it's built on the same card, (both have 384 CUDA cores) can we just reduce the numbers by a rough % of the core clock speed to get rough numbers that the respective cards would put out? I.E. the 640M LE has a clock of 500mhz, the 640M is ~625Mhz. Could we expect ~55% of this for the 640M LE and 67% for the 640M? Assuming DDR3 on both so as not to have that kind of difference. Reply
  • Ryan Smith - Wednesday, June 20, 2012 - link

    It would be fairly easy to test a desktop card at a mobile card's clocks (assuming memory type and functional unit count was equal) but you can't extrapolate performance like that because there's more to performance than clockspeeds. In practice performance shouldn't drop by that much since we're already memory bandwidth bottlenecked with DDR3. Reply
  • jstabb - Wednesday, June 20, 2012 - link

    Can you verify if creating a custom resolution breaks 3D (frame packed) blu-ray playback?

    With my GT430, once a custom resolution has been created for 23/24hz, that custom resolution overrides the 3D frame-packed resolution created when 3D vision is enabled. The driver appeared to have a simple fall through logic. If a custom resolution is defined for the selected resolution/refresh rate it is always used, failing that it will use a 3D resolution if one is defined, failing that it will use the default 2D resolution.

    This issue made the custom resolution feature useless to me with the GT430 and pushed me to an AMD solution for their better OOTB refresh rate matching. I'd like to consider this card if the issue has been resolved.

    Thanks for the great review!
    Reply
  • MrSpadge - Wednesday, June 20, 2012 - link

    It consumes about just as much as the HD7750-800, yet performs miserably in comparison. This is an amazing win for AMD, especially comparing GTX680 and HD7970! Reply
  • UltraTech79 - Wednesday, June 20, 2012 - link

    This preform about as well as an 8800GTS for twice the price. Or half the preformance of a 460GTX for the same price.

    These should have been priced at 59.99.
    Reply

Log in

Don't have an account? Sign up now