POST A COMMENT

21 Comments

Back to Article

  • CeriseCogburn - Monday, June 04, 2012 - link

    Forget it, just buy the massively available GTX460 at the same price and get at least 2X the performance, and that includes forgetting about the 7750 mentioned as well, a massive rip off. Reply
  • zcat - Monday, June 04, 2012 - link

    Wasteful and proud of it, eh. Reply
  • JNo - Tuesday, June 05, 2012 - link

    True but changing cards frequently to save small amounts of energy has to be traded off with the fact that new cards cost energy to make and use up the earth's materials - they are not very recyclable and recycling in itself costs energy.

    So there has to be a balance - upgrading too often wastes energy when existing cards probably suffice and not often enough means you will burn energy with old inefficient electronics eventually. Lower wattage and being green shouldn't be an excuse to justify buying shiny new things all the time.
    Reply
  • Taft12 - Sunday, June 10, 2012 - link

    Most tech fiends don't just throw video cards in the garbage - they sell them on eBay or Craigslist to fund their newest purchase. Sounds pretty green to me! Reply
  • CeriseCogburn - Tuesday, June 12, 2012 - link

    Tossed out or not it takes resources and energy and rare materials to make video cards, every single one of them.

    If you want to save the earth, buy the best, instead of crap that needs an upgrade sooner rather than later.

    Otherwise you're following the Obama car plan - destroy running cars with silica sand poured into the engine, to get a new car and some government welfare tax dollars with 2 tons of rare earth metals to make the battery ( mined from China the world market controller and video card manufacturer ), in order to "save the earth"... which actually becomes MORE WASTEFUL all the way around.

    Likewise, for instance, use your old system instead of buying some crap apu for your HTPC just because you want to be cool and are set on fan boying out.
    Reply
  • rickcain2320 - Monday, June 25, 2012 - link

    Actually batteries have a much longer lifespan than car engines, electric motors will last the life of 2-3 automobiles or more, and they don't require regular fluid changes which are incredibly polluting to the environment.
    Batteries are also quite recyclable and worth doing so. There's a huge industry behind recycling regular lead/acid batteries as well.

    Going back to video cards, it is nice to see a card draw much less power. Having dealt with a Geforce 8800GTS before I can honestly say a performance video card is merely a space heater that just happens to play games.
    Reply
  • tviceman - Monday, June 04, 2012 - link

    Ryan you summed it up perfectly pointing out there is absolutely no good reason to use DDR3 ram. Terrible, terrible pairing. This card will be a complete dud until GDDR5 is used in place. Reply
  • Onus - Wednesday, June 06, 2012 - link

    I agree 100% with this. If this was done as a necessary cost-saving feature, this card is doomed; as is the performance is too low (choose HD6670 for less) and with GDDR5 the price will have to be higher, and from the results at Tom's Hardware, it won't catch up to the HD7750. Reply
  • CeriseCogburn - Tuesday, June 12, 2012 - link

    But GTX 460 destroys all of them for $99 - the same price. Reply
  • zcat - Monday, June 04, 2012 - link

    15W idle? Was hoping for better from Kepler. AMD's ~10W iirc.

    In any case, I'm really looking forward to the bench & power consumption comparison reviews for this card and the GDDR5 version, as my mini-ITX workstation (yes, I said workstation) "needs" an upgrade from the Intel HD4000 for some better-than-meh summer gaming fps.
    Reply
  • medi01 - Tuesday, June 05, 2012 - link

    I think AMDs new cards are down to less than 1 watt, if monitor is switched off. Reply
  • Onus - Wednesday, June 06, 2012 - link

    Check out the single-slot HD7750 by XFX. I put one in my PC-Q08 and it runs very nicely. Reply
  • RaistlinZ - Monday, June 04, 2012 - link

    GDDR3 is the Devil's DRAM. Reply
  • Casper42 - Monday, June 04, 2012 - link

    I guess its a good thing then that its not mentioned anywhere in the article above.

    DDR3 != GDDR3
    Reply
  • BrunoLogan - Tuesday, June 05, 2012 - link

    Nothing to see here... Bring us the 660Ti instead. Reply
  • Assimilator87 - Tuesday, June 05, 2012 - link

    EVGA's website says these cards support three monitor Surround? How is that possible without Displayport? Reply
  • Taft12 - Sunday, June 10, 2012 - link

    The EVGA versions DO have displayport Reply
  • skroh - Tuesday, June 05, 2012 - link

    You're suffering from Eyefinity brainwashing. Surround on nV cards does not require DisplayPort. In this case, your resolution options will be limited by the fact that the third port is HDMI, but who would go for higher than 3x1080P with an entry-level card anyway? Reply
  • UltraTech79 - Wednesday, June 06, 2012 - link

    This shitty card is nearly better than my 8800GTS512 and costs twice as much. How many YEARS do you think I can run my 8800 to make up for this 50$?

    Too many.
    Reply
  • philipma1957 - Sunday, July 08, 2012 - link

    fanless with gddr5 ram would make it worthwhile.

    the hd7750 is the best fanless card for htpc.

    this card is next to worthless with a fan inside of it for htpc.

    the card pulls 65 watts pcie only this means it can be fanless just like the sapphire hd7750 ultimate .

    or buy a gtx 670 and turn the fan down for htpc and up for games
    Reply
  • mpauls - Tuesday, August 07, 2012 - link

    I bought a Dell Vostro 460 in Jan. 2012. It came with 16 GB's of Ram, 1 TB hard drive, a Nvidia 310 video card with 512 MB, the OS is Win 7 SP1 64 bit. I got a good deal on the GT 640 at $87.00. After uninstalling the old Nvida drivers, I installed the latest nvidia driver, the 301.42-desktop-win7-winvista-64bit-english-whql.exe, rebooted ,went into the desktop, then closed down the computer.

    After seating the GT 640 card, which takes up 2 slots on the back of the computer, in a PCI e Slot ver. 3.0 and screwed the DVI cable into the back of the computer, I put on the power and saw on the monitor a black screen and the monitor open/close button stayed yellow. There was no post.

    Has anyone experienced this with a video card. Gigabyte says it's either the power supply which has 350 Watts and upgrade it, upgrade the Bios, or the card is defective. Nvidia says upgrade the bios or if that does not work the card is defective--the power supply is suffient, the card uses 65 watts.

    If the card is Ok, will a upgrade of the Bios work?
    Reply

Log in

Don't have an account? Sign up now