POST A COMMENT

75 Comments

Back to Article

  • Grooveriding - Friday, July 20, 2012 - link

    Very disappointing card for that price. $660 for this ?

    The MSI GTX 680 Lightning can also be overclocked and overvolted, showed higher overclocks, has a better cooler and costs less than this card.

    More disappointment from EVGA along with their scaled back warranty and the need to pay them additional money for extended warranty coverage/stepup program.

    /thumbs down
    Reply
  • RussianSensation - Friday, July 20, 2012 - link

    $740 for the EVBot controller + a card with a blower fan, dinky heatsink (compared to MSI Lightning 7970/680 and Sapphire TOXIC 7970) is overpriced imo. It's not like the extra 100mhz or 4GB of VRAM over 2GB 1230-1290mhz GTX680s will make this card more future proof for next generation games.

    To make it even more laughable, this card isn't even close to the fastest factory preoverclocked GTX680 either:
    http://www.newegg.com/Product/Product.aspx?Item=N8...

    If I wanted the best single-GPU with bragging rights, I'd rather get the GTX690 at that point. The 690 would be much faster and quieter and actually be good to go for next gen games. This seems like a marketing exercise.
    Reply
  • Ryan Smith - Friday, July 20, 2012 - link

    The thing is that I really don't consider a blower to be a negative thing here. This card barely passed 50dB under load, and at the same time it's fully exhausting. Open air coolers have their place, but not having to worry about additional case cooling is quite convenient. Reply
  • RussianSensation - Saturday, July 21, 2012 - link

    Ya, I agree that this blower is one of the better ones. But even from your review, the blower was struggling to keep this card under 70*C with overvolting. Under regular overclocking, it worked perfectly fine but the card only reached 1211MHz (1301 with GPU Boost). Those clocks are nothing special and plenty of $500-560 GTX580s such as Gigabyte Windforce 3x, Asus Direct CUII, Zotac AMP!, Galaxy KFA2 can reach those clocks. So the question is why is this card $660? 4GB of VRAM is a waste at 1080/1200P and at 2560x1600 with AA, 1.2ghz HD7970 is faster.

    But if you are going to overvolt, the cooler suddenly becomes a limitation, especially after buying EVBot for $740. Suddenly you aren't too far off from a real special card - GTX690 - that's actually going to be fast enough to play today's and next generation games. Going from 1301mhz to 1377mhz with volt mod is not going to make GTX680 any better for newer games since that's not good enough, especially after you consider that because it gets too hot, the delta is less than 76mhz in actual gaming.
    Reply
  • Sabresiberian - Sunday, July 22, 2012 - link

    The overvolted card reached 83C in the review - but the max recommended temp for a GTX 680 is 98C. It did that keeping the noise down to 56.3dB.

    I tell you what I think - the people that have problems with EVGA's choice of cooling here is simply that it's not sexy. The fact that it works well is secondary.

    The overclocked Classified is faster than the 7970GE in 4/5 games @ 2560x1600 in this review. It's faster in 5/9 of them.
    Reply
  • Spunjji - Monday, July 23, 2012 - link

    I don't think you can call 56.3dB "keeping the noise down", though. That's approaching conversational levels of noise - more importantly the fans on blower heat-sinks sound subjectively worse; less like the broadband noise of a desk fan and more like the drone of a hair-dryer.

    So, it's not all about "style". There are other legitimate concerns at play here.
    Reply
  • CeriseCogburn - Sunday, July 29, 2012 - link

    " Supplying this power is a pair of 8pin PCIe power sockets, which means on paper the GTX 680 Classified can safely draw up to 375W. In practice it’s not clear whether GK104 can actually take that, at least with air cooling, so pushing this card much beyond 300W is mostly in the realm of hardcore water and liquid nitrogen overclockers. "

    blag blah blah blah: is overpriced imo. It's not like the extra 100mhz or 4GB of VRAM

    blah blah blah blah: To make it even more laughable
    blah blah blah blah: Those clocks are nothing special
    blah blah blah blah: This seems like a marketing exercise

    You seem like a moron.
    Reply
  • RussianSensation - Sunday, September 02, 2012 - link

    We were discussing this card's value vs. the GTX680 Lightning. It's worse than the Lightning in overclocking, price and noise levels. Compared to the 7970, it's ridiculously overpriced and will get beaten by a 1250mhz 7970. So there are at least 2 better options on the market: GTX680 Lightning and Sapphire Vapor-X 7970 GE. Plus, at these prices, you can now get HD7950 Crossfire or catch a sale on 2 GTX670s for $340 each. Sorry but $660 for the Classified is a rip-off.

    As of September 2, this overpriced pile of garbage is still $660 on Newegg, and then you need to add $60+ for EVBot = $720:
    http://www.newegg.com/Product/Product.aspx?Item=N8...

    That's a joke and a half. Almost $300 more than the Vapor-X, $180 more than the GTX680 Lightning, and still more expensive than the fastest single GPU this round: Sapphire TOXIC 7970.

    Plus, this Classified pile of overpriced marketing will get owned by $640 MSI TwinFrozr III 7950s in CF or $730 GTX670 SLI.

    Seriously, only non-nonsensical fanboish noise comes out of your mouth. Even among NV's choices, this card is terrible.
    Reply
  • Sabresiberian - Sunday, July 22, 2012 - link

    Nice link to a deactivated product in which 4/9 of the people that bought it had trouble. I wonder why it was deactivated? Speaks to the quality of your entire post. "Dinky heatsink". Did you read any of the article at all?

    Yeah, I'm going to be spending all my video card money on Galaxy products, for sure. I especially like that their card had one DisplayPort and 3 HDMI ports. I'm sure I'm going to run 3 TVs off it.
    Reply
  • ubernator44 - Monday, September 17, 2012 - link

    alright so you must realize that:
    1; you will never see the potential out of this card with only air or normal liquid cooling.

    2; because of #1 anyone who isnt into phase change/ln2/ or sub zero cooling thinks this is a waste.

    you must realize that this thing has 18 power phases O.o like holy crap. the only real perk this card has is its amazing ability to reach extreme clocks at low voltages given the fact you use subzero cooling. anything else is just not adiquate. so yes, for 90% of people, this thing is a 660 dollar brick, for us extreme overclockers, this is a godsend! just look at kingpin and what he got off this beast. for you gamers out there that only use a max of normal liquid cooling, a 680 FTW+ is probly all you will want to pay for. anything else (like this card) is useless to you. so yes, amazing card, small market target, but they made it anyway, which is rare for a company to do!
    Reply
  • ubernator44 - Monday, September 17, 2012 - link

    edit, sorry 17 phases :P 14+3 phases :P Reply
  • san1s - Friday, July 20, 2012 - link

    This isn't really intended for ordinary gamers, but rather overclockers using exotic cooling. In that case, the overclocking features this card provides makes it a far more valuable card to them in comparison to reference cards. Reply
  • RussianSensation - Saturday, July 21, 2012 - link

    Ya, but in that case the MSI Lightning 680 imo is the better card. It has more premium components and is also ready for LN2. Reply
  • CeriseCogburn - Sunday, July 29, 2012 - link

    And has a 375 W ceiling..... right.... Reply
  • Sabresiberian - Sunday, July 22, 2012 - link

    As far as cooling solution, that's just your opinion, (actually from what I've read it's wrong because the Lightning gets warmer) and a lot of people aren't going to like MSI's because they want the warm air moved out of the case.

    The big kicker for me though is the 4GB of memory; if you plan on running 3x 2560x1440, 2GB just isn't enough. I'm an MSI fan, but I can't use their product to fill my needs. If I want 4GB and "unlocked voltage" my only choice is the EVGA Classified.
    Reply
  • Amoro - Friday, July 20, 2012 - link

    I wonder why the SC version is the only one with overclocked memory. Does that mean that overclocking the memory isn't valuable? Typo maybe? Reply
  • ltcommanderdata - Friday, July 20, 2012 - link

    Probably because stable memory overclocking is difficult to achieve when you are trying to drive double the VRAM. Seeing 4GB of VRAM seems to be overkill, keeping 2GB of VRAM and increasing memory clocks would probably have been more worthwhile although it doesn't quite have the same marketing ring to it as "4GB". Reply
  • CeriseCogburn - Sunday, July 29, 2012 - link

    I wondered where all the blabbering amd fanboys skittered off to in their constant 3GB ram drone psychosis....

    Let me just share a quote : " Quote :

    The 4GB -- Realistically there was not one game that we tested that could benefit from the two extra GB's of graphics memory. Even at 2560x1600 (which is a massive 4 Mpixels resolution) there was just no measurable difference. "

    LOL

    So now the blabbering jerks will yapper about cost, complain about the 7970 6GB being "superior" and have the most enormous and gigantic brain fart concerning their endlessly godless and irritatingly stupid 3GB ram superiority dance vs 2GB 680 670.

    It's a freaking TOTAL BLACKOUT at alcoholic blood toxic death level.

    Just wait, because no amount of evidence will do it for the amd fanboy, and their masters at amd have known this for years, and have been playing them like a retarded out of tune fiddle gets played. A week or a day on they will be back at it, on some other article , any webspot they land... and the brain fart will be what they are not even aware of.
    It's clear how Hitler came to power.
    Reply
  • mpschan - Friday, July 20, 2012 - link

    Where are their mid-range offerings? Where are their $200-300 cards on this latest architecture?

    I'm starting to think that by the time we see a 660 AMD will be releasing their 8000 series.
    Reply
  • superccs - Friday, July 20, 2012 - link

    Yeah I totally clicked on this article thinking it was a 660 review. WTF? Nvida, you no like midrange anymore?

    Bork!
    Reply
  • SteveLord - Friday, July 20, 2012 - link

    I too have been waiting on a mid range offering. This is crap nVidia....... Reply
  • RussianSensation - Saturday, July 21, 2012 - link

    What's wrong with HD7850 for $200-210 or MSI TwinFrozr HD7950 $310, both with 30-40% overclocking headroom? HD7950 @ 1.1ghz > GTX680. No point in waiting for this mythical GTX660Ti. Reply
  • CeriseCogburn - Tuesday, July 24, 2012 - link

    You have the mythical $200 or $210 7850 that bottom price is running anyone $220 for the crappiest version around.

    Why do you people always talk lies ?
    Reply
  • HisDivineOrder - Friday, July 20, 2012 - link

    nVidia will launch the 660 part on the day the entire Radeon 8xxx series shows up. On the very day.

    I know I've waited forever and a day for it, too. I've given up hope. I think it's a myth at this point. A story grandpappies tell their youngin's. A tall tale.

    The Geforce 660 is a legend wrapped in a mystery drizzled with lies and peppered with vague promise.
    Reply
  • RussianSensation - Saturday, July 21, 2012 - link

    Rumor: August 16th for 660 series. Reply
  • CeriseCogburn - Tuesday, July 24, 2012 - link

    What does midrange mean to you ? The 460 560 560ti 570 580 have "midrange" covered... unless of course you mean non mid mid range, or middler lwor midrange, or range of ranges unranged of which there are none....

    WHAT are you people expecting ? What cards exactly is this mythical purportedly missing midrange supposed to fall in between for you ?

    I'm serious, it's been many, many months, but by logic alone, there isn't a card spot you so desire, and by absolute omission for just as long....

    What the freak do you people expect ? The only thing I can possibly imagine is a "midrange card" that falls above the 580, above the 7870, above the 6970, below a stock 670... and costs perhaps "$150=$200" for your "midrange budget" - right ?

    I don't get it. Won't one of you midrange wannabes explain it - sometime before it appears, or like is the fantasy supposed to be an absolute mystery forever ?
    Reply
  • Galidou - Sunday, August 05, 2012 - link

    What do we expect?? Mid range prices with last gen top of the line performance but new gen power consumption and temperatures.... Seems pretty clear to me. Ok you need an example... gtx 560ti between gtx470-480 performance but less power and lower temperatures so I don't have to change power supply if I go sli nor change my case, in the end, save some money and game as well + overclock better.

    Everything that came out from nvidia from THIS very generation is overkill for gaming at 1080p, and that's the most used resolution in the whole freaking world, end of the discussion... who do you beleive you are criticizing everyone's desire/needs? GOD?
    Reply
  • CeriseCogburn - Saturday, August 25, 2012 - link

    Ok, well thanks for trying.
    However, nothing is overkill for 1080P, they ALL have to be turned down except the TOP duals.
    So a 680 and 7970 are underkill for 1920 1200 + 1920 x 1080.
    Reply
  • Belard - Saturday, July 21, 2012 - link

    Remember when the 4850 first came out as a $250 card... yet eventually ended up as a $100 card. Even todays modern $100 video cards are not much faster than the 4850... and that card is over 4 years old!

    If we go by the usual scale of GPU performance increases at targeted price points...

    For today, we should be able to get the performance of a 5870 card at a $100 price.

    What do we have? The 6870 is slightly slower than the 5870 (great model naming there AMD - idiots), it costs about $155. (okay, the 5870 was a $500 card).

    The smaller and cheaper to make 7850 is slightly faster than the 5870, but it costs about $225?! The 6870 is a better deal since its $75 cheaper yet about 7% slower.

    So realistically, the $130 7770 is over-priced as its 2/3rd the performance for a $20~30 savings over the older 6870.

    Of course, the 5770/ 6770 and 7770 are all pretty much the same card... not impressive.
    Reply
  • RussianSensation - Saturday, July 21, 2012 - link

    MSRP for the HD4850 was $199, HD4870 was $299. HD4850 was never $250.

    I agree with you that most of the performance increase in GPUs has happened in the $250+ level. Although HD7850 OCed = GTX580. The 7850 can be found for $200-230 no problem and GTX580 cost $500 just 1.5 years ago. So it is progress, just not as fast as in the past.

    It's too expensive to make fast GPUs in the ~ $100 level. If you can only afford $100-130 GPUs, I think you are better off just getting a PS4 or the next Xbox. The allure of the PC are the games you can't play on consoles, controls, mods and better graphics and much cheaper game prices. $300 for a GPU isn't expensive when you consider the prices of games on the PC.

    But ya I agree with you that HD7750/7770 are a joke. The latter is just 25-30% fater than a 2.5 year old HD5770. NV has nothing for less than <$400 (GTX670) worth buying. I guess that's what happens when wafer prices rise and the market for <$100 GPUs disappears.
    Reply
  • Belard - Saturday, July 21, 2012 - link

    Its been a few years... so I was a bit off on the price ;P

    I've owned 3DFx, GF2/3/4/5/7 series and ATIs 9800Pro/4670.

    I paid $190 for the GF7600GT with the extra large cooler to reduce nice (Exhaust heat out the back)... and I laughed when the reviewers complained about the dual-slot being a "problem"... WTF?! Blowing heat out is better than blowing heat off the GPU and having it stay inside the case.

    After than, I spent $85 on the ATI 4670 with the HIS blower... With the way PC gaming is, I don't see the value of spending a dime over $200. And considering its been 3 years since the ATI 5000 series, the 7850 should be a $150 card at the most.

    Yes, I'm planning on the PS4 to replace my PC gaming and to rid me of Windows. NO PC games = Why use Windows?
    Reply
  • RussianSensation - Saturday, July 21, 2012 - link

    Console gaming has its appeals. Sitting on a nice couch in front of a 50-60 inch LED/Plasma after a long-day's of work is often more comfortable than gaming on a chair at a desk. However, that PS4 won't be $150, probably more like $400-500. Reply
  • Visual - Monday, July 23, 2012 - link

    Eh, what does your screen have to do with the rest of the hardware?
    I've been playing my PC games on a couch 2m away from a 47" TV for the last 5 years, a lot of them with a wireless XBOX360 controller as well, at least when I feel the extra precision of a mouse is not needed, and always at a resolution and details settings much better than the console alternative. I only play exclusives on the actual XBOX360. There is no way in hell I will ever consider console gaming a serious option.
    Reply
  • CeriseCogburn - Tuesday, July 24, 2012 - link

    The same type of brainfart had the guy spewing nVidia has nothing below the $400 gtx670 worth buying.

    Thank you for adding a dose of reality.
    Reply
  • CeriseCogburn - Sunday, July 29, 2012 - link

    The 4850 has been below $100 for a long, long time. Brand new it has been $60 for a year.

    Now it's $40 with a special aftermarket HS
    http://www.ebay.com/itm/ASUS-ATI-Radeon-HD-4850-EA...

    Whatever, you're all screwy on numbers, as it makes it easy to moan and whine.
    Reply
  • will54 - Saturday, July 21, 2012 - link

    I read somewhere that the GTX 660 will be coming out in August and than they are going to focus on the 700 series. Not positive but I think I read on Toms Hardware. Reply
  • shin0bi272 - Sunday, July 22, 2012 - link

    WTF Anand? I post a link as a reply and its instantly marked as spam? that's bull shit. Reply
  • poohbear - Friday, July 20, 2012 - link

    why do you benchmark shogun 2 @ 1600p using Ultra Quality and then in 1200p you benchmark it @ very high quality? why did you drop the detail level exactly? makes no sense. Reply
  • Ryan Smith - Friday, July 20, 2012 - link

    Because it was utterly unplayable at 5760x1200 at Ultra, even with 2 video cards. I'm all for bogging down a video card, but there has to be a limit. Reply
  • poohbear - Friday, July 20, 2012 - link

    no i mean u dropped the quality when u went down to 1980x1200. why did u do that? not many people really pay attention to 5760x1200, most of us are on 1080p (according to Steam hardware survey). Reply
  • plonk420 - Saturday, July 21, 2012 - link

    i'm kinda more interested in 8xMSAA or 4xSSAA... Reply
  • Ryan Smith - Saturday, July 21, 2012 - link

    Ahh, okay, I see what you mean.

    So the short answer is that the memory requirements on Ultra are so high that we wouldn't be able to test most of our previous-generation 1GB cards at 1920 if we used it. I did want to have Ultra in there somewhere so that was the compromise I had to make to balance that with the need for a useful test at 1920.

    Though I will agree that it's unorthodox.
    Reply
  • RussianSensation - Saturday, July 21, 2012 - link

    At the same time that would be pretty useful to see if GTX570/580 run out of VRAM in Shogun with Ultra settings at 1080P. What if GTX660Ti only has 1.5GB of VRAM? We'd want to know if it's already starting to become a bare minimum in games :) Reply
  • CeriseCogburn - Tuesday, July 24, 2012 - link

    The 570 and 580 don't run out, but the 5750, 5870, and 6950 1gb and 6970 1gb do. A lot of amd fans have those 1gb cards because as usual, the amd fan is all about scrimping pennies and claiming they have the best anyway. Sad, isn't it.

    Sadder is the 1920x1200 rez they use here, which allows crap amd cards to lose by less when most people have 1920x1080 where nVidia stomps on amd ever harder, because as usual, amd fan boys are hacking away over pennies and buy the much cheaper and far more common 1920x1080 monitors instead of 1920x1200, saving $50 minimum amd more like $100+.

    So, amd loses, all around, again, as usual.
    Reply
  • Sabresiberian - Sunday, July 22, 2012 - link

    There is no "1200p"

    Catch-phrases like "720p" and "1080p" refer to television formats; they aren't just the vertical pixel number. 1920x1200 is not a television standard, and the "p" is superfluous.

    ;)
    Reply
  • LtGoonRush - Friday, July 20, 2012 - link

    While EVGA's cooler is an improvement over stock, I wonder how a capable card like this would perform if paired with an high performance cooler like the Arctic Accelero Xtreme III. Kepler-based cards drop their boost clocks above 70C to compensate for increased leakage, so it would be interesting to see how fast this card could get while staying below that mark. Even at maximum RPMs the fans would probably be quieter than this one. Reply
  • pandemonium - Saturday, July 21, 2012 - link

    I can't understand where the market for this card is. Wait, nevermind. I forgot how many nVidia fanbois there are out there... Reply
  • RussianSensation - Saturday, July 21, 2012 - link

    So true. $740 GTX680 with a volt-mod kit vs. $450 HD7970 that overclocks on stock voltage to 1.175V and gives the same performance. NV marketing machine FTW! Reply
  • CeriseCogburn - Tuesday, July 24, 2012 - link

    Amd cards never give the same performance as they lack so many features.
    you can perhaps, if you're lucky, get an fps only equivlanet in a few old games, or a hacked equivalent with crappy IQ that I'm sure you cannot see anyway, and in that case your power/performance is a big fat loser too - we cannot suddenly forget that for just this latest round when it was the most important point ever made for several years just prior now can we...
    pffffft !~
    Reply
  • ypsylon - Saturday, July 21, 2012 - link

    Not with this card. When you buy reference for liquid cooling then you can't go wrong with EVGA. Best cards around. When you buy EVGA Hydrocopper - you can't go wrong. But EVGA Classified are usually only highly overpriced reference designs. Yes there are tweaks here and there, but for max performance [air cooler] out of GTX family most people [including my humble person] go to MSI TwinFrozr3 Lighting/EX.or Asus 3 slot bricks (name escapes me).

    Lately EVGA sliding with theirs top offerings. SR-X motherboard is cruel joke when compared to ASUS dual CPU creation and now this. Another misfire.

    But I think EVGA doesn't care too much. They have devoted customers who buy everything EVGA without thinking...
    Reply
  • Belard - Saturday, July 21, 2012 - link

    This card is so old-school looking... like an Atari 2600... or 70s camera. Reply
  • ekon - Saturday, July 21, 2012 - link

    Few people are aware that EVGA was in the compact camera business back in the 70s:

    http://tinypic.com/view.php?pic=65bac5&s=6
    Reply
  • Belard - Saturday, July 21, 2012 - link

    Wow, Amazing!

    Its so cool how a 1970s camera's lens look just like a blower! What were the chances!

    :)
    Reply
  • Belard - Sunday, July 22, 2012 - link

    Kinda funny. I showed my 7yr old the big picture of this EVGA GTX 680 classified card and he said "it looked really old"... wow.

    For the retro- look, it does look nice. There will come a time when the computer toys we have today will look like OLD OLD junk.

    If mankind makes it another 100 years, our PCs, tablets and GPUs would be like telegraph equipment.
    Reply
  • CeriseCogburn - Tuesday, July 24, 2012 - link

    That's an amazing comment considering the years long AMD standard block look on 99% plus AMD cards we've been treated to.

    I remember being sick to my stomache seeing the same old red red red red red pcb on them all. Finally one amd fan promoter claimed he had a blue pcb amd card and linked a pic but it has the same old sad red square cover with the black lines.

    I do realize when the amd double D breast design recently hit many fanboys went into some sort of sexually perverse mental mode, but that shouldn't wipe out the endless years of amd standard fare we were all tortured with.

    In the case of this card, there's a lot of white on the outside I haven't seen anywhere else, the white "top" with printing will be staring at you out of the case, something so many cards have been oblivious too for far too long... then we also have the black carbon look - another unusual feature although with the fanboysim over anything and everything black that is understandable as I'm sure their pr boys figured that part a clear win, sadly enough.
    Reply
  • Haravikk - Saturday, July 21, 2012 - link

    With 4gb RAM it seems like it's almost intended to be the ultimate Second Life card; powerful enough to handle that app's mediocre but insanely demanding graphics with the RAM to hold all the hundreds of overly high-resolution textures plastered onto every visible surface.

    But for $660 I'm not sure it's worth the novelly =)
    Reply
  • dave1_nyc - Saturday, July 21, 2012 - link

    OK, I know that this is trivial, but the previous Classified at least looked kinda cool and unique. This one seems visually unappealing. Reply
  • CeriseCogburn - Tuesday, July 24, 2012 - link

    But once you put it in the case, usually within a few minutes of having an insane "unboxing" session much like a religious pilgrimage with a possible absolutely boring youtube minutes somehow considered a "treat" by the disturbed (of which there are many), you shove it in the case and put on the side cover... never to really see it ever again in it's fully glory, until it's death.

    What you will see is the big fat WHITE laberl and red classified printing jamming at your face if you have a side window..... clearly the most important aspect - even though 98% don't have a window to look through... but if you do - you're set.

    Don't mind me - I'm still amazed how "the feel" of some look makes it or breaks it for 99% of the retarded humans that surround me - especially when "the looking" is done like .000001% of the time as in the case of these video cards.

    It must have to do with their estrogen levels I tell myself, or maybe they don't have a girlfriend and that's why...
    Reply
  • MrSpadge - Saturday, July 21, 2012 - link

    > Software overvoltage control is forbidden.

    I can understand this for the reference design. But for custom designs? WTF?!
    Reply
  • shin0bi272 - Sunday, July 22, 2012 - link

    The instant I saw the original 680 I said that the 256bit memory bus was going to limit it severely. Even before I saw any other stats for the thing I knew id never buy one. Nvidia was cheap when they released the 680 because they saw what the 7970 was putting out and they said we'll call our 660 midrange our 680 high end and we can make more money (also love the fact that you guys test the handful of games that amd's 7 series beats the nvidia 6 series... not cherry picking your benchmarks at all nooo).

    This card does push the 680 to its limit which is cool and all but it just proves that a) the 256bit mem bus is still a midrange card designator no matter how much they claim gddr5 is fast enough to not need more than that... it does. And b) Nvidia could have pushed the 680's base clock up much higher and, while it would still be bottle necked bad, it would have been more attractive.

    Bring on the 700 series Im done with the 6's
    Reply
  • HisDivineOrder - Tuesday, July 24, 2012 - link

    To be fair, AMD started the gouging with the 7970 series, its moderate boost over the 580 series, and its modest mark-up over that line.

    When nVidia saw what AMD had launched, they must have laughed and rubbed their hands together with glee. Because their mainstream part was beating it and it cost them a LOT less to make. So they COULD have passed those savings onto the customer and launched at nominal pricing, pressuring AMD with extremely low prices that AMD could probably not afford to match...

    ...or they could join with the gouging. They joined with the gouging. They knocked the price down by $50 and AMD's pricing (besides the 78xx series) has been in a freefall ever since.
    Reply
  • CeriseCogburn - Tuesday, July 24, 2012 - link

    You people are using way too much tin foil, it's already impinged bloodflow to the brain from it's weight crimping that toothpick neck... at least the amd housefire heatrays won't further cook the egg under said foil hat.

    Since nVidia just barely has recently gotten a few 680's and 670's in stock on the shelves, how pray tell, would they produce a gigantic 7 billion transistor chip that it appears no forge, even the one in Asgard, could possibly currently have produced on time for any launch, up to and perhaps past even today ?

    See that's what I find so interesting. Forget reality, the Charlie D semi accurate brain fart smell is a fine delicacy for so many, that they will never stop inhaling.

    Wow.

    I'll ask again - at what price exactly was the 680 "midrange" chip supposed to land at ? Recall the GTX580 was still $499+ when amd launched - let's just say since nVidia was holding back according to the 20lbs of tinfoil you guys have lofted, they could have released GTX680 midrange when their GTX580 was still $499+ - right when AMD launched... so what price exactly was GTX680 supposed to be, and where would that put the rest of the lineups on down the price alley ?

    Has one of you wanderers EVER comtemplated that ? Where are you going to put the card lineups with GTX680 at the $250-$299 midrange in January ? Heck ... even right now, you absolute geniuses ?
    Reply
  • natsume - Sunday, July 22, 2012 - link

    For that price, I prefer rather the Sapphire HD 7970 Toxic 6GB @ 1200Mhz Reply
  • CeriseCogburn - Tuesday, July 24, 2012 - link

    Currently unavailable it appears.

    And amd fan boys have told us 7970 overclocks so well to (1300mhz they claim) so who cares.

    Toxic starts at 1100, and no amd fan boy would admit the run of the mill 7970 can't do that out of the box, as it's all we've heard now since January.

    It's nice seeing 6GB on a card though that cannot use even 3GB an maintain a playable frame rate at any resolution or settings, including 100+ Skyrim mods at once attempts.
    Reply
  • CeriseCogburn - Tuesday, July 24, 2012 - link

    Sad how it loses so often to a reference GTX680 in 1920 and at triple monitor resolutions.

    http://www.overclockersclub.com/reviews/sapphire__...
    Reply
  • Sabresiberian - Sunday, July 22, 2012 - link

    One good reason not to have it is the fact that software overclocking can sometimes be rather wonky. I can see Nvidia erring on the cautious side to protect their customers from untidy programs.

    EVGA is a company I want to love, but they are, in my opinion, one that "almost" goes the extra mile. This card is a good example, I think. Their customers expressed a desire for unlocked voltage and 4GB cards (or "more than 2GB"), and they made it for us.

    But they leave the little things out. Where do you go to find out what those little letters mean on the EVBot display? I'll tell you where I went - to this article. I looked in the EVBot manual, looked up the manual online to see if it was updated - it wasn't; scoured the website and forums, and no where could I find a breakdown of what the list of voltage settings meant from EVGA!

    I'm not regretting my purchase of this card; it is a very nice piece of hardware. It just doesn't have the 100% commitment to it a piece of hardware like this should.

    But then, EVGA, in my opinion, does at least as good as anybody, in my opinion. MSI is an excellent company, but they released their Lightning that was supposed to be over-voltable without a way to do it. Asus makes some of the best stuff in the business - if their manufacturing doesn't bungle the job and leave film that needs to be removed between heatsinks and what they should be attached to.

    Cards like this are necessarily problematic. To make them worth their money in a strict results sense, EVGA would have to guarantee they overclock to something like 1400MHz. If they bin to that strict of a standard, why don't they just factory overclock to 1400 to begin with?

    And, what's going to be the cost of a chip guaranteed to overclock that high? I don't know; I don't know what EVGA's current standards are for a "binning for the Classified" pass, but my guess is it would drive the price up, so that cost value target will be missed again.

    No, you can judge these cards strictly by value for yourself, that's quite a reasonable thing to do, but to be fair you must understand that some people are interested in getting value from something other than better frame rates in the games they are playing. For this card, that means the hours spent overclocking - not just the results, the results are almost beside the point, but the time spent itself. In the OC world that often means people will be disappointed in the final results, and it's too bad companies can't guarantee better success - but if they could, really what would be the point for the hard-core overclocker? They would be running a fixed race, and for people like that it would make the race not worth running.

    These cards aren't meant for the general-population overclocker that wants a guaranteed more "bang for the buck" out of his purchase. Great OCing CPUs like Nehalem and Sandy Bridge bring a lot of people into the overclocking world that expect to get great results easily, that don't understand the game it is for those who are actually responsible for discovering those great overclocking items, and that kind of person talks down a card like this.

    Bottom line - if you want a GTX 680 with a guaranteed value equivalent to a stock card, then don't buy this card! It's no more meant for you than a Mack truck is meant to be a family car. However, if you are a serious overclocker that likes to tinker and wants the best starting point, this may be exactly what you want.

    ;)
    Reply
  • Oxford Guy - Sunday, July 22, 2012 - link

    Nvidia wasn't happy with the partners' designs, eh? Oh please. We all remember the GTX 480. That was Nvidia's doing, including the reference card and cooler. Their partners, the ones who didn't use the awful reference design, did Nvidia a favor by putting three fans on it and such.

    Then there's the lack of mention of Big Kepler on the first page of this review, even though it's very important for framing since this card is being presented as "monstrous". It's not so impressive when compared to Big Kepler.

    And there's the lack of mention that the regular 680's cooler doesn't use a vapor chamber like the previous generation card (580). That's not the 680 being a "jack of all trades and a master of none". That's Nvidia making an inferior cooler in comparison with the previous generation.
    Reply
  • CeriseCogburn - Tuesday, July 24, 2012 - link

    I, for one, find the 3rd to the last paragraph of the 1st review page a sad joke.

    Let's take this sentence for isntance, and keep in mind the nVidia reference cooler does everything better than the amd reference:
    " Even just replacing the cooler while maintaining the reference board – what we call a semi-custom card – can have a big impact on noise, temperatures, and can improve overclocking. "

    One wonders why amd epic failure in comparison never gets that kind of treatment.

    If nVidia doesn't find that sentence I mentioned a ridiculous insult, I'd be surprised, because just before that, they got treated to this one: " NVIDIA’s reference design is a jack of all trades but master of none "

    I guess I wouldn't mind one bit if the statements were accompanied by flat out remarks that despite the attitude presented, amd's mock up is a freaking embarrassingly hot and loud disaster in every metric of comparison...

    I do wonder where all these people store all their mind bending twisted hate for nVidia, I really do.

    The 480 cooler was awesome because one could simply remove the gpu sink and still have a metal covered front of the pcb card and thus a better gpu HS would solve OC limits, which were already 10-15% faster than 5870 at stock and gaining more from OC than the 5870.

    Speaking of that, we're supposed to sill love the 5870, this sight claimed the 5850 that lost to 480 and 470 was the best card to buy, and to this day our amd fans proclaim the 5870 a king, compare it to their new best bang 6870 and 6850 that were derided for lack of performance when they came out, and now 6870 CF is some wonderkin for the fan boys.

    I'm pretty sick of it. nVidia spanked the 5000 series with their 400 series, then slammed the GTX460 down their throats to boot - the card all amd fans never mention now - pretending it never existed and still doesn't exist...
    It's amazing to me. All the blabbing stupid praise about amd cards and either don't mention nVidia cards or just cut them down and attack, since amd always loses, that must be why.
    Reply
  • Oxford Guy - Tuesday, July 24, 2012 - link

    Nvidia cheaped out and didn't use a vapor chamber for the 680 as it did with the 580. AMD is irrelevant to that fact.

    The GF100 has far worse performance per watt, according to techpowerup's calculations than anything AMD released in 40nm. The 480 was very hot and very loud, regardless of whether AMD even existed in the market.

    AMD may have a history of using loud inefficient cooling, but my beef with the article is that Nvidia developed a more efficient cooler (580's vapor chamber) and then didn't bother to us it for the 680, probably to save a little money.
    Reply
  • CeriseCogburn - Wednesday, July 25, 2012 - link

    The 680 is cooler and quieter and higher performing all at the same time than anything we've seen in a long time, hence "your beef" is a big pile of STUPID dung, and you should know it, but of course, idiocy never ends here with people like you.

    Let me put it another way for the faux educated OWS corporate "profit watcher" jack***: " It DOESN'T NEED A VAPOR CHAMBER YOU M*R*N ! "

    Hopefully that penetrates the inbred "Oxford" stupidity.

    Thank so much for being such a doof. I really appreciate it. I love encountering complete stupidity and utter idiocy all the time.
    Reply
  • Oxford Guy - Wednesday, July 25, 2012 - link

    "The 680 is cooler and quieter and higher performing all at the same time than anything we've seen in a long time..."

    Gee whiz. I wonder why 28nm is more efficient than 40nm.
    Reply
  • jmsabatini - Tuesday, July 24, 2012 - link

    You really needed to test Skyrim with a TON of ultra high res texture mods.

    Without that, your results were right in line with my expectations, i.e. not worth the extra $$$ over the vanilla card.
    Reply
  • CeriseCogburn - Wednesday, July 25, 2012 - link

    Something the regular joe moronic masses cannot seem to comprehend for the life of them, is when the cores peter out before the ram can be effectively used while maintaining a playable framerate, no amount of memory no matter how much "can help".

    Let me put it another way:

    The card makers need a more powerful core to use more than 2GB memory.(actually less than 2, but I won't go into that)

    The results are all over the web, and have been for months. No one should still be so utterly blind to the disclosed facts, still.

    Reply
  • hammbone852 - Saturday, September 15, 2012 - link

    This is disappointing but i'm guessing the drivers aren't pushing the GPU to Its fill potential. This card should be in the top 3 in all tests. Reply
  • Pey - Friday, September 21, 2012 - link

    As a SLI user, I will buy two of these and I´ll tell you why.

    I tried 2 MSI 680 Lightning are the temps go up to 75-76 each card, while 2 680 reference cards reach 72°. So you guys forget that for sli users, you have to get cards with a blower -unless you have watercooling.
    I know the price is not ideal, but for people like who are looking for a couple 680 with overclock and a proper watercooler-free solution, this is the card to go.
    You could go for a MSI Lightning, but when you put in another card, temps will go up to 15-20° on each card, and i dont like playing while having 76-78° plus the noise.
    Reply

Log in

Don't have an account? Sign up now