POST A COMMENT

405 Comments

Back to Article

  • Wreckage - Thursday, March 22, 2012 - link

    Impressive. This cards beats AMD on EVERY level! Price, performance, features, power..... every level. AMD paid the price for gouging it's customers, they are going to lose a ton of marketshare. I doubt they have anything to match this for at least a year. Reply
  • Creig - Thursday, March 22, 2012 - link

    The review has been up for less than a minute so you couldn't possibly have read it already. How pathetic is it that you were sitting there hitting F5 repeatedly just so you could get in another "First post! Nvidia is uber!" comment.

    Get a life.
    Reply
  • Grooveriding - Thursday, March 22, 2012 - link

    Haha Creig,

    Good observation, he must of been sitting there spamming to get in that first comment, before he read a word of the review.

    Sour grapes at being banned much, Wreckage ?
    Reply
  • nathanddrews - Thursday, March 22, 2012 - link

    ... but he's correct. The 680 does dominate in nearly every situation and category.

    "the GTX 680 is faster, cooler, and quieter than the Radeon HD 7970. NVIDIA has landed the technical trifecta, and to top it off they’ve priced it comfortably below the competition."

    Obvioulsy Wreckage's analysis of AMD's "price gouging" and prophesies of doom are farfetched...
    Reply
  • N4g4rok - Thursday, March 22, 2012 - link

    Well, yeah, the card does well in most of those tests, but i think it might be a little too far to say that it dominates the 7970 on every level. Reply
  • cactusdog - Thursday, March 22, 2012 - link

    Just finsished looking around various sites and the 680 isnt as good as was suggested.

    Dont forget, you're basically comparing an overclocked Nvidia card to a stock AMD card, and even the base clock is much higher on the nvidia card.

    At the same clocks the results will look much better for AMD. Also, 3 monitor gaming could favour AMD with 3GB of vram.

    Seems like Nvidia really wanted to target the 7970 and the price/performance tag this time, by building a souped up, overclocked Gk104 but its not a 7970 killer. AMD will just need to sell them for $449.

    AMD can stay with the 7970 as planned until Q4 and the 8 series.
    Reply
  • gamerk2 - Thursday, March 22, 2012 - link

    You know clocks aren't the only thing that determines speed? Couldn't one just as easily argue that AMD cards were better because they basically clocked their RAM so high?

    Fact is, at stock, the card beats the 7970 at a lower price. Period.
    Reply
  • Meaker10 - Thursday, March 22, 2012 - link

    You're ignoring the host of factory overclocked models out there that will be quieter than the 680M and perform on a similar level, the price just needs to be dropped. Reply
  • Kakkoii - Thursday, March 22, 2012 - link

    Herp derp, the same can be said about Nvidia cards as well. The 680 has tonnes of OC'ing headroom. The GPU boost it has is a messily overclock. Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    I guess the lower clocked cores of 470, 480, 570, 580 , and many other Nvidia cards were greatly cheated in all benchmarks because the amd cores were often well over 100mhz higher at stock....
    So we have had at least 3 years of lying benchmarks in amd's favor.
    I'd like to personally thank gamerk2 for this very astute observation that sheds the light of Nvidia asbolutely winning in all the above mentioned nvidia cars for the past couple of years.
    Thank you... (sarc/reality check is free)
    Reply
  • toastyghost - Sunday, April 29, 2012 - link

    oh look, a fanboy fight in the comments on a hardware site. how very original. Reply
  • jewie27 - Sunday, July 08, 2012 - link

    tonnes? WTF? Reply
  • santiagodraco - Thursday, March 22, 2012 - link

    If you think overclocking RAM (which you imply but which isn't necessarily even true) makes that big of a difference than overclocking the GPU then you are fooling yourself.

    The GPU does the work, not the ram.

    As for price/performance yes the 680 appears to be better now (they are ALWAYS leapfrogging each other) but wait until ATI releases their new variation, cuts prices to match and beats Nvidia by 20% or more... it will happen. Does every time :)
    Reply
  • vol7ron - Thursday, March 22, 2012 - link

    They're both important.

    What does a fast number cruncher mean, if it's busy waiting on the numbers?

    Both CPU and RAM are important and they can both be bottlenecks.
    Reply
  • Iketh - Thursday, March 22, 2012 - link

    "The GPU does the work, not the ram."

    LOL you can't say something more stupid!
    Reply
  • grave00 - Friday, March 23, 2012 - link

    Sometimes I really wish the editors could come in here and mark posts with strong agreement or disagreement with statements. I'd like to know what they think of things like. GPU does all the work vs RAM doesn't do much. I have an uninformed opinion. The interested but uninformed need some kind of truth detector. Maybe just for a few pages worth. I start to lose my grip on what is real in the forum after awhile. fun though it may be. Reply
  • blanarahul - Tuesday, March 27, 2012 - link

    Question -1

    To understand the statement that "GPUs do all the work and memory doesn't", consider this:-

    1. You overclocked your Graphics Card, but only the core and not the memory.

    You ran a benchmark and let's assume you got a score of 100.

    2. Now, you overclocked your memory and ran the same benchmark again.

    You got the score of 101.

    This is what actually happens in MOST cases. It doesn't happen always.

    Question - 2

    Why it doesn't happen always?

    Answer:- If you use extreme methods and take your core clock too high the memory will become a bottleneck.

    Cosider that you try to overclock using Liquid Nitrogen.

    1. After overclocking only the core clock to the maximum.

    Benchmark score:- 150

    2. You overclock your memory too.

    Benchmark score:- 200

    In this case the memory was holding back the GPU Core from operating at it's full potential.

    But this does not happen if don't use extreme methods.

    I hope this helps.
    Reply
  • CeriseCogburn - Friday, April 06, 2012 - link

    Actually the 79xx series is the 1st time in a very long time amd has had a lead, let alone a lead of 20%, let alone "leap frogging".
    Amd has been behind since the GTX8800 and I don't know how long before that.
    Let's face it, the 79xx for 2.5 months was the 1st time amd played Frogger in a long time and made it across the street without getting flattened before stepping off the curb.
    You're welcome for the correct and truth filled history.
    Reply
  • SlyNine - Thursday, March 22, 2012 - link

    Sorry but the 7970 is still much faster in crysis min fps, which I would argue is more important then average. It's faster in Metro as well.

    All things considered, the 7970 stands up against the 680GTX well.

    Lets also consider X.264 acceleration, as far as I can tell the 680GTX has none.
    Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    It loses in everything to 680 including 3 monitor performance.
    That's not standing up well, it's larger, hotter, and slower at everything, with far less features and it's $60 bucks more.
    FXAA
    dynamic Vsync
    turbo
    More features I'm sure you fans of the loser underdog don't care about as of 9 am this morning.
    It's EPIC FAIL and it's not standing, it's decked to the ground and can't get up.
    Reply
  • Targon - Thursday, March 22, 2012 - link

    Many people have been blasting AMD for price vs performance in the GPU arena in the current round of fighting. The thing is, until now, AMD had no competition, so it was expected that the price would remain high until NVIDIA released their new generation. So, expect lower prices from AMD to be released in the next week.

    You also fail to realize that with a 3 month lead, AMD is that much closer to the refresh parts being released that will beat NVIDIA for price vs. performance. Power draw may still be higher from the refresh parts, but that will be addressed for the next generation.

    Now, you and others have been claiming that NVIDIA is somehow blowing AMD out of the water in terms of performance, and that is NOT the case. Yes, the 680 is faster, but isn't so much faster that AMD couldn't EASILY counter with a refresh part that catches up or beats the 680 NEXT WEEK. The 7000 series has a LOT of overclocking room there.

    So, keep things in perspective. A 3 percent performance difference isn't enough to say that one is so much better than the other. It also remains to be seen how quickly the new NVIDIA parts will be made available.
    Reply
  • SlyNine - Thursday, March 22, 2012 - link

    I still blast them, I'm not happy with the price/performance increase of this generation at all. Reply
  • Unspoken Thought - Thursday, March 22, 2012 - link

    Finally! Logic! But it still falls on deaf ears. We finally see both sides getting their act together to get minimum features sets in, and we can't see passed our own bias squabbles.

    How about we continue to push these manufactures in what we want and need most; more features, better algorithms, and last and most important, revolutionize and find new way to render, aside from vector based rendering.

    Lets start incorporating high level mathematics for fluid dynamics and the such. They have already absorbed PhysX and moved to Direct Compute. Lets see more realism in games!

    Where is the Technological Singularity when you need one.
    Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    Well, the perspective I have is amd had a really lousy (without drivers) and short 2.5 months when the GTX580 wasn't single core king w GTX590 dual core king and the latter still is and the former has been replaced by the GTX680.
    So right now Nvidia is the asbolute king, and before now save that very small time period Nvidia was core king for what with the 580 .. 1.5 years ?
    That perspective is plain fact.
    FACTS- just stating those facts makes things very clear.
    We already have heard the Nvidia monster die is afoot - that came out with "all the other lies" that turned out to be true...
    I don't fail to realize anything - I just have a clear mind about what has happened.
    I certainly hope AMD has a new better core slapped down very soon, a month would be great.
    Until AMD is really winning, it's LOSING man, it's LOSING!
    Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    Since amd had no competition for 2.5 months and that excuses it's $569.99 price, then certainly the $500 price on the GTX580 that had no competition for a full year and a half was not an Nvidia fault, right ? Because you're a fair person and "finally logic!" is what another poster supported you with...
    So thanks for saying the GTX580 was never priced too high because it has no competition for 1.5 years.

    Reply
  • Unspoken Thought - Saturday, March 24, 2012 - link

    Honestly the only thing I was supporting was the fact he is showing that perspective changes everything. a fact exacerbated when bickering over marginal differences that are driven by the economy when dealing with price vs performance.

    Both of you have valid arguments, but it sounds like you just want to feel better about supporting nVidia.

    You should be able to see how AMD achieved its goals with nVidia following along playing leap frog. Looking at benchmarks, no it doesn't beat it in everything and both are very closely matched in power consumption, heat, and noise. Features are where nVidia shine and get my praise. but I would not fault you if you had either card.
    Reply
  • CeriseCogburn - Friday, April 06, 2012 - link

    Ok Targon, now we know TiN put the 1.3V core on the 680 and it OC'ed on air to 1,420 core, surpassing every 7970 1.3V core overclock out there.
    Furthermore, Zotac has put out the 2,000Ghz 680 edition...
    So it appears the truth comes down to the GTX680 has more left in the core than the 7970.
    Nice try but no cigar !
    Nice spin but amd does not win !
    Nice prediction, but it was wrong.
    Reply
  • SlyNine - Thursday, March 22, 2012 - link

    Go back and look at the benchmarks idiot. 7970 wins in some situations. Reply
  • SlyNine - Thursday, March 22, 2012 - link

    In Crysis max, 7970 gets 36 FPS while the 680 only gets 30 FPS.

    Yes, some how the 7970 is losing. LOOK AT THE NUMBERS, HELLO!!???

    Metro 2033 the 7970 gets 38 and the 680 gets 37. But yet in your eyes another loss for the 7970...

    7970 kills it in certian GPU Compute, and has hardware H.264 encoding.

    In a couple of games, which you already get massive FPS with both, the 680 boasts much much higher FPS. But than in games where you need the FPS the 7970 wins. Hmmm

    But no no, you're right, the 680 is total elite top shit.
    Reply
  • eddieroolz - Friday, March 23, 2012 - link

    You pretty much admitted that 7970 loses in a lot of other cases by stating that:

    "7970 kills it in certain GPU compute..."

    Adding the word modifier "certain" to describe a win is like admitting defeat in every other compute situation.

    Even for the games, you can only mention 2 out of what, 6 games where 7970 wins by a <10% margin. Meanwhile, GTX 680 proceeds to maul the 7970 by >15% in at least 2 of the games.

    Yes, 7970 is full of win, indeed! /s
    Reply
  • SlyNine - Friday, March 23, 2012 - link

    lol, you're way of the mark.

    My point wasn't that the 680GTX isn't faster, it's however that it does stand up well against the 680GTX in performance.

    As far as compute goes, I'm not sure I understand your premise. Frankly I think it's an invalid inference. I said kills it. If that somehow implies it means it losses in the other compute tests, I'm not sure how you got there. Again invalid inference of the data.
    Reply
  • Galidou - Friday, March 23, 2012 - link

    You didn't get the point of what he meant. Yes AMD is loosing but mostly in games that already run 60+fps. The games AMD wins is where it's still not maxed out yet(below 60 fps).

    Which maybe means if some big demanding games come out, the winning/loosing shceme might go back and forth. But right now, not much games out there will push those gpus unless you got very high resolutions and right now, I think 90% of gamers have 1080p and lower which still runs super smooth with 95% of graphical options enables on a 150$ GPU...

    Still gotta say that this GTX 680 is really good for a flagship and the first one that's not uber huge and noisy and hot...
    Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    Shogun 2 TOTAL WAR, in this bench set is THE HARDEST GAME, not metro2033 and not crysis warhead.
    Sorry feller but ignoring that gets you guys the big fib you want.
    Sorry.
    Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    SHOGUN 2 680 wins in top rez.
    from article " Total War: Shogun 2 is the latest installment of the long-running Total War series of turn based strategy games, and alongside Civilization V is notable for just how many units it can put on a screen at once. As it also turns out, it’s the single most punishing game in our benchmark suite"

    OH WELL guess it's the metro2033 and crysis game engines cause the hardest game Nvidia 680 wins.
    Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    No you're WRONG. 1. 608 wins 1 bench in Merto2033, and ties within bench error on the other two resolutions.
    The hardest game as stated by the reviewer (since you never read) is Shogun2 total war, and Nvidia makes a clean sweep at all resolutions there.
    In fact the Nvidia card wins everything but Crysis here, ties on Metro, and smokes everything else.
    If Metro isn't a tie, take a look at the tie Ryan has for Civ5 and get back to me... !
    (hint: Nvidia wins by far more in Civ5)
    So--- let's see, one game with wierd benching old benching and AMD favored benchmark (dumping the waterfall bench that Nvidia won on all the time) >(Crysis)
    One "tie" metro2033, then Nvidsia sweeps the rest of them. many by gigantic frame rate victories.
    Other places show Nvidia winning metro2003 by a lot. (pureoverclock for one)
    ....
    No I'm not the one fudging, spinning and worse. You guys are. You lost, lost bad, man up.
    Reply
  • b3nzint - Monday, March 26, 2012 - link

    gt680 got more clocks, way higher memory bandwidth than 7970 thats why it got lower power load and price. but i think we can only compare 2 things if they have the "same" engine like drag race cars. both of them made a big leap from previous tech. and thats a win for us.
    btw, who comes out first ..amd. i say amd win period. so next time maybe they must release next gen gpu on the same time.
    Reply
  • b3nzint - Monday, March 26, 2012 - link

    sorry what i meant was gtx 680 has lower memory, so it gain lower power. Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    Or so a memory overclock unleashes it and it screams even further away at the tops of the charts... Reply
  • dlitem - Thursday, March 22, 2012 - link

    Actual street prices can be different:

    At least here on the eastern shores of Atlantic ocean German retailers are selling 7970's starting 460-470 eur including taxes with cards on stock and GTX680's are starting 499eur with taxes...
    Reply
  • TheRealArdrid - Thursday, March 22, 2012 - link

    Sigh, are people really relying on that weak argument again? It's the same thing people said when Intel starting trouncing AMD: it's not fair because Intel has Turbo Boost. Reply
  • jospoortvliet - Thursday, March 22, 2012 - link

    Seeing on other sites, the AMD does overclock better than the NVIDIA card - and the difference in power usage in every day scenario's is that NVIDIA uses a few more watts in idle and a few less under load.

    I'd agree with my dutch hardware.info site which concludes that the two cards are incredibly close and that price should determine what you'd buy.

    A quick look shows that at least in NL, the AMD is about 50 bucks cheaper so unless NVIDIA lowers their price, the 7970 continues to be the better buy.

    Obviously, AMD has higher costs with the bigger die so NVIDIA should have higher margins. If only they weren't so late to market...

    Let's see what the 7990 and NVIDIA's answer to that will do; and what the 8000 and 700 series will do and when they will be released. NVIDIA will have to make sure they don't lag behind AMD anymore, this is hurting them...
    Reply
  • theartdude - Thursday, March 22, 2012 - link

    Late to market? with Battlefield DLC, Diablo III, MechWarrier Online (and many more titles approaching), this is the PERFECT TIME for an upgrade, btw, my computer is begging for an upgrade right now, just in time for summer-time LAN parties. Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    GTX680 overclocks to 1,280 out of the box for an average easy attempt...
    http://www.newegg.com/Product/Product.aspx?Item=N8...

    See the feedback bro.
    7970 makes it to 1200 if it's very lucky.
    Sorry, another lie is 7970 oc's better.
    Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    So you're telling me the LIGHTNING amd card is cheaper ? LOL
    Further, if you don't get that exact model you won't get the overclocks, and they got a pathetic 100 on the nvidia, which noobs surpass regularly, then they used 2dmark 11 which has amd tessellation driver cheating active.... (apparently they are clueless there as well).
    Furthermore, they declared the Nvidia card 10% faster overall- well worth the 50 bucks difference for your generic AMD card no Overclocked LIghtning further overclocked with the special vrm's onboard and much more expensive... then not game tested but benched in amd cheater ware 3dmark 11 tess cheat.
    Reply
  • Reaper_17 - Thursday, March 22, 2012 - link

    i agree, Reply
  • blanarahul - Tuesday, March 27, 2012 - link

    Mr. AMD Fan Boy then you should compare how was AMD doing it since since the HD 5000 Series.

    6970= 880 MHz
    GTX 580=772 MHz
    Is it a fair comparison?

    GTX 480=702 MHz
    HD 5870=850 Mhz
    Is it a fair compaison?

    According to your argument the NVIDIA cards were at a disadvantage since the AMD cards were always clocked higher. But still the NVIDIA cards were better.

    And now that NVIDIA has taken the lead in clock speeds you are crying like a baby that NVIDIA built a souped up overclocked GK104.

    First check the facts. Plus the HD 8000 series aren't gonna come so early.
    Reply
  • CeriseCogburn - Friday, April 06, 2012 - link

    LOL
    +1
    Tell 'em bro !
    (fanboys and fairness don't mix)
    Reply
  • Sabresiberian - Thursday, March 22, 2012 - link

    Yah, I agree here. Clearly, once again, your favorite game and the screen size (resolution) you run at are going to be important factors in making a wise choice.

    ;)
    Reply
  • Concillian - Thursday, March 22, 2012 - link

    "... but he's correct. The 680 does dominate in nearly every situation and category."

    Except some of the most consistently and historically demanding games (Crysis Warhead and Metro 2033) it doesn't fare so well compared to the AMD designs. What does this mean if the PC gaming market ever breaks out of it's console port funk?

    I suppose it's unlikely, but it indicates it handles easy loads well (loads that can often be handled by a lesser card,) but when it comes to the most demanding resolutions and games, it loses a lot of steam compared to the AMD offering, to the point where it goes from a >15% lead in games that don't need it (Portal 2, for example) to a 10-20% loss in Crysis Warhead at 2560x.

    That it struggles in what are traditionally the most demanding games is worrisome, but, I suppose as long as developers continue pumping out the relatively easy to render console ports, it shouldn't pose any major issues.
    Reply
  • Eugene86 - Thursday, March 22, 2012 - link

    Yes, because people are really buying both the 7970 and GTX680 to play Crysis Warhead at 2560x.... :eyeroll:

    Nobody cares about old, unoptimized games like that. How about you take a look at the benchmarks that actually, realistically, matter. Look at the benches for Battlefield 3, which is a game that people are actually playing right now. The GTX680 kills the 7970 with about 35% higher frame rates, according to the benchmarks posted in this review.

    THAT is what actually matters and that is why the GTX680 is a better card than the 7970.
    Reply
  • SlyNine - Thursday, March 22, 2012 - link

    While I love BF3, it's not the only game that matters, but it is the only example of the 680 beating the 7970 at frame rates that matter. However the 7970 is catching up as the res goes up. If we add 2 monitors does the 680 still win ?

    BTW Crysis and Metro 2033 FPS matters to me. Do you think the GPU world revolves around you and what you want? You are not the center of the Videocard world.
    Reply
  • Eugene86 - Thursday, March 22, 2012 - link

    No, the GPU world doesn't revolve around me but as I already said, nobody but you and a handful of other people actually care about the Crysis and Metro benchmarks because almost nobody plays those games anymore.
    In my example, Battlefield 3 is a current game that is actually played by people so those benchmarks are useful.
    The only reason why Crysis and Metro are used are because they are benchmark games that stress the video cards to their limits. This is nice for bragging rights but completely useless in the real world.
    Nvidia and AMD both release video cards that are aimed to please their main market, which is gamers who play on a single monitor at 1080p.
    Reply
  • SlyNine - Friday, March 23, 2012 - link

    So what are you baseing this on? can you give me any sources?

    I'm not buying a 600$ video card for just one game.

    Plus like I said, as the settings go up, they seem to converge. I can't help but wonder if the 7970 would overtake the 680 at some point before we hit 30fps.
    Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    Then look at SHOGUN 2 total war in this very article man.
    Wow, s many of you are so controlled and so mindless on things...
    " Total War: Shogun 2 is the latest installment of the long-running Total War series of turn based strategy games, and alongside Civilization V is notable for just how many units it can put on a screen at once. As it also turns out, it’s the single most punishing game in our benchmark suite "
    680 takes the top in that game man.
    Reply
  • Galidou - Sunday, March 25, 2012 - link

    ''Nvidia and AMD both release video cards that are aimed to please their main market, which is gamers who play on a single monitor at 1080p''

    Well then it means that gamers can be more than pleased with a radeon 6870 at 140$ that runs everything with 95% graphical options enabled or a gtx 560ti(not the 448 cores version) for around 200$ which performs a little better than the 6870 and still does the trick in everygame at 1080p.

    Prices taken from the bay as no regular 560ti was available on newegg for price comparison.

    Oh... and for the 5% graphical options you can't turn on, you'll only notice when you go on a sunday walk in your games, but doing so will have you dead in a second if you play online against other players...
    Reply
  • b3nzint - Monday, March 26, 2012 - link

    i play metro, cysis a lot, amazing graphic! but thats not "real world" to me. maybe if i play bf3 im in real world? Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    Shogun 2 total war, the most demanding game in the benches- did you read ?
    Nvidia GTX680 sweeps the entire resolution set beating the slower 7970 that cannot handle modern demanding games as well,.
    Reply
  • akse - Thursday, March 22, 2012 - link

    Seems impossible to do such a feat!!! Considering they launched it months later than the competitor! Reply
  • arjuna1 - Thursday, March 22, 2012 - link

    Correct??

    You call +10 fps difference at best, on certain situations, a domination??

    The only good thing this will bring is prices down, the rest is truly unremarkable, for both companies.

    See you in the 8xxx/7xxx series.
    Reply
  • Wreckage - Thursday, March 22, 2012 - link

    Maybe I was a bit hasty. I did forget to mention that it hard launched with working drivers and working h.264 encoding, also quiet under load. Impressive++ Reply
  • Owls - Thursday, March 22, 2012 - link

    Not hasty but trolly. Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    He states the facts that shame amd's recent launch fiasco and it's trolling...
    Way to go. Tell the truth, you're a troll. Only if we tell lies for amd are we being a good poster boy.
    Reply
  • Lazlo Panaflex - Thursday, March 22, 2012 - link

    The Bitch is back! Go troll some other site, chump. Reply
  • flashbacck - Thursday, March 22, 2012 - link

    Lol. What a loser! Reply
  • medi01 - Thursday, March 22, 2012 - link

    But he read the title.
    Besides, Anand never misses chance to piss on AMD's cookies.
    Reply
  • medi01 - Thursday, March 22, 2012 - link

    "but as always, at the end of the day it’s NVIDIA who once again takes the performance crown"

    Oh boy...
    Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    The 680 stomped the 7970 in Civ5 and the reviewer obviously a you know what fan, said the 7970 tied. He's trying as hard as he possibly can for his brothers like you. Reply
  • mschira - Friday, March 23, 2012 - link

    Oh come on. You're just annoyed he beat you.
    And his comment almost makes sense.
    M.
    Reply
  • N4g4rok - Thursday, March 22, 2012 - link

    You should probably go look at the charts, at least. Reply
  • Belard - Thursday, March 22, 2012 - link

    Er... no. In some games, AMD's $350 card was faster... in many games they are are comparable with each other. But nothing deadly to AMD.

    In general, the 680 is the faster card.... hopefully we'll see a price war.
    Reply
  • Makaveli - Thursday, March 22, 2012 - link

    what are you ranting about wreckage NV would have done the same thing if they released their card first!

    Gouging its customers don't make me laugh. If the price is too high for you don't buy it.
    AMD didn't put a gun to anyone head.
    Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    Nvidia released it's last 4 flagship cards at $499, yes that's correct, and this one is that.
    Not sure what imaginary world you live in, but it's one that does not include the common facts at hand.
    In other words, AMD already knew far beforehand the $499 flagship Nvidia card price was coming, and so did everyone else who paid attention.
    Reply
  • consolePoS - Thursday, March 22, 2012 - link

    I said haha busted you dickhead goddamn Reply
  • consolePoS - Thursday, March 22, 2012 - link

    woops thought my well thought out comment had been removed, oh well whilst I'm already commenting heres another: "Wreckage is a complete benson and an all around arse-bandit" Reply
  • slayernine - Thursday, March 22, 2012 - link

    Actually it looks to me like the 680 is hugely disappointing, losing to the 7970 and even the 7950 in some tests.

    Try reading the article....
    Reply
  • Hauk - Thursday, March 22, 2012 - link

    Wreckage first again! LMAO.. Reply
  • pandemonium - Friday, March 23, 2012 - link

    I'm confused...are we reading the same article? The 7970 and 680 swap top positions for the most powerful single GPU in several different ways.

    Where is this, "This cards beats AMD on EVERY level! Price, performance, features, power..... every level"?
    Reply
  • CeriseCogburn - Saturday, March 24, 2012 - link

    We always, since game engines favor one style core over another, have a reasonable average of the games chosen to be tested as indicator of "performance".
    Every common and popular website testing has that "performance" chalked up the GTX680 winning.
    you can't argue price
    you can't argue features
    you can't argue power
    --
    20% slower overall
    $80 more expensive
    no on the fly OC, no dyna vsync, no physx- no unique destrcution, no TXAA
    loses on watts per frame
    --
    Yes we are looking at the same review but your bias has your brain dreaming up other things ?
    Reply
  • IceDread - Friday, March 23, 2012 - link

    No it does not.

    SLI scaling is really bad as is surround.

    http://www.sweclockers.com/recension/15196-geforce...

    Check the graphs.
    Reply
  • CeriseCogburn - Saturday, March 24, 2012 - link

    I guess SWE clockers are amateurs or have some shoulder chip..

    http://www.guru3d.com/article/geforce-gtx-680-sli-...

    http://www.guru3d.com/article/radeon-hd-7970-cross...

    Nvidia 101 FPS
    AMD 89 FPS
    Reply
  • george1976 - Saturday, March 24, 2012 - link

    Excuse me sir but I think you've been reading the wrong article. Reply
  • Ryan Smith - Thursday, March 22, 2012 - link

    Just a heads up guys, we're a bit behind schedule and are still adding images and tables, so hold on. Reply
  • casteve - Thursday, March 22, 2012 - link

    whew - thought my coffee hadn't kicked in :) Reply
  • Granseth - Thursday, March 22, 2012 - link

    Hi, liked the review but are missing a few things, though I expect them to be reviewed at a later time in a new article. Like the improved multi-screen support, SLI, overclocking and things like that.

    But I would like to know more about this turbo as well. What I am courious about is if it will boost minimum framerate as well as average framerate, or if the GPU is so taxed when it hits minimum framerate that it won't have anything extra to offer up to its turbo.
    Reply
  • Ryan Smith - Thursday, March 22, 2012 - link

    Minimum framerates. -16% power target on the left, stock on the right.

    Crysis Min: 21.4...21.9

    Dirt3 Min: 73.4....77.1

    So to answer your question, it depends on the game.
    Reply
  • Jamahl - Thursday, March 22, 2012 - link

    Just a comment on the power draw - I wonder if you could test the 680 and 7970 in a different game, say for example Batman of BF3. The reason for this is due to the 7970 winning in Metro, while losing in most of the others and I wonder if there is something going on regarding power draw. Reply
  • CeriseCogburn - Friday, March 23, 2012 - link

    See the GTX 680 win in Metro 2033 all the way on up 1920 and 2560 resolutions >
    http://hothardware.com/Reviews/NVIDIA-GeForce-GTX-...

    What's different is AAA is used, as well as the Sandy E runs stock at 3,300 and is not overclocked.
    What appears to be a big problem for AMD cards is they have been offloading work to the cpu much more than the Nvidia cards, and even more so in CF v SLI, so when you don't have a monster CPU with a monster overclock to boot the AMD cards lose even worse.
    Reply
  • SlyNine - Friday, March 23, 2012 - link

    Anandtech uses AAA for Metro.

    You need to look agian, the difference is no DOF and hothardware is running at lower settings.

    you, fail.
    Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    Oh I didn't fail, I showed the 680 winning in the game that is claimed it loses in.
    That's a WIN for me, period.
    Reply
  • SlyNine - Friday, April 27, 2012 - link

    Ok so your 500$ video card can win at lower settings than the 459$ videocard. Reply
  • blppt - Thursday, March 22, 2012 - link

    Wondering if you guys could also add a benchmark for one the current crop of 1ghz core 7970s that are available now (if you've tested any). Otherwise, great review. Reply
  • tipoo - Thursday, March 22, 2012 - link

    With everything being said by Nvidia, I thought this would be a Geforce 8k series class jump, while its really nothing close to that and trades blows with AMDs 3 month old card. GCN definitely had headroom so I can see lower priced, higher clocked AMD cards coming out soon to combat this. Still, I'm glad this will bring things down to sane prices. Reply
  • MarkusN - Thursday, March 22, 2012 - link

    Well to be honest, this wasn't supposed to be Nvidias successor to the GTX 580 anyway. This graphics card replaced the GTX 560 Ti, not the GTX 580. GK 110 will replace the GTX 580, even if you can argue that the GTX 680 is now their high-end card, it's just a replacement for the GTX 560 Ti so I can just dream about the performance of the GTX 780 or whatever they're going to call it. ;) Reply
  • tipoo - Thursday, March 22, 2012 - link

    I didn't know that, thanks. Ugh, even more confusing naming schemes. Reply
  • Articuno - Thursday, March 22, 2012 - link

    If this is supposed to replace the 560 Ti then why does it cost $500 and why was it released before the low-end parts instead of before the high-end parts? Reply
  • MarkusN - Thursday, March 22, 2012 - link

    It costs that much because Nvidia realized that it outperforms/trades blows with the HD 7970 and saw an opportunity to make some extra cash, which basically sucks for us consumers. There are those that say that the GTX 680 is cheaper and better than the HD 7970 and think it costs just the right amount, but as usual it's us, the customers, that are getting the shaft again. This card should've been around $300-350 in my opinion, no matter if it beats the HD 7970. Reply
  • coldpower27 - Thursday, March 22, 2012 - link

    Nah, they aren't obligated to give more then what the market will bear, no sense in starting a price war when they can have much fatter margins, it beats the 7970 already it's just enough.

    Now the ball is in AMD's court let's see if they can drop prices to compete $450 would be a nice start, but $400 is necessary to actually cause significant competition.
    Reply
  • CeriseCogburn - Friday, March 23, 2012 - link

    This whole thing is so nutso but everyone is saying it.
    Let's take a thoughtful sane view...
    The GTX580 flagship was just $500, and a week or two ago it was $469 or so.
    In what world, in what release, in the past let's say ten years even, has either card company released their new product with $170 or $200 off their standard flagship price when it was standing near $500 right before the release ?
    The answer is it has never, ever happened, not even close, not once.
    With the GTX580 at $450, there's no way a card 40% faster is going to be dropped in at $300, no matter what rumor Charlie Demejerin at Semi0-Accurate has made up from thin air up as an attack on Nvidia, a very smart one for not too bright people it seems.
    Please, feel free to tell me what flagship has ever dropped in cutting nearly $200 off the current flagship price ?
    Any of you ?!?
    Reply
  • Lepton87 - Thursday, March 22, 2012 - link

    Because nVidia decided to screw its costumer and nickle and dime them. That's why. All because 7970 underperformed and nv could get away with it. Reply
  • JarredWalton - Thursday, March 22, 2012 - link

    Or: Because NVIDIA and AMD and Intel are all businesses, and when you launch a hot new product and lots of people are excited to get one, you sell at a price premium for as long as you can. Then supply equals demand and then exceeds demand and that's when you start dropping prices. 7970 didn't underperform; people just expected/wanted more. Realistically, we're getting to the point where doubling performance with a process shrink isn't going to happen, and even 50% improvements are rare. 7970 and 680 are a reflection of that fact. Reply
  • coldpower27 - Thursday, March 22, 2012 - link

    Well, it's possible, but for financial reasons they won't do so.

    If they had created a 28nm product with similar thermals as the GTX 580 as well as similar die size you would indeed see a massive increase in performance..

    However this generation nVidia wanted to improve on all aspects to some degree so as such not as much can go into performance.

    We have an massive improvement in die area, a mile improvement in performance and a decent improvement in energy consumption and considerable improvement in energy efficiency. A very well balanced product.
    Reply
  • CeriseCogburn - Friday, March 23, 2012 - link

    The GTX580 is $470, so who believes Nvidia was dropping a killer card in at $299 like Charlie D the red fan lie disseminator said in his rumor starting post ?
    His lie has worked magic on all minds.
    Reply
  • silverblue - Friday, March 23, 2012 - link

    The 680 shouldn't be $300 any more than the 580 should be $470. Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    Spinning so hard you're agreeing while drilling yourself into a dark hole. Reply
  • SlyNine - Thursday, March 22, 2012 - link

    Agreed, back when the 9700pro came out we seen the first signs of this. The cards began needing external power adapters. The HSF's started growing to get those 4x increases.

    It was only a matter of time until they hit a wall with that method, and here we are.
    Reply
  • johnpombrio - Thursday, March 22, 2012 - link

    Rumor is that BIG Kepler will be named GTX685 and be out in August. Reply
  • Philbar71 - Thursday, March 22, 2012 - link

    "it takes a 16% lead over the GTX 7970 here"

    Whats a GTX 7970????
    Reply
  • prophet001 - Thursday, March 22, 2012 - link

    haha
    i saw that too... must have been a late night last night. we can let it slide :)
    Reply
  • N4g4rok - Thursday, March 22, 2012 - link

    It's pretty impressive. I'd like to see what it will cost from one of the retail sites. I'm not necessarily regretting the 7950 i got, but that nice little FPS bump you get from the 680 is nothing to turn your nose up at. Reply
  • Jorgisven - Thursday, March 22, 2012 - link

    "Overall GTX 580 is targeted at a power envelope somewhere between GTX 560 Ti and GTX 580, though it’s closer to the former than the latter." Is this a typo (580 instead of the intended 680)? Or am I just not understanding this correctly? Reply
  • mm2587 - Thursday, March 22, 2012 - link

    Well I guess we can expect AMD to slash price $50 across the top of their line to fall back into competition. Competion is always good for us consumers.

    While theres no arguing gtx680 looks like a great card I am a bit dissapointed this generation didn't push the boundries further on both the red and the green side. Hopefully gk100/gk110 is still brewing and we will still see a massive performance increase on the top end of the market this generation.

    Unfortunatly I predict the gk100 is either scrapped or will be launched as a gtx 780 a year from now.
    Reply
  • MarkusN - Thursday, March 22, 2012 - link

    As far as I know, GK100 got scrapped due to issues but the GK110 is still cooking. ;) Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    Competition isn't "the same price with lesser features and lesser performance". I suppose with hardcore fanboys it is, but were talking about reality, and reality dictates the amd card needs to be at least $50 less than the 680..
    Reply
  • Lepton87 - Thursday, March 22, 2012 - link

    http://translate.google.ca/translate?hl=en&sl=...

    So they are basically even at 2560 4xMSAA yet anand doesn't hesitate to call GTX680 indisputable king of the heel. It's strange because 7950 is at least the same amount faster than 580 yet he only implicitly said that it is faster than 580.

    http://www.hardwarecanucks.com/forum...review-27.h...

    http://www.guru3d.com/article/geforce-gtx-680-revi...

    http://translate.google.ca/translate...itt_einleit...

    judging from those results OC7970 is at least as fast if not faster than OC680. Wonder why they didn't directly compare oc numbers, probably this wasn't in nvidia reviewiers guide. Also it's not any better in tandem

    http://www.sweclockers.com/recension...li/18#pageh...
    Reply
  • Sabresiberian - Friday, March 23, 2012 - link

    Someone needs to learn how to read charts.

    What, did you post a bunch of links and think no one was going to check them out?

    I'll quote one of your sources, HardwareCanucks:

    "After years of releasing inefficient, large and expensive GPUs, NVIDIA's GK104 core - and by association the GTX 680 - is not only smaller and less power hungry than Tahiti but it also outperforms the best AMD can offer by a substantial amount. "

    Personally, I don't much care who comes out on top today, what I want is the battle for leadership to continue, for AMD and Nvidia to truly compete with each other, and not fall into some game of appearances.

    "Big Kepler" should really establish how much better it is than Tahiti, though I wouldn't be surprised if some AMD fanboys will still turn the charts upside down and backwards to try to make Tahiti the leader - just as they are doing now. Of course, the 7990 dual GPU board will be out by then, and they will claim AMD has the best architecture based on it performing better than Big Kepler (assuming it does).

    I don't know what AMD has planned, but I hope they come out with something right after Kepler (September? Too long, Nvidia, too long!) that is new, that will re-establish them. Of course, then there's Maxwell for next year . . .

    ;)
    Reply
  • jigglywiggly - Thursday, March 22, 2012 - link

    it's slower in crysis, it's slower Reply
  • Sabresiberian - Friday, March 23, 2012 - link

    Yeah? What if it's faster in Crysis 2?

    http://www.tomshardware.com/reviews/geforce-gtx-68...

    (While I get why using a venerable bench that Crysis provides gives us a performance base we're more familiar with, the marks for Crysis 2 show why an older model may not be such a good idea. Clearly, the GTX 680 beats out the Radeon 7970 in most DX9 benchmarks, and ALL DX11 Crysis 2 benches.)

    So, your troll doesn't just fail, it epic fails.
    Reply
  • SlyNine - Saturday, March 24, 2012 - link

    My problem is the tests are run without AA. I'd rather seen some results with AA as I suspect that would cause the 680GTX to trade blows with the 7970.

    On the other hand 1920x1200 with 2x AA is AA enough for me.
    Reply
  • HighTech4US - Thursday, March 22, 2012 - link

    In the review all the slides are missing.

    For example: instead of a slide what is seen is this: [efficiency slide] or this [scheduler]
    Reply
  • HighTech4US - Thursday, March 22, 2012 - link

    OK, looks like it just got fixed Reply
  • ET - Thursday, March 22, 2012 - link

    Impressive combination of performance and power draw. AMD will have to adjust pricing.

    This looks promising for the lower end cards (which are of more interest to me). AMD's 77x0 cards have been somewhat disappointing, and I'll be looking forward to see what NVIDIA can offer in that price bracket and also the 78x0 competition.
    Reply
  • rahvin - Friday, March 23, 2012 - link

    With 28nm limited (in part because of the TSMC shutdown of the line) we won't see price reductions, the parts are going to be too limited for that to happen unfortunately, that is unless AMD stockpiled tons of chips before the TSMC shutdown. What we might see is AMD releasing drivers or new cards that stop underclocking their chips to keep the TDP so low. From what I've read in the reviews AMD has underclocked their cards significantly and could issue drivers tomorrow that boosts performance 30% but at the sacrifice of increased power consumption.

    The 680 appears to be a very nice card, but they tossed the compute performance out the window to accomplish it, the 580 smokes the 680 in most of the compute benchmarks. I find that disappointing personally and won't be upgrading as from my perspective it's not much of an upgrade against a 580. Shoot, show me a game that strains the 580, with every game produced a console port that is designed for DX9 I'm not sure why anyone bothers upgrading.
    Reply
  • Janooo - Thursday, March 22, 2012 - link

    Ryan, why you did not include OC79XX as you did with OC GTX460 when 68XX were launched? Reply
  • Ryan Smith - Thursday, March 22, 2012 - link

    Because you guys have made it abundantly clear that you don't want us doing that.

    http://www.anandtech.com/show/3988/the-use-of-evga...
    Reply
  • Janooo - Thursday, March 22, 2012 - link

    "We were honestly afraid that if we didn't include at least a representative of the factory overclocked GTX 460s that we would get accused of being too favorable to AMD. As always, this is your site - you ultimately end up deciding how we do things around here. So I'm asking all of you to chime in with your thoughts - how would you like to handle these types of situations in the future?"

    Anand is asking what to do. The article form the link is not a proof of that. What are you talking about?
    Reply
  • chizow - Thursday, March 22, 2012 - link

    I think he's referring to the 620 comments worth of nerdrage more than the article. Reply
  • prophet001 - Thursday, March 22, 2012 - link

    bad nerdrage is bad :( Reply
  • Janooo - Thursday, March 22, 2012 - link

    I see.
    Still 680 overclocks/boosts on the fly and 7970 has set clock.
    It's hard to compare them.
    Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    If you keep the 680 cool it goes faster - so a good way would be to crank the 680's fan to 100% and watch it further trounce the 7970, right ? Reply
  • Janooo - Thursday, March 22, 2012 - link

    That's the thing. I am not sure 680 can clock higher than 7970. If we do the same for both cards 7970 might end up faster card. Reply
  • SlyNine - Thursday, March 22, 2012 - link

    Wait the boost speed is 1110 vs 1005 right? So 10% faster in shader performance, which will = about 5% in benchmarking performance in the best case.

    Nothing to see here move along.
    Reply
  • Janooo - Thursday, March 22, 2012 - link

    Well, 7970@1.1GHz beats plain 680. Reply
  • SlyNine - Thursday, March 22, 2012 - link

    Who cares, I wan't to know what the card comes shipped as. Thats what matters, anything extra you get out of that is exactly that, extra. What comes out of the box, thats what they are promising. Reply
  • BoFox - Friday, March 23, 2012 - link

    Wait, you mean that HD 7970 needs to be overclocked by more than 20% in order to beat plain 680?

    How about overclocking that 680 by 15% like most review sites show is possible?

    Then the 7970 would need to be overclocked by an impossible 35% in order to beat a 680 overclocked by 15%.

    That was a nice try, Janooo!
    Reply
  • Janooo - Friday, March 23, 2012 - link

    It seems you missed the point.
    Whatever speed 680 has 7970 can match it. These cards are equal in this regard.
    When they have the same clock speed then it looks like 7970 is faster.
    Look for AMD to release a faster card soon.
    Reply
  • CeriseCogburn - Friday, March 23, 2012 - link

    We will have to subtract some mhz from the 7970 for having a larger core with more die space to make it fair, so transistor for transistor Kepler wins big. Reply
  • CeriseCogburn - Friday, March 23, 2012 - link

    Plus were going to have to subtract more from The Heatie because it cheats on ram size too.
    Thanks Janooo you have great ideas.
    Reply
  • BoFox - Monday, March 26, 2012 - link

    Ok, if I go by your analogy and say that overclocking GTX 580 to the same speed as HD 6970 (880 MHz) makes both cards "equal in this regard."

    When they have the same clock speed then it looks like GTX 580 is faster.

    Look for Nvidia to release a faster card soon ultilizing that 8-pin PCI-E connector on the PCB (which it did not need in order to beat HD 7970 overclocked or not).
    Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    680 has made 1,900mhz and makes well over 1,280 ouit of box reference... Reply
  • SlyNine - Thursday, March 22, 2012 - link

    Why, thats how it is setup stock. That is how EVERY SINGLE CARD will come. Reply
  • Jamahl - Thursday, March 22, 2012 - link

    And you were all too willing to do so without evening-up the initial crime. Don't insult our intelligence Ryan. Reply
  • SlyNine - Thursday, March 22, 2012 - link

    You need to open your mind alittle bit. It's easy to see what would happen in Anand did something that actually limited out of the box performance.

    Why are you even suggesting they do such a thing. This is how the card ships, and thats how you will be getting it.

    Maybe they should lower the memory clock on AMD cards to make it fair. Or wait, their are different number of shaders. Maybe Anand should somehow limit that.

    It just doesn't make any sense.
    Reply
  • MattM_Super - Friday, March 23, 2012 - link

    Yeah you can't please all the people even some of the time when it comes to GPU reviews. This seems like a through enough review of the card as it comes out of the box. Overclocking is also important, but considering the hassle, increase in temps and noise, and possible voiding of the warranty, it seems unreasonable to demand that the OC scores be treated as more important than the stock scores. Reply
  • Scott314159 - Thursday, March 22, 2012 - link

    Any chance of running FAH on the 680... it will only take a few minutes and would give us folders a view into its relative performance compared to the outgoing 580 (and the Radeons).

    I'm looking to buy a new card in the short term and FAH performance is a factor.

    Thanks in advance!
    Reply
  • Ryan Smith - Thursday, March 22, 2012 - link

    Tried it. It wouldn't run. Reply
  • cudanator - Thursday, March 22, 2012 - link

    C'mon guys, why isn't there a single CUDA-Test? And don't say "cause AMD doesn't support it" :P For me most interesting would be the CUDA-Speed compared to other nVidia-Models. Reply
  • Wreckage - Thursday, March 22, 2012 - link

    Not to mention PhysX. Sadly there are a lot of features AMD does not support and so they don't get benchmarked often enough. h.264 encoding is another one. Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    On other sites they turn on full PhysX in Batman on the 680 and keep it off on the 7970, and the 680 still wins.
    LOL
    If you watched the release video they show PhysX now has dynamic on the fly unique in game destruction that is not repeatable - things break apart with unique shatters and cracks. I say "it's a bout time!" to that.
    My 7970 needs to go fast, in facts it's almost gone as we speak. No way am I taking the shortbus.
    Reply
  • SlyNine - Thursday, March 22, 2012 - link

    Umm, the GNC (7xxxHD) has a fixed function H.264 encoder. Afaik the 680GTX doesn't even have a fixed function h.264 encoder. So I'm pretty sure it would mop the floor with cuda H.264 encoding. Reply
  • Ryan Smith - Thursday, March 22, 2012 - link

    We have the data, although it's not exactly a great test (a lot of CUDA applications have no idea what to do with Kepler right now). It will be up later today. Reply
  • chizow - Thursday, March 22, 2012 - link

    Nvidia let AMD off the hook by productizing a mid-range GK104 ASIC as their flagship SKU and pricing it at $500.

    Its a great part no doubt and beats AMD in every metric, but as the article stated, its still going to be the smallest increase in price:performance that we've seen since 9800GTX.

    Overall 28nm has been a huge disappointment so far in terms of performance increase over previous generations at the same price points, I guess we will have to wait for 28nm BigK to get that true high-end increase you'd expect from a smaller process node and new GPU arch.
    Reply
  • B-Unit1701 - Thursday, March 22, 2012 - link

    'Off the hook'? LMAO they released what they had. They are already months late, the only other option would have been to just not release a card this generation, would THAT have made you happier? Reply
  • chizow - Thursday, March 22, 2012 - link

    No, what would have made me happier from both Nvidia and AMD would be to follow their own historical price:performance metrics.

    Instead, we have AMD first trying to pass an overall unimpressive Tahiti part as a high-end flagship at $550+ followed by Nvidia following suit by pricing their mid-range ASIC as a $500 part.

    28nm has been a big disappointment so far, as we have the smallest increase in price:performance in any generation or process shrink since the 9800GTX.
    Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    With AMD GF foundry failures TSMC is stoked to the gills. We're not going to get the prices you want for performance for another 6 months or so when production is freed up with TSMC's ongoing 2B expansion.
    You ought to include real inflation as well which is as we all know far higher than the socialist government figures that are suppressed so their automatic payout increases are lessened.
    Be disappointed, a valid point, there are extenuating factors.
    Reply
  • xrror - Thursday, March 22, 2012 - link

    exactly. I completely understand why Nvidia is charging $500 for their mid-range part, but it still sucks.

    AMD also... I get why the 6000 series was gimped (it was originally supposed to be 32nm, and that fell through) but 7000 series... maybe that can be explained by moving to a new arch with GCN.

    Regardless... disappointing. Well actually it's dissapointing that you must pay $500+ to get a card that /might/ give you a fresh gaming experience over the $350 card you bought last generation.

    Unless AMD can pull a 8000 gen card out of their arse with drivers that work (i'm not optimistic) then you can bet if/when "full Kepler" comes out it will be $800+
    Reply
  • CeriseCogburn - Friday, March 23, 2012 - link

    Charlie D with his $299 leak, the only source, has made everyone think the 1core top card in the world was going to be released $150 cheaper than the current top 1core card in the world.
    He must still be laughing hard at semi-accurate.
    Reply
  • chizow - Friday, March 23, 2012 - link

    It wasn't Charlie's leak, it was the combined evidence of ~300mm^2 size, transistor count, mid-range specs, ASIC designation, and leaked GTX 670Ti pics also leading people to the conclusion this part was originally meant to be priced in that $250-$350 range.

    Obviously GK104 performed better than expected, coupled with Tahiti being weaker than expected, resulting in what we see today: an exceptionally performing mid-range ASIC being SKU'd as an overpriced flagship part at premium pricing.
    Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    Sorry I don't buy any of it. It's a "new architecture", if we take Charlie's leak, everything fits but the price, and every price has been $499 going on 4X in a row at least.
    Reply
  • chizow - Friday, March 23, 2012 - link

    I agree, but honestly I don't even think AMD can compete at this point. Nvidia has beaten AMD at its own game soundly (small die, power efficient strategy, and done it with their 2nd best ASIC.

    Now they're free to make the slightly faster, power-hungry GPGPU monster GPU with as much TDP as they like without having to hear about it being worst than AMD's tree-hugging specs.
    Reply
  • Sabresiberian - Friday, March 23, 2012 - link

    Nvidia releasing their new architecture a few months after AMD released theirs does not make them late. Nvidia's schedule hasn't been the same as AMD's for several years now.

    And, what's AMD's answer to Big Kepler going to be? They lost today to Nvidia's mid-line chip, they will lose big time when Big Kepler comes out. By the time they catch up, Maxwell will be breathing down their necks.

    ;)
    Reply
  • Exodite - Thursday, March 22, 2012 - link

    I'm a bit confused regarding the multi-display output options, which is a shame as the ability to drive more than two displays at once is something that'll once again make Nvidia interesting for me personally.

    I regularly run two separate DVI displays for my desktop/work and a HDMI-attached TV for mainly media and some couch surfing.

    Will this setup work without issues with the GTX 680?

    What kind of power level will it put the card in?

    On my Radeon 6950 it bumps idle clocks significantly as soon as I enable multiple displays and I don't consider it an issue, I'm just curious about what the implications are for the GTX 680.

    Thanks for an awesome review, as usual!
    Reply
  • blanarahul - Tuesday, March 27, 2012 - link

    Check techpowerup's review! Reply
  • SlitheryDee - Thursday, March 22, 2012 - link

    This is great news. AMD's going to have fight nvidia in the pricing arena. There's going to be some great cards going for really cheap by this holiday season. Yay competition! Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    Woo hooooooooooo ! :-) Reply
  • kallogan - Thursday, March 22, 2012 - link

    Good gpu but it's not a killer beast like it was announced... Reply
  • silverblue - Thursday, March 22, 2012 - link

    I never expected it to be "as fast as three 580s in SLi" but it's still a very impressive piece of kit.

    I'm wondering how much performance is yet to come for both Keplar and Tahiti with driver updates, especially the latter.
    Reply
  • ccjuju - Thursday, March 22, 2012 - link

    I can't wait for this card to come out so I can buy a discounted 580 instead. Reply
  • TerdFerguson - Thursday, March 22, 2012 - link

    "Battlefield 3 may be the most interesting game in our benchmark suite for a single reason: it’s the first AAA DX10+ game. It’s been 5 years since the launch of the first DX10 GPUs, and 3 whole process node shrinks later we’re finally to the point where games are using DX10’s functionality as a baseline rather than an addition."

    What, Just Cause 2 didn't count?
    Reply
  • bhima - Thursday, March 22, 2012 - link

    Bah, I wanted to see Anand review TXAA and NVIDIA's adaptive V-sync features. Reply
  • Ryan Smith - Thursday, March 22, 2012 - link

    As you can probably tell, we're a bit behind schedule. We will be discussing both TXAA and adaptive v-sync, though for TXAA we won't technically be reviewing it since it's not available yet. Reply
  • chizow - Thursday, March 22, 2012 - link

    Did not see OC results listed anywhere, will they be added to this article or an addendum/supplement later? Reply
  • Ryan Smith - Thursday, March 22, 2012 - link

    It will be added to this article. We may also do a separate pipeline article, but everything will be added here for future reference. Reply
  • SlyNine - Thursday, March 22, 2012 - link

    Adaptive v-sync, is that like tripple buffering? never heard of it but it sounds interesting. Reply
  • iwod - Thursday, March 22, 2012 - link

    In those high res test it seems it is seriously limited by bandwidth. If you could get 7Ghz GDDR5 or MemoryCube i guess it would have performed MUCH better.

    I think we finally got back to basic. What GPU is all about. Graphics. We have been spending too much time doing GPGPU which only benefits a VERY small percentage of the consumer market.
    Reply
  • PeteRoy - Thursday, March 22, 2012 - link

    Unfortunately there is no game that can benefit from the power this card has to give.

    We are in 2012 and the PC gaming graphic level is stuck in the year 2007 with Crysis as the last game to push the limits of video cards.

    Since 2006 when the Playstation 3 and Xbox 360 took over the gaming industry all games are made for these consoles therefore all games have the graphic technology of DirectX 9.

    Battlefield 3 in the end of 2011 still doesn't look as good as Crysis from 2007.
    Reply
  • Sabresiberian - Thursday, March 22, 2012 - link

    YOU have no use for this card because you are clearly playing Tetris at 1024x768. Other people have much higher resolutions and play much more demanding games.

    Some of them even - GASP! - run more than one monitor. Imagine that.

    ;)
    Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    I agree. The 7970 isn't doing it for me @ 19x12 w SB@4800x4. MOAR. Reply
  • SlyNine - Thursday, March 22, 2012 - link

    Crysis 2 will full DX11 plus texture pack tottaly tears my 5870 a new one. unless these cards are now pumping out 3x the FPS than the performance is needed. Reply
  • Ahnilated - Thursday, March 22, 2012 - link

    I waited all this time hoping the GTX680's would be an awesome card to upgrade from my GTX480's that I run in SLI. Well I am very disappointed now. I guess I continue waiting. Reply
  • Pessimism - Thursday, March 22, 2012 - link

    Do not forget that NVIDIA knowingly and willingly peddled defective chips to multiple large vendors. The real question is whether they have mastered the art of producing chips that will remain bonded to the product they are powering, and whether they will turn to slag once reaching normal operating temperatures. Reply
  • Sabresiberian - Thursday, March 22, 2012 - link

    Do you work for AMD's marketing department, or are you just a fanboy with tunnel vision? Reply
  • silverblue - Thursday, March 22, 2012 - link

    Could be beenthere under a different name... ;) Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    Youtube has settled that lie - all the "bumpgate" models have defectively designed heatsinks - end users are inserting a penny (old for copper content) above the gpu to solve the large gap while removing the laughable quarter inch thick spongepad.
    It was all another lie that misplaced blame. Much like the ati chip that failed in xbox360 - never blamed on ati strangely.... (same thing bad HS design).
    Reply
  • Arbie - Thursday, March 22, 2012 - link


    IMHO the only game worth basing a purchase decision on is Crysis / Waheard. There, even the 7950 beats the GTX680, especially in the crucial area of minimum frame rate. The AMD cards also take significantly less power long-term (which is most important) and at load. They are noisier under load but not enough to matter while I'm playing.

    So for me it's still AMD.
    Reply
  • kallogan - Thursday, March 22, 2012 - link

    Don't know if you can say that. Crysis is old now. No directx 11. But it's true the GTX 680 does not particularly shine in heavy games like Metro 2033 or Crysis warhead compared to other games that may be more Nvidia optimised like BF3. Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    Except in the most punishing benchmark Shotun 2 total War, the GTX680 by Nvidia spanks the 7970 and wins at all 3 resolutions !
    *
    *
    Can we get a big fanboy applause for the 7970 not doing well at all in very punishing high quality games comparing to the GTX680 ?
    Reply
  • Sabresiberian - Thursday, March 22, 2012 - link

    The key phrase you use here is "where it matters to me". I wouldn't argue with that at all - your decision is clearly the right one for your gaming tastes.

    That being said, you change your wording a bit, and it seems to me to imply (softening it "IMHO") that everyone should choose by your standards; that is also clearly wrong. The games I play are World of Warcraft, and Skyrim. WoW test results can be best compared to BF3, of those benches that were used in this article. I've never played Crysis passed a demo - so choosing based on that benchmark would be shooting myself in the proverbial foot.

    Clearly, the GTX 680 is the better choice for me.

    I've always said, choose your hardware by application, not by overall results (unless, of course, overall results matches your application cross-section :) ), and the benches in this article are more data to back up that recommendation.

    ;)
    Reply
  • 3DoubleD - Thursday, March 22, 2012 - link

    Please don't buy a GTX 680 for WoW...

    It's even overkill for Skyrim, since you don't really need much more than 30 fps. You'd be fine using more economical variants.
    Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    Wrong, but enjoy your XFX amd D double D.
    The cards, all of them, are not good enough yet.
    Always turning down settings and begging the vsync.
    They all fail our current gams and single monitor resolutions.
    Reply
  • Iketh - Thursday, March 22, 2012 - link

    for pvp you most certainly do need more than 30 FPS, try 60 at the least and 75 as ideal with a 120hz monitor... the more FPS I can get, the better I perform... your statement is true for raiding only Reply
  • Arbie - Friday, March 23, 2012 - link

    "I've always said, choose your hardware by application, not by overall results"

    Actually, that' is what I said. But I wasn't as pompous about it, which may have confused you.

    ;)
    Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    Well it's a good thing fair and impartial Ryan put the two games 680 doesn't trounce the 7970 in up first in the bench line up, so it would make amd look very good to the chart and chan click through crowd.
    Yeah, I like an alphabet that goes C for Crysis then M for Metro, so in fact A for AMD comes in first !
    Reply
  • Sivar - Thursday, March 22, 2012 - link

    Many Anandtech articles not written by Anand have a certain, "written by an intelligent, geeky, slightly insecure teenager" feel to them. While still much better than other tech websites, and I've been around them all for some time, Anand is a cut above.

    This article, and a few others you've written, show that you are really getting the hang of being a truly professional writer.
    - Great technical detail without paraphrasing marketing material.
    - Not even the slightest hint of "fanboyism" for one company over another.
    - Doesn't drag on and on or repeat the same thing several times in slightly different ways.
    - Anand, who usually takes the cool articles for himself, had the trust in you to let you do this one solo.

    I would request, however, that you hyperlink some of the acronyms used. Even after being a reader since the Geocities days, it's sometimes difficult to remember every term and three letter combination on an article with so much depth and breadth.
    Also, for the sake of mobile users and image quality, there really needs to be some internal discussion on when to use which image format. PNG-8 for large areas of flat or gradient color, charts, screen captures, and slides -- but only when the source is not originally a JPG (because JPG subtly corrupts the image so as to ruin PNG's chance of compression) and JPG for pretty much all photographs. I wrote a program to analyze images and suggest a format -- Look for "ImageGuide" on Google Code.

    In any case, the fact that I can think of only the most minor of suggestions, as opposed to when I read a certain other website named after its founder of a much shorter name.
    Reply
  • Sabresiberian - Thursday, March 22, 2012 - link

    I agree, another thorough review by one of the better people doing it on the internet. Thanks Ryan!

    As far as the dig on Tomshardware, I don't quite agree there. I notice Chris Angelini wrote the GTX 680 article for that website, and I'm very much looking forward to reading another thorough review.

    ;)
    Reply
  • Sivar - Thursday, March 22, 2012 - link

    Tom's may have improved greatly since I last gave it another chance, but since not long after they were bought out, I've found the reporting to be flagrantly sensationalist and light on fact. The entity that bought them out, and the journalists he hired, are well known for just that. Many times I read the author's conclusion and wondered if he was looking at the same bar charts that I was.

    To be blunt, at times when people quoted their site, I felt as if I'd shifted into an alternate dimension where otherwise knowledgeable people were comically oblivious to the most egregiously flawed journalism. It was as if a group of Nobel prize winners were unthinkingly quoting Bill O'Reilly or Michael Moore on a political matter as if it was assumed they were a paragon of truth and even-headedness.
    Reply
  • Sabresiberian - Thursday, March 22, 2012 - link

    Very well said. (I especially like the comment using both a staunch conservative and flaming liberal as examples of poor source material.)

    I do tend to look at specific writers, and probably give Toms too much credit based on that more narrow view. I freely admit to having a somewhat fanboy feel for the site, too, since it was one of the first and set a mark, at one time, unreached by any other site I knew about.

    I have been a bit confused by some statements made by some writers on that site, conclusions that didn't seem to be supported by the data they published. Perhaps it's time to step up and comment when that happens, instead of just interpreting my confusion as a lack of careful reading on my part (which happens to the best of us).

    ;)
    Reply
  • Nfarce - Sunday, March 25, 2012 - link

    "It was as if a group of Nobel prize winners were unthinkingly quoting Bill O'Reilly or Michael Moore on a political matter"

    Well Obama, Al Gore, and Arafat were each given a Nobel Prize, so I'd hardly consider that entity a good reference point of analogy in validity. In Any event, I welcome opinions from all sides. The main stream "news" media long ago abandoned objective reporting. One is most informed by reading different takes on the same "facts" and formulate one's own opinion. Of course, you have to also research outside the spectrum for some information that the main stream media will hide from time to time: like how bad off the US economy really is.
    Reply
  • Ryan Smith - Thursday, March 22, 2012 - link

    Thanks for the kind words, though I'm not sure whether "slightly insecure teenager" is a compliment on my youthful vigor or a knock against my immaturity.;-)

    Anyhow, we usually use PNGs where it makes sense. All of my photo processing is done with Photoshop, so I know ahead of time whether JPG or PNG will spit out a smaller image, and any blurring that may result. Generally speaking we should be using the right format in the right place, but if you have any specific examples where it's not, drop me a line (it will be hard to keep track of this thread) and I'll take a look.
    Reply
  • IlllI - Thursday, March 22, 2012 - link

    ok there seems to be some confusion here. many times in the review you directly compare it to GF114 (which i think was never present in the 580 series) yet also at the same time you say the 680 is a direct replacement for the 580.
    i dont think it is. what it DOES seem like however, is that this 680 was indeed suppose to be the mainstream part, but since the ati competition was so low that nvidia just jacked up the card number (and price).
    Reply
  • CeriseCogburn - Friday, March 23, 2012 - link

    So Nvidia should have dropped the 680, their GTX580($450+) killer in at $299...
    Charlie D's $299 rumor owns internet group think brains.
    Reply
  • Hrel - Thursday, March 22, 2012 - link

    It is a little weird that Nvidia gave up ground on the compute side, but their architecture is still vastly superior at tesselation. Which is the main point of DirectX11 and the biggest breakthrough in graphics in the past 12 years; maybe longer. AMD has improved that part of their GPU's quite a bit since the HD4000/HD5000 series; but they clearly still have a long way to go.

    Nvidia wins on every single front; this is why all my graphics cards are Nvidia now. I'm just glad AMD is so competitive, hopefully prices will start falling again, before December.
    Reply
  • SydneyBlue120d - Thursday, March 22, 2012 - link

    Hope to see some testing about video decoding and encoding, especially 4K compared with AMD cards :) Thanks :) Reply
  • shaggart5446 - Thursday, March 22, 2012 - link

    seems like tom is getting paid just like fuddo where on earth could you say gtx680 is better than 7970 nediot pick your games them what to bench mark why did amd win in some test result shouldnt nediot wins all the bench mark if your gonna say gtx680 is faster Reply
  • silverblue - Thursday, March 22, 2012 - link

    No. Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    Obviously amd has been paying off Crysis and Metro game companies ) Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    But not Crysis 2 game company which we can't benchmark anymore because amd can't play that modern title ) Reply
  • edwpang - Thursday, March 22, 2012 - link

    It'll be interesting to know the real engine clock these cards are running when testing, since GTX 680 has the "Boost Clock" feature. Reply
  • Ryan Smith - Thursday, March 22, 2012 - link

    Unfortunately Precision X doesn't have any logging, so I don't have any precise numbers. However I did take notes from what I saw in Precision X during our testing, which I'm happy to post here (just keep in mind that they're my internal notes).

    ----

    Crysis: Generally sustained at 1097; fluctuates some at 2560 between 1071 and 1097 due to power load.

    Metro: Frequently bounces between a range of 1045 to 1097.

    Dirt 3: Bounces between 1084 and 1097.

    Shogun 2: Fluctuates between 1071 and 1019; very high load.

    Batman: Some fluctuating between 1097 and 107.

    CivV: Mostly 1097; a few drops to 1084.

    Portal 2: Between 1058 and 1006.

    BF3: 1097 sustained

    SC2: 1097 sustained

    Skyrim: 1097 sustained

    SLG: 1097 sustained
    Reply
  • Everyone - Thursday, March 22, 2012 - link

    "the GTX 680 is faster, cooler, and quieter than the Radeon HD 7970"

    Sounds just like the launch of the 5870. But here's the problem. I bought a 5870 over two years ago, in a decent sale which had it priced at $325. Just now, two years later, are we getting cards that beat its performance by a wide margin and make me feel like it's actually time to upgrade. But look at the pricing! The Nvidia 680 is $500. There still isn't a card out there at the price level I paid two years ago (a price level I feel very comfortable with in contrast to the over $500 card market which I view as 'high end out of my wallet range') gives me a decent jump in performance over my 5870.

    I read every new videocard review anandtech posts, but I can't shake the feeling that something here is a little weird. In such a rapidly evolving market, why is it that two years later there hasn't been a realistic improvement in the level of graphics cards that I (and many others I'd imagine) am interested in?
    Reply
  • prophet001 - Thursday, March 22, 2012 - link

    talk to obama Reply
  • Sabresiberian - Thursday, March 22, 2012 - link

    About what?

    Are you trying to blame the President for high video card prices?

    Please.
    Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    It's called inflation, try to get the idea, not some fanboy hurt. Reply
  • silverblue - Thursday, March 22, 2012 - link

    What on earth is your problem? Where is there any sign of fanboyism in the three posts above you? He's got a valid point - cards are getting faster but they're getting more expensive at the same time.

    Successful troll is successful, it seems.
    Reply
  • CeriseCogburn - Friday, March 23, 2012 - link

    He said he got it in a sale. A few months ago it was $475 used on ebay (bitcoin).
    Many cards have come out at $500 and above.
    The 580 was well over $500 for a long time, as was the 5870.
    I guess you're mad because he;s wrong.
    Reply
  • silverblue - Friday, March 23, 2012 - link

    Reading comprehension isn't your friend. He's quite clearly talking about a 5870 that he bought for $325 when the prices were reduced two years back. He's also saying that there isn't a card in the region of $325 (give or take a few dollars) that is worth upgrading to right now, so in essence, he's asking what point there is to buying a new card.

    You're reading far too much into this. He still owns his 5870... which, by the way, never cost $500 at launch. The Eyefinity 6 Edition, perhaps, but not the vanilla 5870 - try $429 as an upper limit (and we're talking US Dollars; not my currency of choice but it's the norm for this site).
    Reply
  • CeriseCogburn - Saturday, March 24, 2012 - link

    Yes of course. So we have the 5870 v 7970 crysis W (page it happens to be on)
    40 vs 69
    24 vs 42
    16 vs 26

    I see 3 resolutions where not or barely playable becomes playable.

    So then it comes down to more exaggeration I objected to at the start.
    I wouldn't object saying the 5870 can turn it down satisfy - but the increase is clearly game making.
    Is it the GTX580 that was so far ahead yet denounced for core size and heat that is causing the disconnect ? I'd say so.
    Reply
  • B-Unit1701 - Thursday, March 22, 2012 - link

    Hence why Im still rocking a 4870. Picked one up almost exactly 3 years ago for $250, and haven't been convinced to upgrade until this generation. Will likely jump on a 7850 or a 7870 depending where prices land after nVidia's full launch. Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    The power envelope. The standard 300watts for pci-e (maybe they will raise it) is already exceeded on single cards.
    Where can you go but to 28mn and how is that not going to also bust power specs ?
    Monitor resolutions have increased in 2 years.
    OS has sucked up some.
    Game patches enhance.
    --
    What's wrong is there is a limit that cannot be passed right now smaller nm must be achieved. Smaller ram with 4x the GB...
    Two years may seem long but is it really ?
    Many multi billion dollar upgrades to foundries require constant overhauls to move nodes.
    Amazing is the whole thing hasn't collapsed already.
    Reply
  • fhaddad78 - Thursday, March 22, 2012 - link

    I'm confused by the dialog taking place in this conversation. Unless I am not understanding the results correctly, it seems to me the GTX 680 is a great card, there are some games where it performs really well, and there are other games where it's being outperformed by even older generation products. To me, this card (in it's current state) is more of a side-grade than an upgrade.

    Am I missing something?
    Reply
  • prophet001 - Thursday, March 22, 2012 - link

    only hysteria Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    Yes I don't see any former generations beating it. I think he has the hysteria downpat. ( the full GTX core rumored and confirmed is what's driving these people crazy, as well as the it will be $299 680 before AMD flopped out their $570 beaten card).
    So to an extent this will be the ongoing complaint - far be it from them to not demand the new card undercutting nearly every high end left on the shelves anywhere.
    If that happened they could moan the 500+mm squared twice as fast monster chip is fail at $700 or $600 or $500 and unfair.
    GTX570's were $350 days ago now they are $250
    GTX580 is $359 at egg not $500
    GTX460 is $110
    ---
    There's a whole lot of focus on one top card from each company and a whole lot of ignoring the rest of reality, which it appears to me at least on Nvidia's behalf is a gigantic price slash no one has reported yet.
    --
    I don't understand why so silent, other than, massive bias, but there it is.
    Reply
  • marc1000 - Thursday, March 22, 2012 - link

    Did I miss it or it wasn't mentioned? What version of DX is this card running please? I ask this because AMD is at 11.1 and there is some hype on Windows 8 being 11.1 too. It's a minor upgrade from 11 but we must consider this on buying decisions too. Reply
  • marc1000 - Thursday, March 22, 2012 - link

    yep, no word on DX version in more than 1 site. is this a DX10 card???? why nvidia doesn't let you talk about it?

    one thing AMD has always done is keep current with DX versions, and this caused them some performance penalties in almost every generation of cards. I remember when HD 5000 family launched that is was slower than HD 4000 in a "per core" fashion.

    it seems like Nvidia is dropping on compatibility in order to win at performance. This is not too bad, but it is a little disapointing.
    Reply
  • Klimax - Thursday, March 22, 2012 - link

    According to czech site PcTuning (page http://pctuning.tyden.cz/hardware/graficke-karty/2... it is DirectX 11.1. Reply
  • marc1000 - Thursday, March 22, 2012 - link

    is it? well, lets hope it really is. but why they are not marketing this? Reply
  • Ryan Smith - Thursday, March 22, 2012 - link

    I can confirm it is DX11.1 (the page regarding that isn't up yet). NVIDIA isn't talking about DX11.1 much which is why you don't see it mentioned a lot. Given that the only major consumer feature will be S3D support (which benefits AMD more than NV), as you can imagine they're not exactly rushing to tell everyone. Reply
  • BurnItDwn - Thursday, March 22, 2012 - link

    For 1080P and less, this card is really quite impressive, but it leaves a lot to be desired in the 2560x1600 category.

    I'd be curious to see triple headed performance figures ... (Nvidia can do something similar to Eyefinity right?)
    Reply
  • Sabresiberian - Thursday, March 22, 2012 - link

    Mmm for my purposes, the GTX 680 outperforms the Radeon 7970 enough to be a better buy (certainly if it actually costs less).

    Look at BF3 results and Skyrim results @2560x1600 (I actually run at 2560x1440, or on my other computer 1920x1200 at 85Hz) to see what is most important to me. Certainly, other games don't show the kind of performance increase you would want, but as in most hardware reviews, making the kind of statement you did without qualification means making an inaccurate one.

    ;)
    Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    The the hardocp - in triple surround/eyefinity comp the 680 wins.
    It's a bad day for AMD and reviewers they lost that too
    Now the reviewers will have to crank up 4 monitors and ruin a game and surf to see if they can squeeze out an amd win..
    Reply
  • Spunjji - Sunday, March 25, 2012 - link

    Can you stop posting, please? Your particular brand of repugnant obnoxiousness is making me nauseous. Reply
  • mckirkus - Thursday, March 22, 2012 - link

    We need an ability to vote bad comments down. You have one guy refreshing the page to get the first comment in and the next four pages are replies to it complaining.

    It's broken and this system is ten year old technology. What gives?
    Reply
  • kallogan - Thursday, March 22, 2012 - link

    If it's 0,5" less than the GTX 580, this gpu should just fit a sugo sg05 Reply
  • thunderising - Thursday, March 22, 2012 - link

    Buddies, this card is 499$.

    AMD is not going to lose much of a market share. Remember, the 400$ upwards market accounts for the smallest % of the market share of GPUs.
    Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    Well god bless that I'm such a fanboy I'm constantly checking whose market share is increasing or decreasing and then I check stock prices...
    ( Are you all bidding for CEO jobs?)
    Since this is such a twist and spin, let me share...
    Last I heard once clarity was given, Nvidia's discreet desktop market share which is what we're talking about here was @ 68%, so amd can't lose much more.

    You can get a distorted pretty figure for amd if you glom "all chips made" together - that may even include the console wins for amd...
    Reply
  • Sabresiberian - Thursday, March 22, 2012 - link

    Okay, right now this IS Nvidia's flagship, because it is the best performer the company has released, but let's not forget it is not intended to be their top offering based on Kepler; it's their middle-of-the road chip.

    (Hopefully we really will see a price war so that "Big Kepler" isn't even more expensive, heh.)

    ;)
    Reply
  • kallogan - Thursday, March 22, 2012 - link

    A 500 dollars middle of the road chip huhu

    Seriously there really are people who intend to buy at this crazy price ?
    Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    Oh yes, goodbye to my 7970.
    There's just a bit too much complaining going on.
    I know it helps a lot and you all feel like you've done something when all the lowered tiers prices keep coming down, but an HD is $100 to $200, you people pay well over $100 each for a case and a power supply, and SSD's are $200 each at small size and quality.
    Why is there so much complaining ?
    4870 was $500 bucks out of the gate.
    We have $739 duals standard now for long periods.(6990)

    HD 5870 was $425 two months ago used when bitcoin was higher in market.

    Seriously. AMD blew it, didn't give us 7970 at $249 so Nvidia couldn't drop at $299 - but do you really believe any of that right there was ever in the works ? It's very hard to swallow and say it is true.
    Reply
  • just4U - Friday, March 23, 2012 - link

    Most expensive card I ever bought was a Geforce 3. Thing cost me $700. Reply
  • von Krupp - Thursday, March 22, 2012 - link

    Smells like victory...for the consumer!

    I look at these numbers and it almost makes me regret my two HD 7970s...but then I realise that the only game I play out of these benchmarks where Nvidia wins is Battlefield 3, and with 2 cards, who cares? The number of rendered frames per second are laughing all the way to the buffer bank no matter whose product you choose!

    I'm just glad that ATI--I mean AMD--has finally gotten its act together to the point where they can compete head-to-head with Nvidia in performance instead of having to undercut on price. It brings back memories of the days where it was the Radeon 9800 Pro vs. FX 5900 Ultra.

    I can see Nvidia taking this as a win and holding back on GK110, refining it more, and then releasing it as the GTX 780 after AMD launches the HD 8970. It wouldn't be too dissimilar from the GTX 480/580, except instead of just enabling locked cores they release the original high-end chip. Making a dual-chip GK104 for the GTX 690 to counter the inevitable HD 7990 seems more logical to me from a business standpoint, because I can hold GK110 and spend less on R&D for the next year.

    At any rate, this is awesome. If only the CPU race was this close...
    Reply
  • marc1000 - Thursday, March 22, 2012 - link

    your point seems logical. 2x 680 to counter 7990, bigkepler really later because there is no need for it now. i bet a beer that this is what they will do. Reply
  • kallogan - Thursday, March 22, 2012 - link

    There will be lots of HD 7970 for sale on the second hand market in the next few days huhu Reply
  • thebeastie - Thursday, March 22, 2012 - link

    I been playing a lot of my PC games in 3d lately and have developed a real taste for it, any chance we could see some 3d performance tests? Reply
  • Dustin Sklavos - Thursday, March 22, 2012 - link

    I'm actually trying to put something for 3D Vision 2 together, so stay tuned. ;) Reply
  • Death666Angel - Thursday, March 22, 2012 - link

    Considering the trash talk and hype nVidia put out after Tahiti launched, this card leaves me feeling thoroughly "meh". It wins some, it loses some against the 7970 and it is always funny to see any wins get small and smaller the higher the resolution goes.
    Don't get me wrong, the GTX680 is the better performing card most of the time.
    But I don't regret the decision to buy a 7970, with water cooling having it overclocked 37% core and 24% memory, which translates in 25 to 30% gains most of the time.
    Power consumption is a bit better for the 680 (worse case wise), which is refreshing coming from nVidia. But the other reviews I've seen put the 7970 and the 680 nearer each other than anandtech and have the 7970 beat the 680 in idle.

    Most importantly, we have 2 new generation that are both performing well and costing about the same (at least when I look at actual prices, not MSRP). Go with whichever company you like best or whichever small featureset you like best. :-)
    Reply
  • kreacher - Thursday, March 22, 2012 - link

    Very nice review and extremely comprehensive about the new architecture / features. Great to see Nvidia back with a bang.
    For the benchmarks I know Skyrim sold lots more but wouldn't Witcher 2 be a better game for GPU benchmarks.
    Reply
  • Aenslead - Thursday, March 22, 2012 - link

    It's a good review, and although PERFORMANCE WISE it's not an AMD killer, you guys clearly skipped on the important gaming technologies like FXAA, TXAA and Adaptive Vsync, as well as 3+1 monitors.

    Incomplete review, as far as product goes.
    Reply
  • silverblue - Thursday, March 22, 2012 - link

    We'll probably get a follow-up guide on it. Reply
  • claysm - Thursday, March 22, 2012 - link

    This is impressive stuff, and I should expect AMD to drop the prices on every one of their 7xxx series cards by a large margin. The 7970, in particular, I think, should go as low as $400, with the 7950 dropping to $330-50, and the 7870 down as well. Reply
  • marc1000 - Thursday, March 22, 2012 - link

    no it wont. 3gb cards have that extra 1gb to justify their prices. Reply
  • ol1bit - Thursday, March 22, 2012 - link

    Love the details and design elements you went into. For a few years it looked like Nivida was building Super chargers. Just Through more power/heat/fuel at it.

    The 680 is something special, More speed, less juice, clearer design. Lover It!

    I can't help but think their foray into Smartphones/tablets have help them reconsider what is important, is a clean power conservative design.

    I've always been a fan of Nivida cards, not so much for the card, but for the software, no issues with games. I've only had 2 Amd cards the last was the 5780. Great card, but lots of games just didn't work on it. Sold that card on ebay and bought dual 460's for less cost.

    Anyway, I'm glad they seem to be heading in a good direction now. Go Nivida!

    And AnandTech, awesome review as always! I trust your reviews more than any other sites. Thanks!
    Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    On the last page of gaming benchmarks here's the fantasy line we're given :
    " . It only seems appropriate that we’re ending on what’s largely a tie. "
    It's just amazing - 680 takes top at highest resolution, top at lowest resolution, and witihin .6 fps at the 80 frames mark in the middle resolution, which is likely a test error that favored the wrong card, and we get this amd fan boy analysis about largely a tie.
    Worse than that, this very reviewer was the one ion the blog that for a year thought Nvidia was cheating in Civ5 until he found out Nvidia followed DX11 driver rules while amd didn't hence the Nvidia huge lead.
    Now, after amd was cued in and startted following DX11 rule implementations in drivers, we are treated to this lie, after the "boy things have really changed for sure" - instead of "AMD failed to properly implement drivers for Civ5 for months on end, and now that they have, they still lose to the superior Nvidia 680".
    ---
    The truth would be nice, instead we get "favorite game that was banned at anandtech offices""tied".
    ---
    I am really sick of watching Nvidia wins belittled and called ties or less, and amd lesser accomplishments cheered.
    It's NOT "largely a tie" amd lost in the highest and lowest resolution miserably, and did not win in the central resolution. THAT CAN NEVER BE A TIE, LARGELY OR OTHERWISE.
    Reply
  • N4g4rok - Thursday, March 22, 2012 - link

    I think scrutinizing one small portion of the review isn't the right way to look at it. There's no way the review is an "amd fanboy analysis."

    You might want to dial back the anger a little bit. The 680 is a great piece of hardware. Whether or not it was a win or lose depends on your interpretation of the benchmarks, so it's a little hard just to say AMD loses, nVidia wins.

    no one is here to belittle one company's victory over the other. And if that happens, just shake it off. opinions don't affect the performance of a GPU, regardless of how many times it's posted as a response to benchmarks.

    All of those results favor the 680, and the author's analysis followed suit. Just relax. there's no need to be fighting over which toy is better than the other. Take your pick and go with it.
    Reply
  • silverblue - Friday, March 23, 2012 - link

    He's mad for buying a 7970 when Kepler was around the corner. ;) If he's got that much money to burn, he can only blame himself for jumping too quickly. Reply
  • N4g4rok - Friday, March 23, 2012 - link

    Either way, i don't think you lose out on much.

    But, i'm using a 7950 with Vsync enabled on a 60hz monitor, so i'm not sure i have any room to talk about that performance lead the 680 has.
    Reply
  • rickon66 - Thursday, March 22, 2012 - link

    It looks like time to let the trusty old 5870 go to pasture, it has been a tad weak since I moved to 2560x1600 last year anyhow. Reply
  • poohbear - Thursday, March 22, 2012 - link

    wow since im a huge BF3 and Shogun2 nut, this vid card series from nvidia seals the deal for me!!! it kicks ass in BF3!!! really hope these kind of results trickle down to their mid range line of cards! Reply
  • mwilli01 - Thursday, March 22, 2012 - link

    Way too many fanboi's here. You'd think some of you actually work for AMD and are trying to defend your jobs.

    I'll go with the cheaper option, because they are so close in everything (but price, lmao).
    Reply
  • kyuu - Thursday, March 22, 2012 - link

    If AMD doesn't adjust their prices, then they're foolish, and the 680 is clearly the superior buy. However, if AMD moves the 7970 to $400-$450 (and adjust the prices of all other cards accordingly), as they should, then I'd take the 7970 over this card any day. Reply
  • Nfarce - Thursday, March 22, 2012 - link

    Likewise - way too many red sticker fangirls who can't stand the fact Nvidia is once again pwning you people...just like Intel does to AMD. Sucks to be you all. Reply
  • PubFiction - Thursday, March 22, 2012 - link

    Why are there so many standards for USB? clearly micro usb should do them all right? Why do they even make computers with USB A ports then?

    Well bigger ports are easier for people to work with, they also tend to be stronger and less prone to breaking. Ever see people fumble with their phone trying to get that micro usb port in, then watch them try in the dark. Why do printers all come with a beefier USB port?

    So it seems reasonable that when putting the ports in, you are going to use the largest one your device can handle to make it easy for customers to see it, and work with it, and keep the connection in place.
    Reply
  • jmpietersen - Thursday, March 22, 2012 - link

    This is very impressive, Nvidia has taken the crown again, nicely done, but remember, if you have read what the claims were and what has been said?
    Essentially, what Nvidia has, is 3 x GTX580's in "CORE" count and yet they only managed a mere 15 - 25% improvement, not exactly first class if you ask me??

    If this card beat, and it should have easily have done it, if you take into account the amount of CUDA cores it has, the GTX590 and should have given the ASUS MARS a run for its money too? Which now in turn shows that the kepler cores are not as good as the Fermi cores.

    I hope for Nvidia's sake its a mere driver issue and there will be further GREATER gains, if not, the essentially, they are going backwards. AMD has at least continuously improved and lets face it, AMD's 7 series, really shines compared to the 6 series??

    Oh and before you all jump the gun, I have a Nvidia GTX 560ti card, love it, but i am not so sure about the new cards??
    Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    FXAA is what was used on the single 680 bench that matched 3x580 using 4xmsaa (though we never heard framerates).
    No one claimed 680 was 300% faster than 580 sir. No one.
    Well, amd fans claim it should be, as you just did.
    AMD claims a very high shader count by counting a 5 superscalar (or 4 now) cluster as 5 shaders not 1. Nvidia for years counted their shaders as 1/5th that number, hence the 5870 has "1600 shaders" and cannot beat a 320 shader 570.
    ---
    It appears Nvidia has no longer sat idly by, and decided to start counting 5x per cluster as well.
    Hence, 1538 shaders is now that instead of 1/5th that number which would be 307 shaders.
    So if you go with the old way of counting, 307 shaders would be beating the 580's higher shader count by a lot.
    Reply
  • jmpietersen - Thursday, March 22, 2012 - link

    Just Quoting original text?

    Game Developers Conference 2012 attendees were first in the world to bear witness to the power of our next-generation ‘Kepler’ graphics card at an invite-only Unreal Engine 3 Samaritan demonstration. When previously shown, the same demo required the use of three GeForce GTX 580 graphics cards.

    http://www.nvidia.com/content/newsletters/web/gf-n...

    Yes i know what you saying, but according to this statement, on the nvidia site, the performance should have somewhat beaten the GTX590, just saying, thats what they had us believe?
    So i am a little taken back, but you just answered my original statement, they have gone backwards?
    Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    I think you should believe yourself and stay wondering, and claim "they've" gone backwards. That's brilliant indeed. (that was sarcasm guy)
    Now you've gone to beating 590 - whatever, it is does in some tests....
    --
    Clearly you want to believe they've gone backwards and lied to you so feel free to do so forever.
    Reply
  • silverblue - Friday, March 23, 2012 - link

    By providing no information whatsoever on the demo's framerate, it is entirely possible that somebody would indeed believe the 680 is three times faster than a 580. Of course, we know it's not (and jmpietersen certainly knows) - it's just far better and more efficient anti-aliasing that's being employed. Win-win for anybody going with the 680 and wanting exceptional image quality.

    The lack of information about the framerates produced by the 580s as compared to the sole 680 is the key to what jmpietersen is arguing about.
    Reply
  • CeriseCogburn - Friday, March 23, 2012 - link

    Yes, but I think what's left out is context since they were bragging out the new TXAA or FXAA - so we get a clipped statement after the fact...
    Then we're told we were lied to, when really it is the statement clippers that are lying, and a bunch of the rest of us tricked and angry.
    Reply
  • KamikaZeeFu - Thursday, March 22, 2012 - link

    Cared to read the article? Nvidia changed the way they operate their Cuda cores. Previously, they were run at twice the frequency of the rest of the core (core clock vs shader clock).

    Now shader clock is gone, shaders operate at the same frequency. So roughly, 1 fermi shader core is 2 kepler shader cores.
    Suddenly, your 3 x GTX580 core count actually means 1,5 x GTX 580 core count.
    Reply
  • jmpietersen - Thursday, March 22, 2012 - link

    Thanks, you also answered my original statement, they have gone backwards?

    But again, read what "THEY" said, not me, i am just disappointed, thats all, i really was hoping it would have been a little more than what it is.

    But i will upgrade to the GTX680, depending on what AMD does in terms of pricing. then i will weigh up my options.
    Reply
  • blanarahul - Tuesday, March 27, 2012 - link

    you didn't read the article i guess. they had to double the CUDA cores since they did away with the shader clock. Reply
  • Ananke - Thursday, March 22, 2012 - link

    This card is impressive, probably worth $249 at launch. Same with the Radeon 7850/7870. I will definitely consider it when it goes under that magical price level.

    In my opinion, NVidia and AMD totally missed the market this year. General consumer will just save and buy iPads, and when MS comes out with Windows tablets it will get even worse.

    Impressive hardware, but it is pricing out almost all of its targeted market. These will sell in a very limited number. I guess they don't care about the GPU business anymore, but such thinking leaves them vulnerable to ARM hybrids with GPU integration from other parties.
    Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    GTX680 is already sold out.
    HD 7970 has been sold out for months.
    They can't make em fast enough.
    Have fun with your i-pad tiny gaming.
    Reply
  • Ananke - Thursday, March 22, 2012 - link

    They spend like 100 mln on R&D and sold out the first several hundred cards at $500. Do you think that is a huge success? With this rate they will need decades just to break even...and what I said is that their targeted customers, both individuals and corporate, will have disposable income for the next several months only, then other electronics is coming out which will attract their money.

    Same like SONY - "great" success with PS Vita, next quarter they announce 2.5 bln loss on the books...yey really great.
    Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    Well whatever you claim you said, I don't care. What is clear at least to me is the new architecture comes out in a lot more than 1 top video card, has already spread to shipped laptops, and will be spun out in millions upon millions of Wal Mart machines and Brand names like Dell/whatever etc for years to come, not to mention the many discreet card lines, and to cap it off very high value, high purchase cost FERMI/KEPLER compute cards costing multiple thousands of dollars each and contributing we have been told at least for Nvidia not amd, to like 38% of Nvidia's overall year upon year profits, which are so far, profits every single year, unlike AMD.
    --
    So I can agree with you for AMD, but not for Nvidia because of the above facts.
    If AMD somehow pulls off a victory this time and turns their research into profit, that's great, it will be a first for them.
    Reply
  • Ananke - Thursday, March 22, 2012 - link

    High Performance Computing is interested in ARM cores...a switch of NVidia fortunes may happen very very quickly, and NVidia shall be blind to not see this coming. Inadequate price positioning doesn't help.

    In consumer market the disposable income goes towards tablets, and this trend increases. This is the same people buying gaming GPUs.

    So, BOTH NVidia and AMD are going to be squeezed on both sides. They have around 6 months to make good volume sales, and they very hard are trying to blew this chance with pricing.

    Here you go, kinda internal marketing info for you, free.

    What I said is the GPU is great, but not good enough for the money they ask.
    Reply
  • CeriseCogburn - Friday, March 23, 2012 - link

    Ok thanks for your perspctives. I read Nvidia announced their biggest ever quarter coming - maybe it's tegra 3 or something or some governments supercomputers or something else. Reply
  • Ananke - Friday, March 23, 2012 - link

    Tegra 3 and mobile platforms. Nobody gives a c*** about GPU, hence the high prices. NVidia dosn't expect these to sell in large volume, but they don't care either. Reply
  • CeriseCogburn - Saturday, March 24, 2012 - link

    The prices are no different than they have been for near a decade, everyone here obviously cares, some quite a lot such as yourself even, and for that same decade top tier cards from either warring company were never expected to sell in great quantity, hence my pointing out the great quantity of lower tier derived from from the research for both companies that is sold, as well as massively profitable high end for Nvidia and not amd.
    -
    So there are indeed "more than plenty of people who care a great deal", and the top research benefits everything down the line, including humanity reaching for the stars.
    You couldn't be more incorrect about everything really, especially using the GPU attack in the singular when it's better than and priced lower than amd's.
    Reply
  • marc1000 - Thursday, March 22, 2012 - link

    BTW, this is a nice achievement and makes me more interested in Nvidia than ever. But $ 400+ cards is not where the money is. It is at 100-200 range. And for some time AMD will have new cards in that price range, while Nvidia only has older ones.

    If Nvidia does launch the mid and low-end cards really fast, then AMD is in great trouble. Otherwise AMD will make money for some time - and lets hope this helps to save them.

    oh, and by the years end I will buy my new card. it will be the best performing card at $ 250 with a single PCIe power connection. It does not matter if it will be AMD or Nvidia.
    Reply
  • dubyadubya - Thursday, March 22, 2012 - link

    I'd say overall both camps did a good job this time around. Competition at this level is a win win for the consumer. Let the price wars begin! Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    I'm just realizing how evil and screwing AMD is. I found out Nvidia's last 4 flagships have been released at $499 for each, and AMD pr and marketing has known this, and certainly has the spying arms to know it this time as well.
    So what does AMD do ?
    Well, closer to $600 suits them... why let's just waltz out there at $589 and heck by the time anyone figures it out it will be too late :(
    It's just sinking in... it's like wait a minute...
    Nvidia was not bound to come out higher as everyone told me to get me to buy the amd card !
    They were bound to $499, been bound to it.
    --
    So, on the top tier, there is no price war, yes let it begin - it has not begun !
    Reply
  • falx - Thursday, March 22, 2012 - link

    On the "Meet the GeForce GTX 680" page I think it should have said "This is as opposed to the vapor chamber on the GTX 580" not the 680. Had to reread it multiple times so now I'm not even sure. Can anyone verify my sanity? Reply
  • Ramon Zarat - Thursday, March 22, 2012 - link

    The title "NVIDIA GeForce GTX 680 Review: Retaking The Performance Crown" is misleading. Said like that, in means the 680 is the fastest card on the planet. It's not.

    The 590 and 6990 are trading blows at the top. Actually, because of that, there's currently no graphic card king, it's in fact a stalemate. A stalemate that is about to end with the imminent introduction of the AMD 7990.

    To be honest to their readers, the title should have been "...Retaking The single GPU Performance Crown". NOTE TO THE AUTHOR: for the sake of logic and consistency, you better say the new 7990 is "Retaking The Performance Crown" without mentioning it's a dual GPU...

    That "small" journalistic inaccuracy details aside, nice review, but limited. We have no idea how the new generation scale with SLI where AMD improved tremendously with the 7XXX series. We have no idea how it performs with 2, 3 or 4 monitors compared to AMD 7970 with its faster and bigger RAM buffer. We very few hints on how the 680 would perform with wide spread GPGPU apps product such has video transcoding, folding or even Physx!

    For what we know this far, the 680 is clearly the best balanced price / performance / power ratio GPU ever produced by Nvidia, but comes a little short in some vertical department.
    Reply
  • arjuna1 - Thursday, March 22, 2012 - link

    Don't be surprised, this is Anandtech. Reply
  • N4g4rok - Thursday, March 22, 2012 - link

    Overall, i'm not real sure i understand the cutthroat nature of comments to these kind of things, let alone slightly overzealous writing.

    Just seems like a silly thing to get worked up about.
    Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    Some people are into it as a hobby and it obviously is very important look at the endless people it employs and you're at a website that thrives on people caring deeply about all of this.
    The gentleman had a good point, it's single core card king only.
    Reply
  • CeriseCogburn - Thursday, March 22, 2012 - link

    The GTX590 is clearly faster than the 6990, but we know who wants to claim it's a tie.
    The GTX580 is also clearly faster and by more, than the 6970. (CF vs SLI lowering the gap but never closing it in the dual core cards)
    It's nice to see how big the lies have become in the fanboy space.
    -
    I do agree with you that the 680 is only core single card king for now, but the GTX590 is clearly single card king, despite the endless lying to the contrary.
    If you read the endless fudging on the initial articles, the writers even say " 590 beats the 6990 but not by a wide enough margin to convincingly "dethrone it".

    It wins but it doesn't win, It's faster overall but it's not the king... fanboyism
    -
    That's the kind of fan boy crud contradiction we get when science and truth is tossed aside for some sort of who knows what "reason" which is anything but reasoned.
    Reply
  • piroroadkill - Friday, March 23, 2012 - link

    Huh, sorry, but I'm not seeing it: http://www.anandtech.com/show/4239/nvidias-geforce...

    6990 vs 590, I'd say looking at the graphs, 6990 wins most of the time, but when 590 does win, it's usually convincingly.

    Given that's pretty much a wash, I'd take the 6990 any day because of the increased VRAM. Believe me, you definitely need it when cranking up all the quality settings, ESPECIALLY if you're going to run multiple monitors.
    Reply
  • CeriseCogburn - Friday, March 23, 2012 - link

    We've heard the last line for a long time speculatively even in reviews, but now the 680 has put that lie to rest in triple monitor wins.
    The other problem is once you've cranked settings enough for theoretical ram limiting, you're below playable frame rates.
    So no, what you claim is just more fud because of those facts.
    Reply
  • mindbomb - Thursday, March 22, 2012 - link

    is it still vp5?

    i want some hardware vp8 decoding.
    :3
    Reply
  • RikkiTikkiTavi - Thursday, March 22, 2012 - link

    This brings back competition to the graphics card market. After years of Nvidia selling their scorching-hot cards at hardly profitable prices, they can finally turn around and hunt AMD for a change. I can't wait to see their answer to this.

    (to prevent any cries from Nvidia-Fanboys about me being unfair to them: I own a 560Ti, after my 8800GT, 7600, 6800, 4200TI, and Riva TNT, so there)
    Reply
  • will54 - Thursday, March 22, 2012 - link

    I noticed in the review they said this was based on the GF114 not the GF110 but than they mention that this is the flagship card for Nvidia. Does this mean that this will be the top Nvidia card until the GTX780 or are they going to bring out a more powerful in the next couple months based off the GF110 such as a GTX 685. Reply
  • von Krupp - Friday, March 23, 2012 - link

    That depends entirely on how AMD responds. If AMD were to respond with a single GPU solution that convincingly trumps the GTX 680 (this is extremely improbable), then yes, you could expect GK110.

    However, I expect Nvidia to hold on to Gk110 and instead answer the dual-GPU HD 7990 with a dual-GK104 GTX 690.
    Reply
  • Sq7 - Thursday, March 22, 2012 - link

    ...my 6950 still plays everything smooth as ice at ultra settings :o Eye candy check. Tesselation check. No worries check. To be honest I am not that interested in the current generation of gfx cards. When UE4 comes out I think it will be an optimal time to upgrade.

    But mostly in the end $500 is just too much for a graphics card. And I don't care if the Vatican made it. When I need to upgrade there will always be a sweet little card with my name on it at $300 - $400 be it blue or green. And this launch has just not left me drooling enough to even consider going out of my price range. If Diablo 3 really blows on my current card... Maybe. But somehow I doubt it.
    Reply
  • ShieTar - Friday, March 23, 2012 - link

    That just means you need a bigger monitor. Or newer games ;-)

    Seriously though, good for you.

    I have two crossfired, overclocked 6950s feeding my 30'', and still find myself playing MMOs like SWTOR or Rift with Shadows and AA switched of, so that i have a chance to stay at > 40 FPS even in scenes with large groups of characters and effects on the screen at once. The same is true for most Offline-RPGs, like DA2 and The Witcher 2.

    I don't think I have played any games that hit 60 FPS @ 2560x1600 @ "Ultra Settings" except for games that are 5-10 years old.

    Of course, I won't be paying the $500 any more than you will (or 500€ in my case), because stepping up just one generation of GPUs never makes much sense. Even if it a solid step up as with this generation, you still pay the full price for only getting an 20% to 25% performance increase. That's why I usually skip at least one generation, like going from 2x260 to 2x6950 last summer. That's when you really get your moneys worth.
    Reply
  • von Krupp - Friday, March 23, 2012 - link

    Precisely.

    I jumped up from a single GeForce 7800 GT (paired with an Athlon 64 3200+) to dual HD 7970s (paired with an i7-3820). At present, there's nothing I can't crank all the way up at 2560x1440, though I don't foresee being able to continue that within two years. I got 7 years of use out of the previous rig (2005-2012) using a 17" 1280x1024 monitor and I expect to get at least four out of this at 1920x1080 on my U2711.

    Long story short, consoles make it easy to not have to worry about frequent graphics upgrades so that when you finally do upgrade, you can get your money's worth.
    Reply
  • cmdrdredd - Thursday, March 22, 2012 - link

    Why is Anandtech using Crysis Warhead still and not Crysis 2 with the High Resolution textures and DX11 modification? Reply
  • Malih - Thursday, March 22, 2012 - link

    Pricing is better, but 7970 is not much worse than 680, like some has claimed (well, leaks).

    With similar pricing, AMD is not that far off, although It remains to be seen whether AMD will lower the price.

    For me, I'm a mainstream guy, so I'll see how the mainstream parts perform, and whether AMD will lower the price on their current mainstream (78x0), I was thinking about getting 7870, but AMD's pricing is too high for me, it gets them money on some market, but not from my pocket.
    Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    AMD is $120 too high. That's not chump change. That's breathe down your throat game changing 1000% at any other time on anandtech ! Reply
  • nyran125 - Friday, March 23, 2012 - link

    some games it wins, others it doesnt. But a pretty damn awesome card regardless. Reply
  • asrey1975 - Friday, March 23, 2012 - link

    Your better off with an AMD card.

    Personally, I'm stlil thinking about buying 2x 6870's to replace my 5870 which runs BF3 no problem on my 27" 1900x1200 Dell monitor.

    It will cost me $165 each so for $330 all up, its stlil cheaper than any $500 card (insert brand/model) and will totally kick ass over 680 or 7970!
    Reply
  • CeriseCogburn - Friday, March 23, 2012 - link

    That's a good combo, it's high on the charts. Reply
  • SpeedyGonzales - Friday, March 23, 2012 - link

    AMD 7970 and Nvidia 680GTX are both good cards.

    Nevertheless AMD and Nvidia try to push price increases for the upper midrange class, either because of the limited supply due to the new process or because they just try.

    Both cards are from a die size perspective as well as from the performance vs. last generation perspective clearly $350-399 cards.

    I recommend to wait, if you are not desperate. Demonstrate, that the market does not accept price increases, specifically given, that pogress of PC graphics is massively blocked by the stupid consoles.
    Reply
  • kmmatney - Friday, March 23, 2012 - link

    To those wondering how people have enough money to buy $500 tablets ....we don't buy $500 video cards. I guess we spend moneuy on what we want - I had no trouble plucking down $500 for an iPad2, but to me onlya fool would pay $500 for a video card... Reply
  • evilspoons - Friday, March 23, 2012 - link

    At last, the replacement for my GTX 285 has arrived.

    It's over twice as fast. It uses less power so I can safely continue to use the same power supply. It's quieter at idle and barely louder at full tilt. It's even smaller!
    Reply
  • AnotherGuy - Friday, March 23, 2012 - link

    "the GTX 680 is faster, cooler, and quieter than the Radeon HD 7970."

    From the benches I saw in ur review I only see it being a little faster on more games and slower on a couple others... Also at load nVidia gets hotter than 7970... Check the Metro bench temps!
    nVidia is finally using as much power as AMD compared to what it used to use before and u gotta call it quieter and cooler?

    Are u just a Fanboy or u got payed to write that down?
    Reply
  • silverblue - Friday, March 23, 2012 - link

    Shh. It does use a little less power, and AMD have historically had quicker fans, which may explain why 7970 is cooler (that, and the cooler design). Reply
  • AnotherGuy - Friday, March 23, 2012 - link

    Do you see the the temps in Metro benchmark?
    http://www.anandtech.com/show/5699/nvidia-geforce-...
    AMD 7970 74 Celcius, nVidia 680 GTX 78 Celsius

    How is nVidia cooler ?
    Reply
  • BoFox - Monday, March 26, 2012 - link

    At idle - that's how - and also when both cards are overclocked- the 7970 obviously reaches thermal limits much quicker than GTX 680 which reaches the limit of voltage feed due to lack of power phases and 8-pin PCI-E connector for clean power.

    Also, when overclocking HD 7970 to say, 1150MHz, it usually requires voltage tweaking. By then, it's already consuming more power than even GTX 580!

    OTOH, GTX 680 still remains rather quiet and cool-running when overclocked to the max (with dynamic voltage scaling).
    Reply
  • BoFox - Monday, March 26, 2012 - link

    Why? Isn't that a good thing? Even if just a little bit, it's still true. Reply
  • travbrad - Tuesday, March 27, 2012 - link

    "From the benches I saw in ur review I only see it being a little faster on more games and slower on a couple others... "

    Look at them again. The only game where the 7970 has a performance advantage greater than 2% is Crysis Warhead, which is an old game that almost no one plays anymore.

    1920x1200 percentages:

    Crysis:Warhead--7970 is 12% faster
    Metro2033--7970 is 2% faster
    Dirt3--680 is 17% faster
    Shogun2---680 is 15% faster
    Batman--680 is 17% faster
    Portal2--680 is 22% faster
    BF3--680 is 31% faster
    SC2--680 is 27% faster
    Skyrim--680 is 16% faster
    Civ5--7970 is 1% faster

    The 680 is clearly the faster card overall, and at $50 cheaper definitely the better deal.
    Reply
  • XiZeL - Friday, March 23, 2012 - link

    Here is a question

    how many screens can i connect at once on this?
    Reply
  • silverblue - Friday, March 23, 2012 - link

    Good question. I've seen mentions of four at once, three in 3D, but I'm not sure. Reply
  • noeldillabough - Friday, March 23, 2012 - link

    I'd also really like to know the answer to this. Also, where to buy? Everywhere seems to have a "out of stock" Reply
  • Ryan Smith - Friday, March 23, 2012 - link

    It supports 4 screens, though Surround mode (SLS) is limited to 3 screens and then the 4th is a separate surface. Reply
  • Slayer68 - Saturday, March 24, 2012 - link

    As far as spanning goes it's 3 of the SAME screens. Might be able to use all 4 ports in non spanned mode for the desktop but not sure why you'd do that for gaming since it would only show up on 1 monitor. Reply
  • Pessimism - Friday, March 23, 2012 - link

    Nvidia knowingly sold defective products to multiple large vendors over a lengthy course of time. Never honor them with your money. The CEO has never publicly acknowledged or apologized for this mistake. Reply
  • CeriseCogburn - Friday, March 23, 2012 - link

    AMD ati in the gaming console space as well.
    I guess you get to enjoy VIA chrome with your held back punishment dollars.
    Don't buy HP as their defective heatsinks were the cause.
    Your console vendor choices are also cut down no xboxes.
    Reply
  • BoFox - Monday, March 26, 2012 - link

    LOL!!! That was funny!

    Yeah, and don't buy AMD because they lied about the Bulldozer transistor number! Dave Baumann recently posted over at B3D with the exact words that it's "just a marketing number".
    Reply
  • silverblue - Monday, March 26, 2012 - link

    I was under the impression that PR just got the number wrong... Reply
  • Soldier1969 - Friday, March 23, 2012 - link

    Clerly beats AMD's best. I'll take 2 plsase to replace my 2 580s 3gb versions. Pure ownage at my 2560 x 1600 res. Nice read thanks. Reply
  • Slayer68 - Saturday, March 24, 2012 - link

    Being able to run 3 screens off of one card is new for Nvidia. Barely even mentioned it in your review. It would be nice to see Nvidia surround / Eyefinity compared on these new cards. Especially interested in scaling at 5760 x 1080 between a 680 and 7970..... Reply
  • ati666 - Saturday, March 24, 2012 - link

    does the gtx680 still have the same anisotropic filtering pattern like the gtx470/480/570/580 (octagonal pattern) or is it like AMDs HD7970 all angle-independent anisotropic filtering (circular pattern)? Reply
  • Ryan Smith - Saturday, March 24, 2012 - link

    It's not something we were planning on publishing, but it is something we checked. It's still the same octagon pattern as Fermi. It would be nice if NVIDIA did have angle-independent AF, but to be honest the difference between that and what NVIDIA does has been so minor that it's not something we've ever been able to create a noticeable issue with in the real world.

    Now Intel's AF on the other hand...
    Reply
  • ati666 - Saturday, March 24, 2012 - link

    thank for the reply, now i can finally make a decision to buy hd7970 or gtx680.. Reply
  • CeriseCogburn - Saturday, March 24, 2012 - link

    Yes I thank him too for finally coming clean and noting the angle independent amd algorithm he's been fanboy over for a long time has absolutely no real world gaming advantage whatsoever.
    It's a big fat zero of nothing but FUD for fanboys.
    It would be nice if notional advantages actually showed up in games, and when they don't or for the life of the reviewer cannot be detected in games, that be clearly stated and the insane "advantage" declared be called what it really is, a useless talking point of deception that fools purchasers instead of enlightening them.
    The biased emphasis with zero advantage is as unscientific as it gets. Worse yet, within the same area, the "perfectly round algorithm" yielded in game transition lines with the amd cards, denied by the reviewer for what, a year ? Then a race game finally convinced him, and in this 7000 series release we find another issue the "perfectly round algorithm" apparently was attached to flaw with, a "poor transition resolution" - rather crudely large instead of fine like Nvidia's which casued excessive amd shimmering in game, and we are treated to that information only now after the 7000 series "solved" the issue and brought it near or up to the GTX long time standard.
    So this whole "perfectly round algorithm" has been nothing but fanboy lies for amd all along, while ignoring at least 2 large IQ issues when it was "put to use" in game. (transition shading and shimmering)
    I'm certain an explanation could be given that there are other factors with differing descriptive explanation, like the fineness of textural changes as one goes toward center of the image not directly affecting roundness one way or another, used as an excuse, perhaps the self deceptive justification that allowed such misbehavior to go on for so long.
    Reply
  • _vor_ - Saturday, March 24, 2012 - link

    Will you seriously STFU already? It's hard to read this discussion with your blatant and belligerent jackassery all over it.

    You love NVIDIA. Great. Now STFU and stop posting.
    Reply
  • CeriseCogburn - Saturday, March 24, 2012 - link

    Great attack, did I get anything wrong at all ? I guess not. Reply
  • silverblue - Monday, March 26, 2012 - link

    Could you provide a link to an article based on this subject, please? Not an attack; just curious. Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    http://www.anandtech.com/show/5261/amd-radeon-hd-7...

    http://forums.anandtech.com/showpost.php?p=3152067...

    " So what then is going on that made Civ V so much faster for NVIDIA? Admittedly I had to press NVIDIA for this - performance practically doubled on high-end GPUs, which is unheard of. Until they told me what exactly they did, I wasn't convinced it was real or if they had come up with a really sweet cheat. It definitely wasn't a cheat.

    If you recall from our articles, I keep pointing to how we seem to be CPU limited at the time. "

    (YES, SO THAT'S WHAT WE GOT, THEY'RE CHEATING IT'S FAKE WE'RE CPU LIMITED- ALL WRONG ALL LIES)

    Since AMD’s latest changes are focused on reducing shimmering in motion we’ve put together a short video of the 3D Center Filter Tester running the tunnel test with the 7970, the 6970, and GTX 580. The tunnel test makes the differences between the 7970 and 6970 readily apparent, and at this point both the 7970 and GTX 580 have similarly low levels of shimmering.

    with both implementing DX9 SSAA with the previous generation of GPUs, and AMD catching up to NVIDIA by implementing Enhanced Quality AA (their version of NVIDIA’s CSAA) with Cayman. Between Fermi and Cayman the only stark differences are that AMD offers their global faux-AA MLAA filter, while NVIDIA has support for true transparency and super sample anti-aliasing on DX10+ games.

    (AMD FINALLY CATCHES UP IN EQAA PART, NVIDIA TRUE STANS AND SUPER SAMPLE HIGH Q STUFF, AMD CHEAT AND BLUR AND BLUR TEXT)

    Thus I had expected AMD to close the gap from their end with Southern Islands by implementing DX10+ versions of Adaptive AA and SSAA, but this has not come to pass.

    ( AS I INTERPRETED AMD IS WAY BEHIND STILL A GAP TO CLOSE ! )

    AMD has not implemented any new AA modes compared to Cayman, and as a result AAA and SSAA continue to only available in DX9 titles.

    Finally, while AMD may be taking a break when it comes to anti-aliasing they’re still hard at work on tessellation

    ( BECAUSE THEY'RE BEHIND IN TESSELLATION TOO.)

    Don't forget amd has a tessellation cheat in their 7000 series driver, so 3dmark 11 is cheated on as is unigine heaven, while Nvidia does no such thing.

    ---
    I do have more like the race car game admission, but I think that's enough helping you doing homework .
    Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    So here's more mr curious ..
    " “There’s nowhere left to go for quality beyond angle-independent filtering at the moment.”

    With the launch of the 5800 series last year, I had high praise for AMD’s anisotropic filtering. AMD brought truly angle-independent filtering to gaming (and are still the only game in town), putting an end to angle-dependent deficiencies and especially AMD’s poor AF on the 4800 series. At both the 5800 series launch and the GTX 480 launch, I’ve said that I’ve been unable to find a meaningful difference or deficiency in AMD’s filtering quality, and NVIDIA was only deficienct by being not quite angle-independent. I have held – and continued to hold until last week – the opinion that there’s no practical difference between the two.

    It turns out I was wrong. Whoops.

    The same week as when I went down to Los Angeles for AMD’s 6800 series press event, a reader sent me a link to a couple of forum topics discussing AF quality. While I still think most of the differences are superficial, there was one shot comparing AMD and NVIDIA that caught my attention: Trackmania."

    " The shot clearly shows a transition between mipmaps on the road, something filtering is supposed to resolve. In this case it’s not a superficial difference; it’s very noticeable and very annoying.

    AMD appears to agree with everyone else. As it turns out their texture mapping units on the 5000 series really do have an issue with texture filtering, specifically when it comes to “noisy” textures with complex regular patterns. AMD’s texture filtering algorithm was stumbling here and not properly blending the transitions between the mipmaps of these textures, resulting in the kind of visible transitions that we saw in the above Trackmania screenshot. "

    http://www.anandtech.com/show/3987/amds-radeon-687...

    WE GET THIS AFTER 6000 SERIES AMD IS RELEASED, AND DENIAL UNTIL, NOW WE GET THE SAME THING ONCE 7000 SERIES IS RELEASED, AND COMPLETE DENIAL BEFORE THAT...

    HERE'S THE 600 SERIES COVERUP THAT COVERS UP 5000 SERIES AFTER ADMITTING THE PROBLEM A WHOLE GENERATION LATE
    " So for the 6800 series, AMD has refined their texture filtering algorithm to better handle this case. Highly regular textures are now filtered properly so that there’s no longer a visible transition between them. As was the case when AMD added angle-independent filtering we can’t test the performance impact of this since we don’t have the ability to enable/disable this new filtering algorithm, but it should be free or close to it. In any case it doesn’t compromise AMD’s existing filtering features, and goes hand-in-hand with their existing angle-independent filtering."

    NOW DON'T FORGET RYAN HAS JUST ADMITTED AMD ANGLE INDEPENDENT ALGORITHM IS WORTH NOTHING IN REAL GAME- ABSOLUTELY NOTHING.
    Reply
  • _vor_ - Tuesday, March 27, 2012 - link

    All I read is blah blah blah NVIDIA blah blah nerdrage blah blah. Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    I'll translate for the special people that need more help.
    AMD's IQ has been bad since 5000 series, with 6000 series also screwey.
    You will have shimmering in game textures and lines in shading transitions on screen since their algorithm has been messed up for years, even though it is angle independent and a perfect circle, IT SUCKS in real life - aka gaming.
    Nvidia doesn't have this problem, and hasn't had it since before the 5000 series amd cards.
    AMD's 7000 series tries once again to fix the ongoing issues, but fails in at least 2 known places, having only Dx9 support, but may have the shimmering and shading finally tackled and up to Nvidia quality, at least in one synthetic check.
    Reply
  • _vor_ - Tuesday, March 27, 2012 - link

    How much is NVIDIA paying you to babysit this discussion and zealously post?

    "It's better to keep quiet and people think you are a fool, than to open your mouth and prove them right."
    Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    Words right from anandtechs articles, and second attack.
    A normal person would be thankful for the information.
    Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    Did you notice the Nvidia card won Civ5 by more than the amd did in Metro2033, but Civ5 is declared a tie, and well we know what everyone is claiming for Metro2033.
    I noticed that and thought it was quite interesting how that was accomplished.
    Reply
  • BoFox - Monday, March 26, 2012 - link

    AMD's angle-independent AF is still flawed in that it's not fully trilinear when it comes to high-frequency textures (noisy moire). You'd be seeing lines of transition when everything suddenly becomes a bit blurry in a distance with these kinds of grainy textures.

    It's rather subjective, though.

    Nvidia does offer up to 32x CSAA with TRAA (transparent, or alpha textures) in DX10/11 games for superb IQ without having to use brute-force SSAA. AMD does not currently support "forced" AAA (Adaptive AA) on alpha textures in DX10/11 games, and the SSAA support in DX10/11 games was finally announced in beta driver support form with HD 7970 cards.

    Transparency AA has been around since 2005, and Nvidia actually maintained the quality IQ options for DX10/11 games compared to DX9 games all along.
    Reply
  • ati666 - Monday, March 26, 2012 - link

    did AMD fix this problem in their HD7970 or not? Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    We will find out what's wrong with it a year from now when the next series big 8000 is launched, until then denials and claims it's as good as nvidia are standard operating procedure, and spinning useless theoretical notions that affect gameplay exactly zero and have amd IQ disadvantages will be spun in a good light for amd to get all the amd fans claiming the buzzwords are a win.
    That will work like it has for the last 3 releases, 4000, 5000, and 6000, and we just heard the 7000 series fixes that fix the 5000 and 6000 crud that was covered up until now in the 7970 release article.
    So amd users will suffer bad IQ in several ways while buzzing up words that are spun from this website as notional greatness and perfectness of amd till like, next release... then your question will be answered - just try to not notice anything until then, ok ?
    Reply
  • blanarahul - Saturday, March 24, 2012 - link

    I was confused as to GPU Boost was necessary or not. Thanks for making the difference clear. Reply
  • ammyt - Saturday, March 24, 2012 - link

    Dafuq y'all saying?
    The benchmarks are tight in front of your faces! The 680 is tied with the 7950, which surpasses it by a little, and the 7970 is the leader. The 7950 is cheaper by a little margin, but the 7970 is roughly $80 more expensive. What are y'all fighting for?

    If I were to choose between the 680, 7950, 7970, I will choose the 7950, cheaper, and a faster by a little margin than the 680. I don't care how or why (memory clock, architecture, bla bla bla) but the benchmarks are in front of you! Clearly, anandtech is biased towards Nvidia.

    (Perhaps they're getting paid from them more than AMD...)
    Reply
  • maximumGPU - Saturday, March 24, 2012 - link

    "The benchmarks are tight in front of your faces!"

    and judging by your conclusion it seems you didn't even read them..
    Reply
  • Skiddywinks - Saturday, March 24, 2012 - link

    "The benchmarks are tight in front of your faces! "

    No s***, Sherlock.

    "The 680 is tied with the 7950, which surpasses it by a little, and the 7970 is the leader. "

    Clearly the benchmarks in front of my face are different to the ones in front of your face.
    Reply
  • BoFox - Monday, March 26, 2012 - link

    I know, that's why I'm telling him that Anandtech Forum is a perfect place for him! Reply
  • BoFox - Monday, March 26, 2012 - link

    Then you'll love Anandtech Forums!! It's the perfect place for you! They'll love you over there! Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    It's absolutely amazing isn't it. Reply
  • SR81 - Saturday, March 24, 2012 - link

    HardOCP has surround benches on both the 680 and 7970, surprisingly the lower bandwidth/VRAM card wins even with 4XMSAA and FXAA enabled at 5760x1200 (ex.Skyrim: 680 = 58.6, 7970 = 45.4)

    When Anand updates this review with surround benches it will leave no doubt which card is the absolute king. I think the articles title is rightfully deserved once testing is done :)
    Reply
  • CeriseCogburn - Saturday, March 24, 2012 - link

    Yep, saw it like 2 days ago, the ram arguments have been foolish once you crank eye candy high enough (on weak ram cards) both comps frame rates are too low to matter. Reply
  • dtolios - Saturday, March 24, 2012 - link

    I know the AMD vs. Nvidia war is a hot topic in Anandtech - just like any other tech forum/review site etc - but one of the really hard applications for modern GPUs is production rendering acceleration.

    There are multiple instances were you can see reviewers trying to compare different GPUs, different architectures, SLI combinations (or just multiple GPU) combinations etc while using GPU accelerated renderers, a professional application that is relying more and more on "game" oriented boards instead of Quadro / FireGL versions (unless vram limitations get in the way).

    Testing on applications like Octane Rendered, Vray 2 GPU, iRay etc, would be a nice addition to your tests - not only because those are hard to find and easily more intensive than "just gaming", but also because few sites have access to such an extensive line of hardware to pull a realistic comparison, including multiple GPUs, different generations, scalability with multiple cards etc. The only "comparison tables" you can easily find, are from people sharing their personal observation on their blog or forum - under not that repeatable conditions etc...

    For some apps, Open CL could be nice to keep on the AMD vs. nVidia "hype" going on, but sadly for some of us, most of these renderers are either exclusively CUDA based, or run better on it, so it would be nice to actually do core comparisons even within the nVidia line: you see, in rendering applications, getting better scalability with multiple cards, or removing 30min out of your 2hour rendering workflow is way more important than gaining 5% FPS advantage over the other card.

    You do include 3DS or similar productivity comparisons in your CPU reviews, so it only make sense to include it for your GPUs too.
    Reply
  • poordirtfarmer2 - Monday, March 26, 2012 - link

    I agree wholeheartedly! I’d love to pick the best “gaming” card for also doing pro work. Although just an amateur, I actually find myself spending more time editing and rendering videos than I do playing games. Reply
  • AnnonymousCoward - Saturday, March 24, 2012 - link

    When 2560x1600 4xAA results in way under 60fps, IMHO it's not a very useful benchmark. Any user would go to 2xAA or no AA, to get 60fps. So who really cares how these cards compare at a setting that's never used. Reply
  • CeriseCogburn - Sunday, March 25, 2012 - link

    They get to show amd "catching up" so they like it. They get to try to puke out Kepler's 2G ram and make amd's 3G shine, so they "can't resist" - and when frame rates fall below playable, "they all of a sudden" "don't care", even when the puking attempt fails. They haven't been able to resist since the 580 w 1.5G vs 2G 6950/6970 it was a great blame the low ram game for any changes.
    Then they checked 6950 1G 2G and 2G was slower...but so what.
    Now 2G Kepler has put the ram lie to rest even in triple monitor gaming... but any lesser win or loss or slimming margin can still be blamed on that, it gets people "buying the amd card" and they get real frustrated here when they can't figure out why Nvidia is winning when they don't believe it should be. It's always expressed in the article how shocked they are. So ram is a convenient scapegoat. It's always used a "future proofing" notion as well, though no evidence has ever surfaced for that.
    Reply
  • _vor_ - Sunday, March 25, 2012 - link

    What's with all the nerdrage? Do you work for NVIDIA? Reply
  • formulav8 - Sunday, March 25, 2012 - link

    Get over yourself already. NVidia doesn't even like You. Can't believe how people feel about a stinking stupid corporation. Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    It's not about a corporation it's about facts guy. Facts mean my friends and my readers get the best they can get for the buck they are paying.
    Just because amd is behind and therefore lies are told, does not mean the truth should not shine through !
    The truth shall shine through !
    Reply
  • AnnonymousCoward - Sunday, March 25, 2012 - link

    Personally, I don't care if the card has 64kB of RAM. Or 8 million stream processors. Performance, cost, power, and noise are what matter.

    And back to my point: performance in the 20-50fps range at 2560x1600 4xAA is meaningless and not a criteria for judgment.
    Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    I never disagreed with that point I merely explained why things are done in such and such a way while ignoring other things.
    It's not difficult at all.
    Reply
  • Zephyr66z0r - Sunday, March 25, 2012 - link

    Well I understand 'some' of the tech behind the GTX680 one thing stands out. 256bit bus width, when you see that with nvidia its along the lines of GTX560.... so does that mean there's going be a 384bit (mid-high) or 512bit(high-enth, 256bit + 256bit + 2 GPU) card/s coming out?

    I can't wait, anyone done SLi with it yet?
    Reply
  • dmnwlv - Sunday, March 25, 2012 - link

    First off, I think nVidia has done a good job with the new GTX680.

    However I do not need a game that is already running at 100+ frames to be even faster.
    It needs to be fast at where it counts - games that are still running slow at 60 fps and below.

    For this, of 3 relevant games, nVidia is faster at just one of them. Experience (if you also remember) has shown that the results could be very different once frames for some settings/games hit below 60fps.

    Hence I cannot agree with all the big f about GTX680 is so much faster fuss.
    You guys are led by the heart (much alike ati fanboys you used to call) than the brain.

    And all other compute tests are non-relevant to me (and majority of you to be honest).
    Reply
  • gramboh - Monday, March 26, 2012 - link

    What about a little game (that several million people play) called Battlefield 3? NV has a massive lead with the GTX 680 over the 7970/7950. AT only benches single player, but the game is even more punishing in 64 player multiplayer. Having a smooth framerate at max detail with 4X AA/16X AF is a big competitive advantage and makes the game significantly more enjoyable.

    Kind of disappointed the card isn't faster in Witcher 2, which I think has the best graphics of a single player game.
    Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    Have all of you people repeating that FUD forgotten Shogun 2 Total War ?
    It's the hardest game in the bench set according to anandtech...
    How is it that THE HARDEST GAME that Nvidia swept top to bottom at every resolution is suddenly and completely forgotten about, while we hear these other FUD declarations ?
    How does that work, just repeat what some other mistaken fudder spewed ?
    Reply
  • SlyNine - Sunday, March 25, 2012 - link

    Well, the driver themself can take more CPU power to run. But with a quad core CPU the thought is laughable. Back in the single CPU/core days it was actually an issue. And before DX9 (or 10) Drivers were only capable of accessing single cores I believe. Reply
  • SlyNine - Sunday, March 25, 2012 - link

    Then look for an overclocked review. Anandtech is always going to do an out of the box for the first review.

    This is what they(amd/nvidia) are promising you, nothing more.
    Reply
  • papapapapapapapababy - Monday, March 26, 2012 - link

    USELESS !

    YESSS OMFG i cant wait to play the latest crappy kinect port with this!.... at 600.000.000 FPS and in 3-D! GTFO GUYS! REALLY....

    just put this ridiculously large, ugly, noisy, silly, and overpriced, toxic waste where it belongs: faaar away from me, ( sensible user) inside one bulky OnLive cloud server. (and pushing avatar 2 graphics, no HDps2 ports)
    Reply
  • henrikfm - Monday, March 26, 2012 - link

    Most monitors have 60Hz refresh rate, you can't benefit from higher frame rates because only 60 frames are drawn.

    By looking at the benchmarks and considering a resolution of 1920, the latest cards fail in 3 games to deliver at least 60fps: Crysis, Metro and BF3. In the first two games the HD7970 beats de GTX680, only loses in BF3 where nVidia has a clear advantage (in my opinion AMD has to work in drivers for BF3).

    So, the GTX680 is faster when the speed really doesn't matter because you're already around 100fps. The guys who are running multiple monitors and higher resolutions will have also money to buy multiple GPU setups, and that is another story.

    Still the GTX680 is a better card, but for $500 I would expect a card to deliver at least 60fps at 1920 for a 2008 released videogame like Crysis. Neither nVidia nor AMD can do that with a single GPU, it's disappointing.
    Reply
  • gramboh - Monday, March 26, 2012 - link

    I'll agree about Metro because there is a sequel (Last Light) coming out in Q1-2013 which will presumably be similar in the graphics department.

    Crysis is irrelevant other than for benchmarking, who still plays it? Single player campaign is entertaining for the eye candy once through (in 2008).

    BF3 is the game that matters because of the MP component, people will be playing it for years to come. AMD really really has to improve performance on the 7950/7970 in BF3, I won't even consider buying it (vs. the 680) unless they can make up some significant ground.
    Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    I just have to do it, sorry.
    You forgot Shogun 2 total war, the hardest game in this bench set, that Nvidia wins in all 3 resolutions.
    You also forgot average frames are not low frames, so you need far above 60 fps average before you don't dip below 60 fps.
    Furthermore, all the eye candy is not cranked, driving the average and dips even lower when it is.
    You got nothing right.
    Reply
  • b3nzint - Monday, March 26, 2012 - link

    back in 7970 review, its got cool stuffs tech. like PRT, MST hubm, DDMA and bla bla bla. why gtx680 dont have sh** like that. pardon my english. its like this thing is built only for 1 purpose only and thats a success. thanks Reply
  • mpx - Monday, March 26, 2012 - link

    This new Nvidia card supposedly has an architecture that burdens CPU with scheduling etc. It may mean that it requires a faster CPU than ATI cards to reach similar performance. And since fast CPUs are expansive it may mean it's actually more expansive. Reply
  • BoFox - Monday, March 26, 2012 - link

    The key word in your first sentence is "supposedly".

    I see no evidence of this. It actually does far better in Starcraft 2, a game that already burdens the CPU. It also excels in Skyrim, while still doing just fine in Civilization V, which are also the most CPU-intensive games out there.
    Reply
  • BoFox - Monday, March 26, 2012 - link

    In SC2 which is a very CPU-dependent game, the card still does amazingly well against the rest of others. The same also goes for Skyrim, beating the red team by a whopping percentage. Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    What you're saying is true for amd cards, and is severe in CF, but true across the board.
    Lower clocked cpu system, better results from Nvidia.
    Reply
  • bozolino - Monday, March 26, 2012 - link

    I dont know whats happening. I have a gtx 560 2win and wanted to compare it to the 680 and see if its worth the replace but the charts cant be compared, for example:
    http://www.anandtech.com/show/5048/evgas-geforce-g...

    This review uses an i7 3xxx while the older uses an i7 720, how can the older cpu performs better than the new one?? Something is very odd....

    I wish they had used same computer as vga test bench, so we could compare all the results together....
    Reply
  • Sharpie - Monday, March 26, 2012 - link

    Its about design trade offs, you can increase the memory speed if you have a narrow bus to get the same throughput of something with lower clock speed and a large bus. all depeends on the architectural design so arguing about clock numbers is pointless generally speaking unless you are comparing apples to apples. Comapring an NVidia chip to an ATI chip is apples to oranges. Yes, im an engineer. Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    Oh I get it you're in some pretend world where the card companies are not "fanboy compared" in the article itself.
    Once the amd card loses, and someone tells the truth about that, it's trolling or fanboy right ?
    If an amd fan lies, it's "a good opinion".
    If an amd fan liar gets corrected, that's "taking it too far" into "fanboy".
    Once aghain, telling the truth on the two competing cards is forbidden only if it's an Nvidia advantage - then WE MUST OMIT the AMD card from the analysis, right ?
    I think you might as well NEVER read another article on a card launch at this site, if you intend to stick to your RUDE comment.
    Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    Heck you didn't read this article nor many others then that say and prove the exact thing.
    I'm sure it will never make sense to you.
    Reply
  • SlyNine - Friday, April 27, 2012 - link

    Actually, I'm pretty sure it's you that doesn't understand the DX API. The CPU's are doing roughly the same thing for both videocards. Cept when it comes to processing the Drivers. Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    The GTX690 currently holds the world record with a 1,900mhz overclock, and is gaining easily 15% with noob out of box overlclocks, and reaching very high memory clocks as well as 1,300+ core, I've seen 1,420 on air, and overclocked models are on their way....
    So much for that fud about 7970... it's FUD FUD and pure FUD.
    Reply
  • blanarahul - Tuesday, March 27, 2012 - link

    GTX 690???? Reply
  • blanarahul - Tuesday, March 27, 2012 - link

    Smart move by NVIDIA. Make a single gpu powerful graphics card to battle the Dual Tahiti monster. Judging by how they are doing the GTX 680 it should have the following specs:-

    4.72 billion transiostors
    2048 CUDA Cores
    384-bit memory bus width
    3 GB VRAM
    600-700 USD
    TDP of 225-250W

    If the clock speeds high enough, it will be quite a match for the 7990!

    The reason i think it will be a single GPU card is because GTX 680=2x GTX 560 Ti.
    So it should be GTX 690=2x GTX 580.
    Reply
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    The release date of the 7970 was jan 11th, 2.5 months is opverstating in AMD's favor...
    Nothing like not knowing what you're talking about huh.
    7970's have come down in price a bit, but you still will find them for much more than $569 at an awful lot of USA online places.
    Reply
  • saturn85 - Tuesday, March 27, 2012 - link

    can we have folding@home benchmark? Reply
  • warmbit - Tuesday, March 27, 2012 - link

    I invite you to the statement of GTX680 card performance based on the results of the eight websites (tom's too):

    http://translate.google.pl/translate?hl=pl&sl=...
    Reply
  • lhotdeals - Wednesday, March 28, 2012 - link

    I never posted a reply to reviews... but the conclusion of this review leaves me baffled. It is clear that 680 has its advantages, but looking at the numbers, the 680 and 7970 trades punches pretty equally in terms of performance. 680 does beat the 7970 in power usage and noise level, but the actual lead is miniscue...

    I do not see how the reviewer can reach the dominating position of the 680 in the conclusion... It is nothing but misleading to an otherwise great review with tons of data points.
    Reply
  • sngbrdb - Wednesday, March 28, 2012 - link

    "...at the end of the day it’s NVIDIA who once again takes the performance crown for the highest performing single-GPU card."

    Are you kidding me?? Are we looking at the same charts? This card only "excels" on the lower end of resolution and quality settings. And in many of your charts, the cards that beat it in higher resolutions are conspicuously absent in the lower-resolution charts (the GTX 590 and Radeon 6990 are frequently missing).

    GPU reviews need to be written by someone who isn't biased toward a brand, and Mr. Smith, that is clearly not you.
    Reply
  • CeriseCogburn - Thursday, March 29, 2012 - link

    Sorry, you can't lie to yourself forever - how about a massive database with percentages and totals from many reviews...
    http://translate.google.pl/translate?hl=pl&sl=...

    At the highest resolutions amd still loses. Amd also loses the lower and lowest Enjoy.
    Reply
  • sngbrdb - Friday, March 30, 2012 - link

    My comments are about the conclusions Mr. Smith draws form the charts he displays, not about whether the card does better elsewhere. The charts that Mr. Smith shows and the conclusions he draws from them are not logical... there may be other data out there that supports his conclusions, but that would mean the data he's displaying here is flawed/inaccurate. Either way, you can't look at those charts and act like the 680 has slaughtered AMD, you end up sounding like a fanboy.

    And charts need to be consistent about the cards being compared... you can't put a card in the chart for one resolution, and leave it out in another.
    Reply
  • CeriseCogburn - Friday, March 30, 2012 - link

    Yes, and anandtech's charts percentages are at the link I provided, that prove you incorrect, period.
    Reality doesn't make anyone a fanboy.Ignoring reality does.
    Reply
  • _vor_ - Saturday, March 31, 2012 - link

    So does replying to nearly all 40 pages of discussion flagrantly waving the NVIDIA banner. Maybe you need to look up the word hypocrisy. Reply
  • CeriseCogburn - Friday, April 06, 2012 - link

    Vor you're angry because you got caught waving the amd flag * incorrectly*.
    There wouldn't be a need if there wasn't false information being spread about.
    It's no wonder it happens when the amd card loses.
    Addressing it is helpful and the right thing to do.
    Reply
  • sngbrdb - Monday, April 09, 2012 - link

    Um.... the charts at the link you provided point back to this article. Please do your research before being rude.

    From the article:
    ---------------------------------

    Sources of data:
    ...
    Anandtech - www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review
    Reply
  • sngbrdb - Friday, March 30, 2012 - link

    *from : P Reply
  • Mombasa69 - Wednesday, April 04, 2012 - link

    This is just a rebadged mid-range card, the 680 has less memory bandwidth than GPU's brought out 4 years ago lol, what a ripp, I can see the big fat directors at Nvidia laughing at all the mugs that have gone out and bought one, thinking this is the real big boy to replace the 580... muppets. lol. Reply
  • N4v1N - Wednesday, April 04, 2012 - link

    Nvidia is the bestest! No AMD is the betterest!
    lol...
    Reply
  • CeriseCogburn - Friday, April 06, 2012 - link

    Yes Nvidia clocked the ram over 6Ghz because their ram controller is so rockin'.
    In any case, the 7970 is now being overclocked, both are to 7000Ghz ram.
    Unfortunately the 7970 still winds up behind most of the time, even in 2650X1200 screen triple gaming.
    Reply
  • raghu78 - Saturday, April 07, 2012 - link

    In the Reference Radeon HD 7970 AND XFX RADEON HD 7970 review the DirectX 11 compute shader Fluid simulation perfomance is far more than in this review.

    http://www.anandtech.com/show/5261/amd-radeon-hd-7...

    http://www.anandtech.com/show/5314/xfxs-radeon-hd-...

    http://images.anandtech.com/graphs/graph5314/43383...

    Reference HD 7970 -133 and XFX HD 7970 -145 . In this review Reference HD 7970 -115.5

    What has changed in between these reviews. Is it that performance has actually decreased with the latest drivers
    Reply
  • oddnutz - Thursday, April 12, 2012 - link

    well i have been an ATI fanboi forever. So I am due a gfx upgrade which would of already happened if ATI priced their latest cards similar to previous generations. I will watch ATI prices over the next few weeks but it looks like i might be turning green soon. Reply
  • blanarahul - Friday, April 13, 2012 - link

    Actually the GTX 680 REFERENCE BOARD was designed for 375 Watts of power.
    It has a total of 2 6-pin and one 8-pin connector on the board! I realized this after seeing the back of the board.
    Reply
  • Commander Bubble - Thursday, April 19, 2012 - link

    I agree with some of the sensible posts littered in here that Witcher 2 should be included as a comparison point, and most notably the ubersampling setup.
    i run 2x 580GTX SLI @1920 and i can't manage a minimum 60fps with that turned on. That would be a good test for current cards as it absoultely hammers them.

    also, i don't know whether CeriseCogburn is right or wrong, and i don't care, but i'm just sick of seeing his name in the comment list. go outside and meet people, do something else. you are clearly spending way too much time on here...
    Reply
  • beiker44 - Tuesday, April 24, 2012 - link

    I can't wait to get one...or wait for the bad ace Dual 690!!! decisions decisions Reply
  • Oxford Guy - Thursday, July 05, 2012 - link

    "At the end of the day NVIDIA already had a strong architecture in Fermi, so with Kepler they’ve gone and done the most logical thing to improve their performance: they’ve simply doubled Fermi."

    Fermi Lite, you mean.

    "Now how does the GTX 680 fare in load noise? The answer depends on what you want to compare it to. Compared to the GTX 580, the GTX 680 is practically tied – no better and no worse – which reflects NVIDIA’s continued use of a conservative cooling strategy that favors noise over temperatures."

    No, the 680's cooling performance is inferior because it doesn't use a vapor chamber. Nvidia skimped on the cooling to save money, it seems.
    Reply
  • unphased - Saturday, July 28, 2012 - link

    I'm using Precision X to OC my Gigabyte GTX 670 and I've got the Mem clock offset at +550Mhz. In the chart log it continues to run at 3557Mhz even while I am not playing any games.

    Is this normal? I even switched off Aero to check and it hasn't changed.
    Reply
  • Gastec - Thursday, November 15, 2012 - link

    I know I'm nobody, not like you americans who are always "somebody" or "something" but I can't just sit here and read and not react. It's enough that I have to put up with them on YouTube, why do you condone them here as well? I'm refering to the likes of Wreckage and CeriseCogburn users that are obviously payed individuals to do negative publicity, here in favor of nVidia. Is that something acceptable here? Am I too old or not "in tone" with the working of the Internet or what? Reply
  • Gastec - Thursday, November 15, 2012 - link

    Having white american genetic traits that allow you to be a convincing how-to-become-rich-and-successful book and TV religion seller migh be a praized quality in your american lands but in my lands we know one thing, and one thing only: that 60 fps is what we want in our games, be them old or new. I don't care if the card can do 120 fps, 10 more than the 110 fps that the other brand can do. That's irrelevant. My monitor works at 60 Hz. If one card can do 55 fps MIN/100 fps MAX, I'll take it over the other one that can do 40 fps MIN/120 fps MAX anyday. So why don't you think about that and convince me to buy your card. With pictures of course. Reply
  • IuteTalpa - Thursday, March 21, 2013 - link

    Hello,

    My name is Iute Talpa and I am adicted to tank games. When you buy nvidia 680 card you receive a promo code that can be used in World Of Tanks. Can please send me that code on dbtreborn@gmail.com ?

    I would really appreciate your gesture.
    Regards,
    Treborn
    Reply
  • BrotherofCats - Monday, June 10, 2013 - link

    I have had nothing but trouble with this card from the start. It crashes a dozen times a day, mostly when I am playing a video, but sometimes when I am using my word processor or Excel. Mostly it freezes the screen for about a minute, then come back with a pop up box that states my driver crashed and has recovered. About twice a day it does a complete crash, the two peripheral screens going white and the central screen gray, and I have to hard boot my computer to get it working again. Have asked for help on the Nvidea forums and facebook page, and none of their solutions, clean install of the driver, or using an earlier driver has worked. Saw on the facebook page that other people are having the same problem. Will probably have to scrap this expensive turkey and get something cheaper that works. Not recommended at all. Reply

Log in

Don't have an account? Sign up now