Power, Temperature, & Noise

As always, we’re wrapping up our look at a video card’s stock performance with a look at power, temperature, and noise. Like we discussed in the introduction, while the official TDP of the GTX 660 Ti is 150W – 20W lower than the GTX 670 – the power target difference is only 7W. So let’s see which is more accurate, and how that compares to AMD’s cards.

GeForce GTX 660 Ti Voltages
Zotac GTX 660 Ti Boost Load EVGA GTX 660 Ti Boost Load Gigabyte GTX 660 Ti Boost Load
1.175v 1.162v 1.175v

Stopping to take a quick look at voltages, there aren’t any big surprises here. NVIDIA would need to maintain the same voltages as the GTX 670 because of the identical clocks and SMX count, and that’s exactly what has happened. In fact all single-GPU GK104 cards are topping out at 1.175v, NVIDIA’s defined limit for these cards. Even custom cards like the Gigabyte still only get to push 1.175v.

Up next, before we jump into our graphs let’s take a look at the average core clockspeed during our benchmarks. Because of GPU boost the boost clock alone doesn’t give us the whole picture – particularly when also taking a look at factory overclocked cards – we’ve recorded the clockspeed of our video cards during each of our benchmarks when running them at 2560x1600 and computed the average clockspeed over the duration of the benchmark. Unfortunately we then deleted the results for the factory overclocked cards, so we only have the “reference” card. Sorry about that guys.

GeForce GTX 600 Series Average Clockspeeds
  GTX 670 GTX 660 Ti Zotac GTX 660 Ti EVGA GTX 660 Ti Gigabyte GTX 660 Ti
Max Boost Clock 1084MHz 1058MHz 1175MHz 1150MHz 1228MHz
Crysis 1057MHz 1058MHz N/A
Metro 1042MHz 1048MHz
DiRT 3 1037MHz 1058MHz
Shogun 2 1064MHz 1035MHz
Batman 1042MHz 1051MHz
Portal 2 988MHz 1041MHz
Battlefield 3 1055MHz 1054MHz
Skyrim 1084MHz 1045MHz
Civilization V 1038MHz 1045MHz

The average clockspeeds on our “reference” GTX 660 Ti don’t end up fluctuating all that much. With a max boost of 1058 the card actually gets to run at its top bin in a few of our tests, and it isn’t too far off in the rest.  The lowest is 1035 for Shogun 2, and that’s only an average difference of 22MHz. The GTX 670 on the other hand had a wider range; a boon in some games and a bane in others. If nothing else, it means that despite the identical base and boost clocks, our cards aren’t purely identical at all times thanks to the impact of GPU boost pulling back whenever we reach our power target.

There are no great surprises with idle power consumption. Given the immense similarity between the GTX 670 and GTX 660 Ti, they end up drawing the same amount of power both during idle and long idle. This does leave AMD with an 8W-10W lead at the wall in this test though.

Moving on to our load power tests we start with Metro: 2033. As we mentioned previously the GTX 660 Ti and GTX 670 have very similar power targets, and this benchmark confirms that. Power consumption for the GTX 660 Ti is virtually identical to the Radeon HD 7870, an interesting matchup given the fact that this is the first time NVIDIA has had to compete with Pitcairn. Pitcairn’s weaker compute performance means it starts off in a better position, but it looks like even with a salvaged GK104 NVIDIA can still compete with it. NVIDIA drove efficiency hard this generation; to compete with a smaller chip like that is certainly a testament to that efficiency.

As for the inevitable 7950 comparison, it’s no contest. The GTX 670 was already doing well here and the GTX 660 Ti doesn’t change that. Tahiti just can’t match GK104’s gaming efficiency, which is why AMD has had to push performance over power with the new 7950B.

Meanwhile it’s fascinating to see that the GTX 660 Ti has lower power consumption than the GTX 560 Ti, even though the latter has the advantage of lower CPU power consumption due to its much lower performance in Metro. Or better yet, just compare the GTX 660 Ti to the outgoing GTX 570.

For AMD/NVIDIA comparisons we have a bit less faith in our OCCT results than we do our Metro results right now, as NVIDIA and AMD seem to clamp their power consumption differently. NVIDIA’s power consumption clamp through GPU Boost is far softer than AMD’s PowerTune. As a result the 7870 consumes 25W less than the GTX 660 Ti here, which even with AMD’s very conservative PowerTune rating seems like quite the gap. Metro seems to be much more applicable here, at least when you’re dealing with cards that have similar framerates.

In any case, compared to NVIDIA’s lineup this is another good showing for the GTX 660 Ti. Power consumption at the wall is 45W below the GTX 560 Ti, a large difference thanks to the latter’s lack of power throttling technology.

As for our factory overclocked cards, these results are consistent with our expectations. Among the Zotac and EVGA cards there’s a few watts of flutter at best, seeing as how they have the same power target of 134W. Meanwhile the Sapphire card with its higher power target is 20W greater at the wall, which indicates that our estimated power target of 141W for that card is a bit too low. However this also means that those times where the Gigabyte card was winning, it was also drawing around 20W more than its competition, which is a tradeoff in and of itself.

Moving on to temperatures, at 31C the GTX 660 Ti is once more where we’d expect it to be given the similarities to the GTX 670. Open air coolers tend to do a bit better here than blowers though, so the fact that it’s only 1C cooler than the blower-type GTX 670 is likely a reflection on Zotac’s cooler.

Speaking of factory overclocked video cards, one card stands out above the rest: the Gigabyte GTX 660 Ti. That oversized cooler does its job and does it well, keeping the GPU down to barely above room temperature.

Considering that most of our high-end cards are blowers while our “reference” GTX 660 Ti is an open air cooler, temperature benchmarks are the GTX 660 Ti’s to win, and that’s precisely what’s going on. 67C is nice and cool too, which means that the open air coolers should fare well even in poorly ventilated cases.

As usual we see a rise in temperatures when switching from Metro to OCCT, but at 73C the GTX 660 Ti is still the coolest reference (or semi-reference) card on the board. To be honest we had expected that it would beat the 7870, but as far as blowers go the 7870’s is quite good.

Moving on to our factory overclocked cards, we’re seeing the usual divisions between open air coolers and blowers. The blower-based EVGA card performs almost identically to the GTX 670, which makes sense given the similarities between the cards. Meanwhile the open air Zotac and Gigabyte cards are neck-and-neck here, indicating that both cards are shooting for roughly the same temperatures, keeping themselves below 70C. Though it’s somewhat weird to see the factory overclocked Zotac card end up being cooler than its reference-clocked self; this appears to be a product of where the fan curve is being hit.

Last but not least we have our look at noise, where we’ll hopefully be able to fully shake out our factory overclocked cards.

Right off the bat we see the blower-based EVGA struggle, which was unexpected. It’s basically the same cooler as the GTX 670, so it should do better. Then again the EVGA GTX 670 SC had the same exact problem.

As for Metro, the GTX 660 Ti once again looks good. 48.2 isn’t the best for an open air cooler, but it’s a hair quieter than the 7870 and notably quieter than the GTX 670. The only unfortunate part about these results is that it just can’t beat the GTX 560 Ti; in fact nothing can. For its power consumption the GTX 560 Ti was an almost unreal card, but it’s still a shame the GTX 660 Ti can’t be equally unreal.

Moving on to our factory overclocked cards however, the Gigabyte GTX 660 Ti OC gets very close thanks to its very large cooler. 43.7dB technically isn’t silent, but it just as well should be. To offer the performance of a GTX 660 Ti (and then some) in such a package is quite the accomplishment.

As for Zotac and EVGA, there’s nothing bad about either of them but there’s also nothing great. EVGA’s card is about average for a blower, while Zotac’s card seems to be suffering from its size. It’s a relatively tiny card with a relatively tiny cooler, and this has it working harder to hit its temperature targets.

Finally we have noise testing with OCCT. Our “reference” GTX 660 Ti actually fares a bit worse than the GTX 670, which is unfortunate. So much of this test comes down to the cooler though that it’s almost impossible to predict how other cards will perform. At least it’s no worse than the 7870.

Meanwhile the Gigabyte GTX 660 Ti OC continues to impress. 43.7dB not only means that it didn’t get any louder switching from Metro to OCCT, but it has now bested the GTX 560 Ti thanks to the 560’s lack of power throttling technology. Make no mistake, 43.7dB for this kind of performance is very, very impressive.

As for EVGA and Zotac, it’s also a rehash of Metro. EVGA’s blower is actually over 1dB quieter than Zotac’s cooler, which is an unfortunate outcome for an open air cooler.

Wrapping things up, even without a true reference sample from NVIDIA it’s clear that the GTX 660 Ti has a lot of potential when it comes to power/temp/noise. Compared to other cards it’s roughly equivalent in power consumption and noise to the 7870, which for NVIDIA is an important distinction since it’s also notably faster than the 7870, so NVIDIA is on a better place on the power/performance curve. This goes for not only the 7870, but especially the 7950, where the GTX 660 Ti continues the tradition the GTX 670 already set, which will see the GTX 660 Ti being cooler, quieter, and less power hungry than AMD’s entry-level Tahiti part.

But it must be pointed out that the lack of a reference design for the GTX 660 Ti means buyers are also facing a lot of variability. Power consumption should be consistent between cards – which is to say a hair less than the GTX 670 – but temperature and especially noise will vary on a card by card basis. Potential buyers would best be served by checking out reviews ahead of time to separate the duds from the gems.

Synthetics OC: Power, Temperature, & Noise
Comments Locked

313 Comments

View All Comments

  • JarredWalton - Tuesday, August 21, 2012 - link

    The defense is quite simple: you're extremely biased towards NVIDIA, and you're going around picking and choosing comparisons that support the way you think it's meant to be played. Mind you, I'm running a GTX 580 personally -- with a 30" 2560x1600 LCD no less -- but that's beside the point

    Your post is laughable because it really boils down to this: you're not as informed as you like to think you are. There are a variety of Korean 27" LCDs selling for ~$400 or less that use the same LG IPS panel. Go search any hardware enthusiast forum (you've linked and mentioned several just in this single post, nevermind the 20 or 30 others you've made on this article shouting down everyone that disagrees with you) and I can guarantee you'll find posts about Catleap, Yamakazi, Auria, and several other brand names. Microcenter (a US company that enthusiasts should be more than familiar with) also carries one of these LCDs for $400: http://www.microcenter.com/single_product_results....

    So basically, your main premise that no one uses such LCDs is at best the perspective of someone with serious blinders. But you don't stop there. You go on to complain how no one sells 2560x1600 LCDs these days. Apparently, you weren't around for the transition from 16:10 to 16:9 and somehow think everything has to be tested with 16:9 now? That would make you one of the even less informed people that thinks 16:9 LCDs are somehow preferable to 16:10 I suppose. Given Ryan has a 30" LCD, why should he test it at less desirable resolutions?

    But even then, you still have to keep going. Ryan specifically comments throughout the text on 1920 performance and rarely mentions 2560, but you appear to get stuck on the presence of 2560 graphs and seem to think that just because they're there, he's writing for 2% of Steam's readership -- which is already a biased and useless number as Steam hardware surveys are always out of date and frequently not resubmitted by people that have already seen the survey 100 times.

    Short summary: you have ranted for over 15000 words in the comments of this article, all with an extremely heavy NVIDIA bias, and yet you have the gall to accuse someone else of bias. You've shown nothing in your above commentary other than the fact that positioning of the various GPUs right now can vary greatly depending on the clock speeds of the GPUs. (Hint: a heavily overclocked GTX 660 Ti 3GB card -- which totally eliminates the asymmetrical aspect of the other 660 Ti cards -- beating a stock HD 7950 doesn't mean much, considering the pricing is actually slightly in favor of the 7950.)

    What it really comes down to is this: buy the card that will best run the games that you tend to play at the settings and resolutions you play at. Period. If you have a 1920x1200 or 1080p display, just about any current $300+ GPU will handle that resolution with maximum detail settings in most games. If you have a better 2560x1600/1440 display, you'll want to check performance a bit more and make sure you get the right card for the job -- I'd suggest looking at 4GB GPUs if you want something that will last a while, or just plan on upgrading again to the HD 8790/GTX 780 next year (and the HD 9790/GTX 880 after that, and....)

    It basically comes down to opinions and a-holes; everyone has one. You think the 2560x1600 crowd apparently doesn't matter, going so far as to say " I think most buy $600 video cards and $300 monitors, not the other way around." I would say anyone buying $600 in video cards to run a $300 monitor has their priorities severely screwed up.

    I bought an $1100 30" LCD five years back and I'm still using it. During that time I have upgraded my system, CPU, GPU, etc. numerous times, but I still use the same LCD. Buying a high quality LCD is one of the best hardware investments you can make, and you're an idiot to think otherwise. That you can now buy 27" QHD displays for $300-$400 will only serve to increase the number of users who own and use such displays. The only reason not to have a larger display is if you simply don't have the space for it.

    I know three people that have between them purchased five of these Korean LCDs, and they're all quite happy with the results. You might recognize the names: Brian Klug, Chris Heinonen, and Ian Cutress. The only issue I have with the cheap QHD panels is that most of them don't have DisplayPort, and that will become increasingly important going forward. So spend a bit more to find one with DP on it. But to dismiss 2560 displays just because you're too cheap to buy one is extremely biased towards your world-view, just like the rest of your posts have been.

    NVIDIA makes some very good GPUs, but so does AMD; which GPU is better/best really comes down to what you plan on doing with it. CUDA people need NVIDIA, obviously, but someone doing BTC hashing wouldn't be caught dead trying to do it on NVIDIA hardware. It's about using the right tool for the job, not about shouting the loudest every time someone offers a differing opinion.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    What a joke of a response that was. He proved the reviewer flat out lied repeatedly, and without a doubt, with a huge lying bias for amd.
    That you CAN'T address, and didn't.
    #2. Catleap is around, so show us the catleapers with 660TI or 7870...
    I've been to the forums and thejian is correct, they nearly all have 2x 680's or 2x 7970's or both. Overclock forum thread.
    #3. This is a gamers review, not bitcoin, and no cuda either, your ending paragraph is goofy as a response as well.

    I like and enjoy anand, but it would be a lot more enjoyable if people told the truth, especially in their own reviews. When the obvious bias exists, it should then at least be admitted to. It's pretty tough getting through the massively biased wording in general as well, and I don't care to go pointing it out again and again. If you can't notice it, something is wrong with you, too.

    You didn't prove thejian incorrect, not even close, but he certainly proved your fellow worker incorrect.
    When I complained about the 1920x1200 when it should be 1920x1080 just several card reviews ago, showing how nVidia won by a much larger amount in that much more common resolution, a load of the enthusiasts here claimed they own 1920x1200, there was only 1 that claimed a 2560x. LOL

    What is obvious is most of your readers and commenters don't even have a 1920x1200, and yes they whine about $5...

    So, nice try, it didn't work, and if the amd fans didn't keep lying and responding thejian wouldn't have to either... however overall it's great people DO what Russian and thejian and others do with long comments, it's way better than smart snarking and ignorant one liners and pure stupidity and grammar complaints to the reviewer.

    If people whine so much about warranty and they do, thejian has a very good point on the monitors, as well. Also they are 2560x1440, so this review doesn't address them, because they are too expensive for ANANDTECH ! In fact, we've seen how a $20 killawatt is too expensive for the anandtech site (above reviewer).

    Okay ? I'm not agreeing with you, because the facts don't fit - and the point on the cards pushing the pixels is ALSO correct in thejian's FAVOR, another portion you completely sidestepped.

    Anyway, I know it's hard, and anand is a great site and it's reviewers I'm sure do their best and do a lot of good work, but facts are facts and fans are fans and fanboys should still use the facts to be a fan when making a recommendation.

    Have a good week.
  • TheJian - Friday, August 24, 2012 - link

    Thank you. You read it, and commented as such. I'm not sure he did :)

    Have a good one.
  • TheJian - Friday, August 24, 2012 - link

    You're defending buying no name monitors from what I found first looking at amazon/newegg (zero at newegg) having "just launched" their site. 3 from Korea, one New zealand. You did read the post correct? I don't give my credit card to foreign countries, or buy no name monitors (even locally) with no company websites for the resellers, gmail account for help, no phone to call, blank faq & help pages. Are you serious? If newegg and amazon don't carry it, I'd say it's not too popular no matter what it is in electronics/pc gear.

    YOU do not run your games at that res and hit 30fps very often. There are a LOT of games that will be well below (the witcher, Batman, just to name a few popular ones).

    You're still defending a position that is used by 2% of the population. That's laughable. Nice fanboy comment...The best you got? Tech report did an article on one monitor he Ebay'd (jeez), I read it and he almost had a heart attack waiting for it...Then no OSD at all...LOL. No other monitor adjustments other than bright/contrast and sound up/down. These are not the quality of a dell :) I'm sure you can find the article at techreport.com He had multiple scares ;) Roll the dice on a non brand if you'd like. I don't see a cheap anything that isn't Korean. HP starts $688, Dell 800.

    LOL to cheap to buy one?...You're wasting my time attacking me & not attacking my data at all. Keep attacking me, it only looks worse. You already stated my point in another comment. These cards are used at 1920x1200 or less and his review beats bandwidth like a dead horse. These are gaming cards, BTC hashing?...I digress...another decimal point of the population.

    You could claim bias all day, it won't help the numbers I pointed out, nor your case. You've wasting a ton of words yourself trying to convince me a few reviewers (related to this site) are in the 98%. You're the 2% or steampowered survey's are just lying eh?

    Never said AMD was crap. Stating the facts and them not going the way you want doesn't make me a fanboy..LOL. Nice try though. His own conclusions make no sense as pointed out. I see nothing above but opinion. Where I gave a book of data :) You really want to argue about their financials...Not bias there either...Just stating financial facts.

    Read it again, it beat the boost. Boost craps out at far less than 1200mhz. Read your own AT article. That post is half of his own words from two articles that make his conclusions incorrect. I can't believe you wasted all that air trying to convince me 2560x1600 and these monitors are the norm. I would expect reviewers to have them, but not to think we all do.

    Many of my posts said good things about AMD...I even said Intel owed them 20bil not 1.5. etc. I even mentioned why they're suffering now (Intel stealing sales, stopping people from buying AMD years ago when I owned a PC business, courts took far too long to decide their case). You really didn't bother to read, but rather went on a monitor rant riddled with personal attacks. Nice.
    BTC hashing...LOL. Nuff said. I discussed the games. You discuss Bitcoin hashing, and defend resolutions YOU already told the other poster isn't used by these...LOL. My world view is 98% of us shouldn't be misled by a 2% opinion. But you just keep thinking that way. ;)
  • TheJian - Sunday, August 19, 2012 - link

    "For every Portal 2 you have a Skyrim, it seems. At 1920 the GTX 660 Ti actually does well for itself here, besting the 7900 series, but we know from experience that this is a CPU limited resolution. Cranking things up to 2560 and Ultra quality sends the GTX 660 Ti through the floor in a very bad way, pushing it well below even the 7870, never mind the 7900 series."
    &
    "68fps is more than playable, but hardcore Skyrim players are going to want to stick to cards with more memory bandwidth."

    Based on my previous post regarding the hardware survey at steampowered, hardcore skyrim players don't exist I guess Ryan? Since nobody uses this res (uh oh, the 3 players using this res are about to complain...LOL), why act like it's important? Making the statement hardcore Skyrim players (in your opinion people with 2560x+ I guess?) should avoid this 660 TI, is at best bad journalism. At worst, I'm not sure, an AMD ad? Also, it's 75fps since the Zotac is FAR more accurate compared to what you BUY for $299/309 at newegg. For both prices you can get a card that is 100mhz faster than the one in REF GREEN in your graphs (your ref version). I'd argue 75fps (even 68.5) at a res nobody plays at is really good. Since when is 75fps unplayable? Never mind the fact I think this res is useless and you should be realizing most are using (e.g. most hardcore users) 1920x1200 or below and you're actually better off with Nvidia in this case...ROFL. For the res I think this card is designed for, it's the best out there and your review should clearly reflect that. The 7950 BOOST edition can't be had for less than $350 and barely be had at all. Never mind the watts/heat issues.

    It's arguable that "hardcore players" could get away with anything in the list but the 560ti as they all hit over 83fps in what you've already stated is a cpu bound res. What evidence do you have that show more than 2% of users in the world use a res over 1920x1200? I'd say steampowered stats are a pretty good representation of what gamers are using and 2560x+ is NOT what they're using unless you have evidence to prove otherwise? Use more games in your tests to show variations rather than resolutions none (meaning 98% apparently) are using. Again I'd say a separate article should be written for the highest resolutions and mutli-monitor gaming, but using it as a basis for recommendations in std consumer cards is ridiculous. I'd rather see 15 games tested (easier to make sure you're avoiding the ones everyone Optimizes for and are benchmarked everywhere), for a better look at overall play across what 98% of us are using.

    This brings your whole conclusion into question, which it seems is totally based on running at 2560x+. Raise your hand if you run at this res or above? I see the same 3 people...LOL.

    "Coupled with the tight pricing between all of these cards, this makes it very hard to make any kind of meaningful recommendation here for potential buyers."
    http://www.newegg.com/Product/Product.aspx?Item=N8...
    Core Clock: 1019MHz Boost Clock: 1097MHz vs. your ref at 915/980. They are selling a CORE BASE that's above your REF BOOST...for $299. What's a virtual launch when 12 cards are available at newegg.com, and only 1 7950 Boost at $350?
    Borderlands 2 free with it also, and another at $309 with basically same specs:
    http://www.newegg.com/Product/Product.aspx?Item=N8...
    1006base/1084 boost. Again real close to your Zotac Amp for $309. So looking at the AMP basically as a BASE at $299/309 (it's only 14mhz faster in both base/boost clocks, which is nothing - not sure why they even sell it) let's fix your conclusion based on 98% of users res:

    Zotac AMP (only 14mhz faster base/boost than $299/309 cards linked above) vs. 7950 (again more expensive by $20) @ 1920x1200
    Civ5 <5% slower
    Skyrim >7% faster
    Battlefield3 >25% faster (above 40% or so in FXAA High)
    Portal 2 >54% faster (same in 2560x...even though it's useless IMHO)
    Batman Arkham >6% faster
    Shogun 2 >25% faster
    Dirt3 >6% faster
    Metro 2033 =WASH (ztac 51.5 vs. 7950 51...margin of error..LOL)
    Crysis Warhead >19% loss.
    Power@load 315w zotac amp vs. 353 7950 (vs 373w for 7950B)! Not only is the 660TI usually faster by a whopping amount, it's also going to cost you less at the register, and far less at the electric bill (all year for 2-4 years you probably have it - assuming you spend $300-350 for a gaming card to GAME on it). Note the AMP is about as bad as the 660 TI watts/heat/noise can get.

    For $299 or $309 I'll RUN home with the 660 TI over 7950 @ $319. The games where it loses, you won't notice the difference at those frame rates. At todays BOOST prices ($350) there really isn't a comparison to be made. I believe it will be a while before the 7950B is $320, let along $299 of the 660 TI. This card DOMINATES at 1920x1200 & below which according to steampowered hardware survey is what 98% of us use for a resolution. So 98% of you have a no brainer recommendation for this card...There...FIXED.

    I own a Radeon 5850 (and waited 7 months for it like the other 470 amazon buyers on back order as they tried to get us to drop our orders by making a new xfx model#) ...My bad, in my other post I put 8850...ROFL...You can google the amazon complainers if wished or you doubt that I own one... :) Just did-google this, you'll land in the complaints:
    "jian amazon backorder 5850" (without the quotes)
    Top of the listed links will get you to the backorder complaints for the card...LOL. I got a card eventually, so don't go giving me that AMD hate crap. Just the facts :) But you can guess what I'll buy this black Friday :) Because the 660TI is awesome, just like my 5850 for $260 was. Unless you're planning on running above 1920x1200 any time soon, you're retarded buying anything but 660TI at the $300 price range (including the 670+, save your money). Heat, Noise, watts...NO BRAINER. 660TI even at Zotac AMP heat/noise/watts. Perf at the resolutions 98% of us use...NO BRAINER. IF you're dickering over $20 (as ryan is in his recommendation of them all being close together) then you don't have the cash for 3 monitors and triple wide gaming either. IF you DO have 3 monitors (likely a Quadcore also), surely you can afford TWO of these and rock the house no matter what you play. Again, though, that's a 2% user base I'm talking about here. You should rewrite your conclusion Ryan. It's baseless currently. Running your LCD in a res not native? Really? I'm kind of offended Ryan, I think you just called me NOT a hardcore gamer...LOL. :)

    One more note: Mutli monitor resolutions at steampowered @ 2560x1600 and below are less than 7% (add up the list!). So again, Ryan you're not making sense. Most people running this res and above have more than one monitor and probably have more than one card to do it. Note it's a 2% user group any way you cut it, and even less when you consider these mostly have more than one monitor and card. I doubt the people you wrote the conclusion for (it seems) are worried about $20.
  • skgiven - Sunday, August 19, 2012 - link

    Although 192bit's is obviously a step down from 256, most games won't be overly impacted even on PCIE2 setups. For those that are, if you go to a PCIE3 setup the 192bit limitation largely disappears; basically PCIE3 bandwidth is twice as fast as PCIE2. So for example, if you have a PCIE3 capable 1155 motherboard and pull an i7-2600 and replace it with an i7-3770 (similar CPU performances) the bandwidth effectively doubles and would be equivalent to 384 at PCIE2. Obviously that would be a fairly pointless upgrade in terms of CPU performance but Intel's 'sly' control is paying off; you have to upgrade your CPU to benefit your GPU. An i7-2600 or similar is still a much sought after CPU, so they are readily salable, making the 'upgrade' reasonably inexpensive. However the LGA1155's are very limited boards, and adding a second card would drop you from x16 to x8 rates for both cards, albeit at PCIE3 performance. So if bandwidth is likely to be a problem for any game now or in the future and your on an LGA1155 just get a bigger card (670/680) rather than going SLI. Adding a second card on a PCIE2 rig could be a really stupid move.
  • Ryan Smith - Sunday, August 19, 2012 - link

    Skgiven, I'm afraid you've been misinformed.

    "the bandwidth effectively doubles and would be equivalent to 384 at PCIE2"

    The speed of the PCIe bus is in no way connected to the width (or overall speed) of the local memory bus on a video card. The local memory bus is an entirely different and otherwise isolated subsystem.

    While PCIe 3.0 may improve a video card's performance - though for single card scenarios we have not found any games that meaningfully benefit from it - any improvement would be due to improved transfer performance between the video card and CPU. If rendering performance was being constrained by memory bandwidth, then it would continue to be so.
  • TheJian - Sunday, August 19, 2012 - link

    Just realized this uses Cry engine 2.0 from 2008. So the only real loser for Nvidia here is from 2008. What happens when you run today's CryEngine 3.0 from Crysis 2? As in a game released March 2011 with DirectX 11, and has even had the HIGH RES patch released which adds :DirectX 11 hardware tessellation & Ultra Upgrade adds soft shadows with variable penumbra, improved water rendering, particle motion blur and shadowing, Parallax Occlusion Mapping, full-resolution High Dynamic Range motion blur, & hardware-based occlusion culling.

    "The test run apply is stringent, harsh and really only suited for high-end DX 11 class graphics cards of 2011 and 2012. "
    from Guru3d Radeon 7950 BOOST article vs a ref clocked 660TI (far slower than AMP):
    http://www.guru3d.com/article/radeon-hd-7950-with-...
    Umm...even at 2560x1600 the 7950 wins by scoring 35fps vs REF CLOCKED 660 TI 34fps.
    Meaning the 660 TI's in this review would CRUSH it. Update your game instead of calling crysis a thorn in NVidia's side and showing a 20% loss for warhead from 2008. You're question should have been "Can it run Crysis 2 DX11 with updated High res patch with all the goodies turned on?". Answer...Yes, as fast as a 7950 BOOST at 2560x1600 and faster below this res (albeit by the same 1fps margin...LOL...Crysis 2 is a WASH for 7950Boost vs. 660TI REF, but a LOSER for the two cards I linked to before at newegg for $50 less than 7950BOOST). Also note it clearly beats $319 7950 regular at any res already in crysis2. As I read further at guru3d, it's clear you need more games. Lost planet2 at 2560x1600 7950boost loses by 20% (40 to 48 vs. ref clocked TI, not vs AMP or two cards I linked). Aliens vs. Predator shows 7950boost beating 660TI by more than 20% (dx11), again showing why more games should be tested, mind you this is at 1920x1200!...LOL. I'm not afraid to show where NV 660 gets capped...ROFL. 54fps 7950vs 40fps 660TI. No need to show 2560 settings if you pick better dx11 group of games (is anyone playing warhead 4yrs later with crysis 2/hi res out?). It goes on and on, even at tomshardware. You need more games. Note CRYSIS 3 will be using the CryEngine 3 with basically all the goodies released in the Crysis 2 update/hires download patches. So highly relevant.

    Worse, after l keep looking at both, and the specs on 7950's at newegg, you can get a 7950 that seems to put to shame AMD's own new Boost speeds to shame at 900mhz:
    http://www.newegg.com/Product/Product.aspx?Item=N8...
    XFX with 900 CORE for $329 (rebated). and another for the same after rebate also:
    http://www.newegg.com/Product/Product.aspx?Item=N8...
    Perhaps you should test what is sold and easily had rather than AMD's version? Though I'm not sure they boost any higher than normal in either case. Only the NV cards showed the core & boost speeds. Obviously power draw and heat would be worse than your review though. I'd still rather see these benchmarked than AMD's ref 7950 design. It's clear they clocked it too low when you can pick up a 900mhz version for $330 after rebate (though only two, rest are $350). Then again, maybe they didn't want to show worse in heat/noise/watts dept.

    This still doesn't change my "review" (LOL) of your conclusion though. 2560x1600+ is NOT what people are running, and the 660 TI is still awesome at 1920x1200 and below for $300 and can be had at that price far above your ref card reviewed here (as I just proved, the same can be said for clocks at 900 core on AMD, but $30 is still 10% higher than $300 and all the heat/noise/watts still applied only worse).

    You started your review of the benchmarks with this (though after crysis warhead instead of crysis 2):
    "For a $300 performance card the most important resolution is typically going to be 1920x1080/1200"
    You should have based your conclusion on that statement alone. It's TRUE. I already proved it from Steampowered hardware survey (30% use these two resolutions today! over ALL cards). Throw out crysis warhead for crysis 2 w/updates and your conclusion should have been very easy to make.

    For that extra $30-50 a 7950 boost costs you can make sure you get an Ivy Bridge K chip (both are only $20 more than regulars i5 or i7) and have ultimate overclocking on your cpu when desired. You can overclock either card already for free (amd or nvidia). By the end of the article I think you forgot what the heck you were reviewing. Two cards battling it out at 1920x1200. Your analysis after each benchmark seems to indicate you thinking these are 2560x1600 cards and that's the most important thing to remember here. Nope. It's NOT, by your own words earlier it's really 1920x1200/1080. Along with conclusions being off, and games being too few (and old totally & out of date in Crysis Warhead's case), you should have put in a 900 amd clocked 7950 (you could have easily run the one you had at that speed to show what you can buy for $330). Who would piss away 10%+ clockspeed on AMD's ref version when you can get multiple models after rebate at $10 more? $319 is the lowest 7950 and only at 800mhz, and a 900mhz version can be had for $10 more after rebate in 2 models. While AMD may have wanted it benchmarked this way for heat/noise I think most would buy the 900mhz version. Maybe you should just rethink the whole idea of benching their ref versions altogether when they don't represent a real purchase at ref prices.

    In the end though, this is just a misleading review currently, no matter how I cut it. Further, the 7950 (boost or not) just isn't worth $330 or $350. It's hot and noisy, and uses more watts for a heck of a lot of LOSING in the benchmarks by LARGE margins. You guys are getting like Tomshardware, who blew their review by reducing overclocked cards to ref speeds (they had the $300 MSI 660 TI I linked to at newegg @ core 1019/boost 1097 in their hands and didn't use it...ROFL). Why the heck even mention you have them if you're going to reduce every one of them before benching them (amd or nvidia - they reduced all...LOL)? Where do you go for a GOOD review these days? Consider that if you bounce over to Tomshardware review people. Small print shows they reduce everything to ref in the test setup page...ROFLMAO...jeez. Would you review a Ferrari at 55-65mph because that's the speed limit? Heck no. I wouldn't tape myself driving on the street at 200mph, but I'd sure test it there without taping myself if I was benchmarking cars...LOL. I'd rather see reviews based on cards you can purchase at the best speeds at the best pricing in both brands (amd/NV) when they release a new product. Include their ref speeds, but also include what we'll buy. In AMD's case here, you have to work to buy a 7950@800mhz (the only card at newegg for $319!). You'd have to ignore the same pricing at 850-900mhz next to the rest at $329+. Who does that? Heck most are at 350, with the two I mention at 900mhz being rebated to $330...LOL. What would you buy, assuming you wanted a 7950 disregarding heat/noise/watts used and the performance at it's intended 1920x1200 res (meaning at this point, you'd have to be an AMD fanboy - which I kind of admit I am...LOL)? You'd buy the 900mhz for $330 after rebate. If you want to ignore rebated products, the same would be true of my conclusions. $309 can get you a clock of almost Zotac 660 TI AMP speed as shown already. 1019core/1097boost. For anyone who likes paperwork you'd get the same card for $10 off. Ignoring the rebate it's still a no brainer. I used to have to read 3 reviews to get a good picure of a product, but these days I have to read a good 10 reviews to get a TRUE picture with all the review shenanigans going on.
  • Will Robinson - Sunday, August 19, 2012 - link

    My goodness the NVDA trolls are here in force on this launch.
    That immediately points to Green Team fear that this card just isn't going to cut it against the HD7950.
    Cerise Cogburn(aka)Silicon Doc has been banned from just about every tech site on the Net.
    Back for some more trolling before the ban Doc?
  • claysm - Monday, August 20, 2012 - link

    I absolutely agree. This Cerise Cogburn loser and his friend TheIdiot (oops, TheJian, stupid autocorrect) have been trolling harder than I've seen in a long time. Go home guys, you've contributed nothing.

Log in

Don't have an account? Sign up now