Power, Temperature, & Noise

Last but certainly not least, we have our obligatory look at power, temperature, and noise. Next to price and performance of course, these are some of the most important aspects of a GPU, due in large part to the impact of noise. All things considered, a loud card is undesirable unless there’s a sufficiently good reason to ignore the noise.

It’s for that reason that GPU manufacturers also seek to keep power usage down, and under normal circumstances there’s a pretty clear relationship between power consumption, heat generated, and the amount of noise the fans will generate to remove that heat. At the same time however this is an area that NVIDIA is focusing on for Titan, as a premium product means they can use premium materials, going above and beyond what more traditional plastic cards can do for noise dampening.

GeForce GTX Titan Voltages
Titan Max Boost Titan Base Titan Idle
1.1625v 1.012v 0.875v

Stopping quickly to take a look at voltages, Titan’s peak stock voltage is at 1.162v, which correlates to its highest speed bin of 992MHz. As the clockspeeds go farther down these voltages drop, to a load low of 0.95v at 744MHz. This ends up being a bit less than the GTX 680 and most other desktop Kepler cards, which go up just a bit higher to 1.175v. Since NVIDIA is classifying 1.175v as an “overvoltage” on Titan, it looks like GK110 isn’t going to be quite as tolerant of voltages as GK104 was.

GeForce GTX Titan Average Clockspeeds
Max Boost Clock 992MHz
DiRT:S 992MHz
Shogun 2 966MHz
Hitman 992MHz
Sleeping Dogs 966MHz
Crysis 992MHz
Far Cry 3 979MHz
Battlefield 3 992MHz
Civilization V 979MHz

One thing we quickly notice about Titan is that thanks to GPU Boost 2 and the shift from what was primarily a power based boost system to a temperature based boost system is that Titan hits its maximum speed bin far more often and sustains it more often too, especially since there’s no longer a concept of a power target with Titan, and any power limits are based entirely by TDP.  Half of our games have an average clockspeed of 992MHz, or in other words never triggered a power or thermal condition that would require Titan to scale back its clockspeed. For the rest of our tests the worst clockspeed was all of 2 bins (26MHz) lower at 966MHz, with this being a mix of hitting both thermal and power limits.

On a side note, it’s worth pointing out that these are well in excess of NVIDIA’s official boost clock for Titan. With Titan boost bins being based almost entirely on temperature, the average boost speed for Titan is going to be more dependent on environment (intake) temperatures than GTX 680 was, so our numbers are almost certainly a bit higher than what one would see in a hotter environment.

Starting as always with a look at power, there’s nothing particularly out of the ordinary here. AMD and NVIDIA have become very good at managing idle power through power gating and other techniques, and as a result idle power has come down by leaps and bounds over the years. At this point we still typically see some correlation between die size and idle power, but that’s a few watts at best. So at 111W at the wall, Titan is up there with the best cards.

Moving on to our first load power measurement, as we’ve dropped Metro 2033 from our benchmark suite we’ve replaced it with Battlefield 3 as our game of choice for measuring peak gaming power consumption. BF3 is a difficult game to run, but overall it presents a rather typical power profile which of all the games in our benchmark suite makes it one of the best representatives.

In any case, as we can see Titan’s power consumption comes in below all of our multi-GPU configurations, but higher than any other single-GPU card. Titan’s 250W TDP is 55W higher than GTX 680’s 195W TDP, and with a 73W difference at the wall this isn’t too far off. A bit more surprising is that it’s drawing nearly 50W more than our 7970GE at the wall, given the fact that we know the 7970GE usually gets close to its TDP of 250W. At the same time since this is a live game benchmark, there are more factors than just the GPU in play. Generally speaking, the higher a card’s performance here, the harder the rest of the system will have to work to keep said card fed, which further increases power consumption at the wall.

Moving to Furmark our results keep the same order, but the gap between the GTX 680 and Titan widens, while the gap between Titan and the 7970GE narrows. Titan and the 7970GE shouldn’t be too far apart from each other in most situations due to their similar TDPs (even if NVIDIA and AMD TDPs aren’t calculated in quite the same way), so in a pure GPU power consumption scenario this is what we would expect to see.

Titan for its part is the traditional big NVIDIA GPU, and while NVIDIA does what they can to keep it in check, at the end of the day it’s still going to be among the more power hungry cards in our collection. Power consumption itself isn’t generally a problem with these high end cards so long as a system has the means to cool it and doesn’t generate much noise in doing so.

Moving on to temperatures, for a single card idle temperatures should be under 40C for anything with at least a decent cooler. Titan for its part is among the coolest at 30C; its large heatsink combined with its relatively low idle power consumption makes it easy to cool here.

Because Titan’s boost mechanisms are now temperature based, Titan’s temperatures are going to naturally gravitate towards its default temperature target of 80C as the card raises and lowers clockspeeds to maximize performance while keeping temperatures at or under that level. As a result just about any heavy load is going to see Titan within a couple of degrees of 80C, which makes for some very predictable results.

Looking at our other cards, while the various NVIDIA cards are still close in performance the 7970GE ends up being quite a bit cooler due to its open air cooler. This is typical of what we see with good open air coolers, though with NVIDIA’s temperature based boost system I’m left wondering if perhaps those days are numbered. So long as 80C is a safe temperature, there’s little reason not to gravitate towards it with a system like NVIDIA’s, regardless of the cooler used.

Load GPU Temperature - FurMark

With Furmark we see everything pull closer together as Titan holds fast at 80C while most of the other cards, especially the Radeons, rise in temperature. At this point Titan is clearly cooler than a GTX 680 SLI, 2C warmer than a single GTX 680, and still a good 10C warmer than our 7970GE.

Idle Noise Levels

Just as with the GTX 690, one of the things NVIDIA focused on was construction choices and materials to reduce noise generated. So long as you can keep noise down, then for the most part power consumption and temperatures don’t matter.

Simply looking at idle shows that NVIDIA is capable of delivering on their claims. 37.8dB is the quietest actively cooled high-end card we’ve measured yet, besting even the luxury GTX 690, and the also well-constructed GTX 680. Though really with the loudest setup being all of 40.5dB, none of these setups is anywhere near loud at idle.

It’s with load noise that we finally see the full payoff of Titan’s build quality. At 51dB it’s only marginally quieter than the GTX 680, but as we recall from our earlier power data, Titan is drawing nearly 70W more than GTX 680 at the wall. In other words, despite the fact that Titan is drawing significantly more power than GTX 680, it’s still as quiet as or quieter than the aforementioned card. This coupled with Titan’s already high performance is Titan’s true power in NVIDIA’s eyes; it’s not just fast, but despite its speed and despite its TDP it’s as quiet as any other blower based card out there, allowing them to get away with things such as Tiki and tri-SLI systems with reasonable noise levels.

Much like what we saw with temperatures under Furmark, noise under Furmark has our single-GPU cards bunching up. Titan goes up just enough to tie GTX 680 in our pathological scenario, meanwhile our multi-GPU cards start shooting up well past Titan, while the 7970GE jumps up to just shy of Titan. This is a worst case scenario, but it’s a good example of how GPU Boost 2.0’s temperature functionality means that Titan quite literally keeps its cool and thereby keeps its noise in check.

Of course we would be remiss to point out that in all these scenarios the open air cooled 7970GE is still quieter, and in our gaming scenario by actually by quite a bit. Not that Titan is loud, but it doesn’t compare to the 7970GE. Ultimately we get to the age old debate between blowers and open air coolers; open air coolers are generally quieter, but blowers allow for more flexibility with products, and are more lenient with cases with poor airflow.

Ultimately Titan is a blower so that NVIDIA can do concept PCs like Tiki, which is something an open air cooler would never be suitable for. For DIY builders the benefits may not be as pronounced, but this is also why NVIDIA is focusing so heavily on boutique systems where the space difference really matters. Whereas realistically speaking, AMD’s best blower-capable card is the vanilla 7970, a less power hungry but also much less powerful card.

Synthetics Final Thoughts
Comments Locked

337 Comments

View All Comments

  • CeriseCogburn - Saturday, February 23, 2013 - link

    $800 or $900 dollars is close enough to a grand that it seems silly.

    Two 7970's at the $579 release and months long price is nearer $1200, and we have endless amd fanboy braggarts here claiming they did the right thing and went for it or certainly would since future proof and value is supreme.

    Now not a single one has said in this entire near 20 pages of comments they'd love to see the FUTURE PROOF ! of the 6 GIGS of ram onboard...
    Shortly ago it was all we ever heard, the absolute reason the 79xx series MUST be purchased over the 600 nVidia series...

    ROFL - the bare naked biased stupidity is almost too much to bear.

    Now the "futureproof" amd cards the crybaby liars screeched must be purchased for the 3G of ram future wins, ARE LOSERS TO THIS NEW TITAN PERIOD, AND FOREVERMORE.

    I guess the "word" "futureproof" was banned from the techtard dictionary just before this article posted.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    Thank you nVidia, 3 monitors, and a 4th, ultra rezz 5,760 x 1,080, and playable maxxed !

    ROFL -

    Thank you all the little angry loser fanboys who never brought this up over 22 pages of ranting whines.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    " After a few hours of trial and error, we settled on a base of the boost curve of 9,80 MHz, resulting in a peak boost clock of a mighty 1,123MHz; a 12 per cent increase over the maximum boost clock of the card at stock. "
    http://www.bit-tech.net/hardware/2013/02/21/nvidia...
    That's 27mhz according to here...

    LOL

    Love this place.
  • TheJian - Sunday, February 24, 2013 - link

    Here's why they don't test more of the games I mentioned previously and others:
    http://www.techpowerup.com/reviews/NVIDIA/GeForce_...
    Crysis 2, with DX11 & HIRES pack added @1920x1200 it beats 3 radeons...Note you have to go to a game where NV doesn't care (warhead) to show it so badly. C2 shows much closer to 2 or 3 radeons than warhead which I don't think NV has spent a second on in probably 4 years.

    Page 11 has Diablo 3 scores.
    Diablo 3 scores

    Page 4 for AC3
    Assassins Creed 3, beats 1,2 or 3 Radeon 7970's at all tested resolutions...ROFL
    http://techreport.com/review/24381/nvidia-geforce-...
    Showing the same 20fps diff at 2560x1600, and showing same CF crapout, losing to singl 7970 even in both websites. Clearly AMD has per game problems. Which they allude to on page 16 of the review:
    "Just know that waiting for driver updates to fix problems has become a time-honored tradition for owners of CrossFire rigs."

    techpowerup.com titan review page 5
    Batman Arkham City, same story...You see this is why SLI/CF isn't all it's cracked up to be...Every game needs work, and if company X doesn't do the work, well, AC3/Bat AC etc is what happens...Crysis 2 seems almost the same also.

    techpowerup.com titan article page 8
    COD Black ops2, 2 titans handily beat 1/2/3 7970's.

    techpowerup page 13:
    F1 2012...ROFL, 1 titan to rule all cards...1/2/3 CF or SLI all beaten by ONE CARD. It seems they limit the games here for a reason at anandtech...Always pitching how fast two 7970's is in this article vs a titan, even though they half recommend ONE titan but 'not at these prices, dual cards will always win'.
    ...ummm, I beg to differ. It should win, if drivers are done correctly, but as shown not always.

    Note at anandtech, dirt showdown shows 3% for NV Titan vs. 7970ghz, but if you run the FAR better Dirt3:
    http://www.pcper.com/reviews/Graphics-Cards/NVIDIA...
    It's a ~20% win for Titan vs. 7970ghz. Crappy showdown game picked for a reason?

    Wait we're not done...
    techpowerup titan review page 15
    Max Payne3, 1 titan closing on 2 or 3 radeon 7970ghz's no matter the res...Not always good to get more cards I guess?

    techpowerup.com page 18 for starcraft 2
    Oh, snap...This is why they don't bench Starcraft 2...ROFL...1, 2 or 3 7970, all beat by 1 titan.
    But then, even a GTX 680 beats 3 7970's in all resolutions here...Hmmm...But then this is why you dropped it right? You found out a 680 beat 7970ghz way back here, even the 670 beat 7970ghz:
    http://www.anandtech.com/show/6025/radeon-hd-7970-...
    Totally explains why you came up with an excuse shortly after claiming a patch broke the benchmark. Most people would have just run with the patch version from a week earlier for the 660ti article. But as bad as 7970ghz lost to 670@1920x1200 it was clear the 660TI would beat it also...LOL. Haven't seen that benchmark since, just a comment it would be back in the future when patch allowed...NOPE. It must really suck for an AMD lover to have to cut out so many games from the new test suite.

    techpowerup.com titan review page 7
    Crap, someone benched Borderlands 2...LOL...Almost the same story, a titan looks good vs. 3 7970's (only loses in 5760x1080 which the single card isn't really for anyway).
    Again, proving adding more cards in some cases even goes backwards...LOL. It shouldn't, but you have to have the money to fix your drivers. Tough to do cutting 30% of your workforce & losing 1.18B.

    techpowerup page 20 titan article has WOW mists of pandaria.
    Dang those techpowerup guys, They had the nerve to bench the most popular game in the world. WOW Mists of Pandaria...Oh snap, 1 titan beats 3 7970's again, at all res. OUCH, even a SINGLE 680 does it...See why they don't bench other games here, and seem to act as though we all play pointless crap like warhead and Dirt3 showdown? Because if you bench a ton of today's hits (anything in 2012) except for a special few, you'll get results like techpowerup.

    That's ELEVEN, count them, 11 situations that kind of show a LOT MORE of the picture than they do here correct? I just wish I knew if they used mins or max at techpowerup (too lazy to ask for now), but either way it shows the weakness of multi-card setups without proper driver dev. It also shows why you need a wide range of TODAY's games tested for an accurate picture. Anandtech has really begun to drop the ball over the years since ryan took over card reviews. These games just add to the missing latency discussion issues that affect all radeons and are still being fixed on a GAME BY GAME basis. The driver fix doesn't affect them all at once. The last driver fixed 3 games (helped anyway), and every other game seems to need it's own fix. BUMMER. Ryan totally ignores this discussion. Techreport has done quite a few articles on it, and cover it in detail again in the titan review. PCper does also.

    Same Techpowerup article (since this site is puking on my links calling it spam) pg 19
    Skyrim, with all 3 radeon's at the bottom again. 1, 2 & 3 7970's beaton by ONE TITAN! So I guess 11 situations Ryan ignores. Does this make anyone take another look at the conclusions here on anandtech?
    PCper titan article shows the same in skyrim.
    http://www.anandtech.com/show/6025/radeon-hd-7970-...
    I kind of see why you dropped skyrim, even in your own tests at 1920x1200 670 was beating 7970ghz also, so even after 13.11 you'll still likely have a loss to 680 as shown at the other two links here, this was 4xmsaa too, which you complained about being a weakness in the 660ti article if memory serves...This kind of score short circuits comments like that huh? I mean 580/670 & 680 all pegged at 97.5fps clearly cpu bound not a memory issue I think, since all radeons are below 86. Well, shucks, can't have this benchmark in our next suite...ROFL. Anyone seeing a pattern here?

    Want more bias? Read the 660TI review's comments section where Ryan and I had a discussion about his conclusions in his article...ROFL. The fun starts about page 17 if memory serves (well not if you list all comments, diff page then). I only had to use HIS OWN benchmarks for the most part to prove his conclusioins BS in that case. He stooped so low as to claim a 27in monitor (bought from ebay...ROFL, even amazon didn't have a seller then, which I linked to) was a reason why 660ti's bandwidth etc sucked. Enthusiasts buy these apparently (cheapest was $500 from korea, next stop was $700 or so). Of course this is why they leave out mins here, as they would hit TEENS or single digits in that article if he posted them. All of the games he tested in that article wouldn't hit 30fps at 2560x1600 on EITHER amd or nv on a 66t0i. So why claim a victor?

    What about Crysis 3? Titan at or near top:
    http://www.guru3d.com/articles_pages/crysis_3_grap...
    Note he's telling you 40min, and really need 60 for smooth gameplay throughout as he says he uses avg. Also note at 2560x1600 with everything on, 7970/ 680 won't be playable as he's only avg 30. But see the point, only WARHEAD sucks on NV. But as show before nobody plays it, as servers are empty. 7970 wins over 680 by 20% in ryans warhead tests. But as soon as you go Crysis 2 dx11/hires textures or Crysis 3 it's suddenly a tie or loss.
    Page 8 in the same article
    Note the comment about 2560x1600, dipping to 25 or so even on gtx 680, and only fastest cards on the planet handle it fine:
    "At 2560x1600 with Very High Quality settings only the most expensive cards on the globe can manage. Please do bear in mind that our tests are based on averages, so YES there will be times your FPS drops to 25 fps in big fire fights and explosions, even with say a GTX 680."
  • TheJian - Sunday, February 24, 2013 - link

    Sorry, this site pukes on a certain amount of links, so I had to change them all to just page refs for the most part: 2nd part here :)
    Ryans warhead comment from this article: "In the meantime, with GTX 680’s LANGUID performance, this has been a game the latest Radeon cards have regularly cleared. For whatever reason they’re a good match for Crysis, meaning even with all its brawn, Titan can only clear the 7970GE by 21%."
    No Ryan, just in this game...Not crysis 2 or 3...LOL. He gives yet another dig in the same page, because this 5yr old game is major important even though nobody plays it:
    "As it stands AMD multi-GPU cards can already cross 60fps, but for everything else we’re probably a generation off yet before Crysis is completely and utterly conquered."

    Jeez, if you'd just put down the 5yr old game and get with the times (crysis 2 or 3 will do Ryan or any of the games above, what 11 of them I gave?), you'll find the only LANGUID performer is AMD. So Titan is a gen behind if you believe him on all CRYSIS games? If NV is a gen behind, how come nobody else shows this in Cryis2 DX11/Hires pack, or Crysis 3? Oh, that's right, NV actually optimizes for games that are less than 5yrs old...ROFL. Honestly I don't think AMD has done anything on their driver for warhead for 5yrs either...They just happen to be faster in a 5yr old game. :) And NV doesn't care. Why would they with the non-existent player base shown above on servers? Is Cryengine 2 being used in a bunch of games I don't know about? Nope, just WARHEAD. I've never heard of the other 3 on the list, but crysis 1 is not quite the same engine and as shown above performs quite well on kepler(1fps difference on 680vs7970ghz @1920x1200) same for crysis 2 & 3. Only warhead sucks on kepler.
    Search wikipedia.org for CryEngine
    You can do this for any game people, to find out what is out now, and what is coming. Look up unreal 3 engine for instance and take a gander at the # of games running it vs. Warhead.
    search wikipedia.org for List of Unreal Engine games
    Complete list of u3 base games there.

    http://techreport.com/review/24381/nvidia-geforce-...
    Guild Wars 2, Titan beating single 7970 & 7970CF at 2560x1600 by a lot...Another ignored game with 3mil sold. Titan is beating CF 7970 by ~20%. OUCH.

    http://www.guru3d.com/articles_pages/crysis_3_grap...
    Reminder, crysis 3 2560x1600 680gtx (that languid card on warhead according to Ryan) TIES 7970ghz in guru3d's benchmarks. Mind you, neither can run there as it's 30fps for both. You'll dip to 10-20fps...ROFL. But point proven correct? RYAN is misrepresenting the facts. Unless you play 3 gen old warhead instead of crysis2 or crysis 3 (or even updated crysis 1 now on cryengine3 according to the site, probably why it does well on kepler too)? Who does that? You still play serious sam1 or far cry 1 too? Still playing doom1?

    Is that 14 games I'm up to now? That's a lot of crap you couldn't use in the new suite huh?

    http://www.anandtech.com/show/6159/the-geforce-gtx...
    The comments section for Ryan's 660ti article. Realizing what I said above, go back and read our conversation. Read as he attempts to defend the bias conclusions in that article, and read the data from his OWN article I used then to prove those cards were made for 1920x1200 and below, not 2560x1600 or 2560x1440 as Ryan defended. Look at the monitor he was pitching and me laying out how you had to EBAY it from KOREA to even make his statements make sense (I gave links, showed that ebay company in korea didn't even have an about page etc...ROFL). Not that you'd actually order a monitor direct from some DUDE in korea giving up your visa like that anyway, how risky is that for a $500 monitor? But it was humorous watching him and Jarred defend the opinions (Jarred basically called me a ahole and said I was uninformed...LOL). The links and the data said otherwise then, and above I just did it again. This hasn't changed much with dual cards or titan. You still need these to play above 1920x1200 at above 30fps and some games still bring the top cards to their knees at 2560x1600 etc. That's why they don't post minimums here. All of the arguments about bandwidth being an issue go out the window when you find out you'll be running 10-20fps to prove it's true. One of the pages in the 660TI article is titled ~"that darned memory bandwidth"...Really? I also pointed out the # of monitors selling @1920x1200 or less (68 if memory serves) and above it on newegg.com at the time. I pointed at that steampowered.com showed less than 2% market share above 1920x1200 (and almost all had dual cards according to their survey, NOT a 660TI or below). I doubt it's much higher now.

    Hopefully one day soon Anand will stop this junk. It's hard to believe this is the new game suite...I mean seriously? That's just sad. But then Anand himself ignored basically the entire freakin' earnings report for NVDA and didn't even respond to the only comment on his NON-informational post (mine...LOL).
    http://www.anandtech.com/show/6746/tegra-4-shipmen...
    I'm the only comment... $20 says Nobody from Anandtech addresses this post either... :) What can they say? The data doesn't lie. Don't believe me...I provided all the links to everything so you can judge them yourselves (and what they've said or done - or not done in all these cases). They didn't address last Q's financial/market share whipping NVDA gave AMD either. I love AMD myself. I currently run a 5850, and put off my 660ti purchase as I'm not really impressed with either side currently and can wait for now (had a black friday purchase planned but passed), but the BIAS here has to stop. Toms, Techreport, PCper etc is reporting heavily on latency problems on radeons (at least 1 other user already mentioned it in this comment section) and AMD is redoing their memory manager to fix it all! AMD released a driver just last month fixing 3 games for this (fixed borderlands2, guild wars2 and one other). PCper.com (Ryan Shrout) is still working on exactly how to accurately test it (others have already decided I guess but more will come out about this). He's calling it frame rating capture:
    http://www.pcper.com/reviews/Graphics-Cards/Frame-...
    Note his comment on situation:
    "This is the same graph with data gathered from our method that omits RUNT frames that only represent pixels under a certain threshold (to be discussed later). Removing the tiny slivers gives us a "perceived frame rate" that differs quite a bit - CrossFire doesn't look faster than a single card."
    AMD cheating here or what (they've both done tricks at some point in their history)? I look forward to seeing Ryan Shrout's data shortly. He used to run AMDMB.com so I'm pretty sure he's pro AMD :)
    http://www.tomshardware.com/reviews/geforce-gtx-ti...
    more latency stuff. Note AMD is working on a new memory manager for GCN supposedly to fix this. I wonder if this will lower their fps avg.

    I didn't form my opinion by making stuff up here. AMD has great stuff, but I provided a LOT of links above that say it's not quite like Anandtech would have you believe. I can find benchmarks where AMD wins, but that's not the point. Ryan always makes the claim AMD wins (check his 660TI article conclusions for example). At best you could call this even, at worst it looks pretty favorable to NV cards here IMHO. IF you toss out crap/old 2 games (warhead, dirt showdown) that nobody plays and add in the 14 above this is pretty grimm for AMD correct? Pretty grimm for Anandtech's opinion too IMHO. If you can argue with the data, feel free I'd like to see it. None of the benchmarks are what you'd buy either, they are all reference clocked cards which nobody in their right mind would purchase. Especially the 660TI's, who buys ref clocked 660TI's? Toms/anand/hardocp seem to love to use them even though it's not what we'd buy as the same price gets you another 100mhz easily OOBE.

    I'd apologize for the wall, but it's not an opinion, all of the benchmarks above are facts and NOT from me. You can call me crazy for saying this site has AMD bias, but that won't change the benchmarks, or the ones Anandtech decided to REMOVE from their test suite (skyrim, borderlands2, diablo3, starcraft2 - all have been in previous tests here, but removed at 660ti+ articles). Strange turn of events?
  • Ryan Smith - Monday, February 25, 2013 - link

    "I'm the only comment... $20 says Nobody from Anandtech addresses this post either... :) What can they say?"

    Indeed. What can we say?

    I want to respond to all user comments, but I am not going to walk into a hostile situation. You probably have some good points, but if we're going to be attacked right off the bat, how are we supposed to have a meaningful discussion?
  • TheJian - Monday, February 25, 2013 - link

    If that's what you call an attack, it has to be the most polite one I've seen. The worst I called you was BIASED.

    Please, feel free to defend the 14 missing games, and the choice of the warhead (which doesn't show the same as crysis 1, 2 or 3 as shown) and dirt showdown. Also why Starcraft2 was in but now out when a launch event for the next one is coming with the next few weeks. Not an important game? The rest above are all top sellers also. Please comment on skyrim, as with the hires pack that is OFFICIAL as I noted in response to CeriseCogburn (where right above his post you call it cpu limited, as his link and mine show it is NOT, and AMD was losing in his by 40fps! out of ~110 if that isn't GPU separation I don't know what is). Are you trying to say you have no idea what the HI-RES pack is for skyrim out for over a year now? Doesn't the term HI-RES instantly mean more GPU taxing than before?

    Nice bail...I attacked your data and your credibility here, not YOU personally (I don't know you, don't care either way what you're like outside your reviews). Still waiting for you to attack my data. Still waiting for an explanation of the game choices and why all the ones I listed are left out for 2 games that sold 100,000 units or less (total failures) and one of them (warhead) from 5 years ago that doesn't represent Crysis 1, 2 or 3 benchmarks shown from all the titan articles (where all the keplers did very well with a lot of victories at 1920x1200, and some above, not just titan).

    This isn't nor have any of my posts been hostile. Is it hostile because I point out you misrepresenting the facts? Is it hostile because I backed it with a lot of links showing it NOT like you're saying (which enforces the misrepresentation of the facts comments)? It would be (perhaps) hostile if I insinuated you were an Ahole and have an "uninformed opinion" like Jarred Walton said about me in the 660ti comments section (which I never did to either of you) even after I provided boat loads of PROOF and information like I did here. So basically it appears, if I provide ample proof in any way say you're not being factual I'm labelled hostile. I was even polite in my response to Jarred after that :)

    How does one critique your data without being hostile? :)

    Never mind I don't want an answer to your distraction comment. Defend your data, and rebut mine. I'm thinking there are curious people after all I provided. It won't be meaningful until you defend your data and rebut the data from all the sites I provided (heck any, they all show the same, 14 games where NV does pretty well and not so good for radeons or CF, in some cases even SLI). I've done all the work for you, all you have to do is explain the results of said homework, or just change your "suite of benchmarks" for gaming. Clearly you're leaving out a lot of the story which heavily slants to NV if added. The ones in the links are the most popular games out today and in the last 15 months. Why are they missing? All show clear separation in scores (in same family of gpu's or out). These are great gpu test games as shown. So please, defend your data and game choices, then do some rebuttal of the evidence. IF someone said this much stuff about my data, and I thought I had a leg to stand on, I'd certainly take some time to rebut the person's comments. Politely just as all my comments were. Including this one. I can't think of a defense here, but if you can and it makes sense I'll acknowledge it on the spot. :)
  • CeriseCogburn - Tuesday, February 26, 2013 - link

    I appreciate that, and read all the words and viewed all the links and then some.

    I have noted extreme bias in many past articles in the wording that is just far too obvious and friends and I have just had a time rolling through it.
    I commented a few reviews back pointing a bit of it out yet there's plenty of comments that reek as well.
    I am disappointed yet this site is larger than just video cards so I roll with it so to speak.

    Now that you've utterly cracked open the factual can exposing the immense amd favored bias, and the benchmark suite is scheduled to change -lol- that's how the timing of things work and coincide so often it seems.

    Anyway, you not only probably have some points, you absolutely do have a lot of unassailable points but then people do have certain "job pressures" so I don't expect any changes at all but am very appreciative and do believe you have done everyone a world of good with your posts.
    The 4 benchmarks dropped was just such a huge nail, lol.

    It's all good, some people like to live in a fantasy type blissful fog and feel good and just the same when reality shines the light it's all good too and even better.

    I absolutely appreciate it, know that.
    You did not waste your time nor anyone else's.
  • thralloforcus - Monday, February 25, 2013 - link

    Please test folding@home and bitcoin mining performance! Those would be my main justifications for getting a new video card to replace the 570 Classified's I have in SLI.
  • Ryan Smith - Monday, February 25, 2013 - link

    As noted elsewhere, OpenCL is currently non-functional on Titan. Until it's fixed we can't run FAHBench. As for BitCoin Jarred has already gone into some good reasons why it's not a very useful GPU benchmark, and why GPUs are becoming less than useful for it.

Log in

Don't have an account? Sign up now