• What
    is this?
    You've landed on the AMD Portal on AnandTech. This section is sponsored by AMD. It features a collection of all of our independent AMD content, as well as Tweets & News from AMD directly. AMD will also be running a couple of huge giveaways here so check back for those.
    PRESENTED BY
POST A COMMENT

111 Comments

Back to Article

  • johnsonx - Tuesday, January 25, 2011 - link

    first page Reply
  • MonkeyPaw - Tuesday, January 25, 2011 - link

    Maybe the site should have a "report an error" link at the end of the page for readers to quietly submit problems. I'm all for accuracy, but these "correction posts" don't really make for conversation. The first 3-4 posts are all correction notices! Hardly the dialog I was looking for. ;) Reply
  • vol7ron - Wednesday, January 26, 2011 - link

    +1 Reply
  • softdrinkviking - Wednesday, January 26, 2011 - link

    +2

    PS

    There should be a way to report 1. errors in technical info (which is important) and 2. grammatical errors (which is not so important) separately. (or exclusively)
    I can just picture the editors trying to sort though hordes of emails detailing grammatical minutiae.
    Reply
  • buzznut - Sunday, January 30, 2011 - link

    All great suggestions. Now I'm certain I'm not the only one who finds the anal retentive "spelling police" weenies annoying! Reply
  • TSnor - Sunday, May 08, 2011 - link

    +1 anal retentive "spelling police" weenies annoying! Reply
  • Kaboose - Tuesday, January 25, 2011 - link

    The chart on the first page shouldn't it read 6950 1gb as opposed to 6870 1gb? Currently it shows both the 6950 and 6870 with all the same specs (except the vRAM) Reply
  • sanityvoid - Tuesday, January 25, 2011 - link

    Great information for the up and coming 'upgraders', it's good to know that competition is still good between both AMD & Nvidia.

    On a side note, I'm loving the two color comparison vs blue field for the charts. It makes it really easy to see what is going on for the tests.

    Keep up the good work!
    Reply
  • KamikaZee - Tuesday, January 25, 2011 - link

    Last page

    "there aren’t any games that can exploit this advantage at the common resolutions of 1920x1200, 1920x180, or 1680x1050."

    1920x1080 missing a 0
    Reply
  • silverblue - Tuesday, January 25, 2011 - link

    I can only recommend that you get the card that performs best in the games you play, and whilst the 560 Ti will obviously have the edge over the 6950 1GB in tesselation, it's going to use more power overall. Either is great.

    The 6870 BE isn't bad either, really. Good time to be in the market... though I'm still waiting for the 6990 to rear its ugly head. ;)
    Reply
  • JPForums - Tuesday, January 25, 2011 - link

    It looks to me like the 560 Ti only has the edge over the 6950 1GB tessellation with high factors. Even the 6870 bests the 560 Ti in the DirectX 11 Tessellation Sample test at the medium setting. See anandtech's 560 Ti launch article.

    What I find even more interesting is that when you consider only the higher resolutions, the 6950 seems to be superior to the 560 Ti. I realize most people still use lower resolutions, but it doesn't make sense to judge between the potential of two cards at any resolutions that both can produce more than playable frame rates at the settings in question. This creates a misleading conclusion in situations where the winner reverses at higher resolutions. Hawx, for instance, shows that the 560 Ti has clearly superior frames rates at lower resolutions where the 6950 scales much better and edges it out at 2560x1600. Neither dip below 80 fps, so you can't really say the gameplay differs, however, it appears the 6950 is the one that has the muscle when it counts. Battlefield BC2 shows a similar reversal (reference anandtech's 560 Ti launch article). Of course, there are situations where nVidia turns tables at higher resolutions as well, they just aren't present in anandtech's launch article (unless I missed it).
    Reply
  • softdrinkviking - Wednesday, January 26, 2011 - link

    I believe that Ryan replied in the comments for the 560 Ti card to a commentor who inquired about the repeatability of FPS results with the 6950 1GB while playing Crisis at high resolutions, and it may pertain to your argument.
    He said that the results are "highly variable."
    If you are going by the avarage frame rate, and only at high res., the 6950 looks better than the 560 Ti but...
    Perhaps the 560 Ti produces more consistant results than the 6950?
    Reply
  • JPForums - Thursday, January 27, 2011 - link

    Ryan does a really good job with articles, so I don't want to come off as bashing him. However, if that was a major concern, I really wish he would have mentioned it in the article. Taking it a step further, he could post charts with min, max, and average. Alternately, if he felt particularly generous, he could post a graph of the frame rates over the course of the benchmark for the cases where one companies cards are less consistent than the others. Of course, that would be a lot of work to do for every benchmark and would incur unnecessary delays in getting the articles out. I would only include such charts/graphs to back myself up when I felt it changed the outcome. That said, even if these never show up, I'll still enjoy reading Ryan's articles.

    On a personal note, the idea that the GTX560 Ti may be more consistent than the HD6950 makes me feel better about my decisions to purchase a GTX460 and GTX470 given the similarities in architecture. That said, I haven't noticed abnormal inconsistencies in frame rate with the HD6870 I bought as a Home Theater/Gaming card for the living room. I hope any inconsistencies in the frame rate of the HD6950 are driver related and not architectural, or we may loose some of the wonderful competition that has characterized the graphics market as of late.
    Reply
  • britjh22 - Tuesday, January 25, 2011 - link

    I see the 6950 pricing as sort of strange. Currently you pay $10 after MIR's to move up to 2 GB. A small price for a sometimes useful boost. But if you take unlocking into consideration, the ability to unlock with the 2Gb version, and not with the 1Gb, I'd say that's quite a massive difference.

    Of course, I don't have a good feel about the success rate of the 6950 to 6970 unlock, or what % of cards it's possible with, but the pricing seems quite strange in that light.

    Oh, and let's see dropping prices on a 6870 please!
    Reply
  • MeanBruce - Tuesday, January 25, 2011 - link

    Dude, 6870s are as low as $199- right now over at Newegg at least for the Sapphire, yup! Reply
  • buildingblock - Tuesday, January 25, 2011 - link

    My local hardware dealer has several GTX 560s in stock today, including 900Mhz factory overclocked models. The Gigabyte Super OC 1Gb is listed and promised soon.... But the 1Gb AMD 6950 - no sign whatever. I see elsewhere references to the fact that this card is likely to be a short run special by AMD as a GTX 560 launch spoiler, and that certainly seems to be the case. I look forward to the Anandtech review of factory overclocked GTX 560s at some point. Reply
  • Ryan Smith - Tuesday, January 25, 2011 - link

    At this point the only place you're going to find them is at Newegg and other e-tailers. With the launch pulled in by this much this soon, they won't be on B&M store shelves yet. This isn't all that rare, in fact I would say it's much more rare to find newly launched cards available in B&M stores. Reply
  • TonyB - Tuesday, January 25, 2011 - link

    Competition is a wonderful thing ain't it? Reply
  • prdola0 - Tuesday, January 25, 2011 - link

    Hello Ryan,
    after such a nice review of the GTX560 Ti, I am quite disappointed that you included the overclocked HD6870 in this test. First, after you reviewed the GTX460 and included an OCed model, you get bashed by AMD fans crying foul. So you ask readers to say if it's ok to include an OCed model and from the count of posts you draw a conclusion not to include OCed models. I was suprised then, because measuring such a thing by mere post count is quite inadequate, considering that unhappy people usually shout the loudest and the happy ones don't need to. So of course you'd have more posts against it, no surprise there. But then I kinda let it go. However, seeing now that you did include an OCed model again, but this time something that is not so common, unlike OCed GTX460, I was very upset. Why didn't you review the OCed Gigabyte or Asus GTX560 cards? And considering reviews from other sides and the great results the OCed cards have, will you prepare a new review article dedicated to the OCed GTX560 to fix this bias?

    Here is where I remember how I though saying things like "I am not going to visit this site anymore" is quite silly after the OCed GTX460 case. But seeing how you turn 180 for reasons unknown to me, I must say the very same thing.

    Best regards,
    Prdola
    Reply
  • Aikouka - Tuesday, January 25, 2011 - link

    I really don't get the persistent whining from some of you over this topic. You're so "hurt" over Ryan spending *his* time on benchmarking a card that was *originally billed as the 560 Ti's competitor*... it's completely inane.

    If you don't think it should be considered, then simply ignore the card in the charts, and you'll get what you consider to be a pure "OC free" comparison.

    As for my stance on it, if something is purchased off the shelf **with the configuration that was tested**, then it's fine to put it on there in a normal (i.e. not overclocking specific) article and/or section.
    Reply
  • sinPiEqualsZero - Tuesday, January 25, 2011 - link

    The market speaks louder than needlessly outraged readers. Like it not, overclocked cards will continue to be produced. In order to be responsible journalists, they have to include them in order to evaluate their value to the consumer.

    He also made clear that AMD was bumping up the launch at little notice. I think you are making much ado about nothing and will see plenty of factory-OC'd cards in the near future.
    Reply
  • prdola0 - Tuesday, January 25, 2011 - link

    It's not about not liking benchmarks of overclocked cards. As I stated, I didn't agree with the whining about GTX460 OC as well. I think it's legitimate to include OCed models. But if you do it, then do it for both sides. Especially after such a drama and a strict decision by the writer not to do it. That is the point. Reply
  • britjh22 - Tuesday, January 25, 2011 - link

    In the original 68xx review, the site got flack for including a highly overclocked GTX 460, at NVIDIA's asking.

    This time, they review the GTX 560 Ti against stock clocked rivals. In a separate article they present ATI's competitive reaction to the GTX 560 launch. I think Anandtech and Ryan handled this correctly. They analyze and present the GTX 560 as a reflection of what NVIDIA has done, and produce a separate article where they focus on the GPU ecosystem as a whole.

    In this way I think it looks a lot less like they kowtowed to a vendor's requests, and in fact show how targeted and thought out AMD/ATI's launch is. In a market this closely matched for performance and price, and with vendors offering customized versions of AMD/NVIDIA products, it's hugely complicated.

    Well done Anandtech for today's articles, they definitely made my lunch hour more enjoyable.
    Reply
  • prdola0 - Tuesday, January 25, 2011 - link

    "They analyze and present the GTX 560 as a reflection of what NVIDIA has done, and produce a separate article where they focus on the GPU ecosystem as a whole."

    Well if they did that, why didn't they include the OCed GTX560 Ti as well? Consider the fact that there are likely going to be a lot of oveclocked GTX560s as with the GTX460 card. That isn't part of the GPU ecosystem?
    Reply
  • britjh22 - Tuesday, January 25, 2011 - link

    The card just launched, it's very possible they don't have one, or didn't have the time to put that through the test suite with all the other things coming off NDA today. As a news source it's more sound for them to be able to have timely coverage, even if they have to revisit something they didn't have time for in the original article.

    It sounds like most tech blogs were up very late compiling, testing, and writing for these launch articles. Most people are content with waiting a week for the entire picture to become clear, and if not, well that is the price for early adoptership.

    80/20 rule.
    Reply
  • prdola0 - Tuesday, January 25, 2011 - link

    You may be right that they didn't have any OCed GTX560s. However while there are many more review sites that did receive them, I kinda doubt that a site with such a big name as AnandTech wouldn't receive any. Reply
  • Ryan Smith - Tuesday, January 25, 2011 - link

    We do in fact have one: an MSI Twin Frozr II (880MHz core). You'll see it later this week, as we didn't have time to pull it in to our review on top of everything else that was going on today. Reply
  • SlyNine - Wednesday, January 26, 2011 - link

    Just tell him to quite his whinning .. jk But for the love of god it's not a big deal. I'm just glad we get the objective tests that we do, As opposed to taking a shot in the dark when buying cards. Reply
  • softdrinkviking - Wednesday, January 26, 2011 - link

    agree 100%

    this was totally about an AMD market reaction, and both cards reviewed are varients of other cards previously released by AMD.
    absolutely no foul.
    Reply
  • 3DVagabond - Sunday, January 30, 2011 - link

    Completely different scenario. This is a review of 2 AMD cards. This is not the review of the GTX-560 with the inclusion of a highly overclocked card that was put in at AMD's request/insistence, as was the case with the GTX460 FTW. Add to that there was also input from nVidia what cards of theirs to NOT include for comparison in the 6870 review and even benchmarks they wanted AMD cards tested with (HAWX2). Again, not even close to the same scenario. Reply
  • GeorgeH - Tuesday, January 25, 2011 - link

    There is no bias at Anandtech, only well documented arguments and conclusions that you're free to disagree with. If you want to abandon one of the best tech review sites on the planet in favor of one that panders to your personal delusions about the fuzziness of a multinational corporation, knock yourself out. Reply
  • prdola0 - Tuesday, January 25, 2011 - link

    I do not want to invalidate the arguments and facts in the article. No problem there. Just that they didn't include the OCed GTX560, which is going to be a major player. Reply
  • sinPiEqualsZero - Tuesday, January 25, 2011 - link

    Find me a factory-overclocked GTX 560 that is currently available in the market. Then we can have that discussion. Anandtech is testing what is currently available - something I'm not sure you understand. All of these reviews are snapshots of a moment in time.

    My searching shows one as "OC" on newegg, but no details about the core clock. That isn't a sign that the site is biased, it's called reality.
    Reply
  • prdola0 - Tuesday, January 25, 2011 - link

    There are two already available (in Europe anyway): one from Asus and one from Gigabyte. Reply
  • omelet - Wednesday, January 26, 2011 - link

    They're working on a review for the overclocked card. I don't think they've ever released benchmarks for factory overclocks the same day that the card comes out, at least not in recent history, so it's not unexpected for them not to have included overclocked GTX 560 data yet. Wait for the 560 overclocked article in a few days. Reply
  • Melted Rabbit - Tuesday, January 25, 2011 - link

    The 850MHz GTX460 rarely ever in stock at online retailers during its lifespan, at times, the 810MHz GTX460 was even hard to find. The overclocked GTX460s that had even lower clocks were generally available. With these previous supply constraints in mind, why should any review site review another highly overclocked card like the 900MHz GTX560 Ti, when its predecessor was a low volume card created specifically to deceptively improve the perceived benchmarks and perceived value of the rest of the cards in its series?

    I have no issue with anandtech or other websites reviewing or including in reviews a 900MHz GTX560 if the variant is still readily available in four to six weeks. It would mean the card existed in reasonable quantities and was not just another Geforce 8800 Ultra card that showed up for the reviews and then was never actually in retail.

    On the other hand, the 6870 Black is a modest overclock of the 6870 from AMD, who has not had and has no supply problems with the 6870 cards.
    Reply
  • GeorgeH - Tuesday, January 25, 2011 - link

    My personal experience is that EVGA's ~850MHz 460s have never impossible to find, although it has occasionally been difficult to find ones that weren't the less desirable external exhaust model. I've never had any difficulty finding ~810MHz cards.

    I didn't do much shopping around Christmas/New Years, though, so my experiences might not be representative of the average.
    Reply
  • GeorgeH - Tuesday, January 25, 2011 - link

    GTX 560 cards are going to be a major player? Really? You know this how? Because your talking points memo from Nvidia marketing told you so?

    If you honestly care about 560 OC results, here's what you can say - "Hey Ryan, will you be getting a chance to test any overclocked 560's soon? How do you think they will perform?"

    Instead, here's what you went with - "OMG!! They didn't include every freaking card on the planet! BIAS! Sweet baby Jesus, I weep for the Anandtech that was!"

    The only thing really missing from this article was the inclusion of an overclocked 460, which from previous benchmarks should be very competitive for $50 less. Unfortunately the ridiculous shitstorm from the last time it was included means we can't have nice things anymore.
    Reply
  • prdola0 - Tuesday, January 25, 2011 - link

    The Gigabyte and Asus OCed cards were available even before stock clocked cards. How is that in any way "temporary" or "uninterresting"? You are trying to downplay it really hard, but for apples-to-apples comparison, there should be the overclocked competition as well. Reply
  • GeorgeH - Tuesday, January 25, 2011 - link

    Quotation marks - they do not mean what you think they mean. Nowhere on this page has anyone used either the words "temporary" or "uninteresting", nor any synonyms that I can see.

    No one is downplaying anything other than your ridiculous claim that AT and Ryan are biased because in two weeks they didn't manage to benchmark and write up every single card in the universe that might be relevant to your interests.
    Reply
  • silverblue - Tuesday, January 25, 2011 - link

    No. They already got roasted for doing it last time. Also, the 6870 Black Edition is an official AMD product that hasn't been shoved down Ryan's throat. So, whereas before all things may not have been exactly equal, they are now.

    The presence of the 6950 1GB in the 560 Ti review is quite natural as the 6950 2GB was already there, and besides which, until you overload that memory, the 6950 1GB performs pretty much the same as its 2GB brother, albeit a tiny bit faster in places - it's not cheating to include it as it's not an overclocked card. There's no other way you can handle it except to have the two AMD cards in separate articles to each other and not mention the 6950 1GB in the 560 Ti review (hardly sensible - we already knew it'd be almost identical to the 2GB variant), or not review the 6870 Black Edition at all. Also, think of the time it must've taken Ryan to handle these reviews - certainly doesn't take a day or so to do.

    With overclocked cards, the situation is that the standard product is reviewed and, usually, the 3rd party offerings are reviewed together in a separate article in short order. I fully expect this to happen as it's normal for a site like Anandtech to do so.

    If your beef is with the 6870 Black Edition, please remember that, as stated in the review, AMD fully intended it to be the 560 Ti's true competition, and that the 6950 1GB was due out in February. When it became apparent that the 6870 wasn't the answer, they released the 6950 1GB early. There's no sense in scrapping all those 6870 Black Editions, of which there has to be thousands, so AMD have not only brought out two cards at the same time, but offered two viable alternatives to nVidia's one. The only thing that AMD will suffer is lack of availability for those 6950s for the time being which is only natural for an accelerated launch, plus nVidia will undoubtedly lose some sales so well done on that.
    Reply
  • ritalinkid18 - Wednesday, January 26, 2011 - link

    Well said, GeorgeH...well said....

    QUOTE:
    "There is no bias at Anandtech, only well documented arguments and conclusions that you're free to disagree with. If you want to abandon one of the best tech review sites on the planet in favor of one that panders to your personal delusions about the fuzziness of a multinational corporation, knock yourself out."
    Reply
  • medi01 - Wednesday, January 26, 2011 - link

    Yeah, I won't talk about cherry picked oced 460, but "forgetting iPhone in the pocket" on comparison pics where it would look very pale (much lower contrast) is quite remarkable. Reply
  • sebanab - Tuesday, January 25, 2011 - link

    Dude give it a rest!
    Plus it's only fair that both makers got the same treatment...
    Reply
  • Menetlaus - Tuesday, January 25, 2011 - link

    Dude, this isn't the GTX560 Ti launch article. This is a picture of the market as you or I can go out and buy cards.

    I agree that the whole OC'd GTX460 "issue" was total bovine excrement from fanboys complaining that their poor nVidia was being compared to existing, non-reference cards that were wildly available at the time of the 460's launch.

    That being said, the launch article for the GTX560Ti is one article down and contains nothing but reference cards in an effort to keep the whiners quiet.
    Reply
  • SandmanWN - Tuesday, January 25, 2011 - link

    Dudes, whatever. (You guys started it)
    The 460 article wasn't even about the 460. It was brought into the fray during an AMD release article. The only bovine excrement came from the drool of Nvidia fanboys that had the ridiculous notion that a cherry picked overclocked card delivered by Nvidia was allowed into a reference card release article for AMD. Which clearly drew red flags from those readers with common sense.

    And not only that but the the writer couldn't even finish the friggin article the way he wanted to because he was spending his time doodling around with the Nvidia card. That was complete BS.

    We tried to give some pointers on how it should have been handled.
    1) Reference vs reference on product release articles.
    2) Follow up articles with overclocked cards vs overclocked cards.

    It was a real simple freaking concept.
    Reply
  • Parhel - Wednesday, January 26, 2011 - link

    Exactly. Nobody said that Anandtech shouldn't review OC'ed cards. The point was that OC'ed cards hand selected by AMD or Nvidia shouldn't be included in the launch article for their competitors new architecture. Had this card been included in the GTX 560 article, their would have been the same uproar as before. Reply
  • silverblue - Wednesday, January 26, 2011 - link

    Yes, but there's a big difference between a majorly overclocked 3rd party card promoted by nVidia and a slightly overclocked original AMD card. I can see your point, though. Reply
  • 7Enigma - Tuesday, January 25, 2011 - link

    I haven't read this article yet (just finished the GTX 560Ti but wanted to say thank you for putting this article up. As many of us had asked for you properly kept the launch article about the card being launched and comparisons to stock cards, but in this article you are comparing other offerings including OC'd cards.

    That's the way it's meant to be done and I thank you.
    Reply
  • Parhel - Wednesday, January 26, 2011 - link

    Seconded. Reply
  • rdriskill - Tuesday, January 25, 2011 - link

    Given that it is a lot easier to find a 1920 x 1080 monitor now than it is to find a 1920 x 1200 monitor, would that resolution make more sense to list in these kinds of comparisions? I realise it wouldn't make much of a difference, but it is kind of strange to not see what, at least in my area, is the most common native resolution. Reply
  • james.jwb - Tuesday, January 25, 2011 - link

    wouldn't mind seeing 27" res include at the high end (2560x1440) as up there pushes the cards much harder and could make all the difference between playable and unplayable. I realize this is more work though :) Reply
  • Ryan Smith - Tuesday, January 25, 2011 - link

    As 16:9 monitors have 90% of the resolution of 16:10 monitors, the performance is very similar. We may very well have to switch from 19x12 to 19x10 because 19x12 monitors are becoming so rare, but there's not a lot of benefit in running both resolutions.

    The same goes for 25x14 vs. 25x16. Though in that case, 25x16 monitors aren't going anywhere.
    Reply
  • Makaveli - Tuesday, January 25, 2011 - link

    Great review.

    As for the complaints GTFO, is it somehow affecting your manhood that there is an overclocked card in the review?

    Some of you really need to get a life!
    Reply
  • ctbaars - Tuesday, January 25, 2011 - link

    Hey! Don't you talk to Becky that way .... Reply
  • silverblue - Tuesday, January 25, 2011 - link

    It's overclocked, sure, but it's an official AMD product line. If AMD had named it the 6880, I don't think anyone would've questioned it really. Reply
  • Shadowmaster625 - Tuesday, January 25, 2011 - link

    The 6870 has 56 texture units and the 6950 has 88 , or 57% more. Yet if you add up all the scores of each you find that the 6950 is only 8% faster on average. This implies a wasted 45% increase in SPs and/or texture units (which one?), as well as about 800 million wasted transistors. Clearly AMD needed to add more ROPs to the 6950. Also, since the memory clock is faster on the 6950, this implies even more wasted transistors. If both cards had the same exact memory bandwidth, they might very well only be 4% apart in performance! AMD's gpu clearly responds much more favorably to an increase in memory bandwidth than it does to increased texture units. It really looks like they're going off the wheels and into the weeds. What they need is to increase memory bandwidth to 216G/s, and increase their ROP-to-SIMD ratio to around 2:1.

    Yes I know about VLIW4... but where is the performance? Improvements should be seen by now. Like what Nvidia did with Civ 5. I'm not seeing anything like that from AMD and we should have been seeing that by now, in spades.
    Reply
  • B3an - Wednesday, January 26, 2011 - link

    ....I like how you've completely missed out the fact that the 6870 is clocked 100MHz higher on the core, and the 6870 Black is 140MHz higher. You list all these other factors, and memory speeds, but dont even mention or realise that the 6870/Black have considerably higher core clocks than the 6950. Reply
  • Shadowmaster625 - Wednesday, January 26, 2011 - link

    It is probably clocked higher because it has almost a billion fewer transistors. Which begs the question.... what the hell are all those extra transistors there for if they do not improve performance? Reply
  • DarkHeretic - Tuesday, January 25, 2011 - link

    This is my first post, i've been reading Anand for at least a year, and this concerned me enough to actually create a user and post.

    "For NVIDIA cards all tests were done with default driver settings unless otherwise noted. As for AMD cards, we are disabling their new AMD Optimized tessellation setting in favor of using application settings (note that this doesn’t actually have a performance impact at this time), everything else is default unless otherwise noted."

    While i read your concerns about where to draw the line on driver optimisation Ryan, i disagree with your choice to disable select features from one set of drivers to the next. How many PC users play around with these settings apart from the enthusiasts among us striving for extra performance or quality?

    Surely it would make be far fairer for testing to leave drivers at default settings when benchmarking hardware and / or new sets of drivers? Essentially driver profiles have been tweaking performance for a while now from both AMD and Nvidia, so where to draw the line on altering the testing methodology in "tweaking drivers" to suit?

    I'll admit, regardless of whether disabling a feature makes a difference to the results or not, it actually made me stop reading the rest of the review as from my own stance the results have been skewed. No two sets of drivers from AMD or Nvidia will ever be equal (i hope), however deliberately disabling features meant for the benefit of the end users, just seems completely the wrong direction to take.

    As you are concerned about where AMD is taking their driver features in this instance, equally i find myself concerned about where you are taking your testing methodology.

    I hope you can understand my concerns on this and leave drivers as intended in the future to allow a more neutral review.

    Regards
    Reply
  • 7Enigma - Tuesday, January 25, 2011 - link

    Here's the point. There is no measurable difference with it on or not from a framerate perspective. So in this case it doesn't matter. That should tell you that the only possible difference in this instance would be a possible WORSENING of picture quality since the GPU wars are #1 about framerate and #2 about everything else. I'm sure a later article will delve into what the purpose of this setting is for but right now it clearly has no benefit from the test suite that was chosen.

    I agree with you though that I would have liked a slightly more detailed description of what it is supposed to do...

    For instance is there any power consumption (and thus noise) differences with it on vs. off?
    Reply
  • Ryan Smith - Tuesday, January 25, 2011 - link

    For the time being it's necessary that we use Use Application Setting so that newer results are consistent with our existing body of work. As this feature did not exist prior to to the 11.1a drivers, using it would impact our results by changing the test parameters - previously it wasn't possible to cap tessellation factors like this so we didn't run our tests with such a limitation.

    As we rebuild our benchmark suite every 6 months, everything is up for reevaluation at that time. We may or may not continue to disable this feature, but for the time being it's necessary for consistent testing.
    Reply
  • Dark Heretic - Wednesday, January 26, 2011 - link

    Thanks for the reply Ryan, that's a very valid point on keeping the testing parameters consistent with current benchmark results.

    Would it be possible to actually leave the drivers at default settings for both Nvidia and AMD in the next benchmark suite. I know there will be some inconsistent variations between both sets of drivers, but it would allow for a more accurate picture on both hardware and driver level (as intended by Nvidia / AMD when setting defaults)

    I use both Nvidia and AMD cards, and do find differences between picture quality / performances from both sides of the fence. However i also tend to leave drivers at default settings to allow both Nvidia and AMD the benefit of knowing what works best with their hardware on a driver level, i think it would allow for a more "real world" set of benchmark results.

    @B3an, perhaps you should have used the phrase "lacking in cognitive function", it's much more polite. You'll have to forgive the oversight of not thinking about the current set of benchmarks overall as Ryan has politely pointed out.
    Reply
  • B3an - Wednesday, January 26, 2011 - link

    You post is simply retarded for lack of a better word.

    Ryan is completely right in disabling this feature, even though it has no effect on the results (yet) in the current drivers. And it should always be disabled in the future.

    The WHOLE point of articles like this is to get the results as fair as possible. If you're testing a game and it looks different and uses different settings on one card to another, how is that remotely fair? What is wrong with you?? Bizarre logic.
    It would be the exact same thing as if AMD was to disable AA by default in all games even if the game settings was set to use AA, and then having the nVidia card use AA in the game tests while the AMD card did not. The results would be absolutely useless, no one would know which card is actually faster.
    Reply
  • prdola0 - Thursday, January 27, 2011 - link

    Exactly. We should compare apples-to-apples. And let's not forget about the FP16 Demotion "optimization" in the AMD drivers that reduces the render target width from R16G16B16A16 to R11G11B10, effectively reducing bandwidth from 64bits to 32bits at the expense of quality. All this when the Catalyst AI is turned on. AMD claims it doesn't have any effect on the quality, but multiple sources already confirmed that it is easily visible without much effort in some titles, while in some others it doesn't have. However it affects performance for up to 17%. Just google "fp16 demotion" and you will see a plenty of articles about it. Reply
  • burner1980 - Tuesday, January 25, 2011 - link

    Thanks for not listening to your readers.

    Why do you have to include an apple to orange comparison again ?

    Is it so hard to test Non-OC vs Non-OC and Oc vs. OC ?

    The article itself is fine, but please stop this practice.

    Proposal for an other review: Compare ALL current factory stock graphic card models with their highest "reasonable" overclock against each other. Which valus does the customer get when taking OC into (buying) consideration ?
    Reply
  • james.jwb - Tuesday, January 25, 2011 - link

    quite a good idea if done correctly. Sort of 460's and above would be nice to see. Reply
  • AnnonymousCoward - Thursday, January 27, 2011 - link

    Apparently the model number is very important to you. What if every card above 1MHz was called OC? Then you wouldn't want to consider them. But the 6970@880MHz and 6950@800MHz are fine! Maybe you should focus on price, performance, and power, instead of the model name or color of the plastic.

    I'm going to start my own comments complaint campaign: Don't review cards that contain any blue in the plastic! Apples to apples, people.
    Reply
  • AmdInside - Tuesday, January 25, 2011 - link

    Can someone tell me where to find a 6950 for ~$279? Sorry but after rebates do not count. Reply
  • Spoelie - Tuesday, January 25, 2011 - link

    If you look at the numbers, the 6870BE is more of a competitor than the article text would make you believe - in the games where the nvidia cards do not completely trounce the competition.

    Look at the 1920x1200 charts of the following games and tell me the 6870BE is outclassed:
    *crysis warhead
    *metro
    *battlefield (except waterfall? what is the point of that benchmark btw)
    *stalker
    *mass effect2
    *wolfenstein

    If you now look at the remaining games where the NVIDIA card owns:
    *hawx (rather inconsequential at these framerates)
    *civ5
    *battleforge
    *dirt2
    You'll notice in those games that the 6950 is just as outclassed. So you're better of with an nvidia card either way.

    It all depends on the games that you pick, but a blanket statement that 6870BE does not compete is not correct either.
    Reply
  • Stuka87 - Tuesday, January 25, 2011 - link

    Why does every single ATI card get the EXACT same FPS in Civilization 5? Did the company that made it get paid off by nVidia to put a frame cap on ATI cards or what? It makes zero sense that 2 year old ATI cards would get the same FPS as just released ATI cards.

    So... What gives?
    Reply
  • Ryan Smith - Tuesday, January 25, 2011 - link

    Under normal circumstances it's CPU limited; apparently at the driver level. Just recently NVIDIA managed to resolve that issue in their drivers, which is why their Civ 5 scores have shot up while AMD's have not. Reply
  • Stuka87 - Wednesday, January 26, 2011 - link

    Hmm, interesting. Reply
  • WhatsTheDifference - Wednesday, January 26, 2011 - link

    please forgive me, Ryan, as I know this sounds abrasive and a little too off-topic from your response here. but speaking of 'score', what's the absolutely mind-boggling delay with including the 4890. quite frankly, if even one of any performance test over the relevant life of the 285 had not the 285 in it, nvidia would burn this site to the ground right after your nvidia-supporting readers did some leveling of their own. so honestly, for every 2xx series that finds its way into a benchmark, where in the world is ATI's top pick of that generation? site regs have posted about anandtech's nvidia-leaning ways, say, a few times, and this particularly clear evidence rather deserves an explanation - in my opinion. or, my spotty attendance contributed to missing at least one fascinating story. Reply
  • 7Enigma - Wednesday, January 26, 2011 - link

    4870 has been in most of the tests (see launch article) which can be used as a slightly lower-performing card. Use that as a guide. Reply
  • erple2 - Thursday, January 27, 2011 - link

    I agree with 7Enigma - the difference between the 4870 and 4890 are no longer significant enough to warrant inclusion in the comparison. I seem to recall that the performance of the 4890 was between the 4870 (shown) and the nvidia 285 (also shown). Couple that with the relative trouncing 30%+ increase in performance) that the newer cards deliver to the GTX285, plus that the frame rates of the GTX285 isn't that high (+30% of 20 fps is 26 fps, which is still "too slow to make it relevant"), I'm not sure that it becomes relevant anymore. Reply
  • cheese319 - Tuesday, January 25, 2011 - link

    It should be easily possible to unlock the card through manually editing the bios to unlock the extra shaders (like what is possible with the 6950 2gb) which is a lot safer for the ram as well. Reply
  • mapesdhs - Tuesday, January 25, 2011 - link


    Talk about not being consistent. :\ Here we have a review that
    includes an oc'd 6870, yet there was the huge row before about the 460
    FTW. Make your minds up guys, either include oc'd cards or don't.
    Personally I would definitely like to the see the FTW included since
    I'd love to know how it compares to the new 560 Ti, given the FTW often
    beats a stock 470. Please add FTW results if you can.

    Re those who've commented on certain tests reaching a plateau in some
    cases: may I ask, why are you running the i7 at such a low 3.33GHz
    speed?? I keep seeing this more and more these days, review articles on
    all sorts of sites using i7 CPU speeds well below 4, whereas just about
    everyone posting in forums on a wide variety of sites is using speeds
    of at least 4. So what gives? Please don't use less than 4, otherwise
    it's far too easy for some tests to become CPU-bound. You're reviewing
    gfx cards afterall, so surely one would want any CPU bottleneck to be
    as low as possible?

    Any 920 should be able to reach 4+ with ease. And besides, who on earth
    would buy a costly Rampage II Extreme and then only run the CPU at
    3.33? Heck, my mbd cost 70% less than a R/II/Extreme yet it would easily
    outperform your review setup for these tests (I use an i7 870 @ 4270).
    For a large collection of benchmark results, Google "SGI Ian", click
    the first result, follow the "SGI Tech Advice" link and then select, "PC
    Benchmarks, Advice and Information" (pity one can't post URLs here now,
    but I understand the rational re spam).

    Lastly, it's sad to admit but I agree with the poster who commented on
    the use of 1920x1200 res. The 1080 height is horrible for general use,
    but the industry has blatantly moved away from producing displays with
    1200 height. I wanted to buy a 1920x1200 display but it was damn hard
    to find companies selling any models at this res at all, never mind
    models which were actually worth buying (I bought an HP LP2475W HIPS
    24" in the end). So I'm curious, what model of display are you using
    for the tests? (the hw summary doesn't say, indeed your reviews never
    say what display is being used - please do!). Kinda seems like you're
    still able to find 1200-height screens, so if you've any info on
    recommended models I'm sure readers would be interested to know.

    Thanks!!

    Ian.
    Reply
  • Ryan Smith - Tuesday, January 25, 2011 - link

    Our 920 is a C0/C1 model, not D0. D0s can indeed hit near 4GHz quite regularly, but for our 920 that is not the case. As for our motherboard, it was chosen first for benchmark purposes - the R2E has some features like OC Profiles that are extremely useful for this line of work.

    As for the monitor we're using, it's a Samsung 305T.
    Reply
  • mapesdhs - Wednesday, January 26, 2011 - link


    Ryan writes:
    > Our 920 is a C0/C1 model, not D0. D0s can indeed hit near 4GHz quite
    > regularly, but for our 920 that is not the case. ...

    Oh!! Time to upgrade. ;) Stick in a 950, should work much better. My point
    still stands though, some tests could easily be CPU-bottlenecking when
    things get tough.

    > As for our motherboard, it was chosen first for benchmark purposes - the
    > R2E has some features like OC Profiles that are extremely useful for this
    > line of work.

    So does mine. :D No need to spend oodles to have such features. Your's
    would of course be better though for 3-way/4-way SLI/CF, but that's a
    minority audience IMO.

    > As for the monitor we're using, it's a Samsung 305T.

    Sweeeet! I see it received good reviews, eg. on the 'trusted' of the
    'reviews' dot com.

    Hmm, do you know if it supports sync-on-green? Just curious.

    Ian.
    Reply
  • Ryan Smith - Wednesday, January 26, 2011 - link

    Indeed CPU bottlenecking is a concern, and we always try to remove it as much as possible. Replacing the CPU means throwing out our entire body of work, so as important as it is to avoid being CPU bottlenecked, we can't do it frequently.

    The issue for us right now is that SNB-E isn't due until late this year, and that's the obvious upgrade path for our GPU testbed since SNB has a limited amount of PCIe bandwidth.
    Reply
  • 7upMan - Wednesday, January 26, 2011 - link

    RYAN: Hi Ryan, while I usually find AnandTech articles quite entertaining and informative, I always wonder why the f*ck professional editors won't get it into their head to test 2GB cards in areas where they belong to. Meaning: a 2GB vs. 1 GB card test should be about graphically overly intensive games and game mods, like the Half-Life 2 Fake Factory mod, or the STALKER Complete mod (Oblivion too has such mods). There are a number of other mods that put massive numbers of huge textures into the graphics RAM, and I think they should be the ones you need to test the cards with. After all, you can't expect games that were written with 1GB VRAM in mind to utilize the full power of double VRAM.

    So please, please run some tests with the above mentioned mods. Thanks in advance.
    Reply
  • IceDread - Wednesday, January 26, 2011 - link

    Good review

    As a sidenote, sort of fun to see my one year old card 5970 is still the best when looking at single cards.
    Reply
  • IceDread - Wednesday, January 26, 2011 - link

    There was not much talk about SLI and crossfire btw, there the value in AMD is higher today day with a fair bit than with SLi solutions comparing AMD 6xxx series with new nvidia cards. Reply
  • Ryan Smith - Wednesday, January 26, 2011 - link

    We had to leave our SLI & CF for this article because of all the driver changes from both parties - as it stands I need to rerun most of those numbers.

    You'll be seeing a lot more on SLI and CF in a week or two; we have a trio of 580s and 6970s in house for some tri-SLI/CF testing.
    Reply
  • Scootiep7 - Wednesday, January 26, 2011 - link

    Not going to add much to the discourse at this point. I just want to say that I really liked this article and thank you for your time and due diligence in writing it. Reply
  • blackshard - Wednesday, January 26, 2011 - link

    Why they are so different from previous articles? NVIDIA numbers have grown about three times and some AMD numbers are grown too.

    Previous article about HD6950 and HD6970 showed this:
    http://www.anandtech.com/show/4061/amds-radeon-hd-...
    Reply
  • Ryan Smith - Wednesday, January 26, 2011 - link

    In the GTX 560 Ti article I explained what was going on.

    http://www.anandtech.com/show/4135/nvidias-geforce...

    "Small Lux GPU is the other test in our suite where NVIDIA’s drivers significantly revised our numbers. Where this test previously favored raw theoretical performance, giving the vector-based Radeons an advantage, NVIDIA has now shot well ahead. Given the rough state of both AMD and NVIDIA’s OpenCL drivers, we’re attributing this to bug fixes or possibly enhancements in NVIDIA’s OpenCL driver, with the former seeming particularly likely. However NVIDIA is not alone when it comes to driver fixes, and AMD has seem a similar uptick against the newly released 6900 series. It’s not nearly the leap NVIDIA saw, but it’s good for around 25%-30% more rays/second under SLG. This appears to be accountable to further refinement of AMD’s VLIW4 shader compiler, which as we have previously mentioned stands to gain a good deal of performance as AMD works on optimizing it."
    Reply
  • blackshard - Wednesday, January 26, 2011 - link

    Ok, got it. I have not read the gtx560 review. Thanks ;) Reply
  • ibudic1 - Wednesday, January 26, 2011 - link

    Don't forget that you can unlock 6950 to 6970, at which point Nvidia is just NOT competitive.

    http://www.techpowerup.com/137140/AMD-Radeon-HD-69...
    Reply
  • ibudic1 - Wednesday, January 26, 2011 - link

    you can unlock the 6950 to a 6970 for free.

    just google: unlocking HD6950

    and the NVIDIA value proposition goes down the toiled. End of discussion.
    Reply
  • Ananke - Wednesday, January 26, 2011 - link

    So much agree!!!
    5850 2Gb is the way to go.
    Reply
  • Ananke - Wednesday, January 26, 2011 - link

    6950 :) I mean Reply
  • Mr Perfect - Wednesday, January 26, 2011 - link

    None of the sites I frequent have said anything about a reduction in texture filtering quality with the new Catalyst versions. Could someone post some links to articles about the issue? Why didn't Anandtech mention it, anyhow? Reply
  • silverblue - Wednesday, January 26, 2011 - link

    Toms did something...

    http://www.tomshardware.co.uk/radeon-catalyst-imag...

    I believe it has been mentioned that any such performance enhancements have been rolled back, so to speak.
    Reply
  • Mr Perfect - Thursday, January 27, 2011 - link

    Hmm, interesting. Tom's points back to an Nvidia post on the issue( http://blogs.nvidia.com/2010/11/testing-nvidia-vs-... ), who themselves quote four review sites I'm not familiar with. Guess I'll check out those sites and see what they're reporting. Reply
  • silverblue - Thursday, January 27, 2011 - link

    I think they were German sites or something? It's been some time since I read this. Reply
  • ezinner - Wednesday, January 26, 2011 - link

    How about putting the price of the cards in the benchmarks? It isn't enough to know how fast a card is, but rather at what cost. Reply
  • swaaye - Wednesday, January 26, 2011 - link

    8X SSAA :D Reply
  • GTVic - Wednesday, January 26, 2011 - link

    This looks nothing like the black edition sold at ncix or newegg? This one has two fans so what is the difference? Reply
  • Ryan Smith - Thursday, January 27, 2011 - link

    Apparently there are 2 Black Edition cards. The one we looked at is the newer of them (687A-ZDBC), whereas the old one used the reference cooler. I'm not sure the newer Black Edition has as widespread availability as the older one, but it's been available at Newegg for as long I've had the card in my hands. Reply
  • antifuchs - Saturday, March 05, 2011 - link

    That would be very interesting to note in the article - could help prevent some annoying mis-purchases: Newegg don't list the newer one (-ZDBC) as a "Black Edition", and searching for "black edition" will only find the reference-cooled card, whereas this article doesn't mention the full model number.

    I would almost have bought the loud reference edition one. Thank god I re-read the comments and re-did the search on Newegg one final time.
    Reply
  • Hrel - Wednesday, January 26, 2011 - link

    So, when am I gonna start seeing 1080p in these charts; as that's really all I care about. I was hoping 2011 would be the year of 16:9 only, to my great dismay this is wrong. Please update soon, 16:9 has been the standard for like 2 years at this point, longer depending on how you look at it. Reply
  • AnnonymousCoward - Thursday, January 27, 2011 - link

    16:9 sucks. Reply
  • ibudic1 - Thursday, January 27, 2011 - link

    if you did share your experience.
    This is how you unlock...
    http://www.techpowerup.com/137140/AMD-Radeon-HD-69...

    http://forums.guru3d.com/showthread.php?t=335318
    Reply
  • erple2 - Thursday, January 27, 2011 - link

    The problem with this is that it's not guaranteed. While you can always flash back if problems arise, making buying decisions strictly based on what the card might be able to do (granted, there's not a lot of cards in the general review cycle that haven't shown that it can be unlocked) sounds an awful lot like "two in the bush". Reply
  • 7upMan - Thursday, January 27, 2011 - link

    RYAN: Hi Ryan, while I usually find AnandTech articles quite entertaining and informative, I always wonder why the f*ck professional editors won't get it into their head to test 2GB cards in areas where they belong to. Meaning: a 2GB vs. 1 GB card test should be about graphically overly intensive games and game mods, like the Half-Life 2 Fake Factory mod, or the STALKER Complete mod (Oblivion too has such mods). There are a number of other mods that put massive numbers of huge textures into the graphics RAM, and I think they should be the ones you need to test the cards with. After all, you can't expect games that were written with 1GB VRAM in mind to utilize the full power of double VRAM.

    So please, please run some tests with the above mentioned mods. Thanks in advance.
    Reply
  • snuuggles - Friday, January 28, 2011 - link

    The chart for Stalker call of pripiate, is repeated on this page:

    http://www.anandtech.com/show/4137/amds-gtx-560-ti...

    it just shows the same chart twice...

    Also, canyou post an edit in the article itself with a link to the actual xfx card or a picture or something? The link you have currently seems to go to one with a stock cooler, and I don't want to just guess which one it is that you tested. Thank you!
    Reply
  • snuuggles - Friday, January 28, 2011 - link

    woopsies! I didn't see the *multiple* pictures you posted of the card. But it would still be awesome to get a corrected link to the exact card, I'm still not 100% clear which one you tested. I apologize if I (again) missed this bit if information

    Thanks!
    Reply
  • Figaro56 - Friday, March 04, 2011 - link

    For the same price of $500 you can get a couple of HD 5870 Crossfire that will kick the GTX 580's teeth down it's throat. Reply
  • recon300 - Tuesday, March 29, 2011 - link

    I haven't seen this question asked yet, so I'm hoping one of you may have the answer. I have noticed that the XFX 6950 is now available with the same dual-fan 3-pipe vapor chamber heatsink setup as is found on their 6870 Black Edition. They literally look identical. I'm wondering if anyone can confirm whether they also share the same acoustic properties... in other words, the XFX 6870BE (as reported by AnandTech) has an idle and load rating of 41.4dB. I wonder if the 6950 has identical (or close) acoustics. Because if the 6950 is as quiet as the 6870BE, I think I'll go with the 6950. Thanks. Reply

Log in

Don't have an account? Sign up now