POST A COMMENT

197 Comments

Back to Article

  • Targon - Thursday, October 21, 2010 - link

    Since the 6870 can not beat the 5870, shouldn't AMD leave the 5870 on the market until they have a true replacement ready? Price vs. Performance is one thing, but dropping their high end parts and replacing them with mid-range cards($200ish) just doesn't have the "Wow!" factor that helps drive sales across the price ranges. Reply
  • Jansen - Thursday, October 21, 2010 - link

    That would be the 6900 series next month:
    http://www.dailytech.com/Radeon+6800+Series+Launch...
    Reply
  • Kyanzes - Friday, October 22, 2010 - link

    Just to be on the safe side I'd like to see minimum FPS results. Although there's very little doubt in my mind that it underperforms. Reply
  • animekenji - Saturday, December 25, 2010 - link

    It doesn't underperform. HD6970 replaces HD5870. HD6870 will be replacing HD5770, which it vastly outperforms. What about the new numbering scheme don't you get? Reply
  • Onyx2291 - Thursday, October 21, 2010 - link

    If I had a job and the money, one of these would be on it's way to my house right now. Reply
  • Doctor_Possum - Thursday, November 11, 2010 - link

    One of these is on it's way to my house right now. Can't wait. Reply
  • Onyx2291 - Thursday, December 22, 2011 - link

    Over a year later, and one is now on it's way to my house right now :D Reply
  • Rasterman - Thursday, October 21, 2010 - link

    Ok nVidia, ATI, Intel, enough with the shitty naming of your devices, a 5870 beats a 6870? Really? I mean come on! Really? Create a committee to agree on a group of benchmarks the result of which is what you get to name your card. Score 100, you now have the Radeon 100, score 340, you now have a GeForce 340. Reply
  • Fleeb - Friday, October 22, 2010 - link

    Though I must agree with you, AMD gave a reason why they did that (marketing perspective) - they are not going to drop 5770 and 5750 yet but replace 5870 and 5850 with 6970 and 6950. Perhaps everything will go back to normal again in the 7xxx series. Reply
  • bennyg - Saturday, October 23, 2010 - link

    Maybe if it were something like 6810 and 6830 there wouldn't be so many complaints.

    But the wider issue is the quasi-quantitative naming schemes in general, they'll never be a perfectly accurate representation of "performance" (or "value for money" or whatever other metric that every individual buyer interprets it to be)

    There'll never be any standard like that, marketing needs wiggle-room that independently-derived pure numbers do not provide. So they'll never agree to it.
    Reply
  • GullLars - Saturday, October 23, 2010 - link

    One sollution would be to to move away from pure number based naming, and do something like:
    AMD/nVidia AG#S# ([Maker]_[Architecture][Generation][# generation of architecture][Market Segment][# of relative performance within segment 1-9]
    Or possibly AMD/Nvidia Architecture Gen# S#
    Example:
    AMD EG1E9 or Evergreen Gen1 E9 = 5970 (Enthusiast)
    nVidia FG1E9 = 480
    AMD Evergreen Gen2 G5(?) = 6850 (Gamer)
    AMD Evergreen Gen1 V7 = 5770 (Value)
    AMD Evergreen Gen1 M5 = 5350 (Media)

    These are just early floating thoughts, which could be refined by marketing monkeys.
    Reply
  • Exelius - Saturday, October 23, 2010 - link

    Marketing monkeys have no intent on making it simple to understand; if you don't know exactly what you're buying, it's easier to sell it to you for more than they would be able to otherwise.

    It's not an accident that the numbering is confusing; if you don't know what you're looking at then a 6870 at a lower price than a 5870 looks like a great deal.
    Reply
  • MonkeyPaw - Friday, October 22, 2010 - link

    Big deal, I say. The card is a few % slower, but is more efficient and is cheaper. People who will actually notice the drop off will probably read reviews first. Besides, if the x900 series is for dual GPU designs, then the naming might not be far off.

    Also, if I had to pick between the 5800 or the 6800, I'd probably get a 6800.
    Reply
  • therealnickdanger - Friday, October 22, 2010 - link

    Don't forget improved image quality!

    It's only disappointing because with a new moniker, I expect new tech, but then again, how long did NVIDIA push G92? 3 generations as different products? LOL
    Reply
  • Rafterman - Friday, October 22, 2010 - link

    What exactly have NVidia got to do with this, no fanboyism please. Reply
  • morphologia - Friday, October 22, 2010 - link

    They are a comparable company with comparably ridiculous naming conventions. How do you go from 9000 to 200?

    Talk about fanboyism...claiming irrelevancy when it's totally relevant reveals your fanboy decoder ring quite clearly.
    Reply
  • Alilsneaky - Friday, October 22, 2010 - link

    I hated nvidia for doing it, why should amd now be forgiven for resorting to doing the same bullshit just because nvidia did it before them?

    I had someone tell me earlier 'that's business'.

    WHAT? No it's bloody not, a scam is a scam, when people start equalling questionable practices like these to business then something is really wrong with today's society.
    Reply
  • Mr Perfect - Friday, October 22, 2010 - link

    What Nvidia did was simply rename the 8800 cards to 9800 card. Same exact chip, same exact clocks, same exact board(at least initially). There where owners of 8800GTs who simply edited the name in the BIOS of their card and had a 9800GT!

    The reason AMD is getting a pass from most people is because this isn't a purely renamed card. It's a redesigned chip on a new PCB with a poor name. If, on the other hand, AMD renames the 5750 and 5770 to the 6750 and 6770 you can expect them to get nailed to the wall right next to Nvidia.
    Reply
  • pcfxer - Saturday, October 23, 2010 - link

    It was very clear why he mentioned NVIDIA. You should read his post... Reply
  • snarfbot - Friday, October 22, 2010 - link

    at least all the iterations of g92 improved performance over their predecessor.

    compare this launch to the x1xxx series of ati products, the x1800 was replaced by the x1900 which was replaced by the x1950. all of which improved performance over their predecessor. all the while on the same process 90nm.(save for the 1950pro and gt, which were mainstream parts.)

    imagine if they named the x1900 the x2900, and somehow it actually performed worse than the x1800.

    thats what they did here, and thats why it fails imo.

    if they just called it hd5790 and kept it at the same price people wouldve gobbled it up anyway, without sacrificing their integrity.

    just a bunch of numbers, but what it means in mindshare is important, and all most people will remember about this generation, is that it was worse than the 5 series and worse than nvidias.

    all aboard the fail boat. honk honk.
    Reply
  • pcfxer - Saturday, October 23, 2010 - link

    The problem with that is that GPUs are much more complex than the way a single score can paint. The technology is complex and thus explaining performance across the board is also complex. It very much is the nature of the beast.

    The only way to go is to scour the web for reviews of the videocards that you are looking at specifically and for the applications you would like to run. It is still true though, that a 5870 will outperform a 5850 or a 5770 so they made that simple.

    AMD definitely has ruined the simple 5850 5870 5890 nomenclature though...
    Reply
  • Krich420 - Tuesday, October 26, 2010 - link

    I think if they just named it 6850/6830 instead of 6870/6850 they could have saved themselves a lot of negative sentiment. Reply
  • Sparks_IT - Thursday, October 21, 2010 - link

    Any information on Eyefinity. I thought there was to be an update/improvement? And is an active adapter still needed? Reply
  • Jansen - Thursday, October 21, 2010 - link

    There are connections for 2 mini DisplayPort, 1 HDMI 1.4a, and 2 DVI.

    http://www.dailytech.com/Radeon+6800+Series+Launch...

    There are some pretty cheap mini-DP adapters out now.
    Reply
  • Jansen - Thursday, October 21, 2010 - link

    My point should have been that you can now use 4 monitors natively with a single card. Reply
  • Stuka87 - Friday, October 22, 2010 - link

    Actually its still limited to two displays at once as I recall. It has four interfaces, but they cannot function simultaneously. Reply
  • mino - Friday, October 22, 2010 - link

    4 it is.
    DP interfaces are independent from DVI/HDMI ones.

    So yeas, you can use any 2 of the DVI-DVI-HDMI plus those 2 DP interfaces.
    Reply
  • AnnihilatorX - Friday, October 22, 2010 - link

    No way, that's not how Eyefinity works
    Eyefinity allows 3 monitors to be driven by a single card, I don't think they would make it any less with the new cards. It may not be 4, but 3 should be alright
    Reply
  • Stuka87 - Friday, October 22, 2010 - link

    Ahh yeah, you are right. For some reason that bit of detail was not in mind at the time that I posted. Guess thats what I get for responding so late at night :) Reply
  • ninjaquick - Monday, October 25, 2010 - link

    Actually, Barts can push 6 screens... As could cypress but it was crippled to three most of the time, with the exception being eyefinity series cards that had 6 DP on the back. Reply
  • Ryan Smith - Friday, October 22, 2010 - link

    In case it isn't obvious from the slipshod organization of the article, we didn't quite get it done on time. We had less than a week to put this article together when normally for an article of this size it takes 2 weeks. Check back in the morning, everything on Eyefinity, DP1.2, etc will be up by then. Reply
  • counter03 - Friday, October 22, 2010 - link

    i am quite interested in that 'mst hub'.is there any available product now?well,i just find nothing with google. Reply
  • wolrah - Friday, October 22, 2010 - link

    The card still has two TMDS clock generators. This means only two DVI/HDMI displays can be driven off the card at one time, no matter what connection. With that in mind, I wouldn't be surprised if the card doesn't even support passive adapters as there is literally no good reason to ever use them. You already have two native ports, so with a maximum of two DVI displays that's that.

    Regardless of if passive adapters are supported, you'll still need active adapters or native DisplayPort on your display to run three or four monitors off this card.

    Supposedly it also supports DisplayPort daisy chaining, so possibly when monitors arrive that have an out port it may handle more than four displays, but again only a maximum of two can be DVI/HDMI without active adapters.
    Reply
  • ninjaquick - Monday, October 25, 2010 - link

    DP support shouldn't be a problem since I've seen quite a few monitors coming out with DP in and some with I/O. Reply
  • Abot13 - Saturday, October 23, 2010 - link

    With the 6800 series the cards can support up to 6 monitors in Eyefinity. The DP ports can use a DP hub or the monitors can be chained with DP. Either way the max is 6 monitors per card. Reply
  • ol1bit - Thursday, October 21, 2010 - link

    The naming stinks, but I can see how these cards will be a big Christmas season for AMD.

    So people that don't the new naming scheme will rush out an buy the bigger names. Smooth Marketing Move.

    Well, if the 6900's launch next month, that will be fun to see.
    Reply
  • Zokudu - Thursday, October 21, 2010 - link

    You removed Left 4 Dead from your benchmarks. Does this mean you won't be replacing it with another Source game (ie L4D2 or maybe Portal 2 when that releases)?

    I know its not 100% perfect but 3 of the top 10 selling games on steam right now are run on the Source engine and the only other ones that share and engine are BF:BC2 and MoH both using Frostbite. I would think that having some form of a Source game in there would be a good idea considering the vast amount of popular games that run on it. I always used the L4D benchmark to compare performance for TF2 and CS:S. in the past.

    Otherwise good review and I'm excited for the HD6900 launch next month.
    Reply
  • Ryan Smith - Friday, October 22, 2010 - link

    By the time Portal 2 comes out, it'll be about the time we refresh our suite anyhow. For the time being the existing Source games run on anything (even the GT 430 got L4D playable at 1680 with 4xAA) so it's not a useful benchmark, especially since there's no guarantee Portal will run that well. Reply
  • AmdInside - Thursday, October 21, 2010 - link

    In the 3DTV Play article, you mentioned AMD's 6000 series would challenge NVIDIA in 3D with the 6000 series but I didn't read any info on it in this article. What gives? Reply
  • StevoLincolnite - Friday, October 22, 2010 - link

    Also didn't see any mention of the improved crossfire performance either... Reply
  • Assimilator87 - Friday, October 22, 2010 - link

    Yeah, this article was sorely lacking in details, especially considering Ryan specifically mentioned that these cards are more about features than performance. You missed four display Eyefinity, UVD 3, HD3D, and an in depth look at DisplayPort 1.2 would be nice as well. Reply
  • Ryan Smith - Friday, October 22, 2010 - link

    In case it isn't obvious from the slipshod organization of the article, we didn't quite get it done on time. We had less than a week to put this article together when normally for an article of this size it takes 2 weeks. Check back in the morning, all of that will be here by the time you wake up (assuming I don't pass out at the keyboard first). Reply
  • StevoLincolnite - Friday, October 22, 2010 - link

    No offense Ryan, but the sole reason why I visit Anandtech is because it usually does provide allot of accurate information on the product that is being reviewed.

    This article... I couldn't help but want more as I walked away. - It almost feels simplified.

    I can understand the whole deadline issues and what-not, but in this case wouldn't it have been better to delay it a day or two?
    Reply
  • Taft12 - Friday, October 22, 2010 - link

    <i>I can understand the whole deadline issues and what-not, but in this case wouldn't it have been better to delay it a day or two?</i>

    Absolutely not - if you don't get a review out on NDA-lift day, you are dead dead dead, even for the cream of the crop that is Anandtech.

    I am certain you busted your ass getting this article as good as it is Ryan and I for one appreciate it. Bravo!
    Reply
  • DoktorSleepless - Friday, October 22, 2010 - link

    Will you eventually be exploring overclocking? Reply
  • jglisso3 - Friday, October 22, 2010 - link

    http://www.techpowerup.com/reviews/HIS/Radeon_HD_6... Reply
  • Ryan Smith - Friday, October 22, 2010 - link

    Yes. I have additional 6850 samples arriving next week for a roundup, which will give us enough cards to do a proper test of overclocking. Reply
  • hamiltenor - Friday, October 22, 2010 - link

    All the info I like, and more. With such a timely review, I don't know how you guys stand being the best. Reply
  • Byte - Friday, October 22, 2010 - link

    Very interesting, i was thinking of upgrading my GTX260 to a GTX460, but all I play is starcraft 2 and this handily beats the 460. If this bad boy goes down to $150ish, looks like i'm going team red. Reply
  • hechacker1 - Friday, October 22, 2010 - link

    I know Anandtech probably wanted to get this article out ASAP, and hence why I don't see thorough testing; but I would like to see UVD3 and other aspects of this new GPU tested. Video quality for an HTPC is important, and with this card drawing such a little at idle it could be a nice HTPC card at the low end.

    It's kind of curious why the newer generation cards lose to a 4870 doing transcoding. I'm guessing the compute performance has barely changed? Or MediaEspresso is a worthless test?
    Reply
  • Ryan Smith - Friday, October 22, 2010 - link

    Quality is unchanged. UVD 3 adds a few fixed function blocks, but quality is a matter of post-processing and hence affected by the drivers once you have sufficient shader power to do all the post-processing. Reply
  • Pastuch - Friday, October 22, 2010 - link

    I posted about this earlier but my post was deleted.

    Ryan there is a ton of HTPC users on this site.

    1. Exactly how long is the Radeon 6870/6850 vs the GTX 460?

    2. How does the GTX460 compare to the Radeon 6 series regarding bitstreaming high def audio?

    3. How UVD3 post-processing compare to Nvidias?
    Reply
  • Ryan Smith - Friday, October 22, 2010 - link

    It's exactly the same as this: http://www.anandtech.com/show/3973/nvidias-geforce... Reply
  • HigherGround - Friday, October 22, 2010 - link

    Why was EVGA card included in this test? The rest of the field is generic (non OC, no brand), so why included an OC card, which skews the readers perspective? Pretty sure EVGA paid you to included its top OC card in this review ... Reply
  • Parhel - Friday, October 22, 2010 - link

    No, NVidia paid them to include it. NVidia sends "guidelines" to all the hardware review sites, telling them what settings to use and which cards to use in the comparison. In the guidelines for today's review was to use the EVGA GTX 460 FTW, and and site you see using it is essentially a paid NVidia shill.

    I could care less about ATI vs NVidia, as I'm not really a gamer, but I'm very disappointed today to see my long time favorite hardware site stooping to this level. In the end, it gives consumers bad information, which should be antithetical to the purpose of a site like this.
    Reply
  • AtwaterFS - Friday, October 22, 2010 - link

    I agree, this site is typically class-leading, but this article give AnandTech a bit of a black eye and the results dont particularly jive with "un-biased" sites like HardOCP. Reply
  • DrKlahn - Friday, October 22, 2010 - link

    I was going to post the same thing. As a long time reader of this site, I was very disappointed with the decision to include the overclocked card. Either the ATI cards should have been overclocked and their results provided in every test or it should have been excluded as per the normal benchmarking guidelines.

    I would have no issue with a followup or side article comparing factory overclocked offerings. But this is clearly bowing to pressure from Nvidia and I expected better of this site.
    Reply
  • aungee - Saturday, October 23, 2010 - link

    To Include the EVGA GTX 460 FTW was unfair and whether intentional or not it did spoil the launch party for AMD on this site to some degree. It would have been more appropriate to make a small mention of it's existence and to benchmark it in the future against any factory OC 6800 cards.

    After getting your head around the naming, AMD needs to be credited for bringing such a performance on only a 255 mm2 package (it even caused the price drop for the 530mm2 GTX 470) . AMD has headroom to drop the price of the 6800 cards so lets hope they do soon.
    Reply
  • tigersty1e - Friday, October 22, 2010 - link

    I couldn't find the clocks, but if you do include an OC'd card in your benches, you should give us the clocks. Reply
  • dertechie - Friday, October 22, 2010 - link

    850 MHz Core, 1700 MHz shaders, 4 GHz Memory, up from 675 MHz Core, 1350 MHz shaders, 3.6 GHz Memory.

    That's a 26% Core OC and an 11% Memory OC. However, the cost has been OC'd too, the FTW card costs the same $240 as the stock Radeon 6870.
    Reply
  • Chris Peredun - Friday, October 22, 2010 - link

    Not bad, but consider that the average OC from the AT GTX 460 review was 24% on the core. (No memory OC was tried.)

    http://www.anandtech.com/show/3809/nvidias-geforce...
    Reply
  • thaze - Friday, October 22, 2010 - link

    German magazine "PC Games Hardware" states the 68xx need "high quality" driver settings in order to reach 58xx image quality. Supposedly AMD confirmed changes regarding the driver's default settings.
    Therefore they've tested in "high quality" mode and got less convincing results.

    Details (german): http://www.pcgameshardware.de/aid,795021/Radeon-HD...
    Reply
  • Ryan Smith - Friday, October 22, 2010 - link

    Unfortunately I don't know German well enough to read the article, and Google translations of technical articles are nearly worthless.

    What I can tell you is that the new texture quality slider is simply a replacement for the old Catalyst AI slider, which only controlled Crossfire profiles and texture quality in the first place. High quality mode disables all texture optimizations, which would be analogous to disabling CatAI on the 5800 series.So the default setting of Quality would be equivalent to the 5800 series setting of CatAT Standard.
    Reply
  • thaze - Saturday, October 30, 2010 - link

    "High quality mode disables all texture optimizations, which would be analogous to disabling CatAI on the 5800 series.So the default setting of Quality would be equivalent to the 5800 series setting of CatAT Standard. "

    According to computerbase.de, this is the case with Catalyst 10.10. But they argue that the 5800's image quality suffered in comparison to previous drivers and the 6800 just reaches this level of quality. Both of them now need manual tweaking (6800: high quality mode; 5800: CatAI disabled) to deliver the Catalyst 10.9's default quality.
    Reply
  • tviceman - Friday, October 22, 2010 - link

    I would really like more sites (including Anandtech) to investigate this. If the benchmarks around the web using default settings with the 6800 cards are indeed NOT apples to apples comparisons vs. Nvidia's default settings, then all the reviews aren't doing fair comparisons. Reply
  • thaze - Saturday, October 30, 2010 - link

    computerbase.de also subscribes to this view after having invested more time into image quality tests.

    Translation of a part of their summary:
    " [...] on the other hand, the textures' flickering is more intense. That's because AMD has lowered the standard anisotropic filtering settings to the level of AI Advanced in the previous generation. An incomprehensible step for us, because modern graphics cards provide enough performance to improve the image quality.

    While there are games that hardly show any difference, others suffer greatly to flickering textures. After all, it is (usually) possible to reach the previous AF-quality with the "High Quality" function. The Radeon HD 6800 can still handle the quality of the previous generation after manual switching, but the standard quality is worse now!

    Since we will not support such practices, we decided to test every Radeon HD 6000 card with the about five percent slower high-quality settings in the future, so the final result is roughly comparable with the default setting from Nvidia."

    (They also state that Catalyst 10.10 changes the 5800's AF-quality to be similar to the 6800's, both in default settings, but again worse than default settings in older drivers.)
    Reply
  • Computer Bottleneck - Friday, October 22, 2010 - link

    The boost in low tessellation factor really caught my eye.

    I wonder what kind of implications this will have for game designers if AMD and Nvidia decide to take different paths on this?

    I have been under the impression that boosting lower tessellation factor is good for System on a chip development because tessellating out a low quality model to a high quality model saves memory bandwidth.
    Reply
  • DearSX - Friday, October 22, 2010 - link

    Unless the 6850 overclocks a good 25%, what 460s reference 460s seem to overclock on average, it seems to not be any better overall to me. Less noise, heat, price and power, but also less overclocked performance? I'll need to wait and see. Overclocking a 460 presents a pretty good deal at current prices, which will probably continue to drop too. Reply
  • Goty - Friday, October 22, 2010 - link

    Did you miss the whole part where the stock 6870 is basically faster (or at worst on par with) the overclocked 460 1GB? What do you think is going to happen when you overclock the 5870 AT ALL? Reply
  • DominionSeraph - Friday, October 22, 2010 - link

    The 6870 is more expensive than the 1GB GTX 460. Apples to apples would be DearSX's point -- 6850 vs 1GB GTX 460. They are about the same performance at about the same price -- $~185 for the 6850 w/ shipping and ~$180 for the 1GB GTX 460 after rebate.
    The 6850 has the edge in price/performance at stock clocks, but the GTX 460 overclocks well. The 6850 would need to consistently overclock ~20% to keep its advantage over the GTX 460.
    Reply
  • Goty - Friday, October 22, 2010 - link

    Other reviews show 6850s hitting 1GHz+ with software voltage modification, so I don't think that will be an issue. Reply
  • karlostomy - Monday, October 25, 2010 - link

    The question then is, why did anandtech choose to include the EVGA card that NVIDIA no doubt hand picked and delivered?

    Including the OC 460 card is one thing, but at the very least some 'attempt' at oc'ing the 6850 would have retained a semblance of reviewer impartiality.
    Reply
  • wyvernknight - Friday, October 22, 2010 - link

    According to this article i just read it can do 6 way eyefinity.

    http://www.semiaccurate.com/2010/10/21/amds-6870-b...

    The diagram is close to the bottom.
    Reply
  • notty22 - Friday, October 22, 2010 - link

    The reviewer addressed why the 460 o/c was included. Owners/gamers are reporting the ability to clock their 460's to the 810,820,850 mhz the clocks various "special" models come @ with stock voltage. I agree , its more of why did Nvidia do this ? Imho, it was to position the card without competing/obsoleting the gtx 465/470. Now that Nvidia has lowered the prices, and the good price point the new AMD cards launched with, this is a exciting time for the gamer.
    Now lets get some new, more powerful dx11 games !
    Thanx for the COMPLETE review !
    Reply
  • Kyanzes - Friday, October 22, 2010 - link

    I could have sworn that AvP had been mentioned as a future standard test game on Anandtech. I could be wrong ofc. Reply
  • 3DVagabond - Friday, October 22, 2010 - link

    I'm really surprised you went along with using the EVGA (OC) card nVidia sent you. They sent you what is commonly referred to as "a ringer", and you went along. You should have used the stock 460 (both models) and a stock 470, IMO. Why let nVidia name the conditions? They are obviously going to do everything they can to tilt the playing field. Was there anything else they wanted that you did for them? Reply
  • AnandThenMan - Friday, October 22, 2010 - link

    Well in the article, they basically admitted to "caving in" to Nvidia by including the overclocked card. Obviously Nvidia was very keen to have a specific card included, seems dubious.

    "However with the 6800 launch NVIDIA is pushing the overclocked GTX 460 option far harder than we’ve seen them push overclocked cards in the past –we had an EVGA GTX 460 1GB FTW on our doorstep before we were even back from Los Angeles."

    I mean stating, "a matter of editorial policy" then ignoring that policy outright seems pretty sketchy to me. Like you said, makes one question the results in general.
    Reply
  • DominionSeraph - Friday, October 22, 2010 - link

    If AMD's official segmentation strategy were to put a factory overclocked 6870 against the GTX 470, what would be the issue with AnandTech comparing the two? Granted, it doesn't mean much to enthusiasts who would just buy a stocker and overclock it to pocket the price differential, but I'd wager a card bought by your average idiot buying off the shelves of Best Buy isn't going to see anything other than the factory clocks. Reply
  • bji - Friday, October 22, 2010 - link

    Actually the people overclocking their video cards and then dealing with overheating and loud-as-an-aircraft-engine fan noise are the idiots.

    Just thought that if you were going to go around saying disparaging things about people who have different values than you do, that you might appreciate some of the same.
    Reply
  • spigzone - Friday, October 22, 2010 - link

    Maybe if you had dropped testing the FTW 460 for the time being, saving it for your 'overview' test next week, you would have had enough time to release a fully fleshed out and organized review instead of letting Nvidia jerk you around, compromising your own 'editorial policy' on only using stock cards in the initial review and saving you the time and trouble of coming up with lame @$$ rationalizations. Reply
  • Parhel - Friday, October 22, 2010 - link

    That's the truth. And even worse is that, after this review, I can no longer trust Anandtech as an unbiased review site. Along with the cards arriving on their doorsteps, NVidia tells the review sites which settings to use for both their own cards and AMD's. If the FTW edition card was included, I have to assume that the 'special' settings were used as well, which invalidates this whole article. Cementing that position is that HardOCP, I site I trust 100% but which is not one of my favorites, shows the new AMD cards performing MUCH better than we see them on Anandtech. Reply
  • spigzone - Friday, October 22, 2010 - link

    I doubt Nvidia even tried to roll Kyle.

    What's more pathetic than someone knowing they are being rolled and trying to rationalize why it's okay as it's happening as if to say 'see, I'm TELLING you I'm getting rolled, so I DO have integrity ... you can see that, can't you???

    Thank god for the Kyles of the world to provide integrity benchmarks.
    Reply
  • Lolimaster - Sunday, October 24, 2010 - link

    Is not even unbiased towards AMD cpu's. This explain everything.

    Trully PATHETIC
    Reply
  • Will Robinson - Friday, October 22, 2010 - link

    It actually doesn't matter that much.After reading all the reviews out its pretty clear both the 6850 and 6870 are damn good cards and have some great new features.
    You can run 6 screens off one card,each at different resolutions,refresh rate and orientation.
    That's pretty awesome.
    NVDA obviously prefers a highly overclocked card to be used in the benchmarking but its pretty clear who the winners are.
    Crossfire scaling and performance looks very nice....these new cards the new mid range champs.
    Reply
  • Manu64 - Friday, October 22, 2010 - link

    So far i've always valued Anandtech as a neutral PC site, now i'm losing my faith... Whole article written in favor of NVDA because of an heavily overclocked card? You are losing your standards :-( Reply
  • Jamahl - Friday, October 22, 2010 - link

    I must agree. The whole front page bangs on about how Anandtech never uses overclocked cards blah blah, then throws it out the window.

    Anandtech hits an all time low.
    Reply
  • kmmatney - Friday, October 22, 2010 - link

    I don't mind an overclocked card - its a card you can buy right now on NewEgg, so its a valid option. HOWEVER, it would have been much better to at least give an odea of how the new ATi cards overclock. Reply
  • mac2j - Friday, October 22, 2010 - link

    I also don't think the 460 (OC) belonged in this article.

    Compare reference to reference .

    Compare the custom 6850s to custom 450s/460s.

    Compare the custom 6870s which are coming later to OC 460s/470s.

    Are the 6950/6970 going to have to beat the Point of View TGT GTX480 beast or the N480GTX Lightning from MSI? Cards built on limited numbers of hand-selected chips and custom overclocked?

    It's pretty ridiculous what you did ... and didn't even mention the possible future advantages of these cards thanks to them having Displayport 1.2 support.
    Reply
  • mindbomb - Friday, October 22, 2010 - link

    I disagree.
    When the gtx 460 1gb OC models price is around the price of a regular 6870, you can compare them.
    Reply
  • bji - Friday, October 22, 2010 - link

    Finally some reasonable logic in this panties-in-a-bunch fest. Reply
  • GeorgeH - Friday, October 22, 2010 - link

    WRT comments complaining about the OC 460 -

    It's been clear from the 460 launch that a fully enabled and/or higher clocked 460 would compete very well with a 470. It would have been stupid for NVIDIA to release such a card, though - it would have made the already expensive GF100 even more so by eliminating a way to get rid of their supply of slightly defective GF100 chips (as with the 465) and there was no competitive reason to release a 460+.

    Now that there is a competitive reason to release one, do you really think Nvidia is going to sit still and take losses (or damn close to it) on the 470 when it has the capability of launching a 460+? Do you really think that Nvidia still can't make fully functional GF104 chips? Including the OC 460 is almost certainly Ryan's way of hinting without hinting (NDAs being what they are) what Nvdia is prepping for release.

    (And if you really think AT is anyone's shill, you're obviously very new to AT.)
    Reply
  • AnandThenMan - Friday, October 22, 2010 - link

    "And if you really think AT is anyone's shill, you're obviously very new to AT."

    Going directly against admitted editorial policy doesn't exactly bolster your argument now does it. As for your comment about a 460+ or whatever you were trying to say, who cares? Reviews are supposed to be about hardware that is available to everyone now, not some theoretical card in the future.
    Reply
  • MGSsancho - Friday, October 22, 2010 - link

    A vendor could just as likely sell an overclocked 470 card as well as a 480. But I think you made the right assumption that team green might be releasing overclocked cards that all have a minimum of 1gb of ram to make it look like their cards are faster than team red's. maybe it will be for near equal price points, the green cards will all be 20~30% overclocked to make it look like they are 10% faster than the red offerings at similar prices. Red cards could just be sold over clocked as well (we have to wait a bit more to see how well they overclock). All of this does not really matter. In the end of the day, buyers will look at whats the fastest product they can purchase at their price point. Maybe secondly they will notice that hey this thing gets hot and is very loud and just blindly blaming the green/red suits and thirdly they will look at features. Who really knows.

    Personally I purchase the slightly slower products then over clock them myself if i find a game that needs it. I would rather have the headroom vs buying a card that is always going to be hot enough to rival volcanoes even if it is factory warrantied.
    Reply
  • Golgatha - Friday, October 22, 2010 - link

    The nVidia volcanoes comment is really, really overstated. I have a mid-tower case with a 120mm exhaust and 2x92mm intakes (Antec Solo for reference), and a GTX 480. None of these case fans are high performance fans. Under very stressful gaming conditions, I hit in the 80-85°C range, and Folding@Home's GPU3 client will get it up to 91°C under 100% torturous load.

    Although I don't like the power consumption of the GTX 480 for environmental reasons, it is rock solid stable, has none of the drawbacks of multi-GPU setups (I actually downgraded from a Crossfire 5850 setup due to game crashing and rendering issues), and it seems to be top dog in a lot of cases when it comes to minimum FPS (even when compared to multi-GPU setups).
    Reply
  • Parhel - Friday, October 22, 2010 - link

    "And if you really think AT is anyone's shill, you're obviously very new to AT"

    I think you're referring to me, since I'm the one who used the word "shill." Let me tell you, I've been reading AT since before Tom's Hardware sucked, and that's a loooong time.

    If I were going to buy a card today, I'd buy the $180 GTX 460 1GB, no question. I'm not an AMD fan, nor am I an NVidia fan. I am, however, an Anandtech fan. And their decision to include the FTW edition card in this review means I can no longer come here and assume I'm reading something something unbiased and objective.
    Reply
  • GeorgeH - Friday, October 22, 2010 - link

    It was actually more of a shotgun blast aimed at the several silly posts implying AT was paid off by EVGA or Nvidia.

    If you've been reading AT for ~10 years, why would you assume that Ryan (or any other longtime contributor) suddenly decided to start bowing to outside pressure? If you stop lighting the torches and sharpening the pitchforks for half a second, you might realize that Ryan probably has a very good reason for including the OC card.

    Even if I'm smoking crack WRT a GTX460+, what's the point of a review? It's not to give AMD and Nvidia a "fair" fight, it's to give us an idea of the best card to spend our money on - and if AMD or Nvidia get screwed in the process, I'm not going to be losing any sleep.

    Typically, OC cards with a significant clock bump are fairly rare "Golden samples" and/or only provide marginal performance benefits without significantly increasing heat, noise, and power consumption. With the 460, Nvidia all but admitted they could've bumped the stock clocks quite significantly, but didn't want to threaten their other cards (*cough* 470 *cough*) if they didn't have to. This is reflected in what you can actually buy at Newegg - of the ~30 1GB 460's, only ~5 are running stock. 850MHz is still high, but is also right in line with the average of what you can expect any 460 to get to, so I don't think it's too far out of place.

    Repeating what I said above, including the OC card was unfair to AMD, but is highly relevant to me and my wallet. I couldn't care less if AMD (or Nvidia) get screwed by an AT review - I just want to know what's best for me, and this article delivers. If the tables were turned, I'm sure that Ryan would have no problem including an OC AMD card in a Nvidia review - because it isn't about being a shill, it's about informing me, the consumer.
    Reply
  • SandmanWN - Friday, October 22, 2010 - link

    What? Put the crack down... Really, if you are short on time to review a product and you steal time away from that objective just to review a specially delivered hand selected opponents card instead of completing your assignment then you've not exactly been genuine to your readers or in this case to AMD.

    If you have time to add in an overclocked card then you need to do the same with the review card, otherwise the OC'd cards need to wait another day.

    I have no idea how you can claim some great influence on your wallet when you have no idea of the OC capabilities of the 6000 series. If you actually bought the 460 off this review then you are banking that the overclock will hold up against a unknown variable. That's not exactly relevant to anyone's wallet.
    Reply
  • GeorgeH - Friday, October 22, 2010 - link

    An OC'd 460 competes with the 6870, and the 6870 doesn't really overclock at all.

    Even overclocked, a 6850 isn't going to touch a 6870, unless you're going to well over 1GHz (which short of a miracle isn't going to happen.)

    It was disappointing that the review wasn't fleshed out more, but I'd say what's missing isn't as relevant to my buying decisions as how well the plethora of OC'd 460s compare to the 6870.
    Reply
  • Parhel - Saturday, October 23, 2010 - link

    "the 6870 doesn't really overclock at all"

    What? You're talking out of your ass No review site has even attempted a serious overclock yet. It's not even possible, as far as I know, to modify the voltage yet! We have no way to gauge how these cards overclock, and won't for several weeks.

    "850MHz is still high, but is also right in line with the average of what you can expect any 460 to get to"

    Now you're sounding like the shill. 850Mhz is not a realistic number if we're talking about 24/7 stability with stock cooling. No way.
    Reply
  • GeorgeH - Saturday, October 23, 2010 - link

    850MHz unrealistic? Nvidia flat out admitted that most cards are capable of at least ~800MHz (no volt mods, no nothing) and reviews around the web have backed this up, showing low to mid 800's on most stock cards, at stock voltages, running stock cooling. If you're worried about reliability, grab one of the many cards that come factory OC'd with a warranty.

    The 6870 doesn't now and never will overclock much at all, at least not in the way the 460 does. As with any chip, there will be golden sample cards that will go higher with voltage tweaks and extra cooling, but AMD absolutely did not leave ~20-25% of the 6870's average clockspeed potential on the table. The early OC reviews back this up as well, showing the 6870 as having minimal OC'ing headroom at stock voltages.

    If you're waiting to compare the maximum performance that you can stretch out of a cherry-picked 6870 with careful volt mods and aftermarket cooling, you're going to be comparing it with a 460 @ ~950MHz, not ~850MHz.

    As a guess, I'd say that your ignorance of these items is what led you to be so outraged at the inclusion of the OC 460 in the review. The magnitude of the OC potential of the 460 is highly atypical (at least in mid-range to high end cards), which is why I and many other posters have no issue with its similarly atypical inclusion in the review.
    Reply
  • StriderGT - Friday, October 22, 2010 - link

    I agree with you that the inclusion of the FTW card was a complete caving and casts shadows to a so far excellent reputation of anandtech. I believe the whole motivation was PR related, retaining a workable relation with nvidia, but was it worth it?!

    Look how ugly can this sort of thing get, they do not even include the test setup... Quote from techradar.com:

    We expected the 6870 to perform better than it did – especially as this is essentially being pitched as a GTX 460 killer.
    The problem is, Nvidia's price cuts have made this an impossible task, with the FTW edition of the GTX 460 rolling in at just over £170, yet competently outperforming the 6870 in every benchmark we threw at it.
    In essence, therefore, all the 6870 manages is to unseat the 5850 which given its end of life status isn't too difficult a feat. We'd still recommend buying a GTX 460 for this sort of cash. All tests ran at 1,920 x 1,080 at the highest settings, apart from AvP, which was ran at 1,680 x 1,050.

    http://www.techradar.com/reviews/pc-mac/pc-compone...
    Reply
  • oldscotch - Friday, October 22, 2010 - link

    ...where a Civilization game would be used for a GPU benchmark. Reply
  • AnnihilatorX - Friday, October 22, 2010 - link

    It's actually quite taxing on the maps. It lags on my HD4850.

    The reason is, it uses DX 11 DirectCompute features on texture decompression. The performance is noticeably better on DX11 cards.
    Reply
  • JonnyDough - Friday, October 22, 2010 - link

    "Ultimately this means we’re looking at staggered pricing. NVIDIA and AMD do not have any products that are directly competing at the same price points: at every $20 you’re looking at switching between AMD and NVIDIA."

    Not when you figure in NVidia's superior drivers, or power consumption...depending on which one matters most to you.
    Reply
  • Fleeb - Friday, October 22, 2010 - link

    I looked at the load power consumption charts and saw the Radeon cards are better in this department and I don't clearly understand your statement. Did you mean that the nVidia cards in these tests should be better because of superior power consumption or that their power consumption is superior in a sense that nVidia cards consume more power? Reply
  • jonup - Friday, October 22, 2010 - link

    I think he meant the nVidia has better drivers but worse power consumption. So it all depends on what you value most. At least that's how I took it. Reply
  • zubzer0 - Friday, October 22, 2010 - link

    Great review!

    If you have the time I would be wery happy if you test how well these boards do in Age of Conan DX10?

    Some time ago you included (feb. 2009) Age of Conan in your reviews, but since then DX10 support was added to the game. I have yet to see an official review of the current graphics cards performance in AoC DX10.

    Btw. With the addon "Rise of the godslayer" the graphics in the new Khitai zone are gorgeous!
    Reply
  • konpyuuta_san - Friday, October 22, 2010 - link

    In my case (pun intended), the limiting factor is the physical size of the card. I've abandoned the ATX formats completely, going all out for mini-ITX (this one is Silverstone's sugo sg06). The king of ITX cases might still be the 460, but this is making me feel a bit sore about the 460 I'm just about to buy. Especially since the 6870 is actually only $20 more than the 6850 where I live and the 6850 is identically priced to the 460. There's just no way I can fit a 10.5 inch card into a 9 inch space. The 9 inch 6850 would fit, but there's a large radiator mounted on the front of the case, connected to a cpu water cooling block, that will interfere with the card. I've considered some crazy mods to the case, but those options just don't feel all that attractive. The GTX460 is a good quarter inch shorter and I'm getting a model with top-mounted power connectors so there's ample room for everything in this extremely packed little gaming box. I'm still kind of trying to find a way to put a 6850 in there (bangs and bucks and all that), which leads to my actual question, namely:

    The issue of rated power consumption; recommended minimum for the 460 is 450W (which I can support), but for the 6850 it's 500W (too much). How critical are those requirements? Does the 6850 really require a 500W supply? Despite having lower power consumption than the 460?! Or is that just to ensure the PSU can supply enough amps on whatever rail the card runs off? If my 450W SFF PSU can't supply the 6850, it really doesn't matter how much better or cheaper it is ....
    Reply
  • joshua4000 - Friday, October 22, 2010 - link

    let me get this straigt, fermi was once too expensive to manufacture due to its huge die and stuff but its striped down versions sell for less and outpace newley released amd cards (by a wide margin when looked at the 470)

    amds cheaper to manufacture cards (5xxx) on the other hand came in overpriced once the 460 had been released (if they havent been over priced all along...), still, the price did not drop to levels nvidia could not sell products without making a loss.

    amd has optimised an already cheap product price wise, that does not outperforme the 470 or an oced 460 while at the same time selling for the same amount $.

    considering manufacturing and pricing of the 4870 in its last days, i guess amd will still be making money out of those 6xxx when dropping the price by 75% msrp.
    Reply
  • NA1NSXR - Friday, October 22, 2010 - link

    Granted there have been a lot of advancements in the common feature set of today's cards and improvement in power/heat/noise, but the absolute 3D performance has been stagnant. I am surprised the competition was called alive and well in the final words section. I built my PC back in 7/2009 using a 4890 which cost $180 then. Priced according to the cards in question today, it would slot in roughly the same spot, meaning pretty much no performance improvement at all since then. Yes, I will repeat myself to ward off what is certainly coming - I know the 4890 is a pig (loud, noisy, power hungry) compared to the cards here. However, ignoring those factors 3D performance has barely budged in more than a year. Price drops on 5xxx was a massive disappointment for me. They never came in the way I thought was reasonable to expect after 4xxx. I am somewhat indifferent because in my own PC cycle I haven't been in the market for a card, but like I said before, disappointment in the general market and i wouldn't really agree with the statement that competition is alive and well, at least in any sense that is benefiting people who weight performance more heavily in criteria. Reply
  • Finally - Friday, October 22, 2010 - link

    Did you have a look at the games market lately? Noticed all those shabby console ports? There is no progress because the graphics power of an XBOX or PS3 is exactly the same as it has been when they were introduced.

    Then again, who wants to play dumbed-down console games, made by illiterates for illiterates running on antique hardware which severely limits innovation in the graphics sector?
    Reply
  • jimhsu - Friday, October 22, 2010 - link

    "I know the 4890 is a pig (loud, noisy, power hungry) compared to the cards here"

    And hence your point. Essentially, major COMPUTER manufacturers (not just video card makers) simply are less concerned about maximum performance anymore -- for 95% of the population, what we have now is "good enough", and for the remaining 5%, getting more of the cheap stuff is also "good enough" (HPC builders, SLI/CrossFire, etc). Instead, people look at things like "is this quiet" (heat production, fans) or "what does this mean for my bottom line" (power consumption, replacability). The age of the monolithic "fast chip" is over.
    Reply
  • Jamahl - Friday, October 22, 2010 - link

    AMD naming the cards the 6800 series or Anandtech changing their policy of not reviewing overclocked cards. Reply
  • spigzone - Saturday, October 23, 2010 - link

    AMD renaming their cards = more confusing
    Anandtech 'changing' their policy = more inexplicable.
    Reply
  • softdrinkviking - Friday, October 22, 2010 - link

    i join the throngs of disgruntled consumers that object to the new naming convention of the 6800 series.
    it's silly and stupid, and you should be ashamed of your collective AMD selves.
    Reply
  • spigzone - Saturday, October 23, 2010 - link

    I didn't like it either ... until I saw the release prices ...

    Then I didn't much care anymore.
    Reply
  • gorg_graggel - Friday, October 22, 2010 - link

    why the heck didn't they just call them 6850 and 6530? according to the numbers those are the the true internal competitors...
    that would also fit with the premise that a next-gen card with the same naming conventions is at least a bit faster...
    the upcoming 6950 and 6970 cards could accordingly be named 6870 and 6890 respectively...
    and the next 2-chip variant could have the 69xx namespace for itself as it clearly wouldn't be justified to append an x2 to it, due to the same reasons the 5000 dual-chip cards don't do this...
    because of different chips? john doe doesn't know about such distinctions and just cares about performance (compared to older generations) when upgrading.
    he's just confused why the 6870 is slower than a 5870 and the guy who knows more about the tech behind it is pissed, because he has to explain to him why the names are not analogous to performance and why it's not kept consistent at least for a few generations...

    the explanations amd has given about this is not satisfying and gives me the impression that they deliberately want to confuse customers...however i can't think of a logical "why"...
    Reply
  • jonup - Friday, October 22, 2010 - link

    To answer your question, Because that makes too much sense! Reply
  • Donkey2008 - Friday, October 22, 2010 - link

    I agree 100%. The new naming scheme is misleading and it seems like 6750 and 6770 would have been much more accurate IMO. From AMD releases over the last several years, the performance of nex-gen 2nd tier cards are ~ equal to the previous top tier cards. This is the first time AMD has strayed from their naming scheme in a long time and it has all the makings of a marketing dept telling the engineers what to call their cards.

    Like to pointed out, 95% of consumers (the ones who waddle into Best Buy and tell someone at the Geek Squad counter to "install a gaming card") won't know the difference. Most of these average consumers will believe that a 6870 is a much better card performance-wise then the previous generation 5870, so they will see the price and think it is a steal. AMD is playing the numbers game with uneducated consumers ("higher numbers are better, right?") and it is sort of disappointing IMO. I expect more from them as a psuedo-fanboy (I am a current of owner of a 4850 and (2) 4890)

    I am still anxious to see what the 69xx has to offer, but some of the excitement of the entire 6xxx series launch has faded because of the new naming scheme. I just don't like marketing games and being played. Tech people are not only sharp, but HIGHLY biased and any deviation from outright perfection usually gets punished (i.e. Microsoft Vista, iPhone 4 antenna, Nvidia GT 250, etc etc). AMD should have known better.
    Reply
  • gorg_graggel - Friday, October 22, 2010 - link

    hmmm well, on second thought it could make more sense in the future as the new scheme reorganized the naming to fit more to the performance categories...
    if they would have been named 6750/6770 their would be riots, because amd dared to raise the prices in the midrange segment, as the author of the article already said i think, which would be even worse...
    in the past amd changed strategies of how big and fast chips are a few times, but the naming didn't...they just didn't have single chips that deserved a name in the x900-range...
    so now that the cayman chip is in the 300w size amd changed its sweet-spot only strategy to a more standard strategy again with low-end, midrange, performance and high-end cards and the new naming does fit perfectly here...at least i have this impression...
    so it maybe confusing now, but depending on how future products turn out it will make more sense again...
    Reply
  • Finally - Friday, October 22, 2010 - link

    Thank God morons don't compare prices.
    Naming is irrelevant as long as you actually get more performance for half the price when the HD5850 was introduced.
    Reply
  • softdrinkviking - Friday, October 22, 2010 - link

    The fact that all of these people are complaining about the naming proves that it isn't irrelevant.
    Names are important to some people.
    Not to you, clearly, but you're not everybody.
    Reply
  • krumme - Friday, October 22, 2010 - link

    I was wondering before if Anandtech was going to use the overclocked 460 card. This day was a test for the new cards from AMD but it was more a test of Anandtech i my view.

    What a mess for the consumer, Anand and Ryan! - i know you must have discussed this.

    - where does this lead to?

    1. More agressive intervention from AMD and Nvidia on the review sites

    2. More OC cards on the launch dates

    This is not good for the transparancy for the consumer.

    Therefore its a sad day. And i guess from your own writing, you dont feel quite comfortable about it yourself. Why the f... didnt you listen more to your own doubt?

    - next time listen to yourself.

    Otherwise a fine review - worth criticizing.
    Reply
  • SandmanWN - Friday, October 22, 2010 - link

    Exactly. If I were controlling the media for AMD I would start shipping out hand selected overclocked 5970's on every Nvidia review and demand they be used or no longer receive free review samples.

    Starting a bad trend here.
    Reply
  • Mygaffer - Friday, October 22, 2010 - link

    Other sites didn't bow to the pressure and include the OC'd gtx460. Guru3d is one that comes to mind. Not only that, but after admitting its your policy to not included them you include the very fastest OC'd gtx460 on the market?
    LAME. At least OC the 6850 so you can show that an OC'd 6850 beats an OC'd gtx460.
    I've lost some respect for you with that decision.
    Reply
  • AnnihilatorX - Friday, October 22, 2010 - link

    The CF HD6850 seems to be quite a good value for high end users.
    They seem to have improved crossfire performance on this generation

    A single HD5870 still retails at twice the price of HD6850
    but 2 HD6850s are 50-70% faster than a single HD5870
    Reply
  • MeanBruce - Friday, October 22, 2010 - link

    Notice the idle noise levels within this comparative are all in the 40db range, with load noise in gaming mode up to the 50 and 60db range. Anyone interested in gaming or working in the 10db range? It is very possible, I am doing it now with an older ATI 4850, talk about peaceful computing and late night gaming. Yup add an uber-efficient aftermarket heatsink I have tried a few from Arctic Cooling and Thermalright, the best one so far is the MK-13 from Prolimatech! Clip on a Noctua NF-S12B uln fan 6db or a 140mm Noctua FLX attenuated to 10db and you are there baby! Total Upgrade Costs: $85. Peace of Mind: Priceless. Bruce out! Reply
  • Ryantju - Friday, October 22, 2010 - link

    I used to play Crysis with HD 4830, which is not very good and I can't see the benchmarks. Since HD 4870's has such a outstanding Price/Performance, can it run Crysis 2? Reply
  • shiznit - Friday, October 22, 2010 - link

    Anand I thought you tested the 5870 in WoW? The ugly texture transitions were blatantly obvious from the start. Imagine my dismay when upgrading from a 8800GT to a brand new just released 5870 and seeing worse texture filtering... Reply
  • Techman123 - Friday, October 22, 2010 - link

    I got my 5870 over a year ago and have been enjoying great frame-rates on my 30in monitor at 2560x1600. Even though it wasn't cheap, it has to have been one of the best buys I ever made, as this card is still one of the top tier of cards on the market. It's not often that a video card over a year old is still that competitive. Plus I have the option of adding a 2nd card once they are relegated to 2nd tier status.

    It is interesting the way they are introducing this card. With the 58xx series, they came out with the high end card first. It makes it seem that although the 6900 series will improve over the 5800 series, it won't be the huge step the 5800 was.
    Reply
  • Setsunayaki - Friday, October 22, 2010 - link

    There was a graph where a 4XXX series card beat the 6XXX series card...There were many where the 5XXX series was higher...Tesellation performance is higher on the 460 GTX and SLI scales better than crossfire...

    What the tesselation performance graph really means is that if you were to take an 460 GTX and 6870 and turn off tesselation and play a game....the 6870 gets a higher framerate, but if you turn on Tesselation on Both cards and go full force with Tesselation and other features (considering that Nvidia has support for PhysX and most games now have some physics implementation)...the outcome shows the 6870 taking such a performance hit that as far as framerates go....a 460 actually matches it or beats it outright.

    What ATI/AMD really needs to work on is Integrating more technologies on its card to actually have more options during a game. No physics processing, Just an optimization on AA and AF...and tesselation performance that doesn't come close to a 460, along with horrible linux support...I really wonder and hope that their flagship card shows something steller....

    Not to argue against it, but for the deserving ATI/AMD fans who have stuck with them over the years. ^_^
    Reply
  • Alilsneaky - Friday, October 22, 2010 - link

    Prices are high for both in my country (Belgium).

    199 Euro for the 6850 and 279 euro (in the cheaper shops, upto 350 in others) for the 6870.

    Very bland release for us, nothing to get excited about at that price point.

    I also take offense to the naming scheme, why pick a name that will inevitable deceive many people into buying a sidegrade.
    Reply
  • Pastuch - Friday, October 22, 2010 - link

    There was not nearly enough discussion on DTS HD MA and TrueHD pass through in this article. Gaming is 50% of the reason to upgrade, the rest of my focus is HTPC use. Please compare the GTX 460 vs the 6870 regarding bit-streaming, video quality and hardware decoding.

    Thanks.

    P.S. Nvidia usually does a pathetic job on anything not related to gaming.
    Reply
  • Scootiep7 - Friday, October 22, 2010 - link

    I think you guys are a little off on calling the 6870 the $200 price point King. The cheapest retail for the card right now is $239.99 for any model and then you have to add in another $5~10 for shipping. That sticks it at $245 - $250. That's no where near the $200 price point. And with most GTX 460 1GBs sitting at about $170 - $190 (w/ shipping), this card is not competing with them on price at all. Maybe in a few months if prices drop, but not now. It's more in the GTX 470 range and that is much tougher competition. I'm sorry, but the 6870 is NOT the $200 price point King. It's not even close. Reply
  • Lolimaster - Sunday, October 24, 2010 - link

    HD6850 offers better performance tha 460 1GB
    HD6850 costs $175

    HD6870 kill both of them, and also 470 performance/power consumption (80w less)
    Reply
  • Scootiep7 - Sunday, October 24, 2010 - link

    Ok, I'm sorry, but I have to laugh at this. Where the hell are you finding a 6850 for $175. The cheapest ANYWHERE is $199 and you still have to factor in #8ish shipping. Re-read my post and realize that the prices I quoted are accurate and you're still looking at a $30 price difference between the 6850 and the 460 1gb. Yes the performance is better, but it's not amazingly better and I don't think it justifies it. Hey, I'm all for the red team this time around. I picked up a 5770 which is an amazing bang for the buck card. I'm just saying that calling the 6870 or the 6850 the new $200 price point king is wrong. Too many variables. Reply
  • orthancstone - Friday, October 22, 2010 - link

    I'm especially pleased to see the 4870 included in some benchmarks. As someone who owns one and who was never impressed with the performance boost/cost ratio of the 58/59xx lines, I've been wondering how the 6xxx line would compare to the two generation old stuff. I'd love to see it included in the third party 6xxx reviews. Reply
  • Edison5do - Friday, October 22, 2010 - link

    As a owner of a HD 4850 was planning to get an HD 5770 but at this point it looks like HD 6850 looks like a better option with a few more bucks.. or wait to see if the HD 5770 will drop price a little more.... Reply
  • Sando_UK - Friday, October 22, 2010 - link

    Anandtech is one of my favourite review sites and it's a real shame to see what's happened here. I don't know the reasons why you guys needed to include the 460 OC in this review (does sound like a fine card btw, but this wasn't the place for it) - can't see any reason this wouldn't have been much better compared in a separate article. The fact Tom's hardware did a very similar thing makes the whole thing fishy...

    New generations/architectures don't come along very often and deserve proper comparison and coverage - I'm not an AMD or Nvidia fanboi (happy to go with whichever is best price/performance/extras at the time) but we rely on you guys to give us the facts on a level playing field. I'm sure you have in this case, but even the suggestion of impropriety damages you (extremely good) reputation, and I think it's something you should really try to avoid in the future - be it AMD or Nvidia reviews.

    Otherwise, thanks for all your hard work.
    Reply
  • Natfly - Friday, October 22, 2010 - link

    It's sad to say, but this review fucking sucks. UVD and the display controller have been overhauled but you make no mention of any of the changes. Are there still only 2 RAMDAC clocks? Or can you now use passive DP converters while using both of both DVI ports?

    And including an OC'd card because nVidia pushed you into it? Way to take a shot to your credibility. And no mention of its clocks or price... AND no overclocking numbers for these new cards when you are specifically comparing it to an OC'd card? I mean wtf, this review is not up to previous Anandtech standards.
    Reply
  • Donkey2008 - Friday, October 22, 2010 - link

    Can you provide a link to your website so I can read your review of the cards? That would be awesome. Reply
  • Natfly - Friday, October 22, 2010 - link

    Sure, right here: http://tinyurl.com/36ag36d Reply
  • BlendMe - Friday, October 22, 2010 - link

    So you're telling me I can get two 6870 and spend lest money, use less power and have more performance than a GTX 480? I like the idea of going back to what made the 48xx cards so great. Small, cheap and expandable.

    Can't wait for the rest of the line up.
    Reply
  • tpurves - Friday, October 22, 2010 - link

    how is it that the nvidia cards go UP in framerate when you increase the resolution from 1650 to 1920 and add 4xAA? Did you mix up some test run numbers? Reply
  • mapesdhs - Friday, October 22, 2010 - link


    It's a pity the charts don't include SLI results for the EVGA 460. I would like to have seen
    how close it came to 470 SLI, given the 470s inferior power, GPU load temp and noise
    results. The 470 GPU load temps under Crysis for just one card are particularly scary;
    the idea of using two 470s SLI, and even more so oc'ing them, seems like a recipe for
    thermal mayhem - alien astronomers with IR telescopes would wonder what the heck
    they've spotted. :D

    The price drop on the 470 is interesting, but the EVGA 460 still looks like a better buy
    because of the power/heat issues, especially so for those considering SLI (as I am),
    and also the fact that the EVGA is as good or better than the 6870. This graph is the
    one that interests me:

    http://images.anandtech.com/graphs/graph3987/33232...

    The stock 460 SLI is clearly nowhere near as good as 6870 CF or 470 SLI, but given
    a single EVGA 460 matches the 6870, I'd really like to know how two EVGAs perform.
    Any chance you could add the data later?

    On the other hand, one could assume the 6870 should have some oc'ing headroom,
    but toms' review didn't show that much of a gain from oc'd 6870s.

    The 6870 here in the UK seems to be about 200 UKP (Aria, Scan), though the XFX
    version looks to be an exception (178 from Scan). The EVGA is 174 (Scan, but no
    stock yet). For those who don't want to spend that much, the 800MHz Palit Sonic
    Platinum 460 has dropped down to only 163 (last week it was 183). I almost bought
    two of the Palit cards last week, so I'm glad I waited.

    Obviously the pricing is all over the place atm, and likely to wobble all over again
    when the next 6xxxx cards are released. Either way, despite the lack of major
    performance increases atm, at least there's finally some pricing/value competition.
    I think I'll wait until the dust settles re pricing, then decide. Quite likely many others
    will do the same.

    Ian.
    Reply
  • AtenRa - Friday, October 22, 2010 - link

    Why did you run at 1920x1200 and not 1920x1080 ??

    most 1920x1200 monitors are extincted from the market and 1920x1080 is becoming the defacto resolution.
    Reply
  • Lunyone - Friday, October 22, 2010 - link

    Well with bowing down to nVidia on the selection of "what" GPU to use, you have lost all credibility in my eyes. Even Tom's Hardware took a higher road and agreed to use the "hand picked" GPU, but limited the clocks to near stock settings, so there was a more "real" world comparison. Who nows if this isn't the first time that this has happened at Anandtech. I notice no rebuttals on Anand's part, so I'm guessing that they are quite amazed that people are seeing how one sided this issue is. This article wouldn't affect my purchase, since I look a several sites to draw a conclusion from. But my confidence in quality and fair reporting from Anandtech's reviews have been compromised, IMHO. I don't know if I will put any merit to any on Anand's reviews, time will tell. Reply
  • Sunburn74 - Friday, October 22, 2010 - link

    Gee. You know whats all this about Anandtech losing credibility? Nvidia specifically asked them to test one card and the consumer benefits from having this information. Its not like anandtech didn't include the reference gtx460 as well. Anything that tells the consumer more about how valuable his dollar really is, is a good thing imo.

    I currently have an oced radeon 5850 and it annoyed the hell out of me trying to justify whether or not the extra 30 bucks I eventually ended up paying for it, was worth it. There weren't any reviews at the time you see...
    Reply
  • SandmanWN - Friday, October 22, 2010 - link

    You can't gauge value of an overclocked card against a stock card. You have no idea what the other card can do. What you are saying is nonsense if you really put two seconds into thinking about what you just said. Reply
  • mindbomb - Friday, October 22, 2010 - link

    we are talking about factory oc'd cards.
    It's not like Anand was playing around in rivatuner.
    Reply
  • krumme - Friday, October 22, 2010 - link

    No it more looks like NV have been playing around with rivatuner.

    Their cards at stock speed is a toaster. Overclocking again just before an AMD release does not make it better.

    If they can sell cards at higher speeds do it. Release it. Dont talk about it, or coverdly hide behind factory overclocked cards. Pathetic and weak.

    The sad thing about it is they made Anand change their methology.

    What a sad result when the start was just bad gfx cards.
    Reply
  • MrSpadge - Friday, October 22, 2010 - link

    Not sure it was mentioned before: Cypress and RV770 run FP64 at 2/5th the FP32 speed, not 1/5th as written on the first page.

    MrS
    Reply
  • Quantumboredom - Friday, October 22, 2010 - link

    It's 2/5 for addition, but 1/5 for multiplication and FMA as I recall it. Reply
  • krumme - Friday, October 22, 2010 - link

    I you want real world testning for solid buying knowledge and no idiotic mixing of oc and non oc cards go to Hardocp

    Anand needs to do a solid review of his own gfx reviews. This is stoneage methology with a fishy smell.

    For the ssd there was an excellent revision with the real world bm suite introduction, - a revision far deeper is needed here.
    Reply
  • chillmelt - Friday, October 22, 2010 - link

    I agree. Comparing a non-OC'd card to an OC'd one is in no way fair. The 6870 clearly clubbered 460 gtx in virtually all benchmarks, at stock speeds. Reply
  • krumme - Friday, October 22, 2010 - link

    Yes. Think about this scenario: For this review anand (kyle) decided to use an overclocked Sapphire 6870 and compare it to a stock speed 460.

    What would happen? How would that look?

    Its the same thing happening here.
    Reply
  • mindbomb - Friday, October 22, 2010 - link

    if they are in the same price range, they should be compared.
    it gives the consumer more information on his purchasing decision.

    idk why this arbitrary oc/non-oc rule comes into play. It was a factory overclock, and required nothing on the part of the end user.
    Reply
  • krumme - Friday, October 22, 2010 - link

    Okey. Now take an OC 6850, 900Mhz, and compare it to a far more expensive stock 460 1G?

    Where does this end. AMD and NV suplying the sites with factory overclocked cards for reviews. Its going to be a f..king mess. No one knows about availability, time period and so on.

    Regarding pruchasing decision all the methology is plain wrong. Go to Hardocp if you consider buying a card. Here you have real world testing. Not like this old bad - and now fishy - methology of anand.

    Why does he change methology just when he is getting the new cards. Hmm. Thats simply very bad science.

    NV should concentrate on making cards that performs instead of using ressources putting pressure on sites like anand. If they have faster 460 models release it, or stop spinning.

    Its a pain to read and watch. And its a shame for a site, that have the ressources to do proper testing, but dont have the balls any more.

    We dont like AMD, Intel nor NV to influence the reviews. Right now there is an obvious bad influence from especially NV. And you, and everyone can see it. Dont underestimate the consequenses.
    Reply
  • bji - Friday, October 22, 2010 - link

    You COMPLETELY missed the point. *IF* the OC 6850 was a factory overclocked card, sold at a price that puts it in the range of cards reviewed, then it *WOULD* make sense to compare it.

    So your example just further makes the point.
    Reply
  • krumme - Friday, October 22, 2010 - link

    Well as it stands now they dont release oc cards at launch day, but that will soon change..., if Anand and others dont stick to their policy Reply
  • campbbri - Friday, October 22, 2010 - link

    Thanks for the great review. I don't know why everyone is complaining about mixing OC and Non-OC cards when you were extremely explicit in pointing it out. Reply
  • krumme - Friday, October 22, 2010 - link

    I dont think you dont know why everyone is complaining.

    First. To be fair its far from everyone :), unfortunately because Anand is surrounded by far to many yes sayers. All positve. Great in many ways. But it does not develop the site as it could. There is a great huge community, and there is plenty of ressources to get ideas to new methology.

    Its good - if not vital - that Kyle is explicit about it. Otherwise it wouldnt be worth critizicing, then it would just look like a payed job, and nobody would care. Its not. But beeing explicit is not enough even if its most important and a huge quality. You need to have a good case. And Anand does have a very bad case.

    Read what Kyle wrote againg. Do you think this is his best and most sound decicion in his life? do he feel comfortable about it?

    He did betray himself a little bit. And he shouldnt do it. He should lissen to his own doubt.
    Reply
  • snarfbot - Friday, October 22, 2010 - link

    yes i understand that, but i cant see how you can call a direct replacement that fails to outperform its predecessor as a success.

    especially when you consider that the prices have increased after launch as opposed to decrease as is normal. and have remained artificially high since, due to limitations at tsmc, which renders the cost argument pretty much moot.

    how about an analogy.

    6870 is to 5870 as 4770 is to 4870.

    and its on the same process which makes it even worse, although you cant really blame amd for that.

    you can very much blame their marketing department for making such a terrible decision though.

    its a terrible name, thats the whole point, at whatever price you cant call it a 6870 if it cant beat a 5870.
    Reply
  • Trefugl - Friday, October 22, 2010 - link

    yes i understand that, but i cant see how you can call a direct replacement that fails to outperform its predecessor as a success.


    But the issue is that the 68xx series alone aren't really replacing the 58xx series. I think they are really splitting what the direct replacement to that market would have been into two - the 69xx (high-end enthusiast) and the 68xx (high-end mid-range).

    I agree that the naming scheme isn't the best, but I think a lot of that could have been mitigated (and maybe even made a non-issue) if the 68xx's weren't the first to launch. If the 69xx came out first people would have accepted them and been happy, but instead we have b*tching because of naming confusion...
    Reply
  • Targon - Sunday, October 24, 2010 - link

    I missed this too until someone pointed out what I missed. The Radeon 6900 series will replace the 5800 series at the high end, and IS the proper high end part you are looking for.

    Back when DirectX 9 first came out, ATI only had DirectX 9 support in the old Radeon 9500 and 9700. When the X300, X600, and X800 came out, notice that AMD took the cards and started at 600 and 800, rather than 500 and 700 for the mid ranged and high end cards. This has continued a bit. In the HD 2000 series, you even had the HD 2900XT on the high end of the series, but then they went to the 3800, 4800, and 5800 series to mark the high end cards.

    So, AMD/ATI has been tweaking the names a fair bit. What initially threw me off is that the next generation high end cards are not the first cards to show up, and we have the mid-ranged cards showing up first.

    If the article said clearly, "We are reviewing the next generation mid range cards with the high generation 6900 due out next month" right up front in the article instead of buried in the text somewhere on page 2(or was it 3), there would have been less confusion.

    I don't mind the change in numbers if all parts come out at the same time, but for now, there is ONLY confusion because we have yet to see the 6970.
    Reply
  • GaMEChld - Friday, October 14, 2011 - link

    I love how people are arguing over this naming change. As if people who buy discrete cards or look at video card specs don't know what their doing. If you don't know what you're buying, it serves you right.

    I don't know why this was so hard for people to understand. The 5700 was incredibly successful. AMD wanted to preserve that card for its performance and value. Thus, the 6700 name was taken. The 6800 model is a new model that sits BETWEEN where the 5700 and 5800 line had. If you recall, there was a MASSIVE performance gap between those lines, and AMD felt they should have something to bridge that gap.

    The new 6800 line bridges that gap. It offers NEAR 5800 power at a significant price reduction.

    And now ALL of the top tier cards are housed under the 6900 bracket, with the 6990 taking the dual GPU slot. If I had anything to complain about its the abandonment of the X2 designation on dual GPU cards.

    In fact, the only thing people should be angry about is the fact that the 6700 is virtually identical to the 5700 and offers little performance advantage. THAT is what is reminiscent of the 8800GT -> 9800GT transition. However, since the 5700 was a midrange product, maybe it received less attention than it should have.
    Reply
  • DanaG - Friday, October 22, 2010 - link

    Now, if the 6870 is what should've been a 6770, and a 6970 is what should've been a 6870... then what'll they call what should've been a 6970? 6-10-70 / 6ten70? 6X70? 6999? Or will they go to 6970 X2? Reply
  • spigzone - Saturday, October 23, 2010 - link

    6990 ... yhat wasn't so hard now, was it? Reply
  • AMD_Pitbull - Saturday, October 23, 2010 - link

    Gotta say, I agree 100%. I really don't understand why everyone is getting so bloody upset with this. New product, new line. You couldn't predict what was going to happen? Sorry. Companies like to keep people guessing.

    Also, if you really want to get technical, this 6870 DOES beat the 5870 if a few things as well. Overall greater effective product AND cheaper? Win in my books. Sorry QQ'ers.
    Reply
  • dvijaydev46 - Saturday, October 23, 2010 - link

    I tried converting a video file using my 5770 Hawk with MediaEspresso 6 (with hardware acceleration enabled of course), I wasn't impressed but Mediashow 5 properly utilized the GPU power and the speed difference in converting was clear. I'm not sure if there was a problem in the installation of my copy of MediaEspresso 6, but I think you guys can use Mediashow 5 to see if there is any difference in video conversion time with an AMD GPU as I don't have any other card. Reply
  • 529th - Saturday, October 23, 2010 - link

    the marketers wanted to differentiate themselves from Nvidia, that's why they are using their second place cards to be in the same category as nvidias second place cards

    If you are shopping for a top of the line card you should know atleast a little bit about them although the un-educated video-card shopper would think that a 470 and 5870 or 6870 is on the SAME performance level, WHICH ISN'T TOO FAR FROM THE TRUTH, but I think it's here where AMD marketers are trying to make a statement

    i could be wrong, i have had very little sleep last night, cedar point was a blast!
    Reply
  • SininStyle - Saturday, October 23, 2010 - link

    Can I just say THANK YOU for adding a OC edition of the 460. Don't know why everyone is whining. If you don't want to know how an OC edition compares then ignore the stupid bench for it. Why is such a huge deal?
    I personally am glad they included it and this is why. The 460 1gb stock is 675mhz and can OC "reliably" to 850mhz.. That's 175mhz gain and its noticeable. Stock volt stock fan. And for those that wanna claim heat, mine shows 64c at 75% fan on OCCT. The 6870 get 50hz OC at stock volt/fan. SEE why this is important people? $180 vs $240 with same results.

    Now with volt changes I'm sure they both have room to go I'm not sure how much. I tend to shy away from higher voltages at least for now.

    The 6850 is the better buy between the 2 68xx cards. That has allot of headroom to OC. That would even be a better comparison to the 460 due to the price. And owning the 460 doesn't make me a fanboy and I will say you can flip a coin for value on these 2.

    So again thanks for the added information. Cant see why anyone would complain about more info. If you don't like the info ignore it if it makes you feel better. Feel free to add OCed 6850s and 6870s I look forward to the comparison.
    Reply
  • Parhel - Saturday, October 23, 2010 - link

    "The 460 1gb stock is 675mhz and can OC "reliably" to 850mhz"

    No, it absolutely cannot. the FTW card is a "golden sample" which is why there are so few available. Stock cooling on a stock card will not get you to 850Mhz with 24/7 reliability. You *might* get to 800Mhz, probably a bit less. That's a great value, IMO. If I were in the market at the moment, I'd pick a base model GTX 460 and OC it. Not arguing that point at all. But presenting this card in the 6870 launch article is a sham and major black eye to Anandtech's credibility.
    Reply
  • rom0n - Saturday, October 23, 2010 - link

    Is it possible to post the GPUZ of the HD6850. It seems there are numerous cases where HD6850 has 1120 sent out to reviewers. See
    http://benchmarkreviews.com/index.php?option=com_c... If this happens to be one of them the results may be a little misleading. If not then it'll reaffirm the results.
    Reply
  • GullLars - Saturday, October 23, 2010 - link

    This means a 6870 with open-air fan optimized for noise will be my early winter solstice present for myself, togheter with the 4x C300 64GB i just got :D
    I went for a value-upgrade of my old rigg with P2x6 1090T, 8GB kingston value DDR3, and AM3 mobo with SB850, so once i get both the SSD in RAID-0 and the GPU, I'll be a happy camper (or rusher) <3
    It'll tide me over untill i can get Bulldozer or a next gen Intel (high end/workstation) around winter 2011/2012.
    Reply
  • poohbear - Saturday, October 23, 2010 - link

    "Apparently a small number of the AMD Radeon HD 6850 press samples shipped from AIB partners have a higher-than-expected number of stream processors enabled.

    This is because some AIBs used early engineering ASICs intended for board validation on their press samples. The use of these ASICs results in the incorrect number of stream processors. If you have an HD 6850 board sample from an AIB, please test using a utility such as GPU-z to determine the number of active stream processors. If that number is greater than 960, please contact us and we will work to have your board replaced with a production-level sample.

    All boards available in the market, as well as AMD-supplied media samples, have production-level GPUs with the correct 960 stream processors."

    so which one did Anandtech get? false marketing is such BS, just wanna be sure your benchmarks for the 6850 are reliable and we're not getting overrated benchmarks due to a cherry picked review sample.
    Reply
  • lakrids - Saturday, October 23, 2010 - link

    The review ended up looking like an advertisement for EVGA at page 7 and beyond. Why EVGA? Why not some other brand?
    Why include that brand at all? Just mark the card "GTX 460 OC'd 850MHz".

    At the very first benchmark: Crysis 2560x1600, you didn't include the reference GTX 460, you pitched the HD6870 against the EVGA overclocked version. EVGA here, EVGA there, EVGA everywhere.

    Would you blame me if I suspect you of being on EVGA's paycheck?
    Reply
  • Lolimaster - Sunday, October 24, 2010 - link

    When I call you a Intel/Nvidia biased site I'm saying the truth. Are you reviewing the HD6000 or doins an EVGA product reviews.

    This is an insult.

    Message:
    Nvidia will disappear like the dodo, just a bit more time and at that time all this sh1t will end.
    Reply
  • SininStyle - Sunday, October 24, 2010 - link

    You do understand if Nvidia vanishes the price of GPUs goes through the roof right? Nvidia isnt going to vanish any earlier then Radeon. Saying either just translates into "Im a fanboy"

    Stop defending a sticker and start shopping price performance. Neither company would hesitate to rape your wallet if the other would allow it. Case in point look at the price of the 57xx and 58xx 2 months ago. Then look at the price of the same cards including the 68xx cards now. Any of these cards perform less then they did 2 months ago? But the price is a whole lot cheaper isnt it? Well you can thank the 460 for that. Competition results in better pricing for the same performance. You should be thanking Nvidia not hating them.
    Reply
  • Super_Herb - Sunday, October 24, 2010 - link

    I love it - "as a matter of policy we do not include overclocked cards on general reviews"..........but this time nVidia said pretty please so we did. But because our strict ethical policy doesn't allow us to include them we'll just tell you we did it this one special time because a manufacturer specifically sent us a special card and then our integrity is still 100% intact......right? Besides, the "special" card nVidia sent us was so shiny and pretty!

    Back to [H]ard to get the real story.
    Reply
  • Super_Herb - Sunday, October 24, 2010 - link

    I love it - "as a matter of policy we do not include overclocked cards on general reviews"..........but this time nVidia said pretty please so we did. But because our strict ethical policy doesn't allow us to include them we'll just tell you we did it this one special time because a manufacturer specifically sent us a special card and then our integrity is still 100% intact......right? Besides, the "special" card nVidia sent us was so shiny and pretty!

    Back to [H]ard to get the real story.
    Reply
  • Super_Herb - Sunday, October 24, 2010 - link

    I love it - "as a matter of policy we do not include overclocked cards on general reviews"..........but this time nVidia said pretty please so we did. But because our strict ethical policy doesn't allow us to include them we'll just tell you we did it this one special time because a manufacturer specifically sent us a special card and then our integrity is still 100% intact......right? Besides, the "special" card nVidia sent us was so shiny and pretty!

    Back to [H]ard to get the real story.
    Reply
  • Super_Herb - Sunday, October 24, 2010 - link

    I love it - "as a matter of policy we do not include overclocked cards on general reviews"..........but this time nVidia said pretty please so we did. But because our strict ethical policy doesn't allow us to include them we'll just tell you we did it this one special time because a manufacturer specifically sent us a special card and then our integrity is still 100% intact......right? Besides, the "special" card nVidia sent us was so shiny and pretty!

    Back to [H]ard to get the real story.
    Reply
  • Super_Herb - Sunday, October 24, 2010 - link

    "As a matter of policy we do not include overclocked cards on general reviews"..........but this time nVidia said pretty please so we did. But because our strict ethical policy doesn't allow us to include them we'll just tell you we did it this one special time because a manufacturer specifically sent us a special card and then our integrity is still 100% intact......right? Besides, the "special" card nVidia sent us was so shiny and pretty!

    Back to [H]ard to get the real story.
    Reply
  • Will Robinson - Sunday, October 24, 2010 - link

    Its amazing how many sites have used the FTW overclocked card as their NVDA comparison card.
    Think its a fluke?....even websites in Sweden were sent one with directions from NVDA to use it.....too bad the FTW card isn't even available in that country.....
    Reply
  • mcnels1 - Sunday, October 24, 2010 - link

    On the power use page, the article states :

    Because we use a 1200W PSU in our GPU test rig our PSU efficiency at idle is quite low, leading to the suppression of the actual difference between cards.

    Actually, an inefficient power supply has the opposite effect, increasing the apparent difference. If the power supply were 100% efficient, then each additional watt used by the graphics card would show up as one additional measured watt used. But if the power supply is only 50% efficient, each additional watt used by the graphics card is measured as two additional watts.
    Reply
  • JEEPMON - Sunday, October 24, 2010 - link

    Please lose the 'Mexican Standoff' Just use standoff, much more professional and politically correct. Reply
  • Wolfpup - Monday, October 25, 2010 - link

    Regarding texture filtering/mip map levels-is it possible this deficiency has existed since at least 2005 in AMD parts? I notice that in a LOT of Xbox 360 games. Reply
  • Sabresiberian - Monday, October 25, 2010 - link

    I'm a gamer, video is a far second for me, but AMD just put the crown firmly on their head and if that's what I was after I'd be thrilled with these cards at these prices.

    ;)
    Reply
  • DarkUltra - Wednesday, October 27, 2010 - link

    Have you checked in-game texture quality during motion? My GeForce GTX 285 have shimmering textures at max texture quality and 16xAF which are very noticeable when walking around in a game. As the GTX 285 have same type of AF quality, is this shimmering also present on the 480 or the 6870?

    http://www.youtube.com/watch?v=JVjM_vrsBO8
    Reply
  • Quidam67 - Friday, October 29, 2010 - link

    Well that's odd.

    After reading about the EVGA FTW, and its mind-boggling factory overclock, I went looking to see if I could pick one of these up in New Zealand.

    Seems you can, or maybe not. As per this example http://www.trademe.co.nz/Browse/Listing.aspx?id=32... the clocks are 763Mhz and 3.8 on the memory?!?

    What gives, how can EVGA give the same name to a card and then have different specifications on it? So good thing I checked the fine-print or else I would have been bumbed out if I'd bought it and then realised it wasn't clocked like I thought it would be..
    Reply
  • Murolith - Friday, October 29, 2010 - link

    So..how about that update in the review checking out the quality/speed of MLAA? Reply
  • CptChris - Sunday, October 31, 2010 - link

    As the cards were compared to the OC nVidia card I would be interested in seeing how the 6800 series also compares to a card like the Sapphire HD5850 2GB Toxic Edition. I know it is literally twice the price as the HD6850 but would it be enough of a performance margin to be worth the price difference? Reply
  • gochichi - Thursday, November 04, 2010 - link

    You know, maybe I hang in the wrong circles but I by far keep up to date on GPUs more than anyone I know. Not only that, but I am eager to update my stuff if it's reasonable. I want it to be reasonable so badly because I simply love computer hardware (more than games per say, or as much as the games... it's about hardware for me in and of itself).

    Not getting to my point fast enough. I purchased a Radeon 3870 at Best Buy (Best Buy had an oddly good deal on these at the time, Best Buy doesn't tend to keep competitive prices on video cards at all for some reason). 10 days later (so I returned my 3870 at the store) I purchased a 4850, and wow, what a difference it made. The thing of it is, the 3870 played COD 4 like a champ, the 4850 was ridiculously better but I was already satisfied.

    In any case, the naming... the 3870 was no more than $200.00 I think it was $150.00. And it played COD4 on 24" 1900x1200 monitor with a few settings not maxed out, and played it so well. The 4850 allowed me to max out my settings. Crysis sucked, crysis still sucks and crysis is still a playable benchmark. Not to say I don't look at it as a benchmark. The 4850 on the week of its release was $199.99 at Best Buy.

    Then gosh oh golly there was the 4870 and the 4890, which simply took up too much power... I am simply unwilling to buy a card that uses more than one extra 6-pin connector just so I can go out of my way to find something that runs better. So far, my 4850 has left me wanting more in GTA IV, (notice again how it comes down to hardware having to overcome bad programming, the 4850 is fast enough for 1080p but it's not a very well ported game so I have to defer to better hardware). You can stop counting the ways my 4850 has left me wanting more at 1900 x 1200. I suppose maxing out Starcraft II would be nice also.

    Well, then came out the 5850, finally a card that would eclipse my 4850... but oh wait, though the moniker was the same (3850 = so awesome, so affordable, the 4850 = so awesome, so affordable, the 5850 = two 6-pin connectors, so expensive, so high end) it was completely out of line with what I had come to expect. The 4850 stood without a successor. Remember here that I was going from 3870 to 4850, same price range, way better performance. Then came the 5770, and it was marginally faster but just not enough change to merit a frivolous upgrade.

    Now, my "need" to upgrade is as frivolous as ever, but finally, a return to sanity with the *850 moniker standing for fast, and midrange. I am a *850 kind of guy through and through, I don't want crazy power consumption, I don't want to be able to buy a whole, really good computer for the price of just a video card.

    So, anyhow, that's my long story basically... that the strange and utterly upsetting name was the 5850, the 6850 is actually right in line with what the naming should have always staid as. I wouldn't know why the heck AMD tossed a curve ball for me via the 5850, but I will tell you that it's been a really long time coming to get a true successor in the $200 and under range.

    You know, around the time of the 9800GT and the 4850, you actually heard people talk about buying video cards while out with friends. The games don't demand much more than that... so $500 cards that double their performance is just silly silly stuff and people would rather buy an awesome phone, an iPad, etc. etc. etc.

    So anyhow, enough of my rambling, I reckon I'll be silly and get the true successor to my 4850... though I am assured that my Q6600 isn't up to par for Starcraft II... oh well.
    Reply
  • rag2214 - Sunday, November 07, 2010 - link

    The 6800 series my not beat the 5870 yet but it is the start of the HDMI 1.4 for 3dHD not available in any other ATI graphics cards. Reply
  • Philip46 - Monday, November 15, 2010 - link

    The review stated why was there a reson to buy a 460(not OC'ed).

    How about benchmarks of games using Physx?

    For instance Mafia 2 hits 32fps @ 1080p(I7-930 cpu) when using Physx on high, while the 5870 manages only 16.5fps, while i tested both cards.

    How about a GTA:IV benchmark?, because the Zotac 2GB GTX 460, runs the game more smoothly(the same avg fps, except the min fps on the 5850 are lower in the daytime) then the 5850 (2GB).

    How about even a Far Cry 2 benchmark?

    Co'me on anandtech!, lets get some real benchmarks that cover all aspects of gaming features.

    How about adding in driver stability? Ect..

    And before anyone calls me biased, i had both the Zotac GTX 460 and Saffire 5850 2GB a couple weeks back, and overall i went with the Zotac 460, and i play Crysis/Stalker/GTA IV/Mafia 2/Far Cry 2..ect @ 1080p, and the 460 just played them all more stable..even if Crysis/Stalker were some 10% faster on the 5850.

    BTW: Bad move by anandtech to include the 460 FTC !
    Reply
  • animekenji - Saturday, December 25, 2010 - link

    Barts is the replacement for Juniper, NOT Cypress. Cayman is the replacement for Cypress. If you're going to do a comparison to the previous generation, then at least compare it to the right card. HD6850 replaces HD5750. HD6870 replaces HD5770. HD6970 replaces HD5870. You're giving people the false impression that AMD knocked performance down with the new cards instead of up when HD6800 vastly outperforms HD5700 and HD6900 vastly outperforms HD5800. Stop drinking the green kool-aid, Anandtech. Reply

Log in

Don't have an account? Sign up now