POST A COMMENT

173 Comments

Back to Article

  • calumhm - Friday, September 11, 2009 - link

    i mean, ATI invented the unified pixel/shader architechture, or so i believe, and this generation they've got dx10.1 level hardware something like 7 months and counting before Nvidia have any.

    Also i once read an article about SLI and Crossfire, about how SLI has only one rendering type, scissors. (meaning the screen is divided in two) Whereas ATI have scissors, tiled, (so that the more demanding areas of the screen are better divided amongst the cards) and others.
    Also, you can combine any HD series ATI card with any HD series card! thats way better than having to say, buy another 7800 for SLI because just one isn't doing it anymore, even though the 9800 series are out. With CFire you could add a 4870 to your old 3870!

    Im currently with Nvidia (a 9600gt (found one for £60!)) but am frequently impressed by ATI.

    -IS- ATI the smart customer's choice?
    Reply
  • billywigga - Friday, August 29, 2008 - link

    im pretty shore the 4870 is low profile ive been looking everywhere for a low profile graphics card and i think i foung a high ends one unlike the geforce 8400 and the 8600 those arenot very good and they dont look good either but where do i buy the 4870 Reply
  • Hrel - Thursday, August 21, 2008 - link

    Why has there been no article comparing the 8800GT to the 9800GT? Is it just a rebrand, are there noticeable performance differences. It's 9 series, I assume it has hybrid power, but I don't know. Anandtech, PLEASE! Do an article on this. Reply
  • billywigga - Friday, August 29, 2008 - link

    bang for the buck id get the 9800 because its newer also all the diffrence is it has more intagrated ram your saving a lot if you just get more ram to your computer. Reply
  • firewolfsm - Friday, August 01, 2008 - link

    I'm trying to do the same benchmark for Crysis for my 4850 as I have a similar system and a fresh vista install. Just wondering what kind of driver settings you used. Reply
  • spikeysting - Saturday, July 19, 2008 - link

    I just got it for $179 at Frys. Such a good deal. Reply
  • Yangorang - Tuesday, July 08, 2008 - link

    Anyone tried this mod?
    http://www.hardforum.com/showthread.php?t=1319658">http://www.hardforum.com/showthread.php?t=1319658
    Reply
  • jALLAD - Friday, July 04, 2008 - link

    I would for sure go for the 4870. (I am a Quake Wars fan boy u see :P)
    But I was unsure how these would perform on Linux. Is the driver support reliable? Right now I use a 7950 NVIDIA and their support on Linux is almost shite. I was wondering whether its better or worse...

    Anyone ?
    Reply
  • KriegenSchlagen - Monday, July 07, 2008 - link

    If you are a Quake Wars fan, aren't the scores higher in 3-4 GeForce SLI configs vs. Crossfire mode 4870? Reply
  • jALLAD - Wednesday, July 09, 2008 - link

    well I am looking forward to a single card setup. SLI or CF is beyond the reach of my pockets. :P

    Reply
  • Grantman - Friday, July 04, 2008 - link

    Thank you very much for including the 8800gt sli figures in your benchmarks. I created an account especially so I could thank Anand Lal Shimpi & Derek Wilson as I have found no other review site including 8800gt sli info. It is very interesting to see the much cheaper 8800gt sli solution beating the gtx 280 on several occasions. Reply
  • Grantman - Friday, July 04, 2008 - link

    When I mentioned "no other review site including 8800gt sli info" I naturally meant in comparison with the gtx280, gx2 4850 crossfire etc etc.

    Thanks again.
    Reply
  • ohodownload - Wednesday, July 02, 2008 - link

    computer-hardware-zone.blogspot.com/2008/07/ati-radeon-hd4870-x2-specification.
    tml
    Reply
  • DucBertus - Wednesday, July 02, 2008 - link

    Hi,

    Nice article. Could you please add the amount of graphics memory on the cards to the "The Test" page of the article. The amount of memory matters for the performance and (not unimportant) the price of the cards...

    Cheers, DucBertus.
    Reply
  • hybrid2d4x4 - Sunday, June 29, 2008 - link

    Hello!
    Long-time reader here that finally decided to make an account. First off, thanks for the great review Anand and Derek, and hats off to you guys for following up to the comments on here.
    One thing that I was hoping to see mentioned in the power consumption section is if AMD has by any chance implemented their PowerXpress feature into this generation (where the discrete card can be turned off when not needed in favor of the more efficient on-board video- ie: HD3200)? I recall reading that the 780G was supposed to support this kind of functionality, but I guess it got overlooked. Have you guys heard if AMD intends to bring it back (maybe in their 780GX or other upcoming chipsets)? It'd be a shame if they didn't, seeing as how they were probably the first to bring it up and integrate it into their mobile solutions, and now even nVidia has their own version of it (Hybrid Power, as part of HybridSLI) on the desktop...
    Reply
  • AcornArmy - Sunday, June 29, 2008 - link

    I honestly don't understand what Nvidia was thinking with the GTX 200 series, at least at their current prices. Several of Nvidia's own cards are better buys. Right now, you can find a 9800 GX2 at Pricewatch for almost $180 less than a GTX 280, and it'll perform as well as the 280 in almost all cases and occasionally beat the hell out of it. You can SLI two 8800 GTs for less than half the price and come close in performance.

    There really doesn't seem to be any point in even shipping the 280 or 260 at their current prices. The only people who'll buy them are those who don't do any research before they buy a video card, and if someone's that foolish they deserve to get screwed.
    Reply
  • CJBTech - Sunday, June 29, 2008 - link

    Hey iamap, with the current release of HD 4870 cards, all of the manufacturers are using the reference ATI design, so they should all be pretty much identical. It boils down to individual manufacturer's warranty and support. Sapphire, VisionTek, and Powercolor have all been great for me over the years, VisionTek is offering a lifetime warranty on these cards. I've had poor experiences with HIS and Diamond, but probably wouldn't hesitate to get one of these from either of those manufactures on this particular card (or the HD 4850) because they are the same card, ATI reference. Reply
  • Paladin1211 - Saturday, June 28, 2008 - link

    Now that the large monolithic, underperforming chip is out, leaving AMD free to grab market share, I'm so excited at what to happen. As nVidia's strategy goes, they're now scaling down the chip. But pardon me, cut the GTX 280 in half and then prices it at $324.99? That sounds so crazy!

    Anyone remembers the shock treatment of AMD with codename "Thunder"? DAAMIT has just opened "a can of whoop ass" on nVidia!
    Reply
  • helldrell666 - Friday, June 27, 2008 - link

    Anand tech why didnt you use and amd 790FX board to bench the radeon cards instead of using an nvidia board for both nvidia and ATI cards.It would be more accurate to bench those cards on compatible boards .
    I think those cards would have worked better on an amd board based on the radeon express 790fx chipset.
    Reply
  • iamap - Friday, June 27, 2008 - link

    I'm looking to buy the 4870 from newegg when they get back in stock next week but I'm not familiar with any of the manufactures, except for Diamond, and I had problems with Diamond years ago.

    Diamond
    HIS
    Powercolor
    Sapphire Technology Limited
    VisionTek

    Any advice, especially ones to avoid?
    Reply
  • feelingshorter - Friday, June 27, 2008 - link

    Go with the one with the warranty. Which would be visiontech life time warranty. Asus does offer a 3 year warranty also. Reply
  • Nehemoth - Friday, June 27, 2008 - link

    Check this one
    http://www.tgdaily.com/content/view/38145/135/">http://www.tgdaily.com/content/view/38145/135/
    Reply
  • Gannon - Thursday, June 26, 2008 - link

    No supreme commander? :-O Reply
  • designerfx - Thursday, June 26, 2008 - link

    http://bensbargains.net/deal/69638/">http://bensbargains.net/deal/69638/


    195 -> 20$ rebate newegg -> 20$ rebate bensbargains = $155!

    To think this card will get cheaper yet!

    I'm buying one asap. This is a freakin steal at 150 bucks.
    Reply
  • QEFX - Thursday, June 26, 2008 - link

    Heck, pick up 2. $310 for CF 4850s! Now there's "bang for the buck" on games that actually work properly with crossfire. Reply
  • Jjunior130 - Thursday, June 26, 2008 - link

    can i haz quantum physix? lol Reply
  • Mustanggt - Thursday, June 26, 2008 - link

    I was watching the 8800GT SLI and it was in the top 2 most of the test, for less than the price of 2 4870s i could pick up a SLI board and another 8800 GT. perhaps also a E8400 to equal the $600 on 2 of these 4870s in CF. I am talking about the resolution i use that the 8800GT was looking very good in SLI 1680x1050 Reply
  • BusterGoode - Thursday, June 26, 2008 - link

    I'd like to see the differnce the GDDR5 made and since clocking the 4850 up may not be possible right now it would be nice to see the 4870 slowed down. If this has been asked or done sorry so much info pouring out now it is hard to keep up, thanks! Reply
  • DerekWilson - Sunday, June 29, 2008 - link

    this is an interesting request ... we'll look at the possibility ... Reply
  • BusterGoode - Sunday, June 29, 2008 - link

    Thanks, great article by the way Anandtech is my first stop for reviews. Reply
  • jay401 - Wednesday, June 25, 2008 - link

    Good but I just wish AMD would give it a full 512-bit memory bus bandwidth. Tired of 256-bit. It's so dated and it shows in the overall bandwidth compared to NVidia's cards with 512-bit bus widths. All that fancy GDDR4/5 and it doesn't actually shoot them way ahead of NVidia's cards in memory bandwidth because they halve the bus width by going with 256-bit instead of 512-bit. When they offer 512-bit the cards will REALLY shine. Reply
  • Spoelie - Thursday, June 26, 2008 - link

    Except that when R600 had a 512bit bus, it didn't show any advantage over RV670 with a 256bit bus. And that was with GDDR3 vs GDDR3, not GDDR5 like in RV770 case. Reply
  • JarredWalton - Thursday, June 26, 2008 - link

    R600 was 512-bit ring bus with 256-bit memory interface (four 64-bit interfaces). http://www.anandtech.com/showdoc.aspx?i=2552&p...">Read about it here for a refresh. Besides being more costly to implement, it used a lot of power and didn't actually end up providing provably better performance. I think it was an interesting approach that turned out to be less than perfect... just like NetBurst was an interesting design that turned out to have serious power limitations. Reply
  • Spoelie - Thursday, June 26, 2008 - link

    Except that it was not, that was R520 ;) and R580 is the X19x0 series. That second one proved to be the superior solution over time.

    R600 is the x2900xt, and it had a 1024bit ring bus with 512bit memory interface.
    Reply
  • DerekWilson - Sunday, June 29, 2008 - link

    yeah, r600 was 512-bit

    http://www.anandtech.com/showdoc.aspx?i=2988&p...">http://www.anandtech.com/showdoc.aspx?i=2988&p...

    looking at external bus width is an interesting challenge ... and gddr5 makes things a little more crazy in that clock speed and bus width can be so low with such high data rates ...

    but the 4870 does have 16 memory modules on it ... so that's a bit of a barrier to higher bit width busses ...
    Reply
  • JarredWalton - Wednesday, June 25, 2008 - link

    I'd argue that the 512-bit memory interface on NVIDIA's cards is at least partly to blame for their high pricing. All things being equal, a 512-bit interface costs a lot more to implement than a 256-bit interface. GDDR5 at 900MHz is effectively the same as GDDR3 at 1800MHz... except no one is able to make 1800MHz GDDR3. Latencies might favor one or the other solution, but latencies are usually covered by caching and other design decisions in the GPU world. Reply
  • geok1ng - Wednesday, June 25, 2008 - link

    The tests showed what i feared: my 8800GT is getting old to pump my Apple at 2560x1600 even without AA! But the tests also showed that the 512MB of DDR5 on the 4870 justifies the higher price tag over the 4850, something that the 3870/3850 pair failed to demonstrate. It remains the question: will 1GB of DDR5 detrone NVIDIA and rule the 30 inches realm of single GPU solutions? Reply
  • IKeelU - Wednesday, June 25, 2008 - link

    "It is as if AMD and NVIDIA just started pulling out hardware and throwing it at eachother"

    This makes me crack up...I just imagine two bruised and sweaty middle-aged CEO's flinging PCBs at each other, like children in a snowball fight.
    Reply
  • Thorsson - Wednesday, June 25, 2008 - link

    The heat is worrying. I'd like to see how aftermarket coolers work with a 4870. Reply
  • Final Destination II - Wednesday, June 25, 2008 - link

    http://www.techpowerup.com/reviews/Powercolor/HD_4...">http://www.techpowerup.com/reviews/Powercolor/HD_4...

    Look! Compare the Powercolor vs. the MSI.
    Somehow MSI seems to have done a better job with 4dB less.
    Reply
  • Final Destination II - Wednesday, June 25, 2008 - link

    Try ASUS, 7°C cooler. Reply
  • Justin Case - Wednesday, June 25, 2008 - link

    I thought it was only Johan, and it was sort of understandable since he's not a native English speaker, but it seems most Anandtech writers don't know the difference between "its" and "it's".

    "It's" means "it is" or "it has" (just as "he's" or "she's"). When you're talking about something that belongs to something else, you use "its" (or "his" / "her").

    In a sentence such as "RV770 in all it's [sic] glory.", you're clearly not saying "in all it is glory" or "in all it has glory"; you sare saying "in all the glory that belongs to it". So you should use "its", not "it's".

    Even if you can't understand the difference (which seems pretty straightforward, but for some reason confuses some people), modern grammar checkers will pick this up 9 times out of 10.
    Reply
  • CyberHawk - Thursday, June 26, 2008 - link

    I am not a native English speaker, but I am well aware of the difference. I am also sure that reviewers are also ... it's just that - with all this text, we can forgive them, can't we?

    I have a bachelor of computer science, studying for higher degree, but: I look at the technical side of the article, so I don't even notice the errors :D (although I can tell the difference I simply don't see it while reading)
    Reply
  • CyberHawk - Thursday, June 26, 2008 - link

    Oh, I forgot: maybe I'm just being too enthusiastic ;) Reply
  • JarredWalton - Wednesday, June 25, 2008 - link

    More likely is that with a 10000 word article and four lengthy GPU reviews in two weeks, errors slipped into the text. I know at one point I noticed Derek says "their" instead of "there" as well, and I can assure you that he knows the difference. I know I use Word's grammar checker, but I'm not sure Derek even uses Word sometimes. :) Reply
  • araczynski - Wednesday, June 25, 2008 - link

    of the 4850's, slickdeals has posted a sale, between rebate and coupon off...$150 each. can't beat that bang/$ by anything from nvidia.

    first ati cards that will ever be in my computers since i've started with the voodoo/riva tnt :)
    Reply
  • Denithor - Wednesday, June 25, 2008 - link

    Page 15: first reference to "GTX 280" should be "GTX 260" instead.

    Page 19: I think you meant "type" not "time" in this paragraph.
    Reply
  • natty1 - Wednesday, June 25, 2008 - link

    This review is flawed. It shows greater than 100% scaling for Crossfire 4870 in Call of Duty 4. Why don't they just give us the raw numbers for both single and dual cards in the same scenario? Why use a method that will artificially inflate the Crossfire results? Reply
  • Denithor - Wednesday, June 25, 2008 - link

    If you read the comments before yours, you'd see the answer.

    Experimental error and/or improved scaling for each card versus a single card. Read the earlier comment for more details.
    Reply
  • natty1 - Thursday, June 26, 2008 - link

    There's no good reason to pull that garbage. People assume they are seeing raw numbers when they read these reviews. Reply
  • DerekWilson - Sunday, June 29, 2008 - link

    i don't understand what you mean by raw numbers ... these are the numbers we got in our tests ...

    we can't do crossfire on the nvidia board we tested and we can't do sli on the intel board we tested ...

    we do have another option (skulltrail) but people seemed not to like that we went there ... and it was a pain in the ass to test with. plus fb-dimm performance leaves something to be desired.

    in any case, without testing every solution in two different platforms we did the best we could in the time we had. it might be interesting to look at testing single card performance in two different platforms for all cards, but that will have to be a separate article and would be way to tough to do for a launch.
    Reply
  • Denithor - Wednesday, June 25, 2008 - link

    In Bioshock in the multiGPU section the SLI 9800GTX+ seems to fall down on the job. In all other benches this SLI beats out the GTX 280 easily, here it fails miserably. While even the SLI 8800GT beats the GTX 280. Methinks something's wrong here. Reply
  • jamstan - Wednesday, June 25, 2008 - link

    Egg's got them for 309.99. I'm gonna run 2 4870s in CF. I planned on using a P45 board but I am wondering if the P45s X8 per card will bottleneck the bandwidth and if I should go with an X48 board instead? When I research CF all I seem to find is "losing any bandwidth at X8 versus X16 is "debateable". What I'm thinking is that 8 pipelines can handle 4GBs so if I look at the 4870s 3.6 Gbs of memory bandwidth then X8 should be able to handle the 4870 without any performance hits. It that correct or am I all wet? Reply
  • jamstan - Friday, June 27, 2008 - link

    I contacted ATI and they said I was correct. A P45 board only running X8 per card in CF will bottleneck the massive DDR5 bandwidth of the 4870s. If you're gonna CF 2 4870s use an X38 or X48 board. Reply
  • SVM79 - Wednesday, June 25, 2008 - link

    I created an account just to say how awesome this article was. It was really nice to see all the technical details laid out and compared to the competition. I was lucky to get in on that $150 hd4850 price at best buy last week and I am hoping the future drivers with improve performance even more. Please keep up the good work on these articles!!! Reply
  • DerekWilson - Sunday, June 29, 2008 - link

    Wow, Anand and I are honored.

    We absolutely appreciate the feedback we've gotten from all of you guys (even the bad stuff cause it helps us refine our future articles).

    of course we enjoy the good stuff more :-)

    thanks again, everyone.
    Reply
  • D3SI - Wednesday, June 25, 2008 - link

    Long time reader, first time poster

    great article, very informative

    looks like the 4870 is the card to get, cant be beat at that price

    and yes a lot of posters are reading way too much into it "you're biased waaa waaa boo hoo"

    just get the facts from the article (thats what the charts and graphs are for) and then make your decision, if you cant do simple math and come to the conclusion yourself that the $300 card is a better buy than the $650 then you deserve to get ripped off.
    Reply
  • joeschleprock - Wednesday, June 25, 2008 - link

    nVidia just got their pussy smoked. Reply
  • kelectron - Wednesday, June 25, 2008 - link

    a very important comparison is missing. for those who want to go in for a multi-GPU setup, the 260 SLI vs 4870 CF is a very important consideration since SLI scaling has always been better than CF, and the 260 scales very very well.

    in that case, if nvidia responds by reducing the price on the 260, the 260 SLI could be the real winner here. but sadly there were no 260 SLI benches.

    please give us a 260 SLI vs 4870 CF review.
    Reply
  • Final Destination II - Wednesday, June 25, 2008 - link

    Dear girls and guys,

    does anyone know of a manufacturer, who offers a HD4850 with a better cooler? I'm desperately searching for one...


    Please reply!
    Reply
  • Graven Image - Wednesday, June 25, 2008 - link

    Asus recently announced a 4850 with a non-stock cooler, though their version still doesn't expel the air out the back like a dual slot design. (http://www.asus.com/news_show.aspx?id=11871)">http://www.asus.com/news_show.aspx?id=11871). Its not available yet thought. My guess is mid-July we'll probably start seeing a couple different fan and heatsink designs. Reply
  • strikeback03 - Thursday, June 26, 2008 - link

    Only dual-slot card I've ever used was an EVGA 8800GTS 640, it sucked air in the back and blew it into the case. Reply
  • Final Destination II - Wednesday, June 25, 2008 - link

    Nice! 7°C cooler, that's a start! I guess I'll wait a bit more, then. Reply
  • Spacecomber - Wednesday, June 25, 2008 - link

    Although I'm somewhat dubious about dual card solutions, I keep looking at the benchmarks and then at the prices for a couple of 8800 GTs.

    Perhaps, if the 4870 forces Nvidia to reduce their prices for the GTX 260 and the GTX 280, they will likewise bring down the price for the 9800 GX2. This is already the fastest single card solution, and it sells for less than the GTX 280. If this card starts selling for under $400 (maybe around $350), will this become Nvidia's best answer to the 4870?

    Given the performance and the prices for the 4870 and the 9800 GX2 will Nvidia be able to price the GTX 280 competitively, or will it simply be vanity product - ridiculously priced and produced only in very small numbers?

    It should be interesting to see where the prices for video cards end up over the course of the next few weeks.
    Reply
  • kelmerp - Wednesday, June 25, 2008 - link

    Better HD knickknacks? Better offloading/upscaling? Reply
  • chizow - Wednesday, June 25, 2008 - link

    The HD4000 series have better HDMI sound support with 8ch LPCM over HDMI, but still can't pass uncompressed bistreams. Image quality hasn't changed as there isn't really any room to improve. Reply
  • kelmerp - Wednesday, June 25, 2008 - link

    It would be nice to have a video card, where it doesn't matter how weak the current-gen processor is (say the lowliest celeron available), the card can still output 1080p HDTV without dropping any frames. Reply
  • Chaser - Wednesday, June 25, 2008 - link

    Good to have back at the FRONT of the finish line. Reply
  • JPForums - Wednesday, June 25, 2008 - link

    Ragarding the SLI scaling in Witcher:
    The GTX 280 SLI setup may be running into a bottleneck or driver issues, rather than seeing inherent scaling issues. Consider, the 9800 GTX+ SLI setup scales from 22.9 to 44.5. So the scaling isn't an inherent SLI scaling problem. Though it may point to scaling issues specific to the GTX 280, it is more likely that the problem lies elsewhere. I do, however, agree with your general statement that when CF is working properly, it tends to scale better. In my systems, it seems to require less CPU overhead.
    Reply
  • DerekWilson - Wednesday, June 25, 2008 - link

    it looks like the witcher hits an artificial 72fps barrier ... not sure why as we are running 60hz displays, but that's our best guess. vsync is disabled, so it is likely a software issue. Reply
  • JarredWalton - Wednesday, June 25, 2008 - link

    Again, try faster CPUs to verify whether you are game limited or if there is a different bottleneck. The Witcher has a lot of stuff going on graphically that might limit frame rates to 70-75 FPS without a 4GHz Core 2 Duo/Quad chip. Reply
  • chizow - Wednesday, June 25, 2008 - link

    It looks like there seems to be a lot of this going on in the high-end, with GT200, multi-GPU and even RV770 chips hitting FPS caps. In some titles, are you guys using Vsync? I saw Assassin's Creed was frame capped, is there a way to remove the cap like there is with UE3.0 games? It just seems like a lot of the results are very flat as you move across resolutions, even at higher resolutions like 16x10 and 19x12.

    Another thing I noticed was that multi-GPU seems to avoid some of this frame capping but the single-GPUs all still hit a wall around the same FPS.

    Anyways, 4870 looks to be a great part, wondering if there will be a 1GB variant and if it will have any impact on performance.
    Reply
  • DerekWilson - Wednesday, June 25, 2008 - link

    the only test i know where the multi-gpu cards get past a frame limit is oblivion.

    we always run with vsync disabled in games.

    we tend not to try forcing it off in the driver as interestingly that decrease performance in situations where it isn't needed.

    we do force off where we can, but assassins creed is limiting the frame rate in absentia of vsync.

    not sure about higher memory variants ... gddr5 is still pretty new, and density might not be high enough to hit that. The 4870 does have 16 memory chips on it for its 256-bit memory bus, so space might be an issue too ...
    Reply
  • JarredWalton - Wednesday, June 25, 2008 - link

    Um, Derek... http://www.anandtech.com/video/showdoc.aspx?i=3320...">I think you're CPU/platform limited in Assassin's Creed. You'll certainly need something faster than 3.2GHz to get much above 63FPS in my experience. Try overclocking to 4.0GHz and see what happens. Reply
  • weevil - Wednesday, June 25, 2008 - link

    I didnt see the heat or noise benchmarks? Reply
  • gwynethgh - Wednesday, June 25, 2008 - link

    No info from Anandtech on heat or noise. The info on the 4870 is most needed as most reviews indicate the 4850 with the single slot design/cooler runs very hot. Does the two slot design pay off in better cooling, is it quiet? Reply
  • DerekWilson - Wednesday, June 25, 2008 - link

    a quick not really well controlled tests shows the 4850 and 4870 to be on par in terms of heat ... but i can't really go more into it right now.

    the thing is quiet under normal operation but it spins up to a fairly decent level at about 84 degrees. at full speed (which can be heard when the system powers up or under ungodly load and ambient heat conditions) it sounds insanely loud.
    Reply
  • legoman666 - Wednesday, June 25, 2008 - link

    I don't see the AA comparisons. There is no info on the heat or noise either. Reply
  • DerekWilson - Wednesday, June 25, 2008 - link

    the aa comparison page had a problem with nested quotes in some cases in combination with some google ads on firefox (though it worked in safari ie and opera) ...

    this has been fixed ...

    for heat and noise our commentary is up, but we don't have any quantitative data here ... we just had so much else to pack into the review that we didn't quite get testing done here.
    Reply
  • araczynski - Wednesday, June 25, 2008 - link

    ...as more and more people are hooking up their graphics cards to big HDTVs instead of wasting time with little monitors, i keep hoping to find out whether the 9800gx2/4800 lines have proper 1080p scaling/synching with the tvs? for example the 8800 line from nvidia seems to butcher 1080p with tv's.

    anyone care to speak from experience?
    Reply
  • DerekWilson - Wednesday, June 25, 2008 - link

    i havent had any problem with any modern graphics card (dvi or hdmi) and digital hdtvs

    i haven't really played with analog for a long time and i'm not sure how either amd or nvidia handle analog issues like overscan and timing.
    Reply
  • araczynski - Wednesday, June 25, 2008 - link

    interesting, what cards have you worked with? i have the 8800gts512 right now and have the same problem as with the 7900gtx previously. when i select 1080p for the resolution (which the drivers recognize the tv being capable of as it lists it as the native resolution) i get a washed out messy result where the contrast/brightness is completely maxed (sliders do little to help) as well as the whole overscan thing that forces me to shrink the displayed image down to fit the actual tv (with the nvidia driver utility). 1600x900 can usually be tolerable in XP (not in vista for some reason) and 1080p is just downright painful.

    i suppose it could by my dvi to hdmi cable? its a short run, but who knows... i just remember reading a bit on the nvidia forums that this is a known issue with the 8800 line, so was curious as to how the 9800 line or even the 4800 line handle it.

    but as the previous guy mentioned, ATI does tend to do the TV stuff much better than nvidia ever did... maybe 4850 crossfire will be in my rig soon... unless i hear more about the 4870x2 soon...
    Reply
  • ChronoReverse - Wednesday, June 25, 2008 - link

    ATI cards tend to do the TV stuff properly Reply
  • FXi - Wednesday, June 25, 2008 - link

    If Nvidia doesn't release SLI to Intel chipsets (and on a $/perf ratio it might not even help if it does), the 4870 in CF is going to stop sales of the 260's into the ground.

    Releasing SLI on Intel and easing the price might help ease that problem, but of course they won't do it. Looks like ATI hasn't just come back, they've got a very, very good chip on their hands.
    Reply
  • Powervano - Wednesday, June 25, 2008 - link

    Anand and Derek

    What about temperatures of HD4870 under IDLE and LOAD? page 21 only shows power comsumption.
    Reply
  • iwodo - Wednesday, June 25, 2008 - link

    Given how ATI architecture greatly rely on maximizing its Shader use, wouldn't driver optimization be much more important then Nvidia in this regard?

    And is ATI going about Nvidia CUDA? Given CUDA now have a much bigger exposure then how ever ATI is offering.. CAL or CTM.. i dont even know now.
    Reply
  • DerekWilson - Wednesday, June 25, 2008 - link

    getting exposure for AMD's own GPGPU solutions and tools is going to be though, especially in light of Tesla and the momentum NVIDIA is building in the higher performance areas.

    they've just got to keep at it.

    but i think their best hope is in Apple right now with OpenCL (as has been mentioned above) ...

    certainly AMD need to keep pushing their GPU compute solutions, and trying to get people to build real apps that they can point to (like folding) and say "hey look we do this well too" ...

    but in the long term i think NVIDIA's got the better marketing there (both to consumers and developers) and it's not likely going to be until a single compute language emerges as the dominant one that we see level competition.
    Reply
  • Amiga500 - Wednesday, June 25, 2008 - link

    AMD are going to continue to use the open source alternative - Open CL.


    In a relatively fledgling program environment, it makes all the sense in the world for developers to use the open source option, as compatibility and interoperability can be assured, unlike older environments like graphics APIs.


    OSX v10.6 (snow lepoard) will use Open CL.
    Reply
  • DerekWilson - Wednesday, June 25, 2008 - link

    OpenCL isn't "open source" ...

    Apple is trying to create an industry standard heterogeneous compute language.

    What we need is a compute language that isn't "owned" by a specific hardware maker. The problem is that NVIDIA has the power to redefine the CUDA language as it moves forward to better fit their architecture. Whether they would do this or not is irrelevant in light of the fact that it makes no sense for a competitor to adopt the solution if the possibility exists.

    If NVIDIA wants to advance the industry, eventually they'll try and get CUDA ANSI / ISO certified or try to form an industry working group to refine and standardize it. While they have the exposure and power in CUDA and Tesla they won't really be interested in doing this (at least that's our prediction).

    Apple is starting from a standards centric view and I hope they will help build a heterogeneous computing language that combines the high points of all the different solutions out there now into something that's easy to develop or and that can generate code to run well on all architectures.

    but we'll have to wait and see.


    Reply
  • Amiga500 - Wednesday, June 25, 2008 - link

    Apple has passed over control of Open CL to the Khronos group, which manage open sourced coding.

    To all intentions and purposes, it is open source. :-)
    Reply
  • emergancyexit - Wednesday, June 25, 2008 - link

    i hope you do 3x crossfire can do. maybe a 4x 4850 vs 3x GTX 260 just to satisfy us readers for the moment would be lovely! Reply
  • DerekWilson - Wednesday, June 25, 2008 - link

    i'm not sure if this is supported out of the box ... ill have to check it out ... Reply
  • emergancyexit - Wednesday, June 25, 2008 - link

    i would really like to know what type of performance theese cards could get in an MMO. (and hopefully compare them to some cheaper cards) Games im interested in are some of the newer titles like Age of conan ( i hear it's graphics are great and is a workout for even a 8800 ultra) And Eve-online (thier new graphics engine works cards pretty hard too)

    MMO's Graphics usually get pretty intesive with some odd 200+ characters flying around shooting fireballs evrywhere with missles sailing through the air in a land of hundreds of monsters as far as the eye can see. it can get pretty demanding on a gameing computer, just as much (if not more) as a hit new title.

    for example, on my current Rig i can get around 50FPS steady at 1440x900 but on Eve-Online i get 35 at the most at peacefull times and 20 or even 15 in a large fight with FEW graphics options selected.
    Reply
  • MIP - Wednesday, June 25, 2008 - link

    Great review, the 4870 looks to be fantastic value. However, we're missing the 'heat and noise' part. Reply
  • skiboysteve - Wednesday, June 25, 2008 - link

    Not only do these cards rock, but I wouldn't be surprised if AMD has an ace up its sleeve with the 4870x2... with that crossfire interconnect directly connected to the data hub that you showed on the chart. That and the fact that they have been looking forward to this crossfire strategy of attacking the high end for quite some time so they might have some tricky driver stuff coming with it.

    I have been disappointed with the heat and power consumption of these cards. But:
    1) Someone said powerplay is getting a driver tweak and, I can always clock them lower in 2D than 500/1000 (which is insane for 2d)
    2) That hardware site someone linked earlier showed a more than 50% reduction in temperatures with an aftermarket cooler! Thats insane!!

    And finally, if I can get the 1 & 2 fixed... I want to know how well these babys overclock. If I can get a 4850 running like a 4870 or better... yum. And in that case, how high will a 4870 OC? And I want to know this with a non stock cooler, because apparently the stock ones suck. With a non stock cooler if the 4850 clocks up to 4870 level, but the 4870 clocks way up too... i'm gonna have to grab a 4870.

    So yeah, fix #1 and #2 and find me non-stock cooler OC #s and I'll go buy one (maybe two?) when nehalem comes out
    Reply
  • Powered by AMD - Wednesday, June 25, 2008 - link

    Impressive review, Thanks :)
    A few glitches:
    It says "Power Consumption, Heat and Noise", but the graphs only shows Power Consuption.
    In Page 17 (The Witcher), in second paragraph, it says 390X2 instead of 3870.

    Thanks again.
    Cheers from Argentina.
    Reply
  • Conscript - Wednesday, June 25, 2008 - link

    atleast that was the tile of the second to last page...but only see two power consumption graphs? Reply
  • Proteusza - Wednesday, June 25, 2008 - link

    I quote one Kristopher Kubricki regarding whether the RV770 is inferior to the GT200:

    "It is. Even AMD isn't going to tell you otherwise. You can debate this all you want, but it's still a $200 video card."

    So, please tell me now why I should pay $650 for a GTX280. I'm struggling to see the logic here.

    Source: http://www.dailytech.com/Update+AMD+Preps+Radeon+4...">http://www.dailytech.com/Update+AMD+Pre...50+Launc...
    (near the bottom)
    Reply
  • AbRASiON - Wednesday, June 25, 2008 - link

    I can live with a greedier card than my 8800GT but I refuse to put up with a noisy machine.

    Any comments on the heat and noise please? would be nice!
    Reply
  • 0g1 - Wednesday, June 25, 2008 - link

    In the article it says the GT200 doesn't need to do ILP. It only has 10 threads. Each of those threads needs ILP for each of the SP's. The problem with AMD's approach is each SP has 5 units and is aimed directly at processing x,y,z,w matrix style operations. Doing purely scalar operations on AMD's SP's would be only using 1 out of the 5 units. So, if you want to get the most out of AMD's shaders, you should be doing vector calculations. Reply
  • DerekWilson - Wednesday, June 25, 2008 - link

    The GT200 doesn't worry with ILP at all.

    a single thread doesn't run width wise across all execution units. instead different threads execute the exact same single scalar op on their own unique bit of data (there is only one program counter per SM for a context). this is all TLP (thread level parallelism) and not ILP.

    AMD's compiler can pack multiple scalar ops into a 5-wide VLIW operation.

    on purely scalar code with many independent ops in a long program, AMD can fill all their units and get close to peak performance. explicit vector instructions are not necessary.
    Reply
  • gigahertz20 - Wednesday, June 25, 2008 - link

    http://www.hardwarecanucks.com/forum/hardware-canu...">http://www.hardwarecanucks.com/forum/ha...870-512m...



    The site above mounted an after market cooler on it and got awesome results. Either the Thermalright HR-03 GT is just that great of a GPU cooler, or the standard heatsink/fan on the 4870 is just that horrible. Going from 82C to 43C at load and 55C to 33C at idle, just from an after market cooler is crazy! I was hoping to see some overclocking scores after they mounted the Thermalright on it, but nope :(
    Reply
  • Matt Campbell - Wednesday, June 25, 2008 - link

    The HR-03GT really is that great. Check it out: http://www.anandtech.com/casecoolingpsus/showdoc.a...">http://www.anandtech.com/casecoolingpsus/showdoc.a...

    Our 8800GT went from 81 deg. C to 38 deg. C at load, 52 to 32 at idle. That's also with the quietest fan on the market at low speed. And FWIW, I played through all of The Witcher (about 60 hours) with the 8800GT passively cooled in a case with only 1 120mm fan.

    -Matt
    Reply
  • Clauzii - Wednesday, June 25, 2008 - link

    I see no fan on that thing??! PASSIVE?? :O ?? Reply
  • jeffreybt2 - Wednesday, June 25, 2008 - link

    "Please note that this is with a single Zalman 92MM fan operating at 1600RPM along with Arctic Cooling MX-2 applied to the base." Reply
  • magnusr - Wednesday, June 25, 2008 - link

    Does the audio part of the card support PAP? If not all blu-ray audio will be downsampled to 16/48... Reply
  • NullSubroutine - Wednesday, June 25, 2008 - link

    I would just like to point out that the 4870 falls behind the 3870 X2 in Oblivion while in every other game it runs circles around it. To me it appears to be a driver problem with Oblivion rather than an indication of the hardware not doing well there. Unless of course the answer lies in the ring bus of the R680? Reply
  • orionmgomg - Wednesday, June 25, 2008 - link

    I would love to see more benchmarks with the CPU OCed to at least 4.0

    All the CPUs you use can hit it NP.

    Also, what about at least 2 GTX 280 Cards and their numbers. Noticed that you did have them in SLI cause the power comsumption comparisons had them, but you held back the performance numbers...

    Lets see the top 4 cards from ATI and Nvidia compete in dule GPU (no punt intended)on an X48 with DDR3 1600 and a FSB of 400x10!

    That would be really nice for the people hoe have performance systems, but may still be rocking out a pair of EVGA 8800Ultras, cause their waiting for real numbers and performance to come out - and their still paying off theye systems lol... :]
    Reply
  • Ilmarin - Wednesday, June 25, 2008 - link

    You're probably aware of these already, but I'll mention them just in case:

    * Page 10 (AA comparison) is malformed with no images
    * Page 21 (Power, Heat and Noise) is missing the Heat and Noise stuff.

    Heat is a big issue with these 4800 cards and their reference coolers, so it would be good to see it covered in detail. My 7800 GTX used to artifact and cause crashes when it hit 79 degrees, before I replaced it with an aftermarket cooler. Apparently the 4870 hits well over 90 degrees at load, and the 4850 isn't much better. Decent aftermarket coolers (HR-03 GT, DuOrb) aren't cheap... and if that's what it takes to prevent heat problems on these cards, some people might consider buying a slower card (like a 9800 GTX+) just because it has better cooling.

    Anand, you guys should do a meltdown test... pit the 9800 GTX+ against the 4850, and the 4870 against the GTX 260, all with reference coolers, in a standard air-cooled system at a typical ambient temp. Forget timedemos/benchmarks... play an intensive game like Crysis for an hour or two, and see if you encounter glitches and crashes. If the 4800 cards can somehow remain artifact/crash free at those high temps, then I'd more seriously consider buying one. Heat damage over time may also be a concern, but is hard to test for.

    Sure, DAAMIT's partners will eventually put non-reference coolers on some cards, but history tells us that the majority of the market in the first few months will be stock-cooled cards, so this has got be of concern to consumers... especially early adopters.
    Reply
  • shadowteam - Wednesday, June 25, 2008 - link

    Did you know these chips can do up to 125C? 90C is so common for ATI cards, I haven't had one since 2005 that didn't blow me hair dry. Your NV card was just a bad chip I suppose. Why do you think NV or ATI would spend a billion dollars in research work, then let its product burn away due to some crappy cooling? They won't give you more cooling than you actually need. It's the same very cards that go to places like Abu-Dhabi, where room temps. easily hit 50C+. Reply
  • soloman02 - Wednesday, June 25, 2008 - link

    Sorry, but no human would survive a temp of 50C.
    http://en.wikipedia.org/wiki/Thermoregulation#Hot">http://en.wikipedia.org/wiki/Thermoregulation#Hot
    In fact the highest temp a human has survived was recorded by the Guinness book of world records as: 46.5C (115.7F). Keep in mind that was the internal temp of the guy. The temp on that day was 32.2C (90F).
    http://www.powells.com/biblio?show=0553587129&...">http://www.powells.com/biblio?show=0553587129&...
    http://www.time.com/time/magazine/article/0,9171,9...">http://www.time.com/time/magazine/article/0,9171,9...

    If it is 50C in those rooms, the people inside are dead or dying.

    The cards are probably fine. All it takes is to search google to back up your figures (or to disprove them like I just did).
    Reply
  • shadowteam - Wednesday, June 25, 2008 - link

    You're just a dumb pissed off loser. There's a big difference in internal human temperature to its surroundings. In places like Sahara, temperatures routinely hit 45C, and max out @ 55C. But does that mean people living there just die? No they don't, because they drink a lot of water, which helps their bodies get rid of excess heat so to keep their internals at normal temperature (32C). You didn't have this knowledge to share so you decided to Google it instead, and make fool out of yourself. Here, let me break it down for you,

    You said: "Keep in mind that was the internal temp of the guy"

    Exactly, the guy was sick, and when you're sick, your body temperature rises, in which case 46C is the limit of survival. I suggest you take Bio-chemistry in college to learn more about human body, which is another 4 years before you finish school.
    Reply
  • Ilmarin - Wednesday, June 25, 2008 - link

    I'm not talking about chips failing altogether... just stability issues, similar to what you experience from over-zealous overclocking. Lots of people have encountered artifacting/crashes with stock-cooled cards over the years. If these are just 'bad chips' that are experiencing stability issues at high temps, then there are a lot of them getting through quality control. Of course NV and ATI do enough to make most people happy... but many of us have good reason to be nervous about temperature. I think they can and should do better. Dual slot exhaust coolers should be mandatory for the enthusiast/performance cards, with full fan control capability. Often it's up to the partners to get that right, and often it doesn't happen for at least a couple of months. Reply
  • shadowteam - Wednesday, June 25, 2008 - link

    I think it's more profitable for board partners to just roll out a stock card rather than go through the trouble of investing time/money into performance cooling. What I've seen thus far, and it's quite apparent, that newer companies tend to go exotic cooling to get themselves heard. Once they're in the game, it's back to stock cooling. For example, Palit and ECS came up with nice coolers for its 9600s. Remember Leadtek from past years? They don't even do custom coolers any more. ASUS, Powercolor, Gigabyte, Sapphire etc just find it easier to throw in a 3rd party cooler from ZM, TT TR, and call it a day. Reply
  • DerekWilson - Wednesday, June 25, 2008 - link

    you know we actually received an updated bios for a certain vendors 4850 that speeds the fan up a bit and should reduce heat ...

    i suspect a lot of vendors will start adjusting their fan tables actually ...
    Reply
  • shadowteam - Wednesday, June 25, 2008 - link

    I think this reply was meant for the guy right above me. I'm all for stock cooling :). Reply
  • ImmortalZ - Wednesday, June 25, 2008 - link

    "Quake Wars once again shows the 4870 outperforming the GTX 280, but this time it offers essentially the same performance as the GTX 280 - but at half the price. "

    You mean the 260 in the first instance?

    No text in The Witcher page. I assume this is intentional.

    Also, I've heard on the web that the 48xx series has dual-link only on one of it's DVI ports. Is this true?

    Oh and another thing - why is the post comment page titled "Untitled Page"? :P
    Reply
  • rahat5810 - Wednesday, June 25, 2008 - link

    Nice cards and nice article. But I would like to point out that there are some mistakes in the article, nothing fatal though. Like, not mentioning 4870 in the list of cards, writing 280 instead of 260, clicking on the picture to enlarge not working for some of the figures. Reply
  • feelingshorter - Wednesday, June 25, 2008 - link

    AMD almost has a perfect card but the fact that the 4870 idles at 46.1 more watts than the 260 means the card will heat up people's room. At load, the difference of 16.1 watts more for the 4870 is forgivable.

    If its possible to overclock a card using software (without going into BIOS screen), then why isn't it possible to underclock a card also using software when the card's full potential isn't being used? I'd really be interested in knowing the answer, or maybe someone just hasn't asked the question?

    I hardly care about Crysis, its more a matter of will it run Starcraft II with 600 units on the map without overheating. Why doesn't anandtech also test how hot the 4870 runs? Although the 4850 numbers aren't pretty at all, the 4870 is a dual slot cooler and might give better numbers right? I only want to know because, like a lot of readers, i have doubts as to whether a card like the 4850 can run super hot and not die within 1+ years of hardcore gaming.
    Reply
  • FITCamaro - Wednesday, June 25, 2008 - link

    Yes I noticed it used quite a bit at idle as well. But its load numbers were lower. And as the other guy said, they probably just are still finalizing the drivers for the new cards. I'd expect both performance and idle power consumption to improve in the next month or two. Reply
  • derek85 - Wednesday, June 25, 2008 - link

    I think ATI is still fixing/finalizing the Power Play, it should be much lower when new Catalyst comes out. Reply
  • shadowteam - Wednesday, June 25, 2008 - link

    If a $200 card can play all your games @ 30+fps, does a $600 card even make sense knowing it'll do no better to your eyes? I see quite a few NV biased elements in your review this time around, and what's all that about the biggest die size TSMC's every produced? GTX's die may be huge, but compared to AMD's, it's only half as efficient. Your review title, I think, was a bit harsh toward AMD. By limiting AMD's victory only up to a price point of $299, you're essentially telling consumers that NV's GTX 2xx series is actually worth the money, which is a terribly biased consumer advice in my opinion. From a $600 GX2 to a $650 GTX 280, Nvidia's actually gone backwards. You know when we talk about AMD's financial struggle, and that the company might go bust in the next few years... part of the reason why that may happen is because media fanatics try to keep things on an even keel, and in doing so they completely forget about what the consumers actually want. No offence to AT, but I've been into media myself, and I can tell when even professionals sound biased. Reply
  • paydirt - Wednesday, June 25, 2008 - link

    You're putting words into the reviewer(s) mouth(s) and you know it. I am pretty sure most readers know that bigger isn't better in the computing world; anandtech never said big was good, they are simply pointing out the difference, duh. YOU need to keep in mind that nVidia hasn't done a die shrink yet with the GTX 2XX...

    I also did not read anything in the review that said it was worth it (or "good") to pay $600 on a GPU, did you? Nope. Thought so. Quit trying to fight the world and life might be different for you.

    I'm greatful that both companies make solid cards that are GPGPU-capable and affordable and we have sites like anandtech to break down the numbers for us.

    Reply
  • shadowteam - Wednesday, June 25, 2008 - link

    Are you speaking on behalf of the reviewers? You've obviously misunderstood the whole point I was trying to make. When you say in your other post that AT is a reviews site and not a product promoter, I feel terribly sorry you because reviews sites are THE best product promoters around, including AT, and Derek pointed this out earlier that AT's too influential to ignore by companies. Well if that is truly the case, why not type in block letters how NV's trying to rip us off, for consumers' sake, may be just for once do it, it'll definitely teach Nvidia a lesson. Reply
  • DaveninCali - Wednesday, June 25, 2008 - link

    I completely agree. Anand, the GTX 260/280 are a complete waste of money. You are not providing adequate conclusions. Your data speaks for itself. I know you have to be "friendly" in your conclusions so that you don't arouse the ire of nVidia but the launch of the 260/280 is on the order of the FX series.

    I mean you can barely test the cards in SLI mode due to the huge power constraints and the price is ABSOLUTELY ridiculous. $1300 for SLI GTX 280. $1300!!!! You can get FOUR 4870 cards for less than this. FOUR OF THEM!!!! You should be screaming how poorly the GTX 280/260 cards are at these performance numbers and price point.

    The 4870 beats the GTX 260 in all but one benchmark at $100 less. Not to mention the 4870 consumes less power than the GTX 280. Hell, the 4870 even beats the GTX 280 in some benchmarks. For $350 more, there shouldn't even be ONE game that the 4870 is better at than the GTX 280. Not even more for more than 100% of the price.

    I'm not quite sure what you are trying to convey in this article but at least the readers at Anandtech are smart enough to read the graphs for themselves. Given what has been written in the conclusion page (3/4 of it about GPGPU jargon that is totally unnecessary) could you please leave the page blank instead.

    I mean come on. Seriously! $1300 compared to $600 with much more performance coming from the 4870 SLI. COME ON!! Now I'm too angry to go to bed. :(
    Reply
  • DaveninCali - Wednesday, June 25, 2008 - link

    Oh and one other thing. I thought Anandtech was a review site for the consumer. How can you not warn consumers from spending $650 much less $1300 on a piece of hardware that isn't much faster and in some cases not faster at all than another piece of hardware priced at $300/$600 in SLI. It's borderline scam.

    When you can't show SLI numbers because you can't even find a power supply that can provide the power, at least an ounce of criticism should be noted to try and stop someone from wasting all that money.

    Don't you think that consumers should be getting some better advise than this. $1300 for less performance. I feel so sad now. Time to go to sleep.
    Reply
  • shadowteam - Wednesday, June 25, 2008 - link

    It reminds of that NV scam from yesteryears... I'm forgetting a good part of it, but apparently NV and "some company" racked up some forum/blog gurus to promote their BS, including a guy on AT forums who eventually got rid off due to his extremely biased posts. If AT can do biased reviews, I can pretty much assure you the rest of the reviewers out there are nothing more than just misinformed, over-arrogant media puppets. To those who disagree w/ me or the poster above, let me ask you this... if you were sent out $600 hardware every other week, or in AT's case, every other day (GTX280's from NV board partners), would you rather delightfully, and rightfully, piss NV off, or shut your big mouth to keep the hardware, and cash flowing in? Reply
  • DerekWilson - Wednesday, June 25, 2008 - link

    Wow ...

    I'm completely surprised that you reacted the way you did.

    In our GT200 review we were very hard on NVIDIA for providing less performance than a cheaper high end part, and this time around we pointed out the fact that the 4870 actually leads the GTX 260 at 3/4 of the price.

    We have no qualms about saying anything warranted about any part no matter who makes it. There's no need to pull punches, as what we really care about are the readers and the technology. NVIDIA really can't bring anything compelling to the table in terms of price / performance or value right now. I think we did a good job of pointing that out.

    We have mixed feelings about CrossFire, as it doesn't always scale well and isn't as flexible as SLI -- hopefully this will change with R700 when it hits, but for now there are still limitations. When CrossFire does work, it does really well, and I hope AMD work this out.

    NVIDIA absolutely need to readjust the pricing of most of their line up in order to compete. If they don't then AMD's hardware will continue to get our recommendation.

    We are here because we love understanding hardware and we love talking about the hardware. Our interest is in reality and the truth of things. Sometimes we can get overly excited about some technology (just like any enthusiast can), but our recommendations always come down to value and what our readers can get from their hardware today.

    I know I can speak for Anand when I say this (cause he actually did it before his site grew into what it is today) -- we would be doing this even if we weren't being paid for it. Understanding and teaching about hardware is our passion and we put our heart and soul into it.

    there is no amount of money that could buy a review from us. no hardware vendor is off limits.

    in the past companies have tried to stop sending us hardware because they didn't like what we said. we just go out and buy it ourselves. but that's not likely to be an issue at this point.

    the size and reach of AnandTech today is such that no matter how much we piss off anyone, Intel, AMD, NVIDIA, or any of the OEMs, they can't afford to ignore us and they can't afford to not send us hardware -- they are the ones who want an need us to review their products whether we say great or horrible things about it.

    beyond that, i'm 100% sure nvidia is pissed off with this review. it is glowingly in favor of the 4870 and ... like i said ... it really shocks me that anyone would think otherwise.

    we don't favor level playing fields or being nice to companies for no reason. we'll recommend the parts that best fit a need at a price if it makes sense. Right now that's 4870 if you want to spend between $300 and $600 (for 2).

    While it's really really not worth the money, GTX 280 SLI is the fastest thing out there and some people do want to light their money on fire. Whatever.

    i'm sorry you guys feel the way you do. maybe after a good night sleep you'll come back refreshed and see the article in a new light ...
    Reply
  • formulav8 - Wednesday, June 25, 2008 - link

    Even in the review you claim 4870 is a $400 performer. So why don't you reflect that in the articles title by adding it after the $300 price?? Would be better to do so I think anyways. :)

    Maybe say 4870 wins up to the $400 price point and likewise with the 4850 version up to the $250 price that you claimed in the article...

    This tweak could be helpful to some buyers out there with a specific budget and could help save them some money in the process. :)


    Jason
    Reply
  • paydirt - Wednesday, June 25, 2008 - link

    This is a review site. This isn't a site to market/promote products. Reply
  • formulav8 - Thursday, June 26, 2008 - link

    They do recommend hardware for different price points and such. So they do market in a way. Have you seen anands picks links? That is promoting products and does it through his referral links as well to get paid to do so. :)

    Anyways, mentioning something as a better buy up to a certain price point would be helpful to someone who is not really in the know.



    Jason
    Reply
  • shadowteam - Wednesday, June 25, 2008 - link

    You've got excellent written skills buddy, and I can't help thinking you're actually better at reviews than your m8 (no offence Anand), but what I truly meant from my post above is what you summed up rather well in your conclusive lines, quote: "You can either look at it as AMD giving you a bargain or NVIDIA charging too much, either way it's healthy competition in the graphics industry once again (after far too long of a hiatus)"

    Either way? Why should anyone look the other way? NV is clearly shitting all over the place, and you can tell that from the email they send you (or Anand) a couple days back. So they ripped us off for 6 months, and now suddenly decide the 9800GTX is worth $200?

    Healthy competition? Could you please elaborate on this further?
    $199 4850 vs $399 GTX260.... yup! that's healthy

    GTX+ vs 4850?
    Does that mean the GTX260 is now completely irrelevant? In fact, the 2xx series is utterly pointless no matter how you look at it.

    To bash on AMD, the 4870 is obviously priced high. For $100 extra, all you get is an OC'ed 4850 w/ DDR5 support. I don't think anyone here cares about DDR5, all that matters is performance, and the extra bucks plainly not worth it. From a consumers' perspective, the 4850 is the best buy, the 4870 isn't.
    Reply
  • mlambert890 - Sunday, July 13, 2008 - link

    "200 series is utterly pointless"

    Yep... pointless unless you want the fastest card (280), then it has a point.

    Pointless to YOU possibly because you're focusing on perf per dollar. Good for you. Nice of you to presume to force that view on the world.

    Absolute performance? GTX 280 seems near the top of every benchmark there bud. Both in single card and in SLI where, last I checked, it gives up maybe TWO instances to the 4870CF - Bioshock and CoD and in both cases framerates are north of 100 at 2560. The 4870, on the other hand, falls WELL short of playable at that res in CF in most other benches.

    High res + high perf = 200 series. Sorry if thats offensive to the egos of those who cant afford the cards.

    Theres a lot in life we can and cant afford. Should have ZERO impact on ABSOLUTE PERFORMANCE discussions.
    Reply
  • FITCamaro - Wednesday, June 25, 2008 - link

    AMD/ATI has to make some money somewhere. And regardless, at $300, the 4870 is a hell of a deal compared to the competition. Yes the 4850 is probably the best value. But the 4870 is still right behind it if you want a decent amount of extra performance at a great price.

    Nvidia may have the fastest thing out there. But only the richest, most brain dead idiots who have not a care in the world about how they spend their (or their parents) money will buy it with cards like the 4850 and 4870 available.

    And its pretty sad when your new $650 high end card is routinely beat by two of your last generation cards (8800GT) that you can get for $150 each or less. It wouldn't be as big a deal if the new card was $300-350 but at $650, it should be stomping on it.

    I think Nvidia is in for a reality check for what people want. If their new chips are only going to cater to the top 1% of the market, they're going to find themselves quickly in trouble. Especially with the all the issues their chipsets have for 6 months after release. And their shoddy drivers. I mean this past Friday I decided to try and set up some profiles so that when I started up Age of Conan, it would apply an overclock to my GPU and unapply it after I exited, it ended up locking up my PC continuously. I had to restore my OS from a backup disc because not even completely uninstalling and reinstalling my nvidia chipset and video drivers fixed it. And in my anger, I didn't back up my "My Documents" folder so I lost 5 years worth of stuff, largely pictures.
    Reply
  • mlambert890 - Sunday, July 13, 2008 - link

    "Nvidia may have the fastest thing out there. But only the richest, most brain dead idiots who have not a care in the world about how they spend their (or their parents) money will buy it with cards like the 4850 and 4870 available."

    You just summed it up in that first sentence there bud. NVidia has the fastest thing out there. The rest is just opinion, bitterness and noise.

    I notice that the tone of the "enthusiast" community seems to be laser focused on cost now. This is like car discussions. People want to pretend to be "Xtreme" but what they really want to see is validation of whatever it is THEY can afford.

    Have fun with the 4870 by all means, its a great card. But the GTX280 IS faster. Did NVidia price it too high? Dont know and dont care.

    These are PERFORMANCE forums to all of the people that dont get that. Maybe even the editors need to be reminded.

    If I want to see an obsession with "bang for the buck" Ill go to Consumer Reports.

    I mean seriously. How much of a loser are you when you're taking a shot like "your PARENTS money"? LOL...

    Personally, I treat the PC hobby as an expensive distraction. Ive been a technology pro for 15 years now and this is my vice. As an adult earning my own money, I can decide how I spend it and the difference between $500 and a grand isnt a big deal.

    The rehtoric on forums is really funny. People throw the "kid/parents" insult around alot, but I think its more likely that the people who take prices beyond what they can afford as some kind of personal insult are more likely the kids here.
    Reply
  • formulav8 - Thursday, June 26, 2008 - link

    "Nvidia may have the fastest thing out there. But only the richest, most brain dead idiots who have not a care in the world about how they spend their (or their parents) money will buy it with cards like the 4850 and 4870 available."


    Yuk Yuk Yuk :)



    Jason
    Reply
  • drpepper128 - Wednesday, June 25, 2008 - link

    To be honest, while I was reading the article I felt as if the article seemed a little ATI biased, but I guess that goes to show you that two different people can get drastically different opinions from the same article.

    The real reason I’m posting this is I want to thank you guys for writing some of the best articles that Anandtech has ever written. I read every page and enjoyed the whole thing. Keep up the great work guys and I look forward to reading more (especially about Nehalem and anything relating to AMD’s future architecture).

    Also, is GDDR5 coming to the 4850 ever? If so, maybe it would be a drastically better buy.

    Thank you,
    drpepper128
    Reply
  • Clauzii - Wednesday, June 25, 2008 - link

    Damn, You R pissed!! :O

    OK, get some sleep and wake up smiling tomorrow, knowing that It's ATI needing to raise prices - - - and go get that 4870 :))
    Reply
  • Clauzii - Wednesday, June 25, 2008 - link

    OH, " ... that It's NOT ATI needing to ... "

    BTW: I actually read the review as pretty neutral, making a hint here and there that the further potential of the HD4870 is quite big :)
    Reply
  • paydirt - Wednesday, June 25, 2008 - link

    You guys are reading into things WAY too much. Readers understand that just because something is a top performer (right now), doesn't mean that is the appropriate solution for them. Do you honestly think readers are retards and are going to plunk down $1300 for an SLI setup?! Let's leave the uber-rich out of this, get real.

    So a reader reads the reviews, goes to a shopping site and puts two of these cards in his basket, realizes "woah, hey this is $1300, no way. OK what are my other choices?"

    This review doesn't tell people what to do. It's factual. You (the AMD fanbois) are the ones being biased.
    Reply
  • Jovec - Wednesday, June 25, 2008 - link

    "This fact clearly sets the 4870 in a performance class beyond its price."

    Or maybe the Nvidia card is priced above its performance class?
    Reply
  • DerekWilson - Wednesday, June 25, 2008 - link

    it could be both :-) Reply
  • Clauzii - Wednesday, June 25, 2008 - link

    I think You are right. nVidia had a little too long by themselves, setting prices as seen fit. Now that AMD/ATI are harvesting the fruits of the merger, overcomming the TLB-bug, financial matters (?), etc. etc. it seems the HD48xx series is right where they needed it.

    This is bound to be a success for them, with so much (tamable) raw power for the price asked.
    Reply
  • Clauzii - Wednesday, June 25, 2008 - link

    Yeah! Nice to see competition get into the game again.

    Reply
  • gigahertz20 - Wednesday, June 25, 2008 - link

    Page 21 is labeled "Power Consumption, Heat and Noise" in the drop down page box, but it only lists power consumption figures. What about the heat and noise? Is it loud, quiet? What did the temperatures measure at idle and load? Reply
  • abzillah1 - Wednesday, June 25, 2008 - link

    I am in love Reply
  • 0g1 - Wednesday, June 25, 2008 - link

    "NVIDIA's architecture prefers tons of simple threads (one thread per SP) while AMD's architecture wants instruction heavy threads (since it can work on five instructions from a single thread at once). "
    Yeah, they both have 10 threads but nV's threads have 24 SP's, AMD's 80 SP's. But the performance will probably be similar because both thread arbiters run about the same speed and nv's SP's run about double the speed, effectively making 48SP's (and in some special cases 96).
    Reply
  • ChronoReverse - Wednesday, June 25, 2008 - link

    Perhaps it's drivers but if AMD intends for the 4870x2 to compete as the "Fastest Card", they better fix their drivers ASAP. Reply
  • FITCamaro - Wednesday, June 25, 2008 - link

    With a few driver revisions it will likely improve. Reply
  • NullSubroutine - Wednesday, June 25, 2008 - link

    It scaled more than 100% in a few games? Reply
  • DerekWilson - Wednesday, June 25, 2008 - link

    greater than 100% scaling is due to margin of error combination for both single card and dual card tests in the vast majority of cases.

    we also tested single card performance on an nvidia system and crossfire performance on an intel system, so the different computers will also add margin of error.

    two card solutions generally don't scale at greater than 100% except in extraordinarily odd situations (where rebalancing loads might help with scaling on both individual cards -- but that's odd and rare).
    Reply
  • Sind - Wednesday, June 25, 2008 - link

    Why no 260 and 280 SLI? Reply
  • ImmortalZ - Wednesday, June 25, 2008 - link

    Because, with that kind of money, one can an entire system with one 48xx :P

    Also, page 10 appears to be broken.
    Reply
  • Lifted - Wednesday, June 25, 2008 - link

    No 260 or 280 SLI in the benchmarks, but they included them in the power charts. Odd. Reply
  • Anand Lal Shimpi - Wednesday, June 25, 2008 - link

    The power data was simply taken from the GTX 280 review, we just added to the list.

    As for the GTX 280 SLI numbers, we didn't include them as it it's mostly out of the price range of the Radeon HD 4870 ($1300 vs. $600 for two 4870s). We can always go back and redo the graphs to include them if you guys would like, but in the interim I would suggest looking at the GTX review to get comparison numbers.

    Take care,
    Anand
    Reply
  • DerekWilson - Wednesday, June 25, 2008 - link

    we actually only have one GTX 260, so we can't test that Reply
  • Clauzii - Wednesday, June 25, 2008 - link

    Yes, and the click to enlarge doesn't work.
    And believe it or not, posting right now from a AT page that looks like 1994...!
    Reply
  • ImmortalZ - Wednesday, June 25, 2008 - link

    Insert a buy in there. Need edit! Reply
  • TonyB - Wednesday, June 25, 2008 - link

    but can it play crysis? Reply
  • StevoLincolnite - Wednesday, June 25, 2008 - link

    Of course it can, There are benchmarks isn't there?
    Seriously ANY Direct X 9 card can run Crysis, The Quality and Performance is a different matter.
    Reply
  • Inkjammer - Wednesday, June 25, 2008 - link

    I have a 9800 GX2 in my primary gaming rig, but I've been debating on what card to drop into my Photoshop/3DS Max art rig. I've been waffling over it for some time, and was going to settle on an 8800GT... but after seeing this, my mind's set on the 4850. It definitely appears to offer more than enough power to handle my art apps, and allow me to use my second PC a gaming rig if need be... all without breaking the bank.

    This'll mark my return to buying ATI hardware since the X800 was king.
    Reply
  • weaksideblitz - Wednesday, June 25, 2008 - link

    this is a welcome development although im only buying a 4850 :) Reply
  • Locutus465 - Wednesday, June 25, 2008 - link

    Very much so, actually from where I sit I think all AMD really needs to do is get a SAM2+ CPU out there that can compete with intel at least similarly to how this card competes with nvida and they'd have one hell of a total platform solution right now. As for upgrading my vid card... I just finished upgrading to the Phenom 4x and Radeon 3870 so I'll be sticking with that for a while. Quite honestly that platform can pretty much run anything out there already as it is, so I'm feeling pretty confident my current setup will last a couple years at least. Reply
  • Lifted - Wednesday, June 25, 2008 - link

    Ditto. If I can get a 4850 for ~$150 or so, that's what I'm doing as well. Reply
  • billywigga - Friday, August 29, 2008 - link

    where are you getting it from best buy or something Reply
  • Clauzii - Wednesday, June 25, 2008 - link

    That leaves 50 for a better cooler ;) Reply
  • Lifted - Wednesday, June 25, 2008 - link

    Is there any reason the first pages of benchmarks have SLI setups included in the charts, but you wait until the end of the article to add the CF? I'd think it would make the most sense to either include both from the start or hold both until the end. Reply
  • Anand Lal Shimpi - Wednesday, June 25, 2008 - link

    The original idea was to format it like the 4850 preview, keep things simple early on but offer SLI/CF graphs later in the article for those who wanted them.

    It looks like in the mad rush to get things done it didn't work out that way, I'll see if it's possible to clean it all up but right now we've got a lot of other minor touchups to do first :)

    Take care,
    Anand
    Reply
  • TechLuster - Wednesday, June 25, 2008 - link

    Anand,

    I really like your idea of "keeping things simple early on" by only including configurations that us mere mortals can afford at first (say, all single-GPU configs plus "reasonable" multi-GPU configs less than ~$400 total), and then including numbers for ultra high-end multi-GPU configs at the end (mainly just for completeness and also for us to drool over--I doubt too many people can afford more than one $650 card!).

    Anyway, great job on the review as always. I think you and Derek should get some well-deserved rest now!

    -TL
    Reply
  • wilkinb - Wednesday, June 25, 2008 - link

    Then include SLI for the 280... let the consumer care about what the value is or isn't, we all value different things. Provide the costs and the performance (SLI) please.

    Its make this very incomplete to not have included SLI for the 280/260, I for one will more then likely get 2 x GTX280's not all of us worry about a few $$, but if the CF 4870's are that good, then I want to know as I don’t care about the brand and will go with the best performance.

    Can you please include them soon so we can make our own judgements on what's good or not?
    Reply
  • Anand Lal Shimpi - Wednesday, June 25, 2008 - link

    I've added the GTX 280 SLI numbers to all of the bar charts in the Multi-GPU section, enjoy :)

    Note, I didn't add them to the line graphs simply because we didn't have data for 280 SLI at lower resolutions. It only really makes sense at 2560 x 1600 anyways so this shouldn't be an issue.

    Take care,
    Anand
    Reply
  • wilkinb - Wednesday, June 25, 2008 - link

    thank you :)

    I really appreciate the response

    Now I just need work out what to order.
    Reply
  • paydirt - Thursday, June 26, 2008 - link

    9800 GTX is $200. I wonder what price the GTX+ debuts at. Reply

Log in

Don't have an account? Sign up now