Back to Article

  • pwnedbygary - Sunday, July 06, 2008 - link

    This card is absolutely a BEAST at folding. Now that standford U. has released the GPU2 Client for XP/2003 (im running it on Vista however) it can complete a 10,000,000 piece workunit in about 2-3 hours. I'd like to see the PS3 do THAT hehe.

    Heres a screenshot:">
  • marone - Tuesday, June 24, 2008 - link

    ATI to Nvidia: Im at ur base, ste@ling your customers Reply
  • Matrixfan - Tuesday, June 24, 2008 - link

    Hello! Please excuse me if it is obvious, but what kind of fps figures are in the test? Do these figures represent minimum or average fps numpers? Reply
  • flexy - Monday, June 23, 2008 - link

    truly, truly amazing. A high-end card which you can get or $149 at BB. this. I applaud AMD this time after some years of disappointment since we didnt see anything exciting after 9700/9800...but this card will be a killer. Price/Performance is actually unreal.
  • jamstan - Monday, June 23, 2008 - link

    I find it odd that a 1000watt OCZ power supply wasn't big enough when the manufacturers only recommend a 550watt PSU for 2 4850s in CF? Sounds like that 1000watt PSU has a bad rail or something. Reply
  • HOOfan 1 - Monday, June 23, 2008 - link

    It is complete HOGWASH.

    As I stated elsewhere, the card can pull NO MORE THAN 75W from the PCI-E socket and NO MORE THAN 75W from its single 6pin PCI-E connector. Two of them can draw no more than 300 Watts. The OCZ EliteXstream uses a single 12V rail so there can be no excuses of over current shutting the PSU down. The PSU is also good for another 660Watts of 12V power....that could power a few peltiers and tons of fans and Harddrives.

    It is sad to think that some people will read this article and actually believe they need a $200+ 1200W PSU to run dual HD4650, when a $100 Corsair VX550 would do.
  • solog - Monday, June 23, 2008 - link

    Why would the manufacturers claim 550W if it were nowhere near enough? Derek Wilson stated that the load draw wasn't stressing the cpu, ram and hard drive. But if you factor those in they still aren't anywhere close to 1000 watts (or the 1200 watts that they claim is really needed to run it!)

    Maybe someone else should redo the power consumption test with different power supplies, including 550W units that are known to be functional. Anyone else see any review that claims anything like this?
  • BigDaddyCF - Monday, June 23, 2008 - link

    Yes you could sat that
    "While it is true that two RV770s can outperform a single GT200 in many cases, you could also make the argument that two GT200s could outperform anything that AMD could possibly concoct."
    However concocting that dual GT200 solution will cost you
    $640 x 2 = $1280
    that's what I'd call the lunatic fringe of gamers, and it has to be a small portion of the market.
  • jhb116 - Sunday, June 22, 2008 - link

    For the official review - can we get the real sound numbers? Also - with the power - is the 4850's power saving features enabled?

    I'm also looking for this type of info when you get the GTX+ review(s). Is any further info on the Hybridpower features - last time I read - it seemed this feature wasn't working?

    Could be game changing for either competitor if they got this type of feature to work. I'm willing to sacrifice a bit of performance to keep my system somewhat green during downtime/web surfing.
  • Bobattack - Sunday, June 22, 2008 - link

    While this is a preview of the 4850 and I'm sure a lot of us can't wait to see how well the 4870 will compare. The test scores don't match up between THIS review (June 19) and the GTX 280 review (June 18) but your hardware stats are exactly the same.

    Bioshock 1920x1200

    Card 08.18 08.19 (ATI 4850 preview)
    GTX 260 69.0 50.4
    9800GTX 64.6 42.3
    ATI 3870 64.6 41.0

    But I looked at the chart again and noticed the problem while typing this.

    The WRONG screen res. is in the grid! (Also some numbers don't match from the BASIC chart to the detailed multi res chart)
    So you have 1280x1024/ 1600x1200 / 1920x1200, which wrong.
    It should be 1600x1200 / 1920x1200 / 2560 x 1600!

    Its a preview, and it was kind of last minute, so its understandable.
  • Final Destination II - Sunday, June 22, 2008 - link

    Well done!
    Good not everyone here is asleep :)
  • Wwhat - Saturday, June 21, 2008 - link

    I noticed bioshock was not tested at all with AA, so I'm curious, what's the status on that? From what I read you could initially enable AA on ATI by renaming the executable to oblivion.exe and AA was a DX9-only feature, but from what I gather DX10.1 should support AA in DX10 games, that's part of the .1 improvements I thought.
    And I heard nvidia was, back when bioshock came out, working on a driver trick to enable AA in it.
    So I'm wondering what the real life status is on such issues in DX10 games?
  • Griswold - Saturday, June 21, 2008 - link

    How come you need a 1200W PSU when this german guy at">

    runs them in CF with a 750W PSU and even triple CF with 2x 4850 and a 3870?

    Piece of shit PSU on your test bench?
  • geogaddi - Saturday, June 21, 2008 - link

    that, AND my 328xi gets 28 mpg. those ever-efficient germans...

    'ze goggles! zey do nothing!'
  • Glock24 - Saturday, June 21, 2008 - link

    Radeon HD4850
    956 million transistors on 55nm fabrication process
    40 texture units
  • Straputsky - Friday, June 20, 2008 - link

    I think there is a difference in the strategy. nVidia produces one large chip and that's their high-end product. AMD uses smaller chips but puts two of them on one board. There is no need having a crossfire able Motherboard, cause it's only one single card. nVidia can't do that with their GT200 (due to huge Die size, power consumption). That means if nVidia would like to counteract they have to use their G92 chip. I think it's a different thing having Crossfire/SLI realized on one board or two due to the Motherboard restrictions.
    Next thing is cost level: It seems that producing two chips on one board is cheaper than one big chip. And it should be easier to cool down two smaller hotspots than one big one.
  • TheJian - Saturday, June 21, 2008 - link

    Nvidia can't do it today. Correct. But the die shrink has already taped out and should be out in 2 months or so just about the same time AMD does it with 4870x2 which should make a GTX280X2 doable.

    For the next two months you DO need CF for 4870 (which isn't even out yet to begin with) or 4850. Also these two in one cards have driver issues as shown by many sites. You are better off much of the time getting 100% of a single chip than trying to get the same out of two chips. No driver tricks etc are needed with a single large chip. No timing issues and stuttering either with single chip.

    They can just replace the current GX2 with these new + chips to speed up the current GX2. Should produce a nice boost and no new work is even needed. Just swap chips and the design is done. I'm sure they're already working on it or maybe even done just waiting for the need after 4870. A die shrink on GX2 should make a good match for 4870. The same in two months on GTX280 (The shrink will cheapen it up, speed it up and allow dual GTX280) should easily dominate any 4870X2 (even a single one will be problematic, 2 just gives nvidia serious bragging rights).

    I worry AMD didn't quite do enough AGAIN (phenom, mumble grumble). They can't take much more losses before we end up with Intel boning us on cpu's, and Nvidia boning us on video cards. I'd hate to see the low end go to crap as they used to be. If AMD dies nvidia will spread the high/low end further apart again and force us to pay more. Right now $100-200 gets you a damn fine gaming experience. A little overclocking and these cards are superb for the money. I guess we can always hope Intel might actually put out a decent card if AMD dies. Cost would not have been a problem for Nvidia at 55nm. So that's going away in 2 months. It would have made nvidia late to the party so from a stockholder's standpoint it's a great idea as long as you don't lose money on each card. Gimp 2 months until die shrink and then make your money from back to school/xmas all while holding AMD down. Barcy is a prime example of why AMD should have just taken the simple route and glued 2 chips together and put a quad in the market a year early. Heck, even Hector admits this now. With the 4850 at $170 I wonder how much AMD is making on these. But hey, get em while they're hot :) That's a good deal.
  • Final Destination II - Saturday, June 21, 2008 - link

    Your crystal ball future-foretelling skills are asthonishing. Maybe we got us a Nvidia-Insider here?

    I doubt it...

    Nvidia (living up to it's name) will do its best to stop the HD 4850, but this time ATI has the borg card: resistance is futile. It's an over all nicer, faster, cheaper package they assembled.

    Plus, I won't just sit here and talk - I will buy one (or the HD4870 - gotta wait a bit till that thing is benchmarked).
  • TheJian - Saturday, June 21, 2008 - link

    No crystal ball just common sense. NO nvidia insider either. The taping out of the die shrunk GT200 has been reported elsewhere already, I'm not forecasting anything. It's widely known you can have a card in hand 12 weeks from tapeout. What's surprising about Nvidia wanting to shrink a FREAKY huge expensive chip? If you need a crystal ball to predict what I said you should just give it up. What's "over all nicer"? 4850 is probably faster than a 9800GTX (but I wonder about minimums - see below)? Cheaper? Not by much after the price cut and 9800GTX+ announcement.">
    Even AMD sees the 4850 competing against the 8800, NOT 9800GTX which is reserved for the 4870 in their slide. Now the GTX is priced competitively to the 4850 raining on AMD's slide. The 4870 is expected to be $299 or so. It's now competing against a $229 9800GTX+ which got a pretty healthy performance boost. No crystal ball needed to see tough times for 4870 $299 vs 9800GTX+ $229. Just as tough for 4850 vs 9800GTX now. Is AMD going to revamp that slide now? AMD KNEW the performance of all these cards (except maybe GTX280/260) before making that slide. Expect another price cut after all 9800GTX's are gone and every one is a GTX+ die shrunk version allowing much cheaper pricing. AMD isn't going to become profitable at these prices, and especially not when nvidia is shrinking the price defecit of everything in the span of 2 months. With the GX2 going for $415 after rebate at newegg, and a die shrink undoubtedly coming shortly, how much will it cost then? Nvidia cut GTX from $269 to $200. That's $70 and it only has one shrink. What will GX2 get with 2 chips shrinking? Even if you just take the $70 off that puts it REALLY close to a single $299 4870. Which would you take? GX2+ will walk away with that one. I could see myself upgrading from 8800GTOC to GX2+ for $280-300 at xmas this year :) I figure it will debut at $340-350 and should easily make $300 by xmas.

    Anand's 4850 vs 9800GTX shows:
    Those are just avg's though. When you look at the minimum fps things change a bit.">
    Anand shows 4850 @ 43fps vs GTX 40.7 at 2560x1600. PCPer shows GTX winning minimums at 2048x1536 and up. The GTX+ does it from above 1600x1200. I'm interested in seeing hardocp/pcper extensive 4850 minimums now. Pcper only did 2 games so far.

    Do you need a crystal ball to predict everything will shrink and get cheaper? Punch "GT200 die shrink taped out" into google. It was reported in may. It's only 2 months out. GX2+ just needs to change out old 65nm for already shrunk GTX+ chips, nothing special there. Why is this confusing to you? You don't think the price will drop from $415 when 4870 arrives? With 2 chips shrinking 100mm's or so each? Both will be running faster also. OUCH. 4870 has a tough sell vs a FASTER/CHEAPER GX2+. AS it stood when 4850 reviews hit AMD looked pretty good. But cheaper + models with more speed for ALL of nvidia's lineup in 2 months sucks for AMD. Even if Nvidia takes an extra month on all this it still sucks for AMD. Nvidia will be in time for back to school/xmas which pretty much makes either sides FY (heck Intel etc for that matter...H2 is where the money is made). Nothing I said is news. I just repeated it. The GT200 will drop from 576mm to 400mm. Quite a savings in cost/heat/watts, and allows cranking it up more. AMD can't afford another price war. They just started nosing around AGAIN for more money from Dubai last week. They're already 5B in debt. They are expected to show a FY09 loss AGAIN. Can they survive another 1.5yr without profit?
  • Miggle - Tuesday, June 24, 2008 - link

    4850 can be had for as low as $170. That's cheaper than the new price on the 9800GTX. You've confirmed yourself that the 4850 is generally faster than the 9800GTX (except for the min fps at uber high resolutions that most wouldnt/couldnt play on). The new price on the 9800GTX hasn't been implemented yet for what I know. AMD wins here.

    Then we have the GTX+ which has been well received in available reviews and is a bit faster than the 4850, priced at $230. I've read somewhere that a DDR5 version of the 4850 will be released for $230. It boasts a much faster memory speed but the core would be clocked the same. Which would be a better buy, we've yet to find out.

    Finally, we have the 4870 which AMD has placed against the GTX. But knowing that even the GTX+ is about on par with the 4850, I wouldn't agree with AMD. The 4870 should be about 20-30% faster than the current 4850 (based on released numbers from unconfirmed sources) and (hopefully) have dual slot cooling so its on a league of its own once released. We've yet to see how NV will respond to this one.
  • Final Destination II - Saturday, June 21, 2008 - link

    Btw, I have a 7600GT. I don't have to upgrade each week, because I have a display 1280x1024 display and don't care at all for 2500xquadrizillion resolutions.

    Plus, you constantly keep forgetting that the HD4850 rules in quality settings.

    Anyway. There are 5 things I will never do:
    - burn my money
    - install SLI
    - install Crossfire
    - burn my money while achieving 5% more performance ("yay! that's worth it!" says the enthusiast. "what a moron" say I...)
    - install one of those ridiculous dual-chip power burners - thanks, I already got me a nice heater for the winter.

    If you check statistics you will see that you are quite alone on your enthusiast throne, looking down on people who "only" get 85% of your performance for 33% the money.
    In fact, I would call those the "intelligent people"...
  • superkdogg - Monday, June 23, 2008 - link

    I am similar to this guy ^^^.

    I have three display options: 12x10 19" LCD monitor, 37" 720p TV, and a VGA projector. Granted the projector kinda sucks, but there's nothing quite like using the garage door as your gaming screen.

    For reason of not having a super display (nor really wanting one) having a usable video card gets really, really simple. I actually still have my x800 GTO2 flashed to x850xt and overclocked. Laugh if you want, but other than being old and not supporting newer shader models (a big deal for some, not to me) it still puts together playable framerates in many games.

    So, now that I've explained where I come from, I can say that the 4850 has my attention. I'm never going to be the $600 graphics card guy, but being the $200 graphics card guy and being able to turn up all the detail settings for most games and being 'bottlenecked' at the monitor sounds good to me.
  • Final Destination II - Saturday, June 21, 2008 - link

  • Final Destination II - Saturday, June 21, 2008 - link

    "Expect another price cut after all 9800GTX's are gone and every one is a GTX+"

    Great. That's the way to kick your own customers in the ass. Remember the price slide on the iPhone? That's what will happen to Nvidia. Frustrated 9800GTX buyers will realize that their precious $300 they bought 2 weeks ago is worth $100 less and worse than ever.

    Sorry, guy - but's that's no reason to buy another one. That's a reason to stick with it and feel sorry for each HD4850 that passes you.
  • Final Destination II - Saturday, June 21, 2008 - link

    55nm NV280? Could it be that you are confusing the 9800GTX+ with your personal wishes? Reply
  • Straputsky - Saturday, June 21, 2008 - link

    Oh, I already expected that nVidia will use the 55nm process asap. But when I look to the ATI chip (about 250mm^2) and then the nVidia (as you mentioned about 400mm^2) I still wonder if it's possible to use two of them on one board. The power consumption might be less, but I think it's still far to high.
    Have a look at the G92b. It's shrinked but due to the higher clock it uses more power than the old one. ATIs new X2 consumes a bit more power than one 280GTX and I think that's the upper end of what might be possible with air cooling. If you want to use two 280GTX you have to reduce power consumption by nearly 50%. I think that's more than you can get from the 55nm process. Maybe you can put together two 260GTX with lower clock speeds - but for which price tag? >400€? At this price level people tend to buy two 280GTX cause price doesn't matter anymore.
    Maybe you're right but until I see it, I won't believe it.
  • magnusr - Friday, June 20, 2008 - link

    Does it support 7.1 LPCM 192kHz 24bit? Good enough to disable onboard sound / audio cards?

    Tested it with receivers who supports LPCM?

    IS ATIs HDCP and drivers just as stable as Nvidia regarding connections through receivers? Had some trouble with the old 3870 while my current 9600GT works fine.
  • madgonad - Friday, June 20, 2008 - link

    Come on Derek and Anand!

    Give us gaming/HTPC lovers some info.

    What kind of audio can you get out of a BluRay?
    Will games fully utilize all seven channels, or just default to stereo?
    Are any levels of EAX supported?
    Does it have the equivalent of DD Live or DTS Connect so that games will be played in 5.1/7.1?

    These are important questions that nobody has answered due to the overwhelming infatuation with Crysis scores.

    Also, how well does it upscale DVDs? In theory that teraflop of processing power could do a lot.
  • rgsaunders - Saturday, June 21, 2008 - link

    You are wasting your time asking Anandtech to do balanced reviews with something other than gaming scores. I have tried on numerous occasions over the last several years to get them to broaden their testing, but it seems all the reviewers are adolescent gamers. They still haven't figured out that gamers are a minority of users. Reply
  • DerekWilson - Monday, June 23, 2008 - link

    we're always open to new test applications and investigative directions.

    specific suggestions will get the best results. especially if you can point us to a reliable, repeatable, fair test to use for a certain real world application.
  • christianspoer - Friday, June 20, 2008 - link">

    Transistor count at HIS' homepage
  • darkryft - Friday, June 20, 2008 - link

    Now this is more like it! Single-slot card with a decent price point that provides excellent performance when doubled up. I like that AMD/ATI have most certainly not caved to simply let Nvidia dictate how it works.

    Mostly though, I like to see cards like this come out mostly to force the price point of the ultra-high end down. With this card or the 8800GT doubled up being able to outrun the GTX 280, to me it pretty much invalidates a $250 surcharge on the component.

    Bang for the buck. Like it should be.
  • Dobs - Friday, June 20, 2008 - link

    To see how these new cards from nVidia and ATI (AMD) perform with the new Intel processor will be the REAL test IMHO.
    Also wondering which 3870x2 is being used.... Isn't the GDDR4 version (by PowerColor) worth a try? Along the same lines... 4870x2 GDDR5 (?) will surely run at cooler temps than the GDDR3 version.... and King or not, I'll be getting one just because I know Dx10.1 compatibility means so much more than just nVidia's Dx10. (I only upgrade about once every 7 years - so I wanna make it last)
    Excuse my brain dump!
  • TheJian - Friday, June 20, 2008 - link

    90% of us do NOT have SLI/CF. So we're all after the best single slot solution. HardOCP shows far different results for GTX280/260. It blows away all other cards including the 9800GX2. They turn EVERYTHING on though, which Is what I think most would do if they spent $650 on a card (which they won't be charging the day 4870x2 ships in 8 weeks or so). You'd crank up AA AND AF. I see no mention of AF in the benchmarks here, which clearly does and they show clearly that both the GTX260 and GTX280 can run at 2560x1600 with both AA/AF turned on, and comment that all other Nvidia cards could NOT do this. None of the others were playable, including 9800GX2, and 9800GTX SLI. This card was built for 24/30in monitors and dominates there. Not to mention the SLI/CF issues you don't have to deal with (stuttering, not working at all as shown here, how many other games will two cards get NOTHING in?). You need to change your benchmarks to show PLAYABLE resolutions like hardocp. I read here and tom's hardware and thought the GTX260/280 sucks. Then read over at hardocp and find a completely different story when taxing these cards MORE and when all these cards hit unplayable LOWS. It was quite enlightening. Also what happens when we hit physx games in the future?

    Add to this there will be a die shrink in two months from Nvidia on both of these cards (it's already been taped out a few weeks back - 12 weeks from tapeout you have cards in hand) and AMD's 4870X2 (which looks like same month available) is in trouble and takes TWO slots to compete. How many of you have SLI/CF? I don't know many, that does cost more you know? Are all of you running out to buy new boards to run CROSSFIRE? So NV will shrink the chips, likely up the gpu 100mhz, and charge $100-150 less for both cards. So a 4870X2 ($600? it comes GDDR5 so not cheap) will be facing a faster GTX280 (ultra?) than today and likely at $500. We've already heard they'll cut prices when 4870 hits (I figure $50 from both in the next 2-3 weeks) so another $100 after die shrink seems doable. The 9800GTX+ is an even bigger problem for AMD. I'm sure an ULTRA version is right behind that one if 4870 is pretty good. A die shrink on G92 should net more than 60mhz, so expect another based on it but faster shortly for say $259/269? It should knock off about 20% power at load also, which should come out at around or below the 8800GT. AMD has a good card. No argument there, but that all seems about to change based on pricing, die shrinks etc. Nvidia seems to have an answer to it all.

    The only thing good here for AMD is Audio. But, I don't know anyone with an HTPC, and NOBODY with their regular PC hooked to their big screen TV. On top of that DTS 6.1 channel (generally just DolbyD 6.1 actually) is the best audio I know of amongst people I know. I don't even know a single person with 8ch audio for their home theater systems. That's about as rare as the idea of me buying any card that's $650. Almost pointless to brag about something none of us has or will have without spending a shedload of money to upgrade our living rooms. You guys buying 30-50ft HDMI cables and running through your walls? I question the quality of audio coming from this card anyway. My klipsch pc speakers make anything but a high end sound card pointless. Maybe the new AMD's do an adequate job, but I'd want to hear about that a bit more. I'm no audio expert though. Besides that $300 for a card just for an HTPC makes me want an Xbox360/PS3 instead especially since you can already do 8ch audio with integrated chipsets (8200/G35). Do I really want to play PC games on my 61in tv? In theory, cool. But not in practice. Too many downsides. I like them fine on my 24in Dell LCD.

    So nice job AMD, but Nvidia has easy answers. Unfortunately for AMD they can't start a price war with a 4Billion dollar loss last year. Meanwhile Nvidia makes 850mil/year. AMD has to make 250-300mil just to pay interest on their $5B debt. FYI I'm an AMD fanboi...LOL. But I admit was lured to the dark side with Intel's e4300 (hey it hit 3ghz easy!), and now the Xeon3110 (3.6 is cold!...How can I pass that up?). I hope AMD starts making money soon. Intel's killing them on one side, and Nvidia is keeping them down on the other. Don't forget 2xGTX280's destroys everything. Because NVDA actually MAKE money and have NO DEBT they can price to hurt AMD if needed. AMD can't do this. I would buy NVDA (Buy in may-jun, sell in mid Dec...ROFL, 30-80% for last 5 years...ROFL), I would not buy AMD stock :)
  • goinginstyle - Saturday, June 21, 2008 - link

    "You'd crank up AA AND AF. I see no mention of AF in the benchmarks here, which clearly does and they show clearly that both the GTX260 and GTX280 can run at 2560x1600 with both AA/AF turned on, and comment that all other Nvidia cards could NOT do this. None of the others were playable, including 9800GX2, and 9800GTX SLI."

    Why even worry about AA at 2560x1600? Have you tried 4xAA at 2560x1600 (probably not since you have a 24")? It is difficult to tell any differences with AA in most games past 1920x1200 not to mention it is near impossible to tell the difference between 8xAF and 16xAF at that resolution unless you do screen captures for a living or watch slide shows. So who cares about cranking the settings up at high resolutions, you do it at lower resolutions where it makes a difference. Not to mention, how many people actually have 30" monitors or really want to spend more on their video card than the processor, ram, hd, and board combined? I have a 30" monitor and regret it for gaming, great for other things though.

    "You need to change your benchmarks to show PLAYABLE resolutions like hardocp. I read here and tom's hardware and thought the GTX260/280 sucks. Then read over at hardocp and find a completely different story when taxing these cards MORE and when all these cards hit unplayable LOWS. It was quite enlightening. Also what happens when we hit physx games in the future?"

    I am sorry, but the video card reviews at H suck although with all of your posts it seems as though you want to increase their hit count. Their idea of a playable resolution is a joke. Also, their benchmark results are not repeatable or comparable between video card reviews and knowing most of their advertisers sell nothing but NV based product I do not trust them at all. I have not since they started falsifying results a few years ago and got caught doing it. Physx is only good at this point for increasing 3DVantage scores which is really important for somebody like you. Anyway, I would hope a $650 card could beat a $199 card at high resolutions and bravo for NV designing a card that could do it while draining my bank account.

    "ut, I don't know anyone with an HTPC, and NOBODY with their regular PC hooked to their big screen TV. On top of that DTS 6.1 channel (generally just DolbyD 6.1 actually) is the best audio I know of amongst people I know. I don't even know a single person with 8ch audio for their home theater systems."

    All the people you know own 7 channel systems? I will give you a hint, 6.1 is 7 channel dude. Standard DTS and DD are 5.1 formats that are the most widely used behind good old stereo. You need DTS-ES or DD-EX for 6.1 output in those formats (fake it with DPL IIx or DTS Neo:6) which sucks compared to true 8-channel output. It is obvious to me by now that you are some 13 year old with a crush on the H and actually do not know what you are talking about. Just spewing trash to up hit counts, same as what Kyle did in his us versus AT video thread a few months ago. So collect your $10 and go buy this month's Teen Throb and play a game of Bejeweled on your 8800GT at 4xAA/16xAF in order to improve your gaming experience.

    "Because NVDA actually MAKE money and have NO DEBT they can price to hurt AMD if needed. AMD can't do this."

    Why not price the GTX280 at $199 and run AMD out of the video card business if they are so smart. NV will not do it as they hope idiots like you continue to spend $650 for a video card.

    "So nice job AMD, but Nvidia has easy answers."
    Answers like spend more money than you need to when buying our products, data corruption, crappy mb chipsets that overheat, buggy drivers under Vista, video freezing, image quality reductions in order to over inflate benchmark scores, LAN problems, crappy product support, viral marketing, and classic answers like "since we do not make a CPU then the CPU is not important". I guess they do have an answer for everything.
  • TheJian - Saturday, June 21, 2008 - link

    The point is I want to see the cards taxed more so I have a better idea of what I'll deal with at some point in the future. Games are getting more complicated or have you completely missed that over the years? I don't need my xeon3110 to run at 4ghz, but it sure is nice to know I can when I have to in the future. You might not think it benefits us to see benchmarks with the details on high, but if I can run there and you can't my card is faster and can handle more workload correct? In the end the extra power is always desired. Benchmarking without showing the extra power the cards have seem kind of pointless. There are plenty of pig games that will be produced like crysis, why not see which cards can hack more detail? What do you think benchmarks are for? Did you even read my posts? I don't advocate buying one, and won't until a price drop on the die shrink if I even do then. I clearly stated that. I have an 8800GT and you're calling me an idiot that spends $650 on cards...LOL. I paid $199. AFAIK the 8800GT has never been $650...ROFL. Get your facts straight.

    I didn't just quote hardocp, I also quoted pcperspective. Apparently you really can't read. Anandtech can do the same type of benchmarking to prove them wrong if they can. It won't happen. PC Perspective got the same results. You dislike hardocp so freaked out (did you miss the part where I said you can search their forums for me dissing them?)...LOL. If you paid attention to my post you'd see PhysX is worth more than you're claiming. GRAW2 is a prime example of how promising it is. I never mentioned 3Dvantage and only care about game scores. But feel free to dream up some more evidence if needed about me...ROFL. Apparently you missed the entire point of my posts. REAL PLAYABLE SCORES are important to me. Or did you miss that point? The only reason AMD isn't chargin $650 for their card is it is NOT as fast as Nvidia's. The leaders charge the most for whatever their product is. Get over it. Don't buy it if you think it's not worth it. I made no claims of whether or not any card was worth X dollars. Lamborghini charges more than Ford. Sorry, it's a better car so they charge more...You'd be driving one if you had the money.

    No idea why you're giving me hints about 6.1ch audio being 7 channels...ROFL? I never said it wasn't. I didn't say all the people I know own 6.1. READ IT AGAIN...jeez. I said that's the best anyone has that I know personally. Pay more attention to my post and stop blathering. Still foaming at the mouth over hardocp?...LOL. You go on to prove my point that 8ch audio isn't popular (I know, that was my point...are you blind?) "Standard DTS and DD are 5.1 formats that are the most widely used behind good old stereo." I TOTALLY agree, 8ch isn't NORMAL. Again, that was my point.

    I'm guessing Nvidia doesn't price the GTX280 at $199 because they wouldn't please their shareholders if they didn't make money. I don't even understand how you can make stupid comments like this while calling me an idiot in the same sentence...WTF? I don't buy $650 cards and actually stated a price drop and die shrink are needed to get me interested. I state I own a 8800GT and you accuse me of being an idiot buying $650 cards...Whatever. Nvidia has pricing power because they have the better cards. Sorry. If it wasn't the case they wouldn't charge it. The market dictates pricing. When AMD comes out with a better product, Nvidia responds (GTX+ and a price cut today proves my point).

    Buggy products? Everyone is guilty of that. No company has a perfect record. ATI's drivers used to be ATROCIOUS. AMD has had tons of problems with chipsets, cpu bugs etc, patches needed etc. I guess you selectively read news. Continue to live in your dream world. IF Nvidia produced a crappy product they wouldn't be king of the hill for long. The only time this isn't the case is when monopolies exist (Intel/MS). When Nvidia released the 5800fx we ran to ATI. Every company out there inflates benchmarks when they can...Even ATI...Sorry. I think iPods suck, but they still sell. Apple's stock flies higher etc. My points about "ANSWERS" to AMD's products were more financial related but I guess you missed that. Performance generally dictates pricing, which is why I think Nvidia will make more money than AMD this year :) Feel free to prove me wrong at the next quarter...ROFLMAO. AMD is doing a good job of running themselves out of business, Nvidia doesn't need to help them. How many quarters have they lost money for now? FYI I'm 37, not 13. Calling someone else an idiot for buying a $650 card just means you can't afford one so you complain about people who can and do buy one. Don't forget our bargain cards are made on the backs of those $650 buyers. I love these people :) Don't forget you wouldn't even have a JOB without the rich people creating one for you...For these $650 people being idiots, if they can afford that card they must be doing better than you eh? :) Flame on.
  • sigsauer228 - Friday, June 20, 2008 - link

    What a fucking moron Reply
  • TheJian - Friday, June 20, 2008 - link

    Spoken like a true lover. Quit loving websites and pay attention to the data, no matter where it comes from.

    Let me know when you can argue with the data I just put up from hardocp or pc perspective.

    It appears you have nothing.

    Now take a good look in the mirror and repeat yourself. :) Go ahead and grin about it now. You're right...LOL.
  • DerekWilson - Friday, June 20, 2008 - link

    with the exception of crysis, all our benchmarks are tested with the highest possible quality settings in game with the AA setting noted for each test.

    higher than 4xAA at 2560x1600 with a 30" panel doesn't get you a good ROI for lost performance and we wouldn't recommend it even if you spent $1300 on 2x GTX 280 cards.

    as for AF, we set it to the highest level possible in game where there is an option. for games that don't give us the option we force 16xAF in the driver (this pertains to oblivion for this article).
  • TheJian - Friday, June 20, 2008 - link

    I haven't seen anyone testing above 4xAA at 2560x1600. The numbers are still odd for even the avgs. Take COD4 at 1600x1200. Your test shows 64fps 9800GTX vs 78fps for 4850. At PcPerspective 77fps vs 79 respectively. FAR closer. I'd really rather see minimums anyway as those numbers actually affect my gameplay the most. As you can see from the other two sites things REALLY look different if you're examining playability instead of just avg fps.

    The 9800GTX+ is already benchmarked and basically shows them dead even. In Crysis it's a victory for the old one AND the new one.">
    I pray for minimums in your future articles. :)

    I wouldn't recommend anyone spending $1300 on video cards... :)
  • Spoelie - Friday, June 20, 2008 - link

    SLI/CF <-> single slot solution. I think you mean single card. The 4850 is a single slot, the GTX cards are dual slot, so no, the title of best single slot belongs to ATi this round, not NVIDIA.

    Do not compare testing methodologies of hardocp with anandtech, there have been enough discussions about this in the past. Each has its values, some prefer one, some the other. Physx is an unknown regarding future use and performance, so no factor in evaluating these cards now.

    The 4870x2 is most likely a single card, dual slot solution. The current GT-200 and most like the die shrink will be a single card, dual slot solution. The form factors are the same. You do not need a crossfire board to run a 4870x2.

    The numbers (prices, clockspeeds, performance) you spew forth are all grabbed out of thin air (shoulda coulda woulda), biased towards nvidia. Wait till the reviews instead of inventing a plausible scenario.

    Your comments about audio are all conjecture and baseless ('I question the quality anyway' based on what??).

    Nvidia has no easy answer, AMD did target the right price ranges at the right time, I'm guessing they will reap the benefits of that the coming 2 months.
  • TheJian - Friday, June 20, 2008 - link

    Does PC Perspective count? They produced the same results and more showing domination for GTX280. And it's not a dual slot solution, it's ONE card/chip as in you don't have to use driver tricks etc to get your performance. The single chip king of the hill. I was meaning 2 x 4870's because you can't get the 4870X2 until later (per AMD's own statement here at anandtech, it will come in 2 months IIRC). So until then we're talking GTX280 with no driver tricks vs 2 AMD's or 2 Nvidia's previous gens.

    Tough luck, I'm comparing methodologies when one sucks. An avg CLEARLY from both sites in my previous post (pcpers and hardocp) do NOT tell the TRUE story. What good does an avg score do me if my game is unplayable at the tested resolution? Sure you can give it to me, but you better give me what to expect at home also, which is the MINIMUM I can expect. GTX280 clearly dominated 9800GX2/9800GTX SLI at both sites in minimums and higher quality while doing it.

    Of what value is the avg I can get if I can't play it there?

    Is x264 encoding that is 5x faster than a X6800 Core 2 not worth mentioning? Should we not futureproof ourselves for apps like Photoshop CS4 which will use the GPU? Should we not thing physx will take off with Intel/AMD/Nvidia all trying to do it in one way or another?

    We seen physx speed things up, just not much using it currently, but it is PROVEN to work and fast:">
    When they let the machine do the work without PhysX they couldn't get 5fps...LOL. With the card on it hits the 50's! This was GRAW2 last october. What will Xmas 2008 hold? Since all 3 Intel/AMD/Nvidia are headed there what will we get? From the link "The question is the same people asked when new features like HDR, AA or AF were introduced into games: "Is this addition worth it?" While it may not have been impressive in past games, the answer today should be yes." This was rolling up on a year ago...ROFL. Don't forget they are even being sold in Dell's laptops (XPS 1730) and a bunch of OEM pc's from the likes of Medion, Acer, Dell and HP. By xmas it will probably be on cheap cards. It's something you must consider when purchasing. Even 8800 series cards can run physx code. This all puts a lot of cards in users pc's by xmas. An audience that's this big will be written for.

    Price not from thin air. The current 9800GTX is dropping about $30-40 and the die shrink is dropping the chip size by 40% (see"> It's reasonable to think they'd drop the current $429 price of the 9800GX2 card if they're saving that 40% die size 2 times on a card correct? I'm thinking $370 for the 9800GX2+ or so is easy. Don't forget they have to price competitively to AMD's offerings. This isn't thin air pal.

    Speeds aren't from thin air either. The speeds of the 9800 GTX+ are known (738MHz core, 1836MHz shader, 1000MHz GDDR3), and overclocking headroom is expected. Meaning an ultra should be easy if needed. Have you ever seen a DIE SHRINK that wouldn't net 10% clock speed or more? Nvidia is clocking it low. Most get 15-20% from a shrink; after the shrink matures maybe more. Shoulda coulda woulda what? Read my above reply for all the performance numbers for minimums etc.

    Can you prove anything about the quality of the audio coming from AMD's cards? I base my opinion on listening to many audio sources in other areas such as car amps. They are not equal. A high watt amp can distort all to crap for less money than a low watt high current expensive amp, and the high current one blows the other out of the water. I've heard wind in UT flyby on Klipsch V2-400's that wasn't there on MANY other speaker brands. I never knew wind existed int he game until I played it on klipsch. Maybe I heard it, but it sounded like distortion if I did, NOT WIND. Onboard audio does NOT sound like an Audigy2/4/Xfi etc. I'm an EX PC business owner (8yrs) and a certified PC tech. I've seen a lot of different audio solutions, built a ton of pc's and know some don't sound like the others. It's not baseless. I simply said I NEED TO KNOW MORE, and nobody has proven anything yet. Unless you can point me to someone saying "it's tested, and I love the audio". They are using their OWN codec you know, or are you NOT aware of this? I'm unaware of AMD's super audio prowess that you seem to think they have. Your statement about what I said is BASELESS CONJECTURE.

    I answered your last part above in my other reply. Nvidia has all the answers. Sorry. Care to place a bet on which one makes money this year? Care to place a bet on FY09?...Don't make me laugh. Care to take a look at what happens on nvidia stock from May 1st to Dec15th EVERY YEAR FOR THE LAST 5 YEARS (probably longer, I just didn't go back farther)? 75%, 75%, 30%, 80%, 80%. Care to look at AMD's last 5 years? Pull up the charts. It's not conjecture. :) I'm not saying AMD doesn't have a good card. I just think Nvidia has better stuff. They shot too low, just like phenom vs Core2. I'll buy whatever is the best for the buck for my monitor (1920x1200). Note I owned a 9700 and 9800pro, and many AMD chips (only when they oc'ed like crazy). Now it's an 8800GTOC and Intel Xeon3110. Whatever works the best.

    Buy some AMD stock...I dare you...LMAO.
  • Final Destination II - Friday, June 20, 2008 - link

    If reality doesn't fit in his brain, the average Nvidia-cocksucker has to talk till bubbles come out of his mouth.
    Well done, Sir!
    Unfortunately that doesn't change reality.

    Plus, you completely miss the point of this article. It's about the HD 4850 (which beats the performance-per-price crap out of your beloved green company's products). Not the 9800GTX, not the 280GTX, not your childish notions of computer hardware for "real men" (really, that was the biggest laugh besides your infantile over-usage of acronyms for 12-year-olds...).

    The HD 4850 destroys any over-priced card NVIDIA tried to sell us and that's why it will be bought and loved.
    If it wasn't for ATI, NVIDIA would still charge us phantasy prices.
    Fortunately, the HD 4850 also scores on power, performance, features and price.

    Just. Accept. It.

    This turn Nvidia has lost, no matter what they do; although I have to admit that I like it if you get your crystal ball and foretell us the future... look! We have an NVIDIA insider with us! Jian is directing the company...

    BTW, even Anandtech is biased towards Nvidia: While using the latest Geforce driver, they refuse to utilize the latest Catalyst, which promises lots of performance gains (ranging from 10%-20%) as far as I've read the changelog...
    If that holds true, the HD 4850 will be an even better deal, thus earning only _one_ word:

  • DerekWilson - Monday, June 23, 2008 - link

    we used drivers later than the 8.6 public version -- we used beta press drivers given to us by AMD for performance testing with 4850. Reply
  • madgonad - Friday, June 20, 2008 - link

    I am only going to question one of your points.

    The 'sound' coming out of the HDMI port on these cards is digital. That is, zeros and ones. There is no 'improved quality' with a digital signal. It is either there, or not there. I have Klipsch speakers too, all Reference BTW. All you should be concerned about in maximizing your audio experience is the quality of the device that is doing the DAC (Digital to Analog Conversion) and what quality audio your software is putting out. The only thing this card can't do is the most recent EAX profiles, but Microsoft killed EAX with Vista anyway. This 'video' card will fully support 7.1 LPCM channels of HD audio. The only way to beat that is to buy Asus's soon-to-be released HDAV sound card that will cost more than a HD4850.
  • Proteusza - Friday, June 20, 2008 - link

    You arent an AMD fanboy, you just love nvidia.

    1. If you use a res of 2560 x 1600, I think you can afford a better graphics card than an 8600GT. So I think its reasonable to assume that people who buy screens capable of such resolutions will put some thought into using either SLI or crossfire, or a dual gpu solution such as a 9800GX2 or 4870X2.
    2. This is the 4850, not the 4870, and it comes close to the GTX 260 in some benchmarks while costing half the price. Thats not bad.
    3. 9800GX2 beats the GTX 280, I dont know which benchmarks you have seen. Every benchmark that I have seen, including those at Anandtech, show this to be the case. After HardOCP recommended the 8600GTS over the X1950 Pro, I dont put much stock in what they think anymore.
    4. 4850 CrossFire beats the GTX 280 in most cases. What do you think a 4870X2 will do to it? Yes you can SLI GTX 280's, but for how much? And what kind of PSU will you need to run it?

    Every review/preview of this card says basically the same thing - for the price, it cant be beat. Lets face facts - nvidias card has 1.4 billion transistors and is made on a 65nm process. That makes it hugely expensive to make. The RV770, on the other hand, is made on a 55nm process and has 500 million fewer transistors. That makes it much cheaper to produce. Yes Nvidia can upgrade their process to 55nm, but that is hugely expensive to do, not to mention its a waste of 65nm GTX 280's. They are in a tough place - their card doesnt really perform as well as its 1.4 billion transistors would imply, and nvidia has no option to produce it, no matter how expensive it is.

    Do you know that nvidia has just announced a 9800 GTX+ manufactured on a 55nm process? Why would they do that if they werent threatened by these cards?

    I own an 8800 GTS 640 right now, and I love it. But I'm still glad that AMD is competitive again. Just look what it made nvidia do - drop prices. Thats good for everyone.
  • TheJian - Friday, June 20, 2008 - link

    I don't love nvidia, just the money they make me. Used to love AMD for the same reason :)

    1. I never said anything about an 8600GT. I want AA and AF both full blast in all games, I think anand needs to rerun some benchmarks doing this which shows the benefits of 512bit bus. They didn't turn on enough stuff to push the cards like hardocp did (to separate the men from the boys so to speak). You might not like the guy (you can dig into his forums and find a post from me ripping him a new one for a few of his statements about AMD...LOL), but the benchmarks don't lie. Anand just gives avgs, which tells me nothing about when a game dips below playability. AMD got AA correct this time, no doubt about that, but I'd like to see AA and AF turned up before making judgements and that's just what hardocp did. Why no reports of how 4xAA/16x AF changed the picture (both on)? My current 8800GT can't do what I want in ALL games at 1920x1200 (native on Dell 24in). Hardocp showed these cards get there.

    2. I don't think they get that close if you turn up the AF and AA together. Hardocp turned on AF+AA and saw the GX2 and 9800GTX lose to both new cards in some cases (GTX260 and 280).
    3. Check hardocp, it's on their frontpage.">
    Crysis has more turned on with GTX280 (16AF) vs GX2 (8AF) and even has more details on HIGH while beating it. GX2 drops to low of 12fps, while GTX280 drops only to 17 (with more turned up! 1medium for 280 and 6 mediums for GX2).
    Assassins creed was better on both GTX260/280. The 280 had EVERY detail maxed out which GX2/9800GTX SLI couldn't do playable. They go into detail about 512bit bus being the reason why it's playable.
    Look at the appples to apples COD4 scores, where the 9800GTX SLI drops to 11fps with 4xAA+16xAF, while the 280 holds at 32fps. Even the GTX260 only drops to 24fps. That's the difference between playable and NOT at 11fps. They noticed the large TANKING of fps in large grassy areas.
    Age of Conan showed the GTX280 was the only card playable at 2560x1600, and at 1920x1200 ran with higher details than the older cards.">
    They show the same thing. AA+AF=higher minimums for GTX280 vs. 9800GX2. So it's in more than one place. "The average frame rate doesn’t change much but the minimum is significantly faster – a big plus for a fast paced shooter like COD4. "
    Check page 9 for call of jaurez, where GTX280 has min16 vs 9800GX2 min9. That's almost 2x the minimum fps. NOT playable on GX2 at 9fps.
    Check page 10 for Company of Heroes "Wow, talk about some improvements here! The GTX 280 simply walks away with this win and even at 1600x1200 the results are impressive. It is 42% faster than the 9800 GX2, 56% faster than the 9800 GTX and 76% faster than the HD 3870 X2 at 1920x1200!?

    Uh....DOMINATION! Note these guys use min/max/avg same has hardocp. Is an arbitrary avg# telling us anything about our playing experience? NOPE. We at least need a minimum. Another quote for GTX260 instead "The Company of Heroes test points out some additional benefits of the GTX 260 compared to other three cards it’s compared against. It easily outperforms the HD 3870 X2 as well as the 9800 GX2 in all resolutions. "
    Heck, even the GTX260 dominated GX2! They saw 100% minimum fps improvement! Lost planet for them showed GX2 going down in defeat again. UT3, another victory for GTX280. Slaughter in min fps at 1600x1200.
    World In Conflict from page 13 at pcperspective "Wow, what a difference a generation makes. World in Conflict has always been accused of being too CPU bound to be any good, but these results paint an interesting picture of the situation. At 2560x1600 the competition can’t even get in the same hemisphere as the GTX 280 is 4x faster than the GX2 and 3870 X2." At 1920x1200 it was 1.6x faster! NOTE the AA+AF is on high!
    Crysis? "At first glance the numbers from the GTX 280 aren’t that much more impressive than that GX2 card in Crysis – the real key though is to look at the minimum frame rates and how the GTX 280 is consistently more stable graphically than the dual-GPU cards. At 1920x1200 the minimum mark that the GTX 280 hits is 44% higher than the HD 3870 X2 or 9800 GX2."

    Getting the picture yet? MINIMUMS are important NOT AVG fps. I love PLAYABLE gaming. The rest of these sites need to catch up to this line of thinking. People should buy a card for PLAYABLE fps at the res they want for their monitor. NOT for avgs that won't cut it in the real world.
    GPU to encode x264 video 5x faster than Core2 X6800?">
    Photoshop CS4 will use this stuff. More than a gaming GPU eh?

    Yes I know of the GTX+ (9800GX2+ probably right behind it for $350...). You did catch that I mentioned the die shrink of the GT200 that already taped out and should be out in two months anyway right? That will bring the cost down and allow an GTX280x2 card. I didn't say Nvidia wasn't worried. I said they have an answer for all of AMD's cards. The + versions of both the 9800GTX and GX2 cards will fill the gaps nicely for nvidia until GT200 die shrink in two months in the high range. 4870x2 won't be out until about then anyway. So as I said 4870x2 will be facing a die shrunk/sped up GTX280, and a month later the GTX280x2 card likely if it's even needed. I'm glad AMD is in the game, don't get me wrong. But just like phenom, they needed MORE here also. I already said I'd wait for a price drop and die shrunk version before i'd bite. I just hoped for more from AMD and a bigger nvidia drop...LOL. It's NOT hugely expensive for nvidia to switch to 55nm on GTX280. It's already taped out and should be out in 2 months or so. They don't own the fabs you know. TSMC already produces for them at 55nm.

    One more note: look at's review to understand VISTA sucks. XP benchmarks dominate vista. Extremetech shows this with MANY games (especially DX10 vs. DX9) and the maker of DirectX (Alex St. John's review at extremetech) says DX10 SUCKS. It's a slow pig vs DX9 because it's further from talking DIRECT to the hardware. You want 25% performance back? Switch to XP. :) Let the flames begin...ROFL.
  • formulav8 - Friday, June 20, 2008 - link

    It would be interesting to see if these new drivers will do anything for the new card. The article looks like it uses version 3.5. So rerun the benches with 3.6 maybe?? :)

  • DerekWilson - Friday, June 20, 2008 - link

    we did not use the 3.5 catalyst drivers.

    we used the very latest beta driver that ATI can provide.
  • formulav8 - Sunday, June 22, 2008 - link

    Now that I looked into it more AMD's release notes doesn't specifically say the 8.6 version supports 4850/4870 cards... Reply
  • formulav8 - Sunday, June 22, 2008 - link

    I know this is a late reply but I will do it anyways. :)

    I am just going by what your testbed specs says...


    Video Drivers

    Catalyst 8.5
    ForceWare 177.34 (for GT200)
    ForceWare 175.16 (everything else)


    If you used Beta drivers you probably should have updated that in the specs and whether they are based on 3.5 or 3.6. It would still be nice to see some results with the new official 3.6 driver though. :)

  • formulav8 - Sunday, June 22, 2008 - link

    I meant to say 8.5 and 8.6 not 3.5 and 3.6 :)

  • goinginstyle - Friday, June 20, 2008 - link

    Is the beta based on 8.5 or 8.6? Reply
  • jpeyton - Friday, June 20, 2008 - link

    Newegg had the Asus 4850 for $199 with a $30 MIR to bring the total to $169.

    People are reporting seeing the Visiontek 4850 on Best Buy shelves for $199, but this week's ad has all Visiontek video cards at 25% off, so that brings it down to $149.

    $149 for a 9800 GTX killer? AMD is turning the GPU price/performance market on its head overnight with this release.
  • jovdes018 - Friday, June 20, 2008 - link

    only if amd/ati could polish performance scaling with future drivers, 4870x2 in crossfire mode would really hit a hammer on every nvdian so exited!!! Reply
  • BikeDude - Friday, June 20, 2008 - link

    and how do these cards perform playing movies? Without resorting to the malware package aka "PowerDVD"? Reply
  • msgclb - Friday, June 20, 2008 - link

    On your Crysis benchmarks why don’t you tell us if you’re using AA. You do for most of the other benchmarks. Does the lack of any AA = No AA? I was going to ask DX9/DX10, 32-bit/64-bit but you do list your operating system as Windows Vista Ultimate 64-bit SP1 so I’m guessing that you’re using 64-bit DX10.

    I get lousy Crysis results using Vista 64-bit DX10 with my dual 8800 GT SLI system but I get better scores than yours if I use DX9.

    I appreciate all the work you put in to keep us informed.
  • DerekWilson - Friday, June 20, 2008 - link

    we do 64-bit dx10 noAA for crysis. Reply
  • bob4432 - Friday, June 20, 2008 - link

    hopefully this will drop the price of the 3850 :) Reply
  • GlassHouse69 - Friday, June 20, 2008 - link

    When is a tech company going to stop sucking NDA manchicken?

    anadtech has lots of bruises on their knees.

    "stay tuned kids for the 4870 while we polish off this knob!"
  • Makaveli - Friday, June 20, 2008 - link

    wow GlassHouse69,

    Do you even know what an NDA is you twit. Lets start breaking NDA's left right and center. Say goodbye to access to all this preleased hardware and the articles. I won't even mention that fact that you can get into legal trouble aswell.

  • strikeback03 - Friday, June 20, 2008 - link

    Well, the other option is to not be given any information until after the release of the hardware, as Canon did with DPReview a few times when concerned they were an early leak. Of course, they could always practice such high-level journalism as publishing rumors. Reply
  • crackedwiseman - Thursday, June 19, 2008 - link

    Is it possible that the scaling issues at 2560 x 1600 have to do with the bandwidth of the crossfire links? I have a pair of Radeon HD 3870s, and the ATI Catalyst control center notes that the cards must be connected by 2 crossfire links in order for full crossfire support of 2560 x 1600 resolutions. Reply
  • rudolphna - Thursday, June 19, 2008 - link

    Its so nice to see AMD finally competetive again. :) My next PC will feature a AMD Phenom 8450, a AMD 770 Motherboard, and a AMD Radeon 4850 :) (yes im an AMD Fanboy sue me lol) Now they just need to get the Phenom clock speeds up and they will be back to full competiveness. This is a good sign, keep it up AMD! =D Reply
  • anindrew - Thursday, June 19, 2008 - link

    Very well written article. I liked how you poked fun at the fact that you couldn't give all the details about the card. At first, I felt that you were being a bit pro-Nvidia, but as I continued reading, it was very clear that you were being objective by letting the numbers (FPS) do the talking. You even compared their market strategies!

    The 4850 for $199 is an incredible price/performance ratio. As someone who is getting ready to build a new PC in the next few weeks, I was extremely interested to see this article. Considering how well the 4850 performs, I'm really interested to see how the estimated $299 4870 will perform. Hopefully, it will have a similar price/performance ratio.
  • Clauzii - Thursday, June 19, 2008 - link

    It was a long march towards this for AMD/ATI. The comming together. The whole Crossfire stuff with cables etc. The 'not so good as expected 2900XT' etc. Paperlaunches. Seems they are getting into shape. Let's see what COU's AMD comes up with, to pair these beasts.

    I especially loved the straight line at the 60FPS-limit in AC. Brilliant, considering every other card was at a falling slope :)) And the more than 100% scaling in COD4. Believe me, I'll be in shock even after sending this comment :O

    OK, two 4850 cards for what I would have paid for one card 6 months ago haven't I waited? Allright!!
  • Clauzii - Thursday, June 19, 2008 - link

    COU's???? What was I seeing :-/

    "CPUs", of course :)
  • Pottervilla - Thursday, June 19, 2008 - link


    Running GPU-Z we see that the Radeon HD 4850 shows up as having 800 stream processors, up from 320 in the Radeon HD 3800 series. Remember that the Radeon HD 3800 was built on TSMC's 55nm process and there simply isn't a smaller process available for AMD to use, so the 4800 most likely uses the same manufacturing process. With 2.5x the stream processor count, the RV770 isn't going to be a small chip, while we can't reveal transistor count quite yet you can make a reasonable guess.

    So much for not telling us--your GPU-Z screen shot says 55nm right on it. :)
  • gochichi - Thursday, June 19, 2008 - link

    Until new games come out, I see no reason to have more than one HD 4850 card. (The obvious exception being having a 30" ultra-high res LCD, which is kind of nutty when you could have a 32" 1080P HDTV for gaming, or one of the awesome 24" LCDs for a handful of bills.) I am really excited about AMD's continued improvements on the HDMI audio, I think HDMI is absolutely the best way to deliver audio in the future and AMD is making it really hard to select any other company if you're planning on connecting your computer to an HDTV. MY computer monitor supports HDMI, and then I can just connect headphones to it which is really nice because it prevents clutter. Why the heck would you get a pricey Sound Blaster when you can just use one simple solution for all of it (not that I believe in sound cards, I think they've been silly for 5+ years).

    The performance is really spot on, it really fits the consumer's needs. I still don't think that given the number of games out and the great performance of a 8800GT or even an HD 3870 there would be too much reason to upgrade. I'm a hypocrite though, because my mouth waters at the difference in COD4 at 1200P between my HD 3870 and HD 4850... though I simply don't use 4X AA (I disable AA). If I were buying today, I think I'd choose the HD 4850 at $200 over the 3870 at $130.00 but it's just because the 4850 is so likable (it's unpretentious, single slot, feature rich).

    Because there is "no need" for these products, AMD is right to price these things at $200.00 and not at $650.00. NVIDIA is really dropping the ball IMHO though their price drops on old tech will protect them for a bit, the total lack of 10.1 support, the lack of awesome HDMI audio (AMD now has the best available on the PC it sounds like) and a less complete video decoder than AMD all make NVIDIA simply uncompetitive at this point. Why do with less features for the same performance?

    The other thing is that when you're talking $150 vs $200, people are going to just go for a nice HD 4850. Now, $200 to $650??? Well... I just don't think that the games nor the CPUs are out yet that would merit this expense.

    Let's see... COD4 is still the best game on the PC and it runs fantastic on the HD 4850. Crysis is a benchmark (that still porks out every single configuration you throw at it IMHO). The 8800GT and now the HD 4850 are important as well because these cards give PCs room to outperform the PS3 and XBOX 360 even at 1200P (1920x1200) when compared to the consoles' 720P. There is absolutely no PS4 generation console on the horizon. These cards now ensure that PC gaming is affordable to everyone interested and put PC gaming in the most affordable place I've ever seen it in my life. Absolutely never could you have a near perfect video card (given the games in the market, and the highest resolution displays) for $200.00 (though the 8800GT was an exception during the week it launched).

    This is really an exciting release, I can't fathom picking up a console right now. CPU's are so nice and cheap, as is RAM, and now these cards, it's just great. NVIDIA: $700-800 is the target price FOR THE WHOLE gaming PC, not just the video card, get it right.
  • Forbin Rhodes - Thursday, June 19, 2008 - link

    I can't wait until ATI comes out with the 4870 X2. I think NVIDIA can throw their TRI SLI in the dustbin? Reply
  • Clauzii - Thursday, June 19, 2008 - link

    Crossfire two of that ;)

    (starts looking for a humongous PSU...)
  • rudolphna - Thursday, June 19, 2008 - link

    lol oohhh yeah.. I'll be looking for Anandtech to be reviewing PCP&Ps newest 2kW Power supply with 200amps on teh 12V rail :) Reply
  • rudolphna - Thursday, June 19, 2008 - link

    PS. (to PCP&P) Switch to 120mm fans, imagine how loud a 2000watt psu will be with an 80mm fan cooling it :) Reply
  • xsilver - Friday, June 20, 2008 - link

    the 80mm fan would require its own psu ;) Reply
  • Clauzii - Thursday, June 19, 2008 - link

    There goes the carrot cutter :)) Reply
  • Devo2007 - Thursday, June 19, 2008 - link

    I can walk into a local retailer and pick one up right now (yes, they are actually showing stock on three different cards). Reply
  • Goty - Thursday, June 19, 2008 - link

    Something is VERY wrong if a 1000W rated power supply can't boot a system that draws less than 500W at load. Most sites recommend a 500W-600W power supply to run a 4850 CF system, which should be PLENTY of power. Reply
  • Creig - Thursday, June 19, 2008 - link

    That's exactly what I was thinking when I read that part of the article. A 4850 supposedly only pulls 110w. So if I was conducting the review, I would have immediately suspected a defective power supply, not an inadequate one. Reply
  • bob4432 - Friday, June 20, 2008 - link

    exactly what i was thinking....ocz quality???? Reply
  • JarredWalton - Friday, June 20, 2008 - link

    I believe the 1000W PSU having problems was specifically in regards to GeForce GTX 280 SLI - though Anand or Derek would have to confirm. The other factor that I don't know is whether the PSU is the problem or perhaps Derek just has really bad electricity in his house. I know I've had no difficulties with even 550W PSUs and 3870 CrossFire (with a Q6600 overclocked to 3.30GHz). Reply
  • Sunrise089 - Friday, June 20, 2008 - link

    Derrick should really clarify the source of the problem then Jarred. We all know on forums everyone says you need a 600 watt PS to even run integrated graphics, but one reason I love AT's real power draw numbers is that they show how little power most sane systems really need. But casually mentioning a 1KW unit isn't enough for even 4850 CF and not explaining further is about as close to pure FUD as I've seen here. Reply
  • DerekWilson - Friday, June 20, 2008 - link

    all these tests have been done at Anand's place and at-the-wall power should not be a problem for any of these recent articles.

    we did have problems with our 1kW thermaltake and our 1kW ocz PSUs with the GTX 280 in SLI. we couldn't get through a crysis run.

    in testing 4850 crossfire, the 1kW ocz power supply (elite xtreme) failed during call of duty.

    we had no problems with the 1200 W pcp&c turbo cool PSU we now have installed.

    our peak power numbers were shown using one of 3dmarks GPU only feature tests. this is in order to isolate GPU power as much as possible for comparison purposes between different graphics cards.

    power draw at the wall will be MUCH larger when playing an actual game. this is because the CPU will be under load and system memory will likely be hit harder as well. we will also see the hard disk active as well.

    i do apologize for not explaining it further. knowing what app we used to test power would probably have done enought to explain why the PSU crashed under game tests but not under our power test with a 1kW PSU ...

    4850 crossfire and up and gt200 sli and up will absolutlely massive ammounts of power to run. we would be the first to say that a 1kW PSU was enough if it were -- but it is not.
  • semo - Saturday, June 21, 2008 - link

    so how much are you drawing at the wall. just saying "MUCH larger" doesn't mean anything.

    this also doesn't make much difference as power ratings refer to how much can be delivered to the system - not how much can be pulled from the socket.

    in other words, there seems to be some confusion. could we get some clarification the next time you do a review for GPUs (e.g. at 4870's launch)
  • flagpole - Saturday, June 21, 2008 - link

    I have a 650w Silverstone Zeus ST650ZF powering my system right now, and it's handling a pair of 4850's Crossfire'd fine.

    Not to mention the 4 harddrives, 5x 120mm fans, Swiftech water pump, an AMD 64X2 4400+ @ 2.7 Ghz, plus various other things like LED's and Cathode tubes sucking back power as well.
  • HOOfan 1 - Friday, June 20, 2008 - link

    How about the fact that nvidia has 2 CWT built 1000W systems certified on SLIzone for dual GTX 280.

    It really perplexes me that you guys think a 1Kw PSU wouldn't be enough for GTX 280 SLi or for 4850 Crossfire. an 800+ Watt PSU should be enough for either. Nvidia even certified the Zalman 850 Watt for dual 9800GX2. Jonnyguru stated that there was a problem specific to the GTX 280 that was not the fault of the PSUs.

    I think you guys really ought to have a talk with nVidia and ATI about this before you just claim that a 1Kw PSU isn't enough for dual GPU for these two cards...because quite honestly that claim sounds rather preposterous to me.
  • strikeback03 - Friday, June 20, 2008 - link

    I was wondering the same - the review says they had power supply problems with 2 4850s in CF, even though the table directly above says that configuration drew 335.7W total system power. Reply
  • Sunrise089 - Thursday, June 19, 2008 - link

    Why the heck are you guys have power supply failures with this card? I know it draws a decent amount of power, but when you're load numbers are less than HALF the rating of the power supply something seems fishy. Reply
  • BPB - Thursday, June 19, 2008 - link

    I thought these cards are to be better than current ATI cards for HD movies. Did you get a chance to play any movies? And if so, ho was the audio? Reply
  • jay401 - Thursday, June 19, 2008 - link

    75C idle, 90C load is insane, i don't care how well the components can tolerate it. It's like an oven inside your case, and -something- will give eventually on it because those temps are nuts. Why does AMD/ATI have such trouble putting out reasonably-temped cards even after yet another die shrink? :( Reply
  • Clauzii - Saturday, June 21, 2008 - link

    They used the die-shrink to ramp up performance, which they needed AND achieved :)

    I hope some Arctic cooling solution will show up even though two slots might be used.
  • docmilo - Thursday, June 19, 2008 - link

    I browsed on over to the Egg and did a search on 4850. A whole bunch of cards popped up at $199.99 and one even has a rebate! I wonder how long until it stops saying "Buy Now" and goes to "Autonotify". Reply
  • chizow - Thursday, June 19, 2008 - link

    You guys did a nice job of covering both the pros and cons of the 4850 and CF, showing some of the pitfalls of relying on multi-GPU solutions for performance. You also made mention that similar performance gains were seen long ago with the 8800GT.

    That said the 4850 is certainly a good part from AMD and there's definitely some very interesting things they've done with this card. You hinted at a lot of them with the architectural changes but there's a few other sites that hinted at some of the changes. Its clear ATI has drastically improved their memory controllers and cache design along with their render back ends for AA performance.

    I think the real thing to keep an eye on though is how AMD managed to get near 100% scaling with CF. Extremetech hinted at improved memory controllers and a gpu communications "Hub" here,2845,2320865...">,2845,2320865... for improved performance between GPUs. I'm sure you guys will cover these improvements in detail in your complete review, but it looks like that hyper transport mechanism you alluded to.
  • MadBoris - Thursday, June 19, 2008 - link

    Nice to see AMD staying competitive, plus keeping prices down.
    I think the days of me spending $400+ on a video card are behind me, atleast for the foreseeable future. You have to provide alot more than 10% performance increases for an extra $250 NVIDIA.

    I'm rather surprised NVIDIA has not really capitalized on taking a huge performance lead and crown with all the AMD post merger dust settling.

    I'm pleasantly surprised that AMD is continuing to excel with HW. If only they would bring back an AIW card, I'd buy one, but my current 8800GTS is not so outmatched that it is worth upgrading to anything this generation.
    Good article Anand.
  • fungmak - Thursday, June 19, 2008 - link

    Looking at the CF perfomance of other sites who used cat 8.6, IIRC were a lot better than the current AT results.

    Just wondering if there is an intention to update using cat 8.6?
  • derek85 - Friday, June 20, 2008 - link

    I second this, I'm sure 8.6 came with some nice optimizations on 770s. Reply
  • DerekWilson - Friday, June 20, 2008 - link

    we did not use catalyst 8.5 drivers.

    we used the very latest beta drivers ATI could get us.
  • Wirmish - Friday, June 20, 2008 - link

    And did you use the Radeon HD 4800 Series Hotfix (6/20/2008) ?">

  • Nighteye2 - Thursday, June 19, 2008 - link

    The big question for the comparison between this card in CF and the GT200 will not be the classic framerates here - but the performance of games that use the GPU for part of the physics processing. The GT200 has lots of compute power to spare for physics, can 2 4850's in CF match that?
  • FITCamaro - Friday, June 20, 2008 - link

    With 800 shaders it wouldn't surprise me. Reply
  • Wirmish - Friday, June 20, 2008 - link

    He talk about CF...

    So it's 1600 shaders !
  • npp - Thursday, June 19, 2008 - link

    This is a fantastic card, and if it really sells for around 150€ here in Europe, this is the way to go. Very impressive offering from AMD, I'm simply amazed!

    The 4870 will often outperfotm the latest nVidia card, I guess... Combined with the supposed refusal of Intel to grant nVidia rights for designing chipsets featuring QPI links, this may mean hard times for the guys in green. They were a little early announcing a war on so many fronts, as it seems. Honestly speaking, I was tired from their domination and insane prices anyway, so this is a very nice turning point.
  • js01 - Thursday, June 19, 2008 - link

    I knew there was a reason why I've stuck with amd for the last few years, because I always get my money's worth. It must be painful to be an early owner of a 9800gtx right now considering the huge price drop. Reply
  • BikeDude - Friday, June 20, 2008 - link

    Why do you have to stay with a particular brand? Are you not free to pick and choose the best deal when you need it?

    Those who bought the 9800GTX a couple of months ago have been able to get a lot of good quality gaming going while some of us have been waiting for the next great thing. Personally I have been waiting for a while now, I still have a 7800GTX! Was the waiting worth it? I doubt so.

    That said, I would really welcome some insight into the driver quality of nVidia drivers vs ATI. nVidia has been disappointing for the past two years. Instability, features gone missing, anomalies introduced (like resetting the colour profile in Vista after login), etc...etc... At one point I was unable to play a DVD for more than 30 minutes before my computer froze. Downgraded the driver and all was fine again. Luckily nVidia fixed that in a later release, but come on... That one was bad, and there's nowhere I can file bug reports.
  • brentpresley - Thursday, June 19, 2008 - link

    Would be great if someone can run Folding At Home on this card and publish the performance.

    The new NV client is smoking right now, and lots of us are pondering buying cards to fold on.
  • KCjoker - Thursday, June 19, 2008 - link

    it gets way too hot for a single slot card dumping the hot air inside the PC. Should be better when some aftermarket cards come out. But why does it draw that much power when the chip is much smaller than Nvidia's? Reply
  • epsilonparadox - Thursday, June 19, 2008 - link

    From the hints in the article, I would assume this is a very large chip that nullifies any benefits the smaller process would have provided. Plus AMD chose a single slot solution which probably isn't doing a good job of cooling the chip. Reply
  • fungmak - Thursday, June 19, 2008 - link

    Well the power that is drawn is about the same as a 9800 GTX.

    However, the 4850 is rumoured to have 950 million trannies compared to the 750 million of the GTX. Also, the die size is rumoured to be around 275mm2 compared to 330 mm"2, so slightly higher density, though i would imagine this is offsite by the reduciton in power due to 55nm compared to 65nm. So all in all the power draw is actually not too bad.
  • andreschmidt - Thursday, June 19, 2008 - link

    956 million transistors on 55nm fabrication process ;) Reply
  • ImmortalZ - Thursday, June 19, 2008 - link

    The Multi-GPU pages seem to be broken - they go straight to the search page. Reply
  • derek85 - Thursday, June 19, 2008 - link

    No the links are fine actually... Anandtech took this article offline for a very short moment for some reason and I hit the same problem during that time. Reply
  • sapiens74 - Thursday, June 19, 2008 - link

    A couple of these sure beats the $650 Nvidia solution Reply
  • ElFenix - Friday, June 20, 2008 - link

    already there on the egg Reply
  • FITCamaro - Friday, June 20, 2008 - link

    For $170-175 after rebate no less. I just got a pair of 8800GTS 512s for $170 each. I kinda wish I'd waited now because while the performance is about the same, I wouldn't have had to buy a new motherboard since my P5WDH Deluxe could run Crossfire. Reply
  • BPB - Friday, June 20, 2008 - link

    $149.99 at BestBuy. Just got 2! They are on the shelves and already marked on sale. VisionTek cards are 25% off this week, so the VisionTek 4850 is $149.99. Reply

Log in

Don't have an account? Sign up now