ATI Radeon HD 4850 Preview: AMD Delivers Performance for the Masses
by Anand Lal Shimpi & Derek Wilson on June 19, 2008 5:00 PM EST- Posted in
- GPUs
Enemy Territory: Quake Wars
This benchmark is performed using the 1.5 version of ET:QW, and we recorded our own demo on the island level. The demo isn't very long but tries to stress the GPU a fair amount. We tested with all the settings on their highest level except for AA which was set to 4x in game.
Even more so than in Call of Duty, the 4850 clobbers the 9800 GTX and performs dangerously close to NVIDIA's upper echelon of hardware. Performance at 2560x1600 tapers off quite a bit, but the 4850 still does quite well comparatively. ET:QW being our only OpenGL benchmark, this is quite a surprising result as NVIDIA has always been the goto solution for OpenGL value. This time around they lose out quite handily.
114 Comments
View All Comments
Spoelie - Friday, June 20, 2008 - link
SLI/CF <-> single slot solution. I think you mean single card. The 4850 is a single slot, the GTX cards are dual slot, so no, the title of best single slot belongs to ATi this round, not NVIDIA.Do not compare testing methodologies of hardocp with anandtech, there have been enough discussions about this in the past. Each has its values, some prefer one, some the other. Physx is an unknown regarding future use and performance, so no factor in evaluating these cards now.
The 4870x2 is most likely a single card, dual slot solution. The current GT-200 and most like the die shrink will be a single card, dual slot solution. The form factors are the same. You do not need a crossfire board to run a 4870x2.
The numbers (prices, clockspeeds, performance) you spew forth are all grabbed out of thin air (shoulda coulda woulda), biased towards nvidia. Wait till the reviews instead of inventing a plausible scenario.
Your comments about audio are all conjecture and baseless ('I question the quality anyway' based on what??).
Nvidia has no easy answer, AMD did target the right price ranges at the right time, I'm guessing they will reap the benefits of that the coming 2 months.
TheJian - Friday, June 20, 2008 - link
Does PC Perspective count? They produced the same results and more showing domination for GTX280. And it's not a dual slot solution, it's ONE card/chip as in you don't have to use driver tricks etc to get your performance. The single chip king of the hill. I was meaning 2 x 4870's because you can't get the 4870X2 until later (per AMD's own statement here at anandtech, it will come in 2 months IIRC). So until then we're talking GTX280 with no driver tricks vs 2 AMD's or 2 Nvidia's previous gens.Tough luck, I'm comparing methodologies when one sucks. An avg CLEARLY from both sites in my previous post (pcpers and hardocp) do NOT tell the TRUE story. What good does an avg score do me if my game is unplayable at the tested resolution? Sure you can give it to me, but you better give me what to expect at home also, which is the MINIMUM I can expect. GTX280 clearly dominated 9800GX2/9800GTX SLI at both sites in minimums and higher quality while doing it.
Of what value is the avg I can get if I can't play it there?
Is x264 encoding that is 5x faster than a X6800 Core 2 not worth mentioning? Should we not futureproof ourselves for apps like Photoshop CS4 which will use the GPU? Should we not thing physx will take off with Intel/AMD/Nvidia all trying to do it in one way or another?
We seen physx speed things up, just not much using it currently, but it is PROVEN to work and fast:
http://www.techpowerup.com/reviews/BFG/Ageia_PhysX...">http://www.techpowerup.com/reviews/BFG/Ageia_PhysX...
When they let the machine do the work without PhysX they couldn't get 5fps...LOL. With the card on it hits the 50's! This was GRAW2 last october. What will Xmas 2008 hold? Since all 3 Intel/AMD/Nvidia are headed there what will we get? From the link "The question is the same people asked when new features like HDR, AA or AF were introduced into games: "Is this addition worth it?" While it may not have been impressive in past games, the answer today should be yes." This was rolling up on a year ago...ROFL. Don't forget they are even being sold in Dell's laptops (XPS 1730) and a bunch of OEM pc's from the likes of Medion, Acer, Dell and HP. By xmas it will probably be on cheap cards. It's something you must consider when purchasing. Even 8800 series cards can run physx code. This all puts a lot of cards in users pc's by xmas. An audience that's this big will be written for.
Price not from thin air. The current 9800GTX is dropping about $30-40 and the die shrink is dropping the chip size by 40% (see http://www.pcper.com/comments.php?nid=5817)">http://www.pcper.com/comments.php?nid=5817). It's reasonable to think they'd drop the current $429 price of the 9800GX2 card if they're saving that 40% die size 2 times on a card correct? I'm thinking $370 for the 9800GX2+ or so is easy. Don't forget they have to price competitively to AMD's offerings. This isn't thin air pal.
Speeds aren't from thin air either. The speeds of the 9800 GTX+ are known (738MHz core, 1836MHz shader, 1000MHz GDDR3), and overclocking headroom is expected. Meaning an ultra should be easy if needed. Have you ever seen a DIE SHRINK that wouldn't net 10% clock speed or more? Nvidia is clocking it low. Most get 15-20% from a shrink; after the shrink matures maybe more. Shoulda coulda woulda what? Read my above reply for all the performance numbers for minimums etc.
Can you prove anything about the quality of the audio coming from AMD's cards? I base my opinion on listening to many audio sources in other areas such as car amps. They are not equal. A high watt amp can distort all to crap for less money than a low watt high current expensive amp, and the high current one blows the other out of the water. I've heard wind in UT flyby on Klipsch V2-400's that wasn't there on MANY other speaker brands. I never knew wind existed int he game until I played it on klipsch. Maybe I heard it, but it sounded like distortion if I did, NOT WIND. Onboard audio does NOT sound like an Audigy2/4/Xfi etc. I'm an EX PC business owner (8yrs) and a certified PC tech. I've seen a lot of different audio solutions, built a ton of pc's and know some don't sound like the others. It's not baseless. I simply said I NEED TO KNOW MORE, and nobody has proven anything yet. Unless you can point me to someone saying "it's tested, and I love the audio". They are using their OWN codec you know, or are you NOT aware of this? I'm unaware of AMD's super audio prowess that you seem to think they have. Your statement about what I said is BASELESS CONJECTURE.
I answered your last part above in my other reply. Nvidia has all the answers. Sorry. Care to place a bet on which one makes money this year? Care to place a bet on FY09?...Don't make me laugh. Care to take a look at what happens on nvidia stock from May 1st to Dec15th EVERY YEAR FOR THE LAST 5 YEARS (probably longer, I just didn't go back farther)? 75%, 75%, 30%, 80%, 80%. Care to look at AMD's last 5 years? Pull up the charts. It's not conjecture. :) I'm not saying AMD doesn't have a good card. I just think Nvidia has better stuff. They shot too low, just like phenom vs Core2. I'll buy whatever is the best for the buck for my monitor (1920x1200). Note I owned a 9700 and 9800pro, and many AMD chips (only when they oc'ed like crazy). Now it's an 8800GTOC and Intel Xeon3110. Whatever works the best.
Buy some AMD stock...I dare you...LMAO.
Final Destination II - Friday, June 20, 2008 - link
If reality doesn't fit in his brain, the average Nvidia-cocksucker has to talk till bubbles come out of his mouth.Well done, Sir!
Unfortunately that doesn't change reality.
Plus, you completely miss the point of this article. It's about the HD 4850 (which beats the performance-per-price crap out of your beloved green company's products). Not the 9800GTX, not the 280GTX, not your childish notions of computer hardware for "real men" (really, that was the biggest laugh besides your infantile over-usage of acronyms for 12-year-olds...).
The HD 4850 destroys any over-priced card NVIDIA tried to sell us and that's why it will be bought and loved.
If it wasn't for ATI, NVIDIA would still charge us phantasy prices.
Fortunately, the HD 4850 also scores on power, performance, features and price.
Just. Accept. It.
This turn Nvidia has lost, no matter what they do; although I have to admit that I like it if you get your crystal ball and foretell us the future... look! We have an NVIDIA insider with us! Jian is directing the company...
BTW, even Anandtech is biased towards Nvidia: While using the latest Geforce driver, they refuse to utilize the latest Catalyst, which promises lots of performance gains (ranging from 10%-20%) as far as I've read the changelog...
If that holds true, the HD 4850 will be an even better deal, thus earning only _one_ word:
Bought.
DerekWilson - Monday, June 23, 2008 - link
we used drivers later than the 8.6 public version -- we used beta press drivers given to us by AMD for performance testing with 4850.madgonad - Friday, June 20, 2008 - link
I am only going to question one of your points.The 'sound' coming out of the HDMI port on these cards is digital. That is, zeros and ones. There is no 'improved quality' with a digital signal. It is either there, or not there. I have Klipsch speakers too, all Reference BTW. All you should be concerned about in maximizing your audio experience is the quality of the device that is doing the DAC (Digital to Analog Conversion) and what quality audio your software is putting out. The only thing this card can't do is the most recent EAX profiles, but Microsoft killed EAX with Vista anyway. This 'video' card will fully support 7.1 LPCM channels of HD audio. The only way to beat that is to buy Asus's soon-to-be released HDAV sound card that will cost more than a HD4850.
Proteusza - Friday, June 20, 2008 - link
You arent an AMD fanboy, you just love nvidia.1. If you use a res of 2560 x 1600, I think you can afford a better graphics card than an 8600GT. So I think its reasonable to assume that people who buy screens capable of such resolutions will put some thought into using either SLI or crossfire, or a dual gpu solution such as a 9800GX2 or 4870X2.
2. This is the 4850, not the 4870, and it comes close to the GTX 260 in some benchmarks while costing half the price. Thats not bad.
3. 9800GX2 beats the GTX 280, I dont know which benchmarks you have seen. Every benchmark that I have seen, including those at Anandtech, show this to be the case. After HardOCP recommended the 8600GTS over the X1950 Pro, I dont put much stock in what they think anymore.
4. 4850 CrossFire beats the GTX 280 in most cases. What do you think a 4870X2 will do to it? Yes you can SLI GTX 280's, but for how much? And what kind of PSU will you need to run it?
Every review/preview of this card says basically the same thing - for the price, it cant be beat. Lets face facts - nvidias card has 1.4 billion transistors and is made on a 65nm process. That makes it hugely expensive to make. The RV770, on the other hand, is made on a 55nm process and has 500 million fewer transistors. That makes it much cheaper to produce. Yes Nvidia can upgrade their process to 55nm, but that is hugely expensive to do, not to mention its a waste of 65nm GTX 280's. They are in a tough place - their card doesnt really perform as well as its 1.4 billion transistors would imply, and nvidia has no option to produce it, no matter how expensive it is.
Do you know that nvidia has just announced a 9800 GTX+ manufactured on a 55nm process? Why would they do that if they werent threatened by these cards?
I own an 8800 GTS 640 right now, and I love it. But I'm still glad that AMD is competitive again. Just look what it made nvidia do - drop prices. Thats good for everyone.
TheJian - Friday, June 20, 2008 - link
I don't love nvidia, just the money they make me. Used to love AMD for the same reason :)1. I never said anything about an 8600GT. I want AA and AF both full blast in all games, I think anand needs to rerun some benchmarks doing this which shows the benefits of 512bit bus. They didn't turn on enough stuff to push the cards like hardocp did (to separate the men from the boys so to speak). You might not like the guy (you can dig into his forums and find a post from me ripping him a new one for a few of his statements about AMD...LOL), but the benchmarks don't lie. Anand just gives avgs, which tells me nothing about when a game dips below playability. AMD got AA correct this time, no doubt about that, but I'd like to see AA and AF turned up before making judgements and that's just what hardocp did. Why no reports of how 4xAA/16x AF changed the picture (both on)? My current 8800GT can't do what I want in ALL games at 1920x1200 (native on Dell 24in). Hardocp showed these cards get there.
2. I don't think they get that close if you turn up the AF and AA together. Hardocp turned on AF+AA and saw the GX2 and 9800GTX lose to both new cards in some cases (GTX260 and 280).
3. Check hardocp, it's on their frontpage. http://enthusiast.hardocp.com/article.html?art=MTU...">http://enthusiast.hardocp.com/article.html?art=MTU...
Crysis has more turned on with GTX280 (16AF) vs GX2 (8AF) and even has more details on HIGH while beating it. GX2 drops to low of 12fps, while GTX280 drops only to 17 (with more turned up! 1medium for 280 and 6 mediums for GX2).
Assassins creed was better on both GTX260/280. The 280 had EVERY detail maxed out which GX2/9800GTX SLI couldn't do playable. They go into detail about 512bit bus being the reason why it's playable.
Look at the appples to apples COD4 scores, where the 9800GTX SLI drops to 11fps with 4xAA+16xAF, while the 280 holds at 32fps. Even the GTX260 only drops to 24fps. That's the difference between playable and NOT at 11fps. They noticed the large TANKING of fps in large grassy areas.
Age of Conan showed the GTX280 was the only card playable at 2560x1600, and at 1920x1200 ran with higher details than the older cards.
http://www.pcper.com/article.php?aid=577&type=...">http://www.pcper.com/article.php?aid=577&type=...
They show the same thing. AA+AF=higher minimums for GTX280 vs. 9800GX2. So it's in more than one place. "The average frame rate doesn’t change much but the minimum is significantly faster – a big plus for a fast paced shooter like COD4. "
Check page 9 for call of jaurez, where GTX280 has min16 vs 9800GX2 min9. That's almost 2x the minimum fps. NOT playable on GX2 at 9fps.
Check page 10 for Company of Heroes "Wow, talk about some improvements here! The GTX 280 simply walks away with this win and even at 1600x1200 the results are impressive. It is 42% faster than the 9800 GX2, 56% faster than the 9800 GTX and 76% faster than the HD 3870 X2 at 1920x1200!?
Uh....DOMINATION! Note these guys use min/max/avg same has hardocp. Is an arbitrary avg# telling us anything about our playing experience? NOPE. We at least need a minimum. Another quote for GTX260 instead "The Company of Heroes test points out some additional benefits of the GTX 260 compared to other three cards it’s compared against. It easily outperforms the HD 3870 X2 as well as the 9800 GX2 in all resolutions. "
Heck, even the GTX260 dominated GX2! They saw 100% minimum fps improvement! Lost planet for them showed GX2 going down in defeat again. UT3, another victory for GTX280. Slaughter in min fps at 1600x1200.
World In Conflict from page 13 at pcperspective "Wow, what a difference a generation makes. World in Conflict has always been accused of being too CPU bound to be any good, but these results paint an interesting picture of the situation. At 2560x1600 the competition can’t even get in the same hemisphere as the GTX 280 is 4x faster than the GX2 and 3870 X2." At 1920x1200 it was 1.6x faster! NOTE the AA+AF is on high!
Crysis? "At first glance the numbers from the GTX 280 aren’t that much more impressive than that GX2 card in Crysis – the real key though is to look at the minimum frame rates and how the GTX 280 is consistently more stable graphically than the dual-GPU cards. At 1920x1200 the minimum mark that the GTX 280 hits is 44% higher than the HD 3870 X2 or 9800 GX2."
Getting the picture yet? MINIMUMS are important NOT AVG fps. I love PLAYABLE gaming. The rest of these sites need to catch up to this line of thinking. People should buy a card for PLAYABLE fps at the res they want for their monitor. NOT for avgs that won't cut it in the real world.
GPU to encode x264 video 5x faster than Core2 X6800?
http://www.pcper.com/article.php?type=expert&a...">http://www.pcper.com/article.php?type=expert&a...
Photoshop CS4 will use this stuff. More than a gaming GPU eh?
Yes I know of the GTX+ (9800GX2+ probably right behind it for $350...). You did catch that I mentioned the die shrink of the GT200 that already taped out and should be out in two months anyway right? That will bring the cost down and allow an GTX280x2 card. I didn't say Nvidia wasn't worried. I said they have an answer for all of AMD's cards. The + versions of both the 9800GTX and GX2 cards will fill the gaps nicely for nvidia until GT200 die shrink in two months in the high range. 4870x2 won't be out until about then anyway. So as I said 4870x2 will be facing a die shrunk/sped up GTX280, and a month later the GTX280x2 card likely if it's even needed. I'm glad AMD is in the game, don't get me wrong. But just like phenom, they needed MORE here also. I already said I'd wait for a price drop and die shrunk version before i'd bite. I just hoped for more from AMD and a bigger nvidia drop...LOL. It's NOT hugely expensive for nvidia to switch to 55nm on GTX280. It's already taped out and should be out in 2 months or so. They don't own the fabs you know. TSMC already produces for them at 55nm.
One more note: look at techpowerup.com's review to understand VISTA sucks. XP benchmarks dominate vista. Extremetech shows this with MANY games (especially DX10 vs. DX9) and the maker of DirectX (Alex St. John's review at extremetech) says DX10 SUCKS. It's a slow pig vs DX9 because it's further from talking DIRECT to the hardware. You want 25% performance back? Switch to XP. :) Let the flames begin...ROFL.
formulav8 - Friday, June 20, 2008 - link
It would be interesting to see if these new drivers will do anything for the new card. The article looks like it uses version 3.5. So rerun the benches with 3.6 maybe?? :)Jason
DerekWilson - Friday, June 20, 2008 - link
we did not use the 3.5 catalyst drivers.we used the very latest beta driver that ATI can provide.
formulav8 - Sunday, June 22, 2008 - link
Now that I looked into it more AMD's release notes doesn't specifically say the 8.6 version supports 4850/4870 cards...