The Cards and The Test

In the AMD department, we received two cards. One was an overclocked part from HIS and the other was a stock clocked part from ASUS. Guess which one AMD sent us for the review. No, it's no problem, we're used to it. This is what happens when we get cards from NVIDIA all the time. They argue and argue for the inclusion of overclocked numbers in GPU reviews when it's their GPU we're looking at. Of course when the tables are turned so are the opinions. We sincerely appreciate ASUS sending us this card and we used it for our tests in this article. The original intent of trying to get a hold of two cards was to run CrossFire numbers, but we only have one GTX 275 and we would prefer to wait until we can compare the two to get into that angle.

The ASUS card also includes a utility called Voltage Tweaker that allows gamers to increase some voltages on their hardware to help improve overclocking. We didn't have the chance to play with the feature ourselves, but more control is always a nice feature to have.

For the Radeon HD 4890 our hardware specs are pretty simple. Take a 4870 1GB and overclock it. Crank the core up 100 MHz to 850 MHz and the memory clock up 75 MHz to 975 MHz. That's the Radeon HD 4890 in a nutshell. However, to reach these clock levels, AMD revised the core by adding decoupling capacitors, new timing algorithms, and altered the ASIC power distribution for enhanced operation.  These slight changes increased the transistor count from 956M to 959M. Otherwise, the core features/specifications (texture units, ROPs, z/stencil) remain the same as the HD4850/HD4870 series.

Most vendors will also be selling overclocked variants that run the core at 900 MHz. AMD would like to treat these overclocked parts like they are a separate entity altogether. But we will continue to treat these parts as enhancements of the stock version whether they come from NVIDIA or AMD. In our eyes, the difference between, say, an XFX GTX 275 and an XFX GTX 275 XXX is XFX's call; the latter is their part enhancing the stock version. We aren't going to look at the XFX 4890 and the XFX 4890 XXX any differently. In doing reviews of vendor's cards, we'll consider overclocked performance closely, but for a GPU launch, we will be focusing on the baseline version of the card.

On the NVIDIA side, we received a reference version of the GTX 275. It looks similar to the design of the other GT200 based hardware.

Under the hood here is the same setup as half of a GTX 295 but with higher clock speeds. That means that the GTX 275 has the memory amount and bandwidth of the GTX 260 (448-bit wide bus), but the shader count of the GTX 280 (240 SPs). On top of that, the GTX 275 posts clock speeds closer to the GTX 285 than the GTX 280. Core clock is up 31 MHz from a GTX 280 to 633 MHz, shader clock is up 108 MHz to 1404 MHz, and memory clock is also up 108 MHz to 2322. Which means that in shader limited cases we should see performance closer to the GTX 285 and in bandwicth limited cases we'll still be faster than the GTX 216 because of the clock speed boost across the board.

Rather than just an overclock of a pre-existing card, this is a blending of two configurations combined with an overclock from the two configurations from which it was born. And sure, it's also half a GTX 295, and that is convenient for NVIDIA. It's not just that it's different, it's that this setup should have a lot to offer especially in games that aren't bandwidth limited.

That wraps it up for the cards we're focusing on today. Here's our test system, which is the same as for our GTS 250 article except for the addition of a couple drivers.

The Test

Test Setup
CPU Intel Core i7-965 3.2GHz
Motherboard ASUS Rampage II Extreme X58
Video Cards ATI Radeon HD 4890
ATI Radeon HD 4870 1GB
ATI Radeon HD 4870 512MB
ATI Radeon HD 4850
NVIDIA GeForce GTX 285
NVIDIA GeForce GTX 280
NVIDIA GeForce GTX 275
NVIDIA GeForce GTX 260 core 216
Video Drivers Catalyst 8.12 hotfix, 9.4 Beta for HD 4890
ForceWare 185.65
Hard Drive Intel X25-M 80GB SSD
RAM 6 x 1GB DDR3-1066 7-7-7-20
Operating System Windows Vista Ultimate 64-bit SP1
PSU PC Power & Cooling Turbo Cool 1200W
New Drivers From NVIDIA Change The Landscape The New $250 Price Point: Radeon HD 4890 vs. GeForce GTX 275
Comments Locked

294 Comments

View All Comments

  • papapapapapapapababy - Thursday, April 2, 2009 - link

    Is the quality of the drivers. ATI. Call me when: A) you fix your broken drivers. B) Decide to finally ditch that bloated Microsoft Visual C++ just so i can have the enormous privilege of using your- also terrible and also bloated- CCC panel. c) Stop pouting my pc with your useless extra services. Until then 'll carry on with NVIDIA. Thanks. > Happy- nvidia- user (and frustrated ex-ATI costumer)
  • josh6079 - Thursday, April 2, 2009 - link

    The quality of drivers can be argued either way and the negative connotations associated with "Drivers" and "ATi" are all but ancient concerns in the single GPU arena.
  • papapapapapapapababy - Thursday, April 2, 2009 - link

    BS. I have a pretty beefy pc, that doesn't mean im going to stop demanding for efficiency when it comes to memory usage and to reduce the shear amount of stupid services required to run a simple application. This are all FACTS about Ati. But hey, you are free to use vista, buy ATI and end up with a system that is inferior and slower than mine.( performance and feature wise)

    btw, to all the people claiming that cuda and physics are gimmicks... Give me a fn break! U Hypocrites. This cards ARE HUGE GIMMICKS! BEHOLD he MEGA POWAAR! For what? Crysis? Thats just ONE GAME. ONE. UNO. 1. Then what?... Console ports. At the end of the day 99.9% of games today are console ports. The fact is, you don't need this monstrosities in order to run that console crap. Trust me, you may get a boner comparing 600 fps vs 599, but the rest of the - sane- people here, dsnt give a rat ass, expectantly when the original - console game- barely runs at 30fps to begin with.

  • SiliconDoc - Monday, April 6, 2009 - link

    The red roosters cannot face reality my friend. They are insane as you point out.
    cuda, physx, badaboom, the vreveal, the mirrors edge that addicted anand, none of it matters to red roosters - the excessive heat at idle from ati - also thrown out with the bathwater, endless driver issues, forget it - no forced multi gpu - nmevermind, no gamer profiles built into the driver for 100 popular games that nvidia has - forget it - better folding performance, forget it -NOT EVEN CURING CANCER MATTERS WHEN A RED ROOSTER FANBOI IS YAKKING ALONG THAT THEY HAVE NO BIAS.
    Buddy, be angry, as you definitely deserve to be - where we got so many full of it liars is beyond me, but I suspect the 60's had something to do with it. Perhaps it's their tiny nvidia assaulted wallets - and that chip on their shoulder is just never going away.
    I see someone mentioned Nvidia "tortured" the reviewers. LOL
    hahahaahahahahahahaaa
    There is no comparison... here's one they just hate:
    The top ati core, without ddr5, and neutered for the lower tier, is the 4830.
    The top nvd core, without ddr5, and neutered for the lower tier, is the 260/192.
    Compare the 260/192 to the 4830 - and you'll see the absolute STOMPING it takes.
    In fact go up to the 4850, no ddr5 - and it's STILL a kicking to proud of.
    Now what would happen if NVidia put ddr5 on it's HATED G92 "rebrand" core ? LOL We both know - the 9800gtx+ and it's flavors actually competes equivalently with the 4850 - if Nvidia put ddr5 on it, it WOULD BE A 4870 .
    Now, that G80/G92/G92b "all the same" according to the raging red roosters who SCREAM rebranded - is SEVERAL YEARS OLD nvidia technology... that - well - is the same quality as the current top core ati has produced.
    So ATI is 2 years behind on core - luckily they had the one company that was making the dddr5 - to turn that 4850 into a 4870 - same core mind you!
    So, the red roosters kept screaming for a GT200 "in the lower mid range" --- they kept whining nvidia has to do it - as if the 8800/9800 series wasn't there.
    The real reason of course, would be - it could prove to them, deep down inside, that the GT200 "brute force" was really as bad or worse than the 770 core - bad enough that they could make something with it as lowly as the 4830...
    Ahh, but it just hasn't happened - that's what the 2 year old plus rebrand of nvidia is for - competing with the ati core that doesn't have the ddr5 on it.
    Well, this reality has been well covered up by the raging red rooster fanboys for quite some time. They are so enraged, and so deranged, and so filled with endless lies and fudging, that they just simply missed it - or drove it deep within their subconscoius, hoping they would never fully, conscoiusly realize it.
    Well, that's what I'm here for ! :)
    To spread the good word, the word of TRUTH.
  • josh6079 - Friday, April 3, 2009 - link

    I just came from using Cat 9's to 182+'s when I upgraded to nVidia.

    The "efficiency when it comes to memory usage" is a non-issue -- especially on a "beefy pc."

    The windows task manager is not a benchmark leading to conclusive comparisons regarding quality. My Nvidia GPU can (and has) occupied more memory, especially when I utilize nHancer so as to tap the super-sampling capabilities.

    Also, it's something to note that nVidia's latest driver download is 77.0 MB in size, yet ATi's latest is only 38.2 MB.
  • papapapapapapapababy - Saturday, April 4, 2009 - link

    1) Nhancer is just an optional utility, optional. IF want to check the gpu temps i just use gpuz, or everest, if i want to overclock i just use rivaturner, for the rest i have the nvidia control panel.

    The ccc, not only is a bloated crap, it also requires Ms NET framework, and spawns like 45 extra services running non stop ALL THE TIME, clogging my pc, and the thing dsnt even work! GREAT WORK ATI! CCC is stupidly slow and broken. Se, i dont need a massive mega all in one solution that doesn't work and runs like ass.


    2) YOUR Nvidia GPU. YOUR. Thats the key word, here. Your fault. Just like that windows task manager of yours, it seems to me you just didnt know how to use that nvidia gpu . And you need that knowledge in order to form conclusive comparisons regarding efficiency.

    3) i made a gif, just for you. here. try no to hurt yourself reading it.

    http://i39.tinypic.com/5ygu91.jpg">http://i39.tinypic.com/5ygu91.jpg


    3) upgrade your adsl? btw the nvidia driver includes the extra functionality, that ati dsnt even have. ( and hint, it doesn't pollute your pc!)



  • tamalero - Sunday, April 5, 2009 - link

    sorry to bust your bubble, but your screenshots is no proof, its clear you removed a LOT OF PROCESSES just to take the screenshot, how about if you take the FULL desktop screenshot that shows the nvidia panel loaded?
    because it doesnt seem to be in the process list.

    also you're liying, I got an ATI 3870. and I only got 3 processes of ATI, one of them being the "AI" tool for game optimizations(using the latest drivers).
    and I agree with Anandtech for first time ever, PhysX is just not ready for the "big thing"
    most of the things they bloat are just "tech demos" or very weak stuff, Mirrors Edge's PhysX in other hand does show indeed add a lot of graphical feel.

    and funny you mention framework, because a lot of new games and programs NOW NEED the framework fundation, or at least the C++ redistribuitable groups.


    also lately I've been reading a lot of fuzz related to the 275's optomizations, wich in many games forces games to use less AA levels than the chosen ones. and thus giving the "edge" to the 275 vs the 4890. (MSAA vs TRSS)
    I suppose again NVidia as been playing the dirty way.
    and its gets annoying how Nvidia as been doing that for quite a bit to keep the dumb thing of "opening a can of whoop ass"
  • papapapapapapapababy - Monday, April 6, 2009 - link

    "you removed a LOT OF PROCESSES", dumbass... if its not a necessary service, its turned off REGARDLESS OF THE videocard . maybe you should try to do the same? (lol just 3 ati services, run the ccc and see) btw, if i CHOOSE to use the nvidia control panel, THEN a new nvidia service starts, THEN as soon as I CLOSE THE control panel ( the process disappears from the task manager. THAT 3 ATI SERVICES OF YOURS ARE RUNNING ALL THE FRINGIN TIME, DUMMY, Remaining resident in YOUR memory REGARDLESS IF YOU USE OR NOT THE CCC. AND THEY GET BIGGER, AND BIGGER, AND BIGGER. ALSO YOU HAVE TO USE THE NET CRAP. (EXTRA SERVICES!) AND FINALLY, THE CCC RUNS LIKE ASS. so there, i WIN. YOU LOOSE. END OF STORY.
  • tamalero - Thursday, April 9, 2009 - link

    hang on, since when you need the CCC panel to be ON ( Ie, loaded and not in the tray ) to play games?
    are you a bit dumb?

    second, why you didnt filter out the services then?
    your screenshot is bull
    its almost like you ran WinXP in safe mode just to take the screenshot and claim your "memory superiority".

    like I said, show us a full screen that shows the nvidia panel LOADED .


    your argument is stupid .
    4 Mb of Ram must be a LOT for you? (thats what my ATI driver uses currently on vista X64.. )
    btw, theres also an option in ATI side to remove the panel from the tray.
    the tray serves a similar function as ATI TOOL ( Ie, fast resolution , color dept and frecuency changes )

    play apples with apples if you want to make a smart conversation.
    "runs like ass", makes me wonder how old are you, 14 years old?
    and my CC runs very fine, thank you!, not a single error.


    also, I got all frameworks installed and even when programs loaded, I dont see any "framework" services running, nor application, so please, get your head out of your ass.
    you're just like this pseudo SiliconDr who spreads only FUD and insults.
  • SiliconDoc - Friday, April 24, 2009 - link

    Besides all your errors and the corrections issued, it comes down to you claiming " Don't load the software that came with the ATI card because it's a fat bloated pig thqt needs to be gotten rid of".
    Yes, and most people who want the performance have to do that.
    Sad, isn't it ?
    You do know you got spanked badly, and used pathetic 3rd grader whines like "~ your screencap is fake" after he had to correct you on it all....
    Just keep it shut until you have a valid point - stop removing all doubt.

Log in

Don't have an account? Sign up now