The Cards and The Test

In the AMD department, we received two cards. One was an overclocked part from HIS and the other was a stock clocked part from ASUS. Guess which one AMD sent us for the review. No, it's no problem, we're used to it. This is what happens when we get cards from NVIDIA all the time. They argue and argue for the inclusion of overclocked numbers in GPU reviews when it's their GPU we're looking at. Of course when the tables are turned so are the opinions. We sincerely appreciate ASUS sending us this card and we used it for our tests in this article. The original intent of trying to get a hold of two cards was to run CrossFire numbers, but we only have one GTX 275 and we would prefer to wait until we can compare the two to get into that angle.

The ASUS card also includes a utility called Voltage Tweaker that allows gamers to increase some voltages on their hardware to help improve overclocking. We didn't have the chance to play with the feature ourselves, but more control is always a nice feature to have.

For the Radeon HD 4890 our hardware specs are pretty simple. Take a 4870 1GB and overclock it. Crank the core up 100 MHz to 850 MHz and the memory clock up 75 MHz to 975 MHz. That's the Radeon HD 4890 in a nutshell. However, to reach these clock levels, AMD revised the core by adding decoupling capacitors, new timing algorithms, and altered the ASIC power distribution for enhanced operation.  These slight changes increased the transistor count from 956M to 959M. Otherwise, the core features/specifications (texture units, ROPs, z/stencil) remain the same as the HD4850/HD4870 series.

Most vendors will also be selling overclocked variants that run the core at 900 MHz. AMD would like to treat these overclocked parts like they are a separate entity altogether. But we will continue to treat these parts as enhancements of the stock version whether they come from NVIDIA or AMD. In our eyes, the difference between, say, an XFX GTX 275 and an XFX GTX 275 XXX is XFX's call; the latter is their part enhancing the stock version. We aren't going to look at the XFX 4890 and the XFX 4890 XXX any differently. In doing reviews of vendor's cards, we'll consider overclocked performance closely, but for a GPU launch, we will be focusing on the baseline version of the card.

On the NVIDIA side, we received a reference version of the GTX 275. It looks similar to the design of the other GT200 based hardware.

Under the hood here is the same setup as half of a GTX 295 but with higher clock speeds. That means that the GTX 275 has the memory amount and bandwidth of the GTX 260 (448-bit wide bus), but the shader count of the GTX 280 (240 SPs). On top of that, the GTX 275 posts clock speeds closer to the GTX 285 than the GTX 280. Core clock is up 31 MHz from a GTX 280 to 633 MHz, shader clock is up 108 MHz to 1404 MHz, and memory clock is also up 108 MHz to 2322. Which means that in shader limited cases we should see performance closer to the GTX 285 and in bandwicth limited cases we'll still be faster than the GTX 216 because of the clock speed boost across the board.

Rather than just an overclock of a pre-existing card, this is a blending of two configurations combined with an overclock from the two configurations from which it was born. And sure, it's also half a GTX 295, and that is convenient for NVIDIA. It's not just that it's different, it's that this setup should have a lot to offer especially in games that aren't bandwidth limited.

That wraps it up for the cards we're focusing on today. Here's our test system, which is the same as for our GTS 250 article except for the addition of a couple drivers.

The Test

Test Setup
CPU Intel Core i7-965 3.2GHz
Motherboard ASUS Rampage II Extreme X58
Video Cards ATI Radeon HD 4890
ATI Radeon HD 4870 1GB
ATI Radeon HD 4870 512MB
ATI Radeon HD 4850
NVIDIA GeForce GTX 285
NVIDIA GeForce GTX 280
NVIDIA GeForce GTX 275
NVIDIA GeForce GTX 260 core 216
Video Drivers Catalyst 8.12 hotfix, 9.4 Beta for HD 4890
ForceWare 185.65
Hard Drive Intel X25-M 80GB SSD
RAM 6 x 1GB DDR3-1066 7-7-7-20
Operating System Windows Vista Ultimate 64-bit SP1
PSU PC Power & Cooling Turbo Cool 1200W
New Drivers From NVIDIA Change The Landscape The New $250 Price Point: Radeon HD 4890 vs. GeForce GTX 275


View All Comments

  • SiliconDoc - Tuesday, April 7, 2009 - link

    Hey, you're the one with the lies and the cover-ups for ATI, and now the anti-semitic conspiracy theories.
    Even with all the spewing you've got going there, you couldn't just say " ati is really the one who lost money, not nvidia with the GT200".
    Oh well, it's more important to spread FUD and now, conspiracy against "Jews".
    Amazing. I had no idea the rabbit hole goes that deeply. rofl
  • helldrell666 - Wednesday, April 8, 2009 - link

    Check for yourself.It's not a conspiracy, these are facts.
    In fact, the 4800 series cards are the most successful generations of cards ATI ever produced.The 4890 that measures about half the size of the gtx285, beats the later in most games at full HD resolutions.

    Btw, where are you from?
  • SiliconDoc - Friday, April 24, 2009 - link

    I guess you forgot about the pci mach64, and dummy there in between doesn't have a clue what that is.
    Let's see, another lie you told - ati is huge blah blah blah nvidia only 5,000 jobs...">
    3 billion, 4 billion 3.5 billion SALES WITH PROFITS !">
    VERY BAD NUMBERS FOR AMD (ati only being a portion far less than half)">

    Now tell me about employment or jobs ? Is that in the communist inflation reprint economy that costs us taxpayers trillions - the fantasy world where CONSTANT billion dollar losses on just a billion dollar company is "sustainable" ?

    But in your inaginary world filled with HATRED and LIES, it's just the opposite... isn't it.
    How pathetic.

  • tamalero - Thursday, April 9, 2009 - link

    I dont know, the 9700-9800 from ATI were amazing as well. Reply
  • SiliconDoc - Tuesday, April 7, 2009 - link

    You cite "the last quarter", but of course only a fool would use that as a future indicator concerning quality and viability of the company. It's another pathetic attempt, fella. Global downturn means nothing to you, and you FAILED to cite the ati numbers, the two quarters in question, so you really have no point. You must have been afraid to tell the truth ?
    If Nvidia has one low quarter in the midst of massive global downturn, while ati had at least 9 quarters where they suffered losses in a row, who is really in danger of playing on the "competitors" chip ?
    You see, that's WHY the ati red roosters had to SCREAM endlessly about nvidia's GT200 die size - because THE WHOLE TIME BEHIND THE SCENES OF THEIR FLAPPING RED ROOSTER BEAKS - THEIR BELOEVED ATI WAS LOSING BILLIONS....
    See bub, that's what has been going on for far too long.
    It's really sad and sick, that people can't be HONEST.
    All the red roosters had to do was say " hey buy ati, they're in financial trouble and have been, we all want competition to continue so let's pitch in, because the brands are about equivalent. "
    See, that would have been honest and respectable and manly.
    Instead the raging red roosters lied and covered up and FALSELY ACCUSED their competition of imaginary losses while their little toy was bleeding half to death - like little lying brats, they couldn't help but spew in the midst of IMMENSE BILLION DOLLAR losses for ati, how the gt200 was "hurting nvidia" and how "ati could crsuh them" with PRICE DROPS -. lol - man alive i'm telling you - all those know it all red rooster jerks - it was and is still amazing.
    That's fine, just be aware that it --- has been pathetic behavior.
  • Jamahl - Wednesday, April 8, 2009 - link

    Actually, you were the one throwing around $billion losses and FAILED to mention Nvidia's own horrible financial situation. Did you say anything about the global downturn while ranting like a fanatic on AMD's losses?

    What was it you were saying about HONESTY again? Yes, in caps.

    Nvidia hasn't had one low quarter - they've lost 2/3rds of their share value in a year. That doesn't happen in one quarter, same as it didn't happen to AMD in one quarter either.

    Nvidia are a horrible little company who hold back progress, and more and more people are wising up to their methods. Articles like this on Anand show what they are like. Nvidia CANNOT COMPETE with ATI on performance so instead they bribe with more cash than ATI use on R&D, and those that don't accept the bribes get cajoled or threatened instead.

    All the while sad sycophants like you are banging on about PhysX and cuda as if they make a difference to anyone. What does make a difference is their pathetic rebadging of ancient tech, catching out the people who don't know any better.">

    That just proves how far ahead the r700 is vs the g200b. ATI put money in research in order to improve the experience, while Nvidia put money into bribes in an attempt to hold onto whatever slender lead they have. It's only a financial lead, in tech terms ATI are a country mile ahead and only the worst Nvidia fanboi cant see that.

  • SiliconDoc - Friday, April 24, 2009 - link

    " That just proves how far ahead the r700 is vs the g200b."
    That rv700 can't compete AT ALL with the GT200 UNLESS it has DDR5 on it.
    That is a FACT. That is REALITY.
    Without DDR5 it is the full core 4850 that competes with the "old technology" at the DDR3 level on both cards, the 4850 and the 9800 series and flavors.
    That's the truth, YOU LIAR.
    Case closed, no takebacks, no uncrossing your fingers, no removing your red raging horns - like - forever.
    The r700 CAANOT COMPETE WITH THE GT200 - unless DDR5 is added as an advantage for the r700 which actually competes with the g80/g92/g92b.
    NOW, if you screamed and schreeched DDR5 is awesome and ati rocks because they used it, I wouldn't disagree or call you the liar you are.
    Got it son ?
    Figure it out, or go take a college class in logic, and skip the communist training if you possibly can. Might get an estrogen emotion reduction as well while you're at it.
  • SiliconDoc - Friday, April 24, 2009 - link

    Check the five year stock charts before you keep lying, and then as far as your idiotic rant about nvidia, it just goes to show there is no such thing as a fair performance comparison from you people, you will lie your red rooster rooter butts off because you have a big twisted insane hatred for Nvidia, based upon some communist like rage that profit is a sin, and money in the industry is BAD, except PEOPLE get paid with all that money you claim NV throws around. lol
    Dude, you're a red rooster rager, look in the mirror and face it, snce you can't face the facts otherwise. Embrace it, and own it.
    Don't be a liar, instead - or rather if all you're going to do is lie, at least admit it - you're body painted red, no matter what.
    The really serious issue is ati has a really bad continuous loss, and might go under.
    However, I can understand you communist like raging red roosters screaming for more price drops as you declare the much better off financially NVidia the one "to be destroyed", and demand more price drops, as you scream "profiteering".
    Well, the basic fact is plain and apparent, ati had to lose 2 billion dollars to provide their competitive price, and ati purchasers are sitting on that loss, their gain, huh.
    Like I said, if Obama and co. give ati/amd a couple billion in everyones taxes, it might work out ok, otherwise bankruptcy is looming - or some massive new investor relations are required.
    Either way, you people don't tell the truth, and that of course is the point, over and over again.
  • JNo - Saturday, April 4, 2009 - link

    Well having trawled through all 16 pages of comments I have to say that as much as power & temp benches, I really want NOISE benchmarks. Yes power usually comes at the expense of noise and although I'm primarily a gamer, I hate fan noise too.

    I happen to have a 8800GT which was great value when it first came out but it becomes a whirlwind in most games and it drives me crazy, breaks the immersion and only in ear headphones help.

    When the scores are this close, I err on the side of silence and (from other sites) it sounds like the GTX275 is noticably quieter than the 4890 under load.

    Also, the GTX275 may suck up more juice under load but it is also the same amount more economical when idle and as I spend way less than 50% of my computer time gaming, that is much more useful to me...

    Agree that PhysX is overhyped promises at the moment. So, for sound and power efficiency, I think the GTX275 just sways the vote *for me*. And it can overclock a bit even if the impression I'm getting is not quite as much as the 4890.

    Then again, here in the UK the prices are different. The new parts are £200+ and that's 33% more than the GTX260 55nm core216 which can be had for only £150 now and is only a little less powerful than the GTX275 and will surely last fine till the DX11 parts come out... choices... choices...
  • helldrell666 - Saturday, April 4, 2009 - link

    You can edit your vga bios using the radeon bios editor v1.12, which is the one im using now on my 4870, and adjust the frequencies in different modes.By downlcocking your radeon card in idle mode, you can get it operate properly in idle mode without sucking so much can use ATI tray tools also for the same purpose.

    As for the noise, i definitely recommend you to wait a little bit until the non-reference cards get released.

    According to some sites, AMD is going to release its DX11 cards in Q3 this year, so if your planning to upgrade, you'd better consider waiting a little bit and get a far better card than the current available ones.

    From my personal experience, a 4870 1gig is more than enough to play most current games at 24" resolutions with all the setts on their highest including the eye candy.......except for Crysis and Stalker clear sky....If you have a smaller monitor than mine, than you might as well consider a 4850 or a gts250....


Log in

Don't have an account? Sign up now