The Cards and The Test

In the AMD department, we received two cards. One was an overclocked part from HIS and the other was a stock clocked part from ASUS. Guess which one AMD sent us for the review. No, it's no problem, we're used to it. This is what happens when we get cards from NVIDIA all the time. They argue and argue for the inclusion of overclocked numbers in GPU reviews when it's their GPU we're looking at. Of course when the tables are turned so are the opinions. We sincerely appreciate ASUS sending us this card and we used it for our tests in this article. The original intent of trying to get a hold of two cards was to run CrossFire numbers, but we only have one GTX 275 and we would prefer to wait until we can compare the two to get into that angle.

The ASUS card also includes a utility called Voltage Tweaker that allows gamers to increase some voltages on their hardware to help improve overclocking. We didn't have the chance to play with the feature ourselves, but more control is always a nice feature to have.

For the Radeon HD 4890 our hardware specs are pretty simple. Take a 4870 1GB and overclock it. Crank the core up 100 MHz to 850 MHz and the memory clock up 75 MHz to 975 MHz. That's the Radeon HD 4890 in a nutshell. However, to reach these clock levels, AMD revised the core by adding decoupling capacitors, new timing algorithms, and altered the ASIC power distribution for enhanced operation.  These slight changes increased the transistor count from 956M to 959M. Otherwise, the core features/specifications (texture units, ROPs, z/stencil) remain the same as the HD4850/HD4870 series.

Most vendors will also be selling overclocked variants that run the core at 900 MHz. AMD would like to treat these overclocked parts like they are a separate entity altogether. But we will continue to treat these parts as enhancements of the stock version whether they come from NVIDIA or AMD. In our eyes, the difference between, say, an XFX GTX 275 and an XFX GTX 275 XXX is XFX's call; the latter is their part enhancing the stock version. We aren't going to look at the XFX 4890 and the XFX 4890 XXX any differently. In doing reviews of vendor's cards, we'll consider overclocked performance closely, but for a GPU launch, we will be focusing on the baseline version of the card.

On the NVIDIA side, we received a reference version of the GTX 275. It looks similar to the design of the other GT200 based hardware.

Under the hood here is the same setup as half of a GTX 295 but with higher clock speeds. That means that the GTX 275 has the memory amount and bandwidth of the GTX 260 (448-bit wide bus), but the shader count of the GTX 280 (240 SPs). On top of that, the GTX 275 posts clock speeds closer to the GTX 285 than the GTX 280. Core clock is up 31 MHz from a GTX 280 to 633 MHz, shader clock is up 108 MHz to 1404 MHz, and memory clock is also up 108 MHz to 2322. Which means that in shader limited cases we should see performance closer to the GTX 285 and in bandwicth limited cases we'll still be faster than the GTX 216 because of the clock speed boost across the board.

Rather than just an overclock of a pre-existing card, this is a blending of two configurations combined with an overclock from the two configurations from which it was born. And sure, it's also half a GTX 295, and that is convenient for NVIDIA. It's not just that it's different, it's that this setup should have a lot to offer especially in games that aren't bandwidth limited.

That wraps it up for the cards we're focusing on today. Here's our test system, which is the same as for our GTS 250 article except for the addition of a couple drivers.

The Test

Test Setup
CPU Intel Core i7-965 3.2GHz
Motherboard ASUS Rampage II Extreme X58
Video Cards ATI Radeon HD 4890
ATI Radeon HD 4870 1GB
ATI Radeon HD 4870 512MB
ATI Radeon HD 4850
NVIDIA GeForce GTX 285
NVIDIA GeForce GTX 280
NVIDIA GeForce GTX 275
NVIDIA GeForce GTX 260 core 216
Video Drivers Catalyst 8.12 hotfix, 9.4 Beta for HD 4890
ForceWare 185.65
Hard Drive Intel X25-M 80GB SSD
RAM 6 x 1GB DDR3-1066 7-7-7-20
Operating System Windows Vista Ultimate 64-bit SP1
PSU PC Power & Cooling Turbo Cool 1200W
New Drivers From NVIDIA Change The Landscape The New $250 Price Point: Radeon HD 4890 vs. GeForce GTX 275
POST A COMMENT

294 Comments

View All Comments

  • SiliconDoc - Friday, April 24, 2009 - link

    I agree but you'll never get that here since ati gets stomped in fs9 and fsx even more.
    This is red rooster ragers central - at the reviewer level for now.
    Put in the acelleration pack, and go for nvidia - the GT200 chips do well in FS9 - and dual is out for FSX so....
    A teenage friend just got a 9800GTX (evga egg) and is running ddr800 2 gigs with a 3.0 P4 HT, on a G35 Asrock and gets 25-30 fps in FSX on a 22" flat Acer with everything cranked.
    He oc'ed the cpu to 3.4 and pulled like 5 more frames per.
    That's what he wanted, very playable on ultra - on his former 8600GTS he couldn't even give it a go for fsx.
    However, moving up from 8800 I'm not certain what the gain is specifically. I've seen one or two reviews on HardOcp for fsx with a few cards. Try them.
    Reply
  • 8KCABrett - Friday, May 8, 2009 - link

    Well, now that I've picked up a GTX 285SC, I've done some rudimentary benchmark comparisons between it and my 8800GTS in FSX and IL-2. I will add BlackShark results soon as well.

    http://www.txsquadron.com/forum/index.php?topic=26...">http://www.txsquadron.com/forum/index.php?topic=26...
    Reply
  • SiliconDoc - Monday, June 22, 2009 - link

    Very interesting, and not a great increase - Tom's lists FSX benchies in most of his card charts - the 9800GTX+ is way up there(3rd I believe), as are some of the 8800 series.
    It's weird.
    The old HD2900 (pro even) does well with a good overclock - even the strange saphhire version which was 256 bit with 320 shaders - on a 25% oc it makes FSX quite playable. ( another friend on an E4500 w 4gigs/800).
    I saw the ati1950XTX at hard does pretty well - well the 1950GT does NOT.
    ---
    That 8800 chip is still - well, if not the best, still darn close.
    Reply
  • lk7900 - Monday, April 27, 2009 - link


    Can you please die? Prefearbly by getting crushed to death, or by getting your face cut to shreds with a
    pocketknife.

    I hope that you get curb-stomped, f ucking retard

    Shut the *beep* up f aggot, before you get your face bashed in and cut
    to ribbons, and your throat slit.

    http://www.youtube.com/watch?v=QGt3lpxyo1U">http://www.youtube.com/watch?v=QGt3lpxyo1U

    I wish you a truly painful, bloody, gory, and agonizing death, *beep*
    Reply
  • lk7900 - Monday, April 27, 2009 - link

    http://www.anandtech.com/video/showdoc.aspx?i=3539...">http://www.anandtech.com/video/showdoc.aspx?i=3539... Reply
  • asq - Monday, April 13, 2009 - link

    My friend working for Anandtech and told me that ATI paying for articles good for them and in disadvantageous to Nvidia..what we can clearly see in that article.. Reply
  • lk7900 - Monday, April 27, 2009 - link

    Die of aids moron. Reply
  • SiliconDoc - Friday, April 24, 2009 - link

    Ahh, yeah well people have to get paid.
    It's nice to see the reaction there from the red rooster though, huh - cheering it on while he spews his brainwashed communist-like hatred of nvidia.
    It's amazing.
    Good for you noticing, though.
    Reply
  • joeysfb - Tuesday, April 28, 2009 - link

    I don't hate Nvidia. I own 5 nvidia cards and 1 ati card. I m buying what gives me the best value. To me, its ATI for now. I think AnandTech did a good job reporting on matter the happens behind the scene. They just report it and it up to individual to form their own thoughts.

    You obviously only buy Nvidia which is good... no fuss on deciding on what to get next!! hahaha....
    Reply
  • SiliconDoc - Monday, June 22, 2009 - link

    Well incorrect entirely. They didn't do a good job reporting on behind the scenes, because they left out the ATI prodding and payment parts.
    Furthermore, ati is still in a world of hurt losing billions in consecutive years.
    If you were to be HONEST about things, if all the people here were to be, the truth would be: " WHERE THE HECK WAS ATI FOR SO LONG ? !! THEY'VE ALWAYS BEEN AROUND, BUT NVIDIA SPANKED THEM FOR SO LONG, WE HATE NVIDIA FOR BEING TOP DOG AND TOP PRICED - BUT IT'S REALLY ATI'S FAULT, WHO ENTIRELY BLEW IT FOR SO LONG.."
    ---
    See, that's what really happened. ATI fell off the gaming fps wagon, and only recently got their act back together. They shouldn't be praised, they should be insulted for blowing competition for so long.
    If you're going to praise them, praise them for losing 33% on every card they sell, in order to have that 5-10 dollar pricepoint advantage, because if ati were to JUST BREAK EVEN, they'ed have to raise all their gaming cards prices about $75 EACH.
    So they're losing a billion a year... by destroying themselves.
    Nvidia has made a profit all along, however. I think the last quarter they had a tiny downturn - while ati was still bleeding to death.
    PRAY that Obama and crew has given or will give ati a couple billion in everyone else's tax money and inflation for everyone printed out of thin air dollars, to save them. You better so, or for a multi-billion dollar sugar daddy corporateer.
    Reply

Log in

Don't have an account? Sign up now