The Cards and The Test

In the AMD department, we received two cards. One was an overclocked part from HIS and the other was a stock clocked part from ASUS. Guess which one AMD sent us for the review. No, it's no problem, we're used to it. This is what happens when we get cards from NVIDIA all the time. They argue and argue for the inclusion of overclocked numbers in GPU reviews when it's their GPU we're looking at. Of course when the tables are turned so are the opinions. We sincerely appreciate ASUS sending us this card and we used it for our tests in this article. The original intent of trying to get a hold of two cards was to run CrossFire numbers, but we only have one GTX 275 and we would prefer to wait until we can compare the two to get into that angle.

The ASUS card also includes a utility called Voltage Tweaker that allows gamers to increase some voltages on their hardware to help improve overclocking. We didn't have the chance to play with the feature ourselves, but more control is always a nice feature to have.

For the Radeon HD 4890 our hardware specs are pretty simple. Take a 4870 1GB and overclock it. Crank the core up 100 MHz to 850 MHz and the memory clock up 75 MHz to 975 MHz. That's the Radeon HD 4890 in a nutshell. However, to reach these clock levels, AMD revised the core by adding decoupling capacitors, new timing algorithms, and altered the ASIC power distribution for enhanced operation.  These slight changes increased the transistor count from 956M to 959M. Otherwise, the core features/specifications (texture units, ROPs, z/stencil) remain the same as the HD4850/HD4870 series.

Most vendors will also be selling overclocked variants that run the core at 900 MHz. AMD would like to treat these overclocked parts like they are a separate entity altogether. But we will continue to treat these parts as enhancements of the stock version whether they come from NVIDIA or AMD. In our eyes, the difference between, say, an XFX GTX 275 and an XFX GTX 275 XXX is XFX's call; the latter is their part enhancing the stock version. We aren't going to look at the XFX 4890 and the XFX 4890 XXX any differently. In doing reviews of vendor's cards, we'll consider overclocked performance closely, but for a GPU launch, we will be focusing on the baseline version of the card.

On the NVIDIA side, we received a reference version of the GTX 275. It looks similar to the design of the other GT200 based hardware.

Under the hood here is the same setup as half of a GTX 295 but with higher clock speeds. That means that the GTX 275 has the memory amount and bandwidth of the GTX 260 (448-bit wide bus), but the shader count of the GTX 280 (240 SPs). On top of that, the GTX 275 posts clock speeds closer to the GTX 285 than the GTX 280. Core clock is up 31 MHz from a GTX 280 to 633 MHz, shader clock is up 108 MHz to 1404 MHz, and memory clock is also up 108 MHz to 2322. Which means that in shader limited cases we should see performance closer to the GTX 285 and in bandwicth limited cases we'll still be faster than the GTX 216 because of the clock speed boost across the board.

Rather than just an overclock of a pre-existing card, this is a blending of two configurations combined with an overclock from the two configurations from which it was born. And sure, it's also half a GTX 295, and that is convenient for NVIDIA. It's not just that it's different, it's that this setup should have a lot to offer especially in games that aren't bandwidth limited.

That wraps it up for the cards we're focusing on today. Here's our test system, which is the same as for our GTS 250 article except for the addition of a couple drivers.

The Test

Test Setup
CPU Intel Core i7-965 3.2GHz
Motherboard ASUS Rampage II Extreme X58
Video Cards ATI Radeon HD 4890
ATI Radeon HD 4870 1GB
ATI Radeon HD 4870 512MB
ATI Radeon HD 4850
NVIDIA GeForce GTX 285
NVIDIA GeForce GTX 280
NVIDIA GeForce GTX 275
NVIDIA GeForce GTX 260 core 216
Video Drivers Catalyst 8.12 hotfix, 9.4 Beta for HD 4890
ForceWare 185.65
Hard Drive Intel X25-M 80GB SSD
RAM 6 x 1GB DDR3-1066 7-7-7-20
Operating System Windows Vista Ultimate 64-bit SP1
PSU PC Power & Cooling Turbo Cool 1200W
New Drivers From NVIDIA Change The Landscape The New $250 Price Point: Radeon HD 4890 vs. GeForce GTX 275
Comments Locked

294 Comments

View All Comments

  • tamalero - Thursday, April 9, 2009 - link

    the 270 is a 285 nerf, so what?

    your point is?
  • SiliconDoc - Thursday, April 23, 2009 - link

    The other point is, when you've been whining about nvidia having a giant brute force core that costs too much to make, and how that gives ati a huge price and profit advantage ( even though ati has been losing a billion a year) , that when ati make a larger core and moer expensive breadboard and cooler setup standard for their rebrand, you point out the greater expense, in order to at least appear fair, and not be a red raging rooster rooter.
    Got it there bub ?
    Sure hope so.
    Next time I'll have to start charging you for tutoring and reading comprehension lessens.
  • SiliconDoc - Thursday, April 23, 2009 - link

    Uh, for you, the mentally handicapped, the point is since ati made a rebrand, call it a rebrand, especially when you've been screeching like a 2 year old about nvidia rebrands, otherwise you're a lying sack of red rooster crap, which you apparently are.
    Welcome to the club, dumb dumb.
    I hope that helps with your mental problem, your absolute inability to comprehend the simplest of points. I would like to give you credit and just claim you're being a smart aleck, but it appears you are serious and haven't got clue one. I do feel sorry for you. Must be tough being that stupid.
  • Griswold - Thursday, April 2, 2009 - link

    Just that one "rebadge" comes with 3 million extra transistors; deal with it.
  • SiliconDoc - Monday, April 6, 2009 - link

    " Because they’re so similar, the Radeon 4870 and 4890 can be combined together for mix-and-match CrossFire, just like the 4850 and 4870."

    Yep, that non rebadge. LOL
  • jtleon - Thursday, April 2, 2009 - link

    Visit:

    http://chhosting.org/index.php?topic=24.0">http://chhosting.org/index.php?topic=24.0

    To see AO applied to FEAR. Boris Vorontsov developed the directx mod long ago!
  • SiliconDoc - Monday, April 6, 2009 - link

    Oh sorry, forgot forced SLI profiles, and I don't want to fail to mention something like EVGA's early release NVidia game drivers for games on DAY ONE. lol
    Aww, red rover red rover send the crying red rooster right over.
    Did I mention ati lost a billion bucks two years in a row for amd ?
    No ?
    I guess Dewreck and anand forgot to mention the larger die, and more expensive components on the 790 ati boards will knock down "the profits" for ati. LOL Yeah, awww... we just won't mention cost when ati's goes up - another red rooster sin by omission.
    I ought to face it, there are so many, I can't even keep up anymore.
    They should get ready for NVidia stiffing them again, they certainly deserve it - although it is funny watching anand wince in text as he got addicted to Mirror's Edge - then declared "meh" for nvidia.
    lol - it's so PATHETIC.
  • tamalero - Thursday, April 9, 2009 - link

    what the hell are you talking about?
  • SiliconDoc - Thursday, April 23, 2009 - link

    You proved you can't read and comprehend properly on the former page, where I had to correct you in your attempt to whine at me - so forget it - since you can't read properly ok nummy ?
  • SiliconDoc - Monday, April 6, 2009 - link

    Ahh, thank you very much. lol
    NVIDIA wins again !
    rofl
    I'm sure the ati card buyers will just hate it...but of course they are so happy with their pathetic "only does framerates, formerly in 2560 for wins, now in lesser resolutions for the win"
    It just never ends - Cuda, PhySx, Ambient Occlusion, bababoom, the vReveal, the game presets INCLUDED in the driver, the ability to use your old 8 or 9 Nvidia card for PhysX or Cuda in a xfire board with another NVidia card for main gaming ...
    I know, NONE OF IT MATTERS !
    The red rooster fanbois hate all of that ! They prefer a few extra frames at way above playable framerates in certain resolutions depending on their fanboy perspective of the card release (formerly 2560 now just lower resolutions)- LOL that they cannot even notice unless they are gawking at the yellow fraps number while they get buzzed down in cold blood in the game.
    Ahhh, the sweet taste of victory at every turn.

Log in

Don't have an account? Sign up now