I'm not really sure why we have NDAs on these products anymore. Before we even got our Radeon HD 4890, before we were even briefed on it, NVIDIA contacted us and told us that if we were working on a review to wait. NVIDIA wanted to send us something special.

Then in the middle of our Radeon HD 4890 briefing what do we see but a reference to a GeForce GTX 275 in the slides. We hadn't even laid hands on the 275, but AMD knew what it was and where it was going to be priced.

If you asked NVIDIA what the Radeon HD 4890 was, you'd probably hear something like "an overclocked 4870". If you asked AMD what the GeForce GTX 275 was, you'd probably get "half of a GTX 295".

The truth of the matter is that neither one of these cards is particularly new, they are both a balance of processors, memory, and clock speeds at a new price point.

As the prices on the cards that already offered a very good value fell, higher end and dual GPU cards remained priced significantly higher. This created a gap in pricing between about $190 and $300. AMD and NVIDIA saw this as an opportunity to release cards that fell within this spectrum, and they are battling intensely over price. Both companies withheld final pricing information until the very last minute. In fact, when I started writing this intro (Wednesday morning) I still had no idea what the prices for these parts would actually be.

Now we know that both the Radeon HD 4890 and the GeForce GTX 275 will be priced at $250. This has historically been a pricing sweet spot, offering a very good balance of performance and cost before we start to see hugely diminishing returns on our investments. What we hope for here is a significant performance bump from the GTX 260 core 216 and Radeon HD 4870 1GB class of performance. We'll wait till we get to the benchmarks to reveal if that's what we actually get and whether we should just stick with what's good enough.

At a high level, here's what we're looking at:

  GTX 285 GTX 275 GTX 260 Core 216 GTS 250 / 9800 GTX+
Stream Processors 240 240 216 128
Texture Address / Filtering 80 / 80 80 / 80 72/72 64 / 64
ROPs 32 28 28 16
Core Clock 648MHz 633MHz 576MHz 738MHz
Shader Clock 1476MHz 1404MHz 1242MHz 1836MHz
Memory Clock 1242MHz 1134MHz 999MHz 1100MHz
Memory Bus Width 512-bit 448-bit 448-bit 256-bit
Frame Buffer 1GB 896MB 896MB 512MB
Transistor Count 1.4B 1.4B 1.4B 754M
Manufacturing Process TSMC 55nm TSMC 55nm TSMC 65nm TSMC 55nm
Price Point $360 ~$250 $205 $140

 

  ATI Radeon HD 4890 ATI Radeon HD 4870 ATI Radeon HD 4850
Stream Processors 800 800 800
Texture Units 40 40 40
ROPs 16 16 16
Core Clock 850MHz 750MHz 625MHz
Memory Clock 975MHz (3900MHz data rate) GDDR5 900MHz (3600MHz data rate) GDDR5 993MHz (1986MHz data rate) GDDR3
Memory Bus Width 256-bit 256-bit 256-bit
Frame Buffer 1GB 1GB 512MB
Transistor Count 959M 956M 956M
Manufacturing Process TSMC 55nm TSMC 55nm TSMC 55nm
Price Point ~$250 ~$200 $150

 

We suspect that this will be quite an interesting battle and we might have some surprises on our hands. NVIDIA has been talking about their new drivers which will be released to the public early Thursday morning. These new drivers offer some performance improvements across the board as well as some cool new features. Because it's been a while since we talked about it, we will also explore PhysX and CUDA in a bit more depth than we usually do in GPU reviews.

We do want to bring up availability. This will be a hard launch for AMD but not for NVIDIA (though some European retailers should have the GTX 275 on sale this week). As for AMD, we've seen plenty of retail samples from AMD partners and we expect good availability starting today. If this ends up not being the case, we will certainly update the article to reflect that later. NVIDIA won't have availability until the middle of the month (we are hearing April 14th).

NVIDIA hasn't been hitting their launches as hard lately, and we've gotten on them about that in past reviews. This time, we're not going to be as hard on them for it. The fact of the matter is that they've got a competitive part coming out in a time frame that is very near the launch of an AMD part at the same price point. We are very interested in not getting back to the "old days" where we had paper launched parts that only ended up being seen in the pages of hardware review sites, but we certainly understand the need for companies to get their side of the story out there when launches are sufficiently close to one another. And we're certainly not going to fault anyone for that. Not being available for purchase is it's own problem.

From the summer of 2008 to today we've seen one of most heated and exciting battles in the history of the GPU. NVIDIA and AMD have been pushing back and forth with differing features, good baseline performance with strengths in different areas, and incredible pricing battles in the most popular market segments. While AMD and NVIDIA fight with all their strength to win customers, the real beneficiary has consistently been the end user. And we certainly feel this launch is no exception. If you've got $250 to spend on graphics and were wondering whether you should save up for the GTX 285 or save money and grab a sub-$200 part, your worries are over. There is now a card for you. And it is good.

New Drivers From NVIDIA Change The Landscape
POST A COMMENT

294 Comments

View All Comments

  • SiliconDoc - Friday, April 24, 2009 - link

    You failed to read his post, and therefore the context of my response, you IDIOT.
    Can you run a second ATI card for PhysX - NO.
    Can you run an ati card and a second NV for PhysX - not without a driver hack - check techpowerup for the how to and files, as I've already mentioned.
    So, THAT'S WHAT WE WE'RE TALKING ABOUT DUMMY.
    Now you can take your stupidity along with you, noone can stop it.
    Reply
  • pizzimp - Friday, April 03, 2009 - link

    From an objective point of view there is not really a clear winner. At the lower resolutions do you really care if you are getting 80 FPS Vs 100 FPS?

    IMO it is the higher resolutions that matter. I would think any real gamer is always looking to upgrade there monitor :).

    I wonder how old you guys are that are posting? Who cares if something is "rebadged" or just an OC version of something? Bottom line is how does the card play the game?

    IMO both cards are good. It comes down to price for me.
    Reply
  • SiliconDoc - Monday, April 06, 2009 - link

    Ahh, you just have to pretend framerates you can't see or notice, and only the top rate or the average, never the bottom framerate...
    Then you must discount ALL the OTHER NVIDIA advantages, from cuda, to badaboom, to better folding scores, to physx, to game release day evga drivers ready to go, to forced game profiles in nvidia panel none for ati - and on and on and on...
    Now, after 6 months of these red roosters screaming ati wins it all because it had the top resolution of the 30" monitor sewed up and lost in lower resolutions, these red roosters have done a 180 degree about face.... now the top resolution just doesn't matter -
    Dude, the red ragers are lying loons, it's that simple.
    The 2 year old 9800X core is the 4870 without ddr5. Think about that, and how deranged they truly are.
    I bet they have been fervently praying to their red god hoping that change doesn't come in the form of ddr5 on that old g80/g92/g92b core - because then instead of it competing with the 4850 - it would be a 4870 - and THAT would be an embarrassment - a severe embarrassment. The crowing of the red roosters would diminish... and they'ed be bent over sucking up barnyard dirt and chickseed - for a long, long time. lol
    Oh well, at least ati might get 2 billion from Obama to cover it's losses ... it's sad when a red rooster card could really use a bailout, isn't it ?
    Reply
  • helldrell666 - Friday, April 03, 2009 - link

    Well, you have a point there. But the card is still not operating on a WHQL driver, and the percentage of those who use 30" montiors is negligible compared to the owners of 22" / 24" monitors.

    I think this is probably due to the 256bit internal memory interface compared to the 484bit that the gtx275 has.even at xbitlabs the 4890 drops significantly in performance compared to the gtx285.



    Reply
  • 7Enigma - Friday, April 03, 2009 - link

    From a subjective point of view you may feel that way, but from an objective point of view there is a clear winner, and it is the 4890. Left for Dead and Call of Duty are the only 2 30" display tests where the 275 significantly defeated the 4890. In all of the other tests either the 4890 either dominated (G.R.I.D., Fallout3), or was within 4% of the 275 which I would call a wash. At all other resolutions the 4890 was the undisputed leader. So I find it difficult to say there is no clear winner.

    What Nvidia should have done was not nerfed their 22" and 24" resolutions for the very few people that game at 30" with the latest drivers. To be honest I wish the article had included all of the results from the 182 drivers (they show just G.R.I.D. but allude to other games also having similar reduced results except at the highest res). It could very likely be a wash then if the 275 is more competetive at the resolutions 99% of the people buying this level/price of card are going to be playing at.

    Anand, any way you could post, even just in the comments, the numbers for the rest of the games with the 182 Nvidia drivers. I don't mind doing the comparison work to see how much closer the 275 would be to the 4890 if they had kept the earlier drivers.
    Reply
  • 7Enigma - Friday, April 03, 2009 - link

    Ah, I see now that the 185's are specifically to enable support for the 275 card. So you can't run the 275 with the 182 drivers. Still would be interesting to see all the data for what happened to the 285 using the newest drivers that decrease the performance at lower resolutions. Reply
  • minime - Friday, April 03, 2009 - link

    First, thanks for your review(s). I'm a silent reader and word-of-mouth spreader for years.

    Second, don't you think reviewers should point their fingers a little bit more aggressively to the power-consumption? Not because it's trendy nowadays, but because it's just not sane to waste that much energy in idle (2D, anyone remembers?) mode. I was thrilled what you alone (don't take it as a disrespect) were able to achieve on the SSD issue.
    Reply
  • SiliconDoc - Monday, April 06, 2009 - link

    PSST ! The ati cards have like 30 watts more power useage in idle - and like 3 watts less in 3d - so the power thing - well they just declare ati the winner... LOL
    They said they were "really surprised" at the 30 watts less in idle for the nvidia - they just couldn't figure it out- and kept rechecking ... but yeah... the 260 was kicking butt.. but... that doesn't matter - ati takes the win using 1-3 watss less in 3D..
    So, you know, the red roosters shall not be impugned !
    capiche' ?
    Reply
  • VulgarDisplay - Friday, April 03, 2009 - link

    It appears that you may have had Vsync turned on which caps the game at 60fps in some of the CoD:W@W tests. It's pretty apparent something is up when the nVidia card has the same FPS at 1680x and 1920x. Either way it still seems like the 4890 wins at those resolutions which is different than most sites that pretty much say it's a wash across the board. I'll take nVidia's drivers over ATi's any day. Reply
  • SiliconDoc - Monday, April 06, 2009 - link

    Hey any little trick that smacks nvidia down a notch is not to be pointed out. Reply

Log in

Don't have an account? Sign up now