NVIDIA Strikes Back: The GTX Gets a Dose of Reality

NVIDIA was still living in the days of G80 when it launched its 1.4 billion transistor GT200 GPU and the GeForce GTX 280/260 that were based on it. Not only did NVIDIA's own GeForce 9800 GX2 outperform the GTX 280 at a lower price, but once AMD launched its Radeon HD 4800 series it became very clear that NVIDIA's pricing was completely out of whack. NVIDIA was pricing its GPUs for a reality that just didn't exist.

The first step to get things back in line was to drop the price of the GeForce 9800 GTX, which NVIDIA did. Next up were the new GTX cards, the GTX 280 now sells for $450 and the GTX 260 is a $299 part. In the conclusion of our Radeon HD 4800 launch article we wrote:

"The fact of the matter is that by NVIDIA's standards, the 4870 should be priced at $400 and the 4850 should be around $250."

It looks like NVIDIA's standards changed, largely thanks to AMD, and now the key players in NVIDIA's lineup are priced more realistically. Today we'll take a look at how the landscape has been reshaped as a result of NVIDIA's pricecuts. At the same time, AMD's literally hot GPUs have seen their prices fall; the Radeon HD 4870 is now a $270 - $280 GPU, slightly down from $299 and the Radeon HD 4850 is a $170 - $180 card. These are very slight changes in price, but at least they are in the right direction.

AMD Prices the Radeon HD 4870 X2

When we previewed the Radeon HD 4870 X2 we weren't given a target pricepoint, we just knew that it'd be more than $500. Today we have a price: $549.

At $549 the X2 isn't exactly a bargain, it's slightly cheaper than two Radeon HD 4870s but you don't need a motherboard with two PCIe x16 slots to use it, which helps lower overall system costs. With NVIDIA's GeForce GTX 280 price drops, the Radeon HD 4870 X2 is now the most expensive current GPU on the market - pretty impressive for a company that swore off building huge GPUs.

The competing product from NVIDIA is, well, there isn't exactly one. NVIDIA doesn't have a single-card multi-GPU GT200 product, so we have to rely on comparing the 4870 X2 to the GeForce GTX 280 (priced at $450) as well as the GeForce GTX 260 in SLI (priced at $300 x 2).

Index These Aren't the Sideports You're Looking For
Comments Locked

93 Comments

View All Comments

  • Spoelie - Tuesday, August 12, 2008 - link

    How come 3dfx was able to have a transparant multigpu solution back in the 90's - granted, memory still was not shared - when it seems impossible for everyone else these days.

    Shader functionality problems? Too much integration (a single card voodoo2 was a 3 chip solution to begin with)?
  • Calin - Tuesday, August 12, 2008 - link

    The SLI from 3dfx used scan line interleaving (or Scan Line Interleaving to be exact). The new SLI still has Scan Line Interleaving, amongst other modes.
    The reason 3dfx was able to use this is that the graphic library used was their own, and it was built specifically to the task. Now, Microsoft's DirectX is not built for this SLI thing, and it shows (see the CrossFire profiles, selected for the best performance for a game, depending on that game).

    Also, 3dfx's SLI had a dongle feeding video signal from the second card (slave) into the first card (master), and the video from the two cards was interleaved. Now, this uses lots of bandwidth, and I don't think DirectX is able to generate scenes in "only even/odd lines", and much of the geometry work must be done by both cards (so if your game engine is geometry bound, SLI doesn't help you)
  • mlambert890 - Friday, August 15, 2008 - link

    Great post... Odd that people seem to remember 3DFX and dont remember GLIDE or how it worked. Im guessing they're too young to have actually owned the original 3D cards (I still have my dedicated 12MB Voodoo cards in a closet), and they just hear something on the web about how "great" 3DFX was.

    It was a different era and there was no real unified 3D API. Back then we used to argue about OpenGL vs GLIDE and the same types of malcontents would rant and rave about how "evil" MSFT was for daring to think to create DirectX

    Today a new generation of illinformed malcontents continue to rant and rave about Direct3D and slam NVidia for "screwing up" 3DFX when the reality is that time moves on and NVidia used the IP from 3DFX that made sense to use (OBVIOUSLY - sometimes the people spending hundreds of millions and billions have SOME clue what they're buying/doing and actually have CS PhDs rather than just "forum posting cred")
  • Zoomer - Wednesday, August 13, 2008 - link

    Ah, I remember wanting to get a Voodoo5 5000, but ultimately decided on the Radeon 32MB DDR instead.

    Yes, 32MB DDR framebuffer!
  • JarredWalton - Tuesday, August 12, 2008 - link

    Actually, current SLI stands for "Scalable Link Interface" and has nothing to do with the original SLI other than the name. Note also that 3dfx didn't support anti-aliasing with SLI, and they had issues going beyond the Voodoo2... which is why they're gone.
  • CyberHawk - Tuesday, August 12, 2008 - link

    nVidia bought them .... and is now uncapable of take advantage if the technology :D
  • StevoLincolnite - Tuesday, August 12, 2008 - link

    They could have at least included support for 3DFX glide so all those GLIDE only games would continue to function.

    Also, ATI have had a "Dual GPU" Card for many years (Rage Furry Maxx) before nVidia released one.
  • TonyB - Tuesday, August 12, 2008 - link

    can it play Crysis though?



    two of my friends computer died while playing it.
  • Spoelie - Tuesday, August 12, 2008 - link

    no it can't, the crysis benchmarks are just made up

    stop with the bearded comments already
  • MamiyaOtaru - Wednesday, August 13, 2008 - link

    Dude was joking. And it was funny.

    It's apparently pretty dangerous to joke around here. Two of my friends died from it.

Log in

Don't have an account? Sign up now