Introduction

A vast expanse of destruction lies before you. Billowing blue smoke rises from the ashes of the destroyed city, and flames continue to lick towards the sky. The horizon shimmers from the heat waves and smoke emanating from the rubble. As you proceed into the wreckage, your boots splash through puddles, sending out ripples and churning up the ashes. One of the buildings appears to have escaped most of the force of the blast, so you head towards it hoping to find some shelter and a place to relax for a moment.

A glint of light reflects off of the cracked windows, and you instinctively dive to the ground. A split second later, the glass shatters and fragments rain down around you as the bullet misses its intended mark. You roll to the side and watch as dirt and rubble plumes into the air from the spot you so recently occupied. As you marvel at the small particles of dirt scattering into the air, you realize it's already too late; you're too far from cover and the sniper is skilled. As your body slams towards the ground and the scene fades to black, you're glad to know that this was only a game, regardless of how lifelike it appears...


That's not a description of any actual game, but it could be in the very near future judging by the progress we continue to see on the graphics front. The attempt to bring such visions to life is reason enough for us to encourage and revere continued excellence in the field of computer graphics. The ongoing struggle between ATI and NVIDIA to bring forth the most parallel and powerful GPUs at reasonable prices opens new possibilities to developers, pushing them to create content beyond the realm of dreams and move onto ground where angles fear to tread: reality. With each successive generation we work our way closer and closer to blurring the line between reality and rendering, while every step leaves us wanting more. Once again it is time to check in on our progress down the infinite road to graphical perfection.

The latest offering from NVIDIA does not offer a host of new features or any upgraded shader model version support as have the past few generations. The NV4x architecture remains a solid base for this product as the entire DirectX 9 feature set was already fully supported in hardware. Though the G70 (yes, the name change was just to reconcile code and marketing names) is directly based on the NV4x architecture, there are quite a few changes to the internals of the pipelines as well as an overall increase in the width and clock speed of the part. This new update much resembles what we saw when ATI moved from R300 to R420 in that most of the features and block diagrams are the same as last years part with a few revisions here and there to improve efficiency.

One of the most impressive aspects of this launch is that the part is available now. I mean right now. Order it today and plug it in tomorrow. That's right, not only has NVIDIA gotten the part to vendors, but vendors have gotten their product all the way to retailers. This is unprecedented for any graphics hardware launch in recent memory. In the midst of all the recent paper launches in the computer hardware industry, this move is a challenge to all other hardware design houses.

ATI is particularly on the spot after today. Their recent history of announcing products that don't see any significant volume in the retail market for months is disruptive in and of itself. Now that NVIDIA has made this move, ATI absolutely must follow suit. Over the past year, the public has been getting quite tired of failed assurances that product will be available "next week". This very refreshing blast of availability is long overdue. ATI cannot afford to have R520 availability "soon" after launch; ATI must have products available for retail purchase at launch.

We do commend NVIDIA for getting product out there before launching it. But now we move on to the least pleasant side of this launch: price. The GeForce 7800 GTX will cost a solid $600. Of course, we do expect retailers to charge a premium for the early adopters. Prices we are seeing at launch are on the order of $650. This means those who want to build an SLI system based on the GeForce 7800 GTX will be paying between $1200 and $1300 just for the graphics component of their system.

So, what exactly is bigger better and faster this time around? And more importantly, what does that mean for game performance and quality (and is it worth the price)? This is the right place to find the answers. As developers continue to grow in shader prowess, we expect to see hardware of this generation stretch its legs even more as NVIDIA believes this is the point where pure math and shader processing power will become the most important factor in graphics hardware.

The Pipeline Overview
POST A COMMENT

127 Comments

View All Comments

  • VIAN - Wednesday, June 22, 2005 - link

    "NVIDIA sees texture bandwidth as outweighing color and z bandwidth in the not too distant future." That was a quote from the article after saying that Nvidia was focusing less on Memory Bandwidth.

    Do these two statements not match or is there something I'm not aware of.
    Reply
  • obeseotron - Wednesday, June 22, 2005 - link

    These benchmarks are pretty clearly rushed out and wrong, or at least improperly attributed to the wrong hardware. SLI 6800 show up faster than SLI 7800's in many benchmarks, in some cases much more than doubling single 6800 scores. I understand NDAs suck with the limited amount of time to produce a review, but I'd rather it have not been posted until the afternoon than ignore the benchmarks section. Reply
  • IronChefMoto - Wednesday, June 22, 2005 - link

    #28 -- Mlittl3 can't pronounce Penske or terran properly, and he's giving out grammar advice? Sad. ;) Reply
  • SDA - Wednesday, June 22, 2005 - link

    QUESTION

    Okay, allcaps=obnoxious. But I do have a question. How was system power consumption measured? That is, was the draw of the computer at the wall measured, or was the draw on the PSU measured? In other words, did you measure how much power the PSU drew from the wall or how much power the components drew from the PSU?
    Reply
  • Aikouka - Wednesday, June 22, 2005 - link

    Wow, I'm simply amazed. I said to someone as soon as I saw this "Wow, now I feel bad that I just bought a 6800GT ... but at least they won't be available for 1 or 2 months." Then I look and see that retailers already have them! I was shocked to say the least. Reply
  • RyDogg1 - Wednesday, June 22, 2005 - link

    But my question was "who," was buying them. I'm a hardware goon as much as the next guy, but everyone knows that in 6-12 months, the next gen is out and price is lower on these. I mean the benches are presenting comparisons with cards that according to the article are close to a year old. Obviously some sucker lays down the cash because the "premium," price is way too high for a common consumer.

    Maybe this one of the factors that will lead to the Xbox360/PS3 becoming the new gaming standard as opposed to the Video Card market pushing the envelope.
    Reply
  • geekfool - Wednesday, June 22, 2005 - link

    What no Crossfire benchies? I guess they didn't wany Nvidia to loose on their big launch day. Reply
  • Lonyo - Wednesday, June 22, 2005 - link

    The initial 6800U's cost lots because of price gouging.
    They were in very limited supply, so people hiked up the prices.
    The MSRP of these cards is $600, and they are available.
    MSRP of the 6800U's was $500, the sellers then inflated prices.
    Reply
  • Lifted - Wednesday, June 22, 2005 - link

    #24: In the Wolfenstein graph they obviously reversed the 7800 GTX SLI with the Radeon.

    They only reveresed a couple of labels here and there, chill out. It's still VERY OBVIOUS which card is which just by looking at the performance!

    WAKE UP SLEEPY HEADS.
    Reply
  • mlittl3 - Wednesday, June 22, 2005 - link

    Derek,

    I know this article must have been rushed out but it needs EXTREME proofreading. As many have said in the other comments above, the results need to be carefully gone over to get the right numbers in the right place.

    There is no way that the ATI card can go from just under 75 fps at 1600x1200 to over 100 fps at 2048x1535 in Enemy Territory.

    Also, the Final Words heading is part of the paragraph text instead of a bold heading above it.

    There are other grammatical errors too but those aren't as important as the erroneous data. Plus, a little analysis of each of the benchmark results for each game would be nice but not necessary.

    Please go over each graph and make sure the numbers are right.
    Reply

Log in

Don't have an account? Sign up now