Introduction

A vast expanse of destruction lies before you. Billowing blue smoke rises from the ashes of the destroyed city, and flames continue to lick towards the sky. The horizon shimmers from the heat waves and smoke emanating from the rubble. As you proceed into the wreckage, your boots splash through puddles, sending out ripples and churning up the ashes. One of the buildings appears to have escaped most of the force of the blast, so you head towards it hoping to find some shelter and a place to relax for a moment.

A glint of light reflects off of the cracked windows, and you instinctively dive to the ground. A split second later, the glass shatters and fragments rain down around you as the bullet misses its intended mark. You roll to the side and watch as dirt and rubble plumes into the air from the spot you so recently occupied. As you marvel at the small particles of dirt scattering into the air, you realize it's already too late; you're too far from cover and the sniper is skilled. As your body slams towards the ground and the scene fades to black, you're glad to know that this was only a game, regardless of how lifelike it appears...


That's not a description of any actual game, but it could be in the very near future judging by the progress we continue to see on the graphics front. The attempt to bring such visions to life is reason enough for us to encourage and revere continued excellence in the field of computer graphics. The ongoing struggle between ATI and NVIDIA to bring forth the most parallel and powerful GPUs at reasonable prices opens new possibilities to developers, pushing them to create content beyond the realm of dreams and move onto ground where angles fear to tread: reality. With each successive generation we work our way closer and closer to blurring the line between reality and rendering, while every step leaves us wanting more. Once again it is time to check in on our progress down the infinite road to graphical perfection.

The latest offering from NVIDIA does not offer a host of new features or any upgraded shader model version support as have the past few generations. The NV4x architecture remains a solid base for this product as the entire DirectX 9 feature set was already fully supported in hardware. Though the G70 (yes, the name change was just to reconcile code and marketing names) is directly based on the NV4x architecture, there are quite a few changes to the internals of the pipelines as well as an overall increase in the width and clock speed of the part. This new update much resembles what we saw when ATI moved from R300 to R420 in that most of the features and block diagrams are the same as last years part with a few revisions here and there to improve efficiency.

One of the most impressive aspects of this launch is that the part is available now. I mean right now. Order it today and plug it in tomorrow. That's right, not only has NVIDIA gotten the part to vendors, but vendors have gotten their product all the way to retailers. This is unprecedented for any graphics hardware launch in recent memory. In the midst of all the recent paper launches in the computer hardware industry, this move is a challenge to all other hardware design houses.

ATI is particularly on the spot after today. Their recent history of announcing products that don't see any significant volume in the retail market for months is disruptive in and of itself. Now that NVIDIA has made this move, ATI absolutely must follow suit. Over the past year, the public has been getting quite tired of failed assurances that product will be available "next week". This very refreshing blast of availability is long overdue. ATI cannot afford to have R520 availability "soon" after launch; ATI must have products available for retail purchase at launch.

We do commend NVIDIA for getting product out there before launching it. But now we move on to the least pleasant side of this launch: price. The GeForce 7800 GTX will cost a solid $600. Of course, we do expect retailers to charge a premium for the early adopters. Prices we are seeing at launch are on the order of $650. This means those who want to build an SLI system based on the GeForce 7800 GTX will be paying between $1200 and $1300 just for the graphics component of their system.

So, what exactly is bigger better and faster this time around? And more importantly, what does that mean for game performance and quality (and is it worth the price)? This is the right place to find the answers. As developers continue to grow in shader prowess, we expect to see hardware of this generation stretch its legs even more as NVIDIA believes this is the point where pure math and shader processing power will become the most important factor in graphics hardware.

The Pipeline Overview
Comments Locked

127 Comments

View All Comments

  • WaltC - Thursday, June 23, 2005 - link

    I found this remark really strange and amusing:

    "It's taken three generations of revisions, augmentation, and massaging to get where we are, but the G70 is a testament to the potential the original NV30 design possessed. Using the knowledge gained from their experiences with NV3x and NV4x, the G70 is a very refined implementation of a well designed part."

    Oh, please...nV30 was so poor that it couldn't even run at its factory speeds without problems of all kinds--which is why nVidia officially cancelled nV30 production after shipping a mere few thousand units. JHH, nVidia's CEO went on record saying, "nV30 was a failure" [quote, unquote] at the time. nV30 was [i]not[/i] the foundation for nV40, let alone the G70.

    Indeed, if anything could be said to be foundational for both nV40 and G70, it would be ATi's R3x0 design of 2002. G70, imo, has far more in common with R300 than it does nV30. nV30, if you recall, was primarily a DX8 part with some hastily bolted on DX9-ish add-ons done in response to R300 (fully a DX9 part) which had been shipping for nine months prior to nV30 getting out of the door.

    In fact, ATi owes its meteoric rise to #1 in the 3d markets over the last three years precisely to the R3x0 products which served as the basis for its later R4x0 architectures. Good riddance to nV3x, I say.

    I'm always surprised at the short and selective memories displayed so often by tech writers--really makes me wonder, sometimes, whether they are writing tech copy for their readers or PR copy at the behest of specific companies, if you know what I mean.
  • JarredWalton - Thursday, June 23, 2005 - link

    98 - As far as I know, the power was measured at the wall. We use a device called "Kill A Watt", and despite the rather lame name, it gives accurate results. It's almost impossible to measure the power draw of any single component without some very expensive equipment - you know, the stuff that AMD and Intel use for CPUs. So under load, the CPU and GPU (and RAM and chipset, probably) are using far more power than at idle.
  • PrinceGaz - Thursday, June 23, 2005 - link

    I agree, starting at 1600x1200 for a card like this was a good idea. If your monitor can only do 1280x1024, you should consider getting a better one before buying a card like the 7800gtx. As a 2070/2141 owner myself, I know that a good monitor capable of high resolutions is a great investment that lasts a helluva lot longer than graphics cards, which are usually worthless after four or five years (along with most other components).

    I'm surprised that no one has moaned about the current lack of an AGP version, to go with their Athlon XP 1700+ or whatever ;)
  • Johnmcl7 - Thursday, June 23, 2005 - link

    I think it was spot on to have 1600x1200 as the minimum resolution, given the power of these cards I think 1024x768, no AA/AF results for 3Dmark2003/2005 which have been thrown around are a complete waste of time.

    John
  • Frallan - Thursday, June 23, 2005 - link

    Good review... And re: the NDA deadlines and the sleapless nights - don't sweat it if a few mistakes are published. The readers here have their heads screwed on the right way and will find the issues for soon enough. And for everyone that does not do 12*16 or 15*20 the answer is simple - U Don't Need The Power!! Save your hard earnt money and get a 6800gt instead.
  • Calin - Thursday, June 23, 2005 - link

    Maybe if you could save the game, change the settings and reload it you could obtain images from exactly the same positions. In one of the fence images, the distance to the fence is quite a bit different in different screenshots
  • Calin - Thursday, June 23, 2005 - link

    You had an 7800 SLI? I hate you all
    :p
  • xtknight - Thursday, June 23, 2005 - link

    Edit: last post correction: actually 21-page report!
  • xtknight - Thursday, June 23, 2005 - link

    Jeez...a couple spelling errors here and there...who cares? I'd like to see you type up a 12-page report and get it out the door in a couple days with no grammatical or spelling errors, especially when your main editor is gone. Remember that English study that showed the human brain interpreted words based on patterns and not spelling?

    I did read the whole review, word-for-word, with little to no trouble. There was not a SINGLE thing I had trouble comprehending. It's a better review than most sites have done which test lower resolutions. I love the non-CPU-limited benchmarks here.

    One thing that made me chuckle was "There is clearly a problem with the SLI support in Wolfenstein 3D". That MS-DOS game is in dire need of SLI. (It's abbreviated Wolfenstein: ET. Wolf3D is an oooold Nazi game.)
  • SDA - Thursday, June 23, 2005 - link

    Derek or Jarred or Wesley or someone:

    Did you measure system power consumption as how much power the computer drew from the wall, or how much power the innards drew from the PSU?


    #95, it's a good thing you know enough about running a major hardware site to help them out with your advice! :-)

Log in

Don't have an account? Sign up now