Introduction

A vast expanse of destruction lies before you. Billowing blue smoke rises from the ashes of the destroyed city, and flames continue to lick towards the sky. The horizon shimmers from the heat waves and smoke emanating from the rubble. As you proceed into the wreckage, your boots splash through puddles, sending out ripples and churning up the ashes. One of the buildings appears to have escaped most of the force of the blast, so you head towards it hoping to find some shelter and a place to relax for a moment.

A glint of light reflects off of the cracked windows, and you instinctively dive to the ground. A split second later, the glass shatters and fragments rain down around you as the bullet misses its intended mark. You roll to the side and watch as dirt and rubble plumes into the air from the spot you so recently occupied. As you marvel at the small particles of dirt scattering into the air, you realize it's already too late; you're too far from cover and the sniper is skilled. As your body slams towards the ground and the scene fades to black, you're glad to know that this was only a game, regardless of how lifelike it appears...


That's not a description of any actual game, but it could be in the very near future judging by the progress we continue to see on the graphics front. The attempt to bring such visions to life is reason enough for us to encourage and revere continued excellence in the field of computer graphics. The ongoing struggle between ATI and NVIDIA to bring forth the most parallel and powerful GPUs at reasonable prices opens new possibilities to developers, pushing them to create content beyond the realm of dreams and move onto ground where angles fear to tread: reality. With each successive generation we work our way closer and closer to blurring the line between reality and rendering, while every step leaves us wanting more. Once again it is time to check in on our progress down the infinite road to graphical perfection.

The latest offering from NVIDIA does not offer a host of new features or any upgraded shader model version support as have the past few generations. The NV4x architecture remains a solid base for this product as the entire DirectX 9 feature set was already fully supported in hardware. Though the G70 (yes, the name change was just to reconcile code and marketing names) is directly based on the NV4x architecture, there are quite a few changes to the internals of the pipelines as well as an overall increase in the width and clock speed of the part. This new update much resembles what we saw when ATI moved from R300 to R420 in that most of the features and block diagrams are the same as last years part with a few revisions here and there to improve efficiency.

One of the most impressive aspects of this launch is that the part is available now. I mean right now. Order it today and plug it in tomorrow. That's right, not only has NVIDIA gotten the part to vendors, but vendors have gotten their product all the way to retailers. This is unprecedented for any graphics hardware launch in recent memory. In the midst of all the recent paper launches in the computer hardware industry, this move is a challenge to all other hardware design houses.

ATI is particularly on the spot after today. Their recent history of announcing products that don't see any significant volume in the retail market for months is disruptive in and of itself. Now that NVIDIA has made this move, ATI absolutely must follow suit. Over the past year, the public has been getting quite tired of failed assurances that product will be available "next week". This very refreshing blast of availability is long overdue. ATI cannot afford to have R520 availability "soon" after launch; ATI must have products available for retail purchase at launch.

We do commend NVIDIA for getting product out there before launching it. But now we move on to the least pleasant side of this launch: price. The GeForce 7800 GTX will cost a solid $600. Of course, we do expect retailers to charge a premium for the early adopters. Prices we are seeing at launch are on the order of $650. This means those who want to build an SLI system based on the GeForce 7800 GTX will be paying between $1200 and $1300 just for the graphics component of their system.

So, what exactly is bigger better and faster this time around? And more importantly, what does that mean for game performance and quality (and is it worth the price)? This is the right place to find the answers. As developers continue to grow in shader prowess, we expect to see hardware of this generation stretch its legs even more as NVIDIA believes this is the point where pure math and shader processing power will become the most important factor in graphics hardware.

The Pipeline Overview
POST A COMMENT

127 Comments

View All Comments

  • Alx - Wednesday, June 22, 2005 - link

    Face it, this launch isn't gonna hurt anyone except people with minds too small to accept that there is simply one more option than there was before. If you liked pc gaming yesterday, then there is no reason why this launch should make you stop liking it today. Unless you're a retarded buttbaby who can't handle choices. In that case please get a console and stop coming to this site. Reply
  • mlittl3 - Wednesday, June 22, 2005 - link

    #82, Wesley

    Well that sucks that ya'll have lost your web editor for awhile. Especially when there is so much cool hardware coming out around now. In our research lab, we pass around our publications and conference posters to others in the group so that a fresh pair of eyes see them before they go live or to the journal editor. But of course, everyone else at AT is also busy so oh well.

    Good work guys and I look forward to the "new CPU speed bump" article (or FX-57 for those not under NDAs).

    Mark

    PS. If ya'll have an opening for another web editor, you should hire #84 (ironchefmorimoto). I hear he can cook really well.
    Reply
  • AtaStrumf - Wednesday, June 22, 2005 - link

    Nicely corrected Derek, I think there are just a few typos left, like this one (**):

    Page 20
    Power Consumption
    We measured power consumption between the power supply and the wall. This multiplies essentially amplifies any differences in power draw because the powersupply is not 100% efficient. Ideally we would measure power draw of the card, but it is very difficult **determine** to determine the power draw from both the PCIe bus and the 12V molex connector.

    AND a few double "Performances" in the title (Performance Performance) starting with page 10.

    Nice card nVidia!!! I hope ATi isn't too far behind though. Crossfire --> cheap SLi ;-) I need a nice midrange product out by September when it'll be time to upgrade to a nice E6 stepping S939 A64 and something to take the place of my sweet old GF2 MX (I'm not kidding, I sold my 6600GT AGP, and now I'm waiting for the right time to move to PCIe).
    Reply
  • IronChefMoto - Wednesday, June 22, 2005 - link

    Amen -- you guys work hard on your articles. Keep up the great work. And don't f*cking bother the web editor. We...er...they don't get enough vacation as it is.

    IronChefMorimoto
    (another web editor who needs a break)
    Reply
  • Wesley Fink - Wednesday, June 22, 2005 - link

    Derek was too modest to mention this in his comments, but I think you should know all the facts. Our Web Editor is on vacation and we are all doing our own HTML and editing for the next 10 days. In our usual process, the article goes from an Editor to the Web Editor who codes the article, checks the grammar, and checks for obvious content errors. Those steps are not in the loop right now.

    The next thing is NDA's and launches. We are always under the gun for launches, and lead times seem to get shorter and shorter. Derek was floating questions and graphs last night at 3 to 4 AM with an NDA of 9AM. Doing 21 pages of meaningful commentary in a short time frame, then having to code it in HTML (when someone else normally handles that task), is not as easy as it might appear.

    I do know Derek as a very conscientious Editor and I would ask that you please give him, and the rest of us, a little slack this next week and a half. If you see errors please email the Editor of the article instead of making it the end of the world in these comments. I assure you we will fix what is wrong. That approach, given the short staff, would be a help to all of us. We all want to bring you the information and quality reviews you want and expect from AnandTech.
    Reply
  • IronChefMoto - Wednesday, June 22, 2005 - link

    #79 -- But why wouldn't it be a high quality article, mlittl3? I thought you told me that AT was infallible? Hmmm? ;-) Reply
  • Houdani - Wednesday, June 22, 2005 - link

    Thanks for the refresh, Derek. I went back and took a peek at the revised graphs. I have a couple of comments on this article before you move on to the next project.

    >> When the Splinter Cell page was refreshed, the graph for 20x15x4 apparently disappeared.

    >> When you removed the SLI's from the Guild War page, it looks like the 7800GTX changed from 50.5 to 55.1 (which is the score previously given to the 6800 Ultra SLI).

    >> Several of the pages have scores for no AA benches listed first, while other pages have scores for the 4xAA listed first. While the titles for the graphs are correct, it's a little easier to read when you stay consistent in the ordering. This is a pretty minor nit-pick, though.

    >> Thanks for updating the transparency images to include mouseover switches ... quite handy.
    Reply
  • fishbits - Wednesday, June 22, 2005 - link

    "They priced themselves into an extremely small market, and effectively made their 6800 series the second tier performance cards without really dropping the price on them. I'm not going to get one, but I do wonder how this will affect the company's bottom line."

    The 6800s were "priced into an extremely small market." How'd that line turn out? I don't imagine they've released this product with the intention of losing money overall. Why do you think retailers bought them? Because they know the cards won't sell and they're happy to take the loss? It's already been proven that people will pay for you to develop and sell a $300, wait $400, wait $500 video card. It's already been proven that people will pay a $100+ premium for cards that are incrementally better, not just a generation better. Sounds like this target is a natural, especially knowing it'll eventually fall into everyone else's purchasing ability.

    Being able to say you have the bar-none best card out there by leaps and bounds is certainly worth something. Look at all the fanboys that are out there. Every week or month you're able to stay on top of the benches means you get more people who'll swear by your products no matter what for years to come. Everyone you can entice into buying your card who sees it as a good product will buy your brand in the future as a preference, all other options being equal. I could be wrong, but suspect Nvidia's going to make money off this just fine.
    -----------------------
    "I am proud that our readership demands a quality above and beyond the norm, and I hope that that never changes. Everything in our power will be done to assure that events like this will not happen again."

    See... that's why I'm a big fan of the site.

    Reply
  • mlittl3 - Wednesday, June 22, 2005 - link

    #78, I bet you didn't even read the article. How do you know it demonstrated editoral integrity? Reply
  • IronChefMoto - Wednesday, June 22, 2005 - link

    #23 (mlittl3) still can't pronounce "Penske" and "terran" right, regardless of the great editorial integrity demonstrated by the AT team today. Thanks! Reply

Log in

Don't have an account? Sign up now