Introduction

Today marks the launch of NVIDIA's newest graphics cards: the 7900 GTX, 7900 GT and the 7600 GT. These cards are all based on an updated version of the original G70 design and offer higher performance for the dollar. Today we will see just how much faster the new NVIDIA flagship part is. But first let's take a look at what makes it different.

At the heart of this graphics launch is a die shrink. The functionality of the new parts NVIDIA is introducing is identical to that of the original G70 based lineup. Of course, to say that this is "just a die shrink" would be selling NVIDIA a little short here. In the future, if either NVIDIA or ATI decide to move to TSMC's newly introduced 80nm half-node process, all that would be involved is a simple lithographic shrink. Sure, things might get tweaked a little here and there, but the move from 90nm to 80nm doesn't involve any major change in the design rules. Moving from 110nm to 90nm requires NVIDIA to change quite a bit about their register transfer logic (RTL), update the layout of the IC, and verify that the new hardware works as intended.

The basic design rules used to build ICs must be updated between major process shrinks because the characteristics of silicon circuits change at smaller and smaller sizes. As transistors and wires get smaller, things like power density and leakage increase. Design tools often employ standard components tailored to a fab process, and sometimes it isn't possible to drop in a simple replacement that fits new design rules. These and other issues make it so that parts of the design and layout need to change in order to make sure signals get from one part of the chip to another intact and without interfering with anything else. Things like clock routing, power management, avoiding hot spots, and many other details must be painstakingly reworked.

In the process of reworking the hardware for a new process, a company must balance what they want from the chip with what they can afford. Yield of smaller and smaller hardware is increasingly affecting the RTL of a circuit, and even its high level design can play a part. Making decisions that affect speed and performance can negatively affect yield, die size, and power consumption. Conversely, maximizing yield, minimizing die size, and keeping power consumption low can negatively affect performance. It isn't enough to come up with a circuit that just works: an IC design must work efficiently. Not only has NVIDIA had the opportunity to further balance these characteristics in any way they see fit, but the rules for how this must be done have changed from the way it was done on 110nm.

After the design of the IC is updated, it still takes quite a bit of time to get from the engineers' desks to a desktop computer. After the first spin of the hardware comes back from the fab, it must be thoroughly tested. If any performance, power, or yield issues are noted from this first run, NVIDIA must tweak the design further until they get what they need. Throughout this entire process, NVIDIA must work very closely with TSMC in order to ensure that everything they are doing will work well with the new fab process. As microelectronic manufacturing technology progresses, fabless design houses will have to continue to work more and more closely with the manufacturers that produce their hardware in order to get the best balance of performance and yield.

We have made quite a case for the difficulty involved in making the switch to 90nm. So why go through all of this trouble? Let's take a look at the benefits NVIDIA is able to enjoy.

NVIDIA's Die Shrink: The 7900 and 7600
Comments Locked

97 Comments

View All Comments

  • Z3RoC00L - Thursday, March 9, 2006 - link

    Anandtech don't favor ATi over nVIDIA. Have you checked out the majority of reviews? The only site that's giving nVIDIA a decisive win is HardOCP. If you want fanboism and retardation (yes new word I invented) please feel free to visit http://www.HardOCP.com">http://www.HardOCP.com. But if you want solid benchmarks only a few places offer them. Beyond3D, Anandtech and firingsquad. You can also check Techreport & Hothardware. Want a list?
    - Anandtech (GeForce 7600 and 7900 series)
    - Beyond 3D (GeForce 7600 series)
    - Bjorn 3D (GeForce 7600 and 7900 series)
    - ExtremeTech(GeForce 7600 and 7900 series)
    - Firing Squad (GeForce 7900 series)
    - Firing Squad (GeForce 7600 series)
    - Guru 3D (GeForce 7600 and 7900 series)
    - Hard OCP (GeForce 7900 series)
    - Hardware Zone (ASUS GeForce 7900 GT)
    - HEXUS (GeForce 7600 and 7900 series)
    - Hot Hardware (GeForce 7600 and 7900 series)
    - Legit Reviews (XFX GeForce 7900 GTX XXX Edition)
    - NV News (eVHGA GeForce 7900 GT CO)
    - PC Perspective (GeForce 7600 and 7900 series)
    - PenStar Systems (eVGA GeForce 7600 CO)
    - The Tech Report (GeForce 7600 and 7900 series)
    - Tom's Hardware Guide (GeForce 7600 and 7900 series)
    - Tweak Town (BFG GeForce 7900 GTX)
    - Club IC (French) (GeForce 7900 GT)
    - iXBT (Russian) (GeForce 7600 and 7900 series)
    - Hardware.FR (GeForce 7900 series)
    - Hardware.FR (GeForce 7600 series)

    All in all the x1900XTX comes out the winner in the high end segment when HIGH END features are used (AA and AF) and when heavy Shaders are used as well. But it's not a clear victory. Results go both ways and much like the x800XT PE vs. 6800 Ultra (with roles reversed) there will never be a clear winner between these two cards.

    I for one prefer the X1900XTX, I like the fact that it will last a tad longer and offer me better Shader performance, better performance under HDR, Adaptive AA, High Quality AF, HDR + AA, AVIVO and the AVIVO converter tool. But that's just my opinion.
  • Fenixgoon - Thursday, March 9, 2006 - link

    You do realize that the x1900 XT and XTX beat the 7800 series, right? That's all Nvidia has had until now. I'm glad to see the 7900 take the lead (albeit the few frames it gains generally don't matter). What concerns me is the budget market. I'd like to see both ATI and Nvidia do some more work in producing better budget cards. My x800pro is still an awesome mid-range card that can hang with many of these new series cards, minus SM3(I bought it some months ago as a final AGP upgrade). In the end of course, stiff compeitition = better price/performance for us
  • Spoonbender - Thursday, March 9, 2006 - link

    Been living under a rock for the last 3 years? ATI's drivers are fine these days. I still prefer NVidia's drivers, but that's a matter of preference mainly. Quality-wise, there's only the slightest difference these days.
    And NVidia isn't all that compatible either. They've ditched support for everything up to (and including) Geforce 2 in their newer drivers. But really, who cares? I doubt you'd get much more performance out of a GF2 by using newer drivers.

    As for the bias, I'm surprised NVidia does so well in this test. I was expecting them to take a beating performance-wise.


    But geez, what you're saying is really "I don't know anything about ATI, but the fact that AT includes their cards in benchmarks means they must be evil liars..."
  • Spinne - Thursday, March 9, 2006 - link

    If you've never had experience with an ATI GPU, how qualified are you to judge their software? I've used cards made by both companies and I would not bad mouth ATI's drivers down anymore. Ever since the Catalyst series came out, their drivers have been pretty decent. The 'Driver Gap' is highly overrated and untrue to the best of my experience, atleast under Windows. Under Linux, my apartment mate tells me ATI's drivers suck, but then again, he's never used them, but I'd give some weight to his opinion. In any case, there's no point in buying a high end card like this for a Linux box.
  • rgsaunders - Thursday, March 9, 2006 - link

    First of all, let me say that Anandtech is usually the first place I visit when looking for information on new hardware, however, I find that your video card reviews seem to have fallen prey to the same pattern as other review sites. Although its nice to know how these cards perform for gaming, the vast majority of users do more than game with their machines. It would be very beneficial to those of us looking for a new video card to see results of comparative video quality for text use and photo editing as well as the normal gaming tests. In the past, I have returned video cards because of their extremely poor text quality, even though they were good for gaming. The gaming community is a vocal minority online, however, the vast majority of users spend a lot of time using their machines for textual processing or photo editing, etc and a small portion of their time gaming.

    Please include the requested tests in upcoming video card reviews so as to provide a balanced, professional review of these products and stand out from all the other review sites that seem to concentrate primarily on gaming.
  • Spinne - Thursday, March 9, 2006 - link

    Can you specify what cards you've had to return due to poor texture quality? As far as I know, no cards have had problems with 2D in a very very long time. In any case, you'd have to be insane and very rich to splurge money on a G71 or R580 class card for Photoshop or @D desktop performance. It's like buying a '70 Dodge Challenger for driving to work in. I do however feel that AT needs to talk about image quality in 3D some. With all the different modes of AF and AA out there, and the cores themselves performing so well, IQ becomes a large factor in the decesion making process.
  • rgsaunders - Thursday, March 9, 2006 - link

    In the past I have had to return Asus and Abit Geforce based cards due to their dubious text\2D quality. There are differences between the various cards, ATI and nVidia, dependant upon the actual manufacturer, in their filter designs. This has a noticeable affect at times on the quality of the text. I agree that IQ in 3D is important, however I do think that text and 2D IQ are also important. The fact that a G71 or R580 class card may be overkill if all you were doing with your computer is Photoshop or MSOffice, however for some of us, the computer is a multipurpose device, used for the full gamut of applications, including occassional gaming. In the main, I usually stay a step behind the bleeding edge of video performance, as do many others. Todays bleeding edge is tomorrows main stream card and unless you review everything the first time, there is no information wrt text and 2D IQ.
  • Zoomer - Monday, March 13, 2006 - link

    These are most likely reference cards, and reference cards from nvidia have in the past proven to output a much better signal that what will be produced later on, esp. when the price cutting starts.
  • Zoomer - Monday, March 13, 2006 - link

    One more thing.

    Derek, why don't you guys take the time required to produce a nice review? Is it really necessary to get that article up and running on the day of the launch? If you got the cards late, bash the company for it. And take all the time you need to do a proper review like these AT have done in the past.

    Reviews with just benchmarks and pharaphrased press release info is REALLY boring and is a turn off. For example, I couldn't bear to look at the graphs as they weren't relevant. I skipped right to the End.

    Whatever happened to overclocking investigations? Testing for core/mem bottlenecks by tweaking the frequency? Such infomation is USEFUL as it means all these with the same care out there DOES NOT have to repeat it for themselves. Recall AT's TNT/GF2 era articles. If my memory is correct, there were pages of such investigation, and a final recommendation was made to clock up the mem clock to the limit, and then clock up the core.

    Image quality comparisons like these done on for the Radeon 32 DDR, R200, etc are almost absent.

    Quality of components used? Granted, this is moot for engineering sample cards, but an investigation of the cooling solution would be good. Reliability and noise of the cooling solution should be included. Does these ultra fine fins dust traps? That small high RPM screamer a possible candidate for early failure?

    Performance is only one small part of the whole picture. Everyone and their dog publishes graphs. However, only a select few go beyond that, and even fewer are from these that have the trust of many.
  • Questar - Thursday, March 9, 2006 - link

    According to Hardocp, the 7900 has horrible texture shimmering issues.

Log in

Don't have an account? Sign up now