The definitive Fall Refresh

After NVIDIA released the TNT2 Ultra, we saw the first incarnation of the now common “6-month product cycle.” The strategy was the exact one used to dethrone 3dfx, and is based on a very simple principle of using parallel design teams. If you have three design teams, one working on the current generation product, one working on the 6-month follow-up and one working on the next-generation solution, assuming all teams work efficiently, you should be able to maintain a stream of GPU releases in 6 month intervals.

To make the job a bit easier, you only work on inventing new architectures every 12 months, giving you a little break in between the hectic lifestyle of a GPU design engineer. But in order to maintain competitiveness you have to have a product every 6 months, so in the time between architectures you simply “refresh” your current generation architecture. A refresh is generally a higher clocked GPU, potentially with some faster memory, made possible due to more experience with manufacturing that particular GPU (yields improve over time) and the availability of faster memory. Sometimes we get advancements in process technology that allows for a boost in clock speed as well.

When NVIDIA introduced the 6-month product cycle the idea was that new architectures would debut in the Fall, and refresh products would hit in the Spring. The delay of NV20 (GeForce3) changed things around a bit and the GeForce2 Ultra became the first “Fall refresh” product. Since then, little attention has been paid to when various GPUs hit, as long as we get something new every 6 months we’re happy. Earlier this year we heard that both ATI and NVIDIA would be releasing their true next-generation hardware next Spring, leaving this Fall as the refresh cycle.

ATI’s high-end refresh was the Radeon 9800 XT, and as you can guess their midrange refresh is the new Radeon 9600 XT. Much like the Radeon 9800 XT, the 9600 XT only adds two features: a higher clock speed and support for OverDrive.

The Radeon 9600 XT GPU now runs at 500MHz, a 25% increase in clock speed over the 9600 Pro’s 400MHz clock. The memory speed of the Radeon 9600 XT remains at 300MHz DDR (effectively 600MHz), so there is no increase in memory bandwidth over its predecessor.

The hefty increase in clock speed is due to improvements in process technology as well as the introduction of a low-k dielectric. As we briefly explained in our 9800 XT review, the benefits of a low-k dielectric are mainly related to shielding from crosstalk in high transistor density chips, which gives us the clock speed boost we see with the 9600 XT. Because we’re just talking about an increase in core clock speed, the games to receive the biggest performance boost from the XT would be those that are GPU-limited, which unfortunately are few and far in between these days. Games that are largely shader bound such as Half Life 2 will definitely enjoy the 9600 XT’s increase in clock speed, but for now we’ll see most of the performance benefits go to waste.

We explained OverDrive technology in our Radeon 9800 XT review and tested it in our Catalyst 3.8 driver update. The Radeon 9600 XT includes an on-die thermal diode that measures the temperature of the core; when the temperature is cool enough the driver will instruct the core to overclock itself by a set margin. The Radeon 9600 XT will run at one of three speeds depending on its temperature: 500MHz, 513MHz or 527MHz. The combination of this driver and hardware support makes up ATI’s OverDrive feature.

OverDrive is currently not enabled for the Radeon 9600 XT in the Catalyst 3.8 drivers, we will have to wait for the Catalyst 3.9s before we can test the 9600 XT with OverDrive. If you’re curious about the performance implications of enabling OverDrive, have a look at our Catalyst 3.8 review – it’s nothing to get too excited about.

Index The Test
POST A COMMENT

70 Comments

View All Comments

  • Anonymous User - Thursday, October 16, 2003 - link

    There is something in the review not clear to me. The 9800XT article made much of the dynamic overclocking feature, while one of the benefits touted for the 9600XT was its .13 process, making it run cooler and which should help overclocking.

    Yet the article mentioned neither dynamic overclocking nor made any attempt to overclock. This should have been done!

    rms
    Reply
  • Anonymous User - Thursday, October 16, 2003 - link

    #29, get a life. All big web sites use flash, you'd be idiotic not to. This isn't slashdot, where you can bitch and moan about how evil MS is, how great Linux, and how your pimples pop every time you eat too quickly. Reply
  • AlteX - Thursday, October 16, 2003 - link

    I think there's a problem with testing all these cards on the same machine. Of course, this gives a good competitive analysis, but is this what we really want to see?

    I, for one, want to buy a value gaming system in a month or two. Being a value SYSTEM (not a high-end system with just a value Gfx card), it definitely won't include anything near Athlon64 FX or even DDR400. It will most probably include some mid-range Ahtlon XP (2500+ or so), DDR333, etc. A Radeon 9600 class card would be a perfect fit for such a system.

    And what I'd really like to know is not how Radeon 9600XT compares to Radeon 9800XT on a high-end machine, but how it compares to other mainstream cards on a mainstream machine. Also, I'd like to know how each game is playable on each card. Meaning: what are the maximal IQ settings (resolution, AA/AF settings) that I can use to still get MINIMUM framerate of at least 25-30 FPS.

    Thanks.
    Reply
  • Anonymous User - Thursday, October 16, 2003 - link

    I have returned 2 different 9600 Pro cards (Club3D and Hercules) because at high resolutions they show a dark shadow to the right of (black on white) text. At 2048x1536@85 it is terrible. A 9000 Pro does not have this problem. I wonder whether the 9600 XT has this same problem, or that it is fixed, maybe because of the new process.

    The problem is most visible using a pattern of alternating black and white pixels, like this:
    ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
    Reply
  • Anonymous User - Thursday, October 16, 2003 - link

    The truth of the matter is that the nvidia cards are more technologically advanced then the ati cards...(btw I originally owned the 5900u sold it when I saw the hl2 benchmarks and bought a 9800p at best buy, returned it 4 days ago and picked up the 9800xt :) )...I have owned the tnt2, geforce2pro, geforce 4 ti4400, and the 5900u over the course of the last 4 years. In the case of the 5900u vs the 9700/9800 series, the 9800 series is a better version of card then the 5900u why? Nvidia dropped the ball, even with a better manufacturing process, higher core and memory speeds they weren't able to match ati's performance due to the fact they render at 32bit instead of 24....basically they render at a higher scale but they are too slow at 32 to help. If they redesign the core of the chipset to have more shaders I think the nv40 will be an awesome card. I'm hoping they do because ATI's customer service is the worst I have ever experienced. Reply
  • Pointwood - Thursday, October 16, 2003 - link

    What about noise? This is more or less the most important info and I didn't find any info about that.

    If the card isn't close to silent, it's worthless to me.
    Reply
  • Anonymous User - Thursday, October 16, 2003 - link

    the 5200 provides directx 9 at a low price Reply
  • Anonymous User - Thursday, October 16, 2003 - link

    #37
    I couldn't agree with you more...I want the same for my 9700.

    I have to agree with #5
    "Telling people to wait on the 5700 Ultra doesn’t make much sense."

    Seems like paid advertisement to me.

    Wait...blah... I heard that a lot when the 9700 came out and people said wait for NV30. Then again when 9800 came out and people said, wait for NV 40.

    If people are going to buy a card you can wait for something... 1-2 weeks maybe, but damn, If I got the money NOW and I plan to buy a solution NOW, why in the world can we get a good recomendation of what is available NOW, or in the inmediate future? I can understand the 9600pro vs XT dilema, but not when the other option is still a ghost without any presence as we speak.
    Reply
  • Anonymous User - Wednesday, October 15, 2003 - link

    I just want to add my vote to include the 9800SE in future benchmarks. This is looking like the card I will buy to play DX8 and DX9 games, and is within my budget (~$170). Actually, I can't find a better performance/price ratio. Reply
  • Anonymous User - Wednesday, October 15, 2003 - link

    Not that I want tsteal the topic of the thread, but I was wondering about those high;y promoted cards from XGI with Volari GPU(s). Has anyone had a chance t use them? If so, how do they fare in comparison to the market leaders? Reply

Log in

Don't have an account? Sign up now