The Newcomers

As we briefly mentioned, there are three new products to talk about today – the Radeon 9800 XT, the Radeon 9600 XT and then NVIDIA’s NV38.

The XT line of Radeon 9x00 cards is specifically targeted at the very high end of the gaming market. With AMD and their Athlon 64 FX, Intel and the Pentium 4 Extreme Edition, it’s not too surprising to see even more companies going this direction. With an ultra-premium part like the Radeon 9800 XT the profit margins are high and more importantly, the PR opportunities are huge – claiming the title of world’s fastest desktop GPU never hurts.

The effort required to produce a part like the Radeon 9800 XT is much lower than a serious redesign. When making any kind of chip (CPU, GPU, chipset, etc…) the design team is usually given a cutoff point where they cannot make any more changes to the design, and that is the design that will go into production. However, it is very rare that manufacturers get things right on the first try. Process improvements and optimizing of critical paths within a microprocessor are both time intensive tasks that require a good deal of experience.

Once ATI’s engineers had more experience with the R350 core and more time with it they began to see where the limitations of the GPU’s clock speed existed; remember that your processor can only run as fast as its slowest speed path so it makes a great deal of sense to change the layout and optimize the use of transistors, etc… to speed up the slow paths within your GPU. This oversimplified process is what ATI and their foundry engineers have been working on and the results are encompassed in the R360 – the core of the Radeon 9800 XT.

The Radeon 9800 XT is able to run at a slightly higher core frequency of 412MHz, quite impressive for ATI’s 0.15-micron chip (yes, this is the same process that the original R300 was based on). Keep in mind that the Radeon 9800 Pro ran at 380MHz and you’ll see that this 8% increase in clock speed is beginning to reach the limits of what ATI can do at 0.15-micron.

The Radeon 9800 XT does receive a boost in memory speed as well, now boasting a 365MHz DDR memory clock (730MHz effective) – an increase of 7% over the original Radeon 9800 Pro and an increase of 4% over the 256MB 9800 Pro. ATI was much more proud of their core clock improvements as we will begin to crave faster GPU speeds once more shader intensive games come out.

The Radeon 9800 XT does have a thermal diode (mounted on-package but not on-die) that has a driver interface that will allow the card to automatically increase its core speed if the thermal conditions are suitable. The GPU will never drop below its advertised 412MHz clock speed, but it can reach speeds of up to 440MHz as far as we know. The important thing to note here is that ATI fully warrantees this overclocking support, an interesting move indeed. Obviously they only guarantee the overclock when it is performed automatically in the drivers, as they do not rate the chips for running at the overclocked speed in all conditions.

The OverDrive feature, as ATI likes to call it, will be enabled through the Catalyst 3.8 drivers and we’ll be sure to look into its functionality once the final drivers are made available.

The Radeon 9800 XT will be available in the next month or so and it will be sold in 256MB configurations at a price of $499 – most likely taking the place of the Radeon 9800 Pro 256MB.

Index The Radeon 9600XT & NV38
Comments Locked

263 Comments

View All Comments

  • Anonymous User - Saturday, October 4, 2003 - link

    The MS flightsim tests might have v-sync enabled. That would explain the strange test results
  • dswatski - Saturday, October 4, 2003 - link

    AND: Age of Mythology AND: Rendering with Adobe Premiere Pro with support for second monitor.
  • Anonymous User - Saturday, October 4, 2003 - link

    gf
  • Rogodin2 - Saturday, October 4, 2003 - link

    That was a pathetic review because there were way too many varibles and the fact that anand stated that there were no valid premises to reach a conclusion should have been taken to heart before he decided to publish such a POS as this.

    rogo
  • Anonymous User - Friday, October 3, 2003 - link

    This info is simply unofficial, as DX doesn't want to stir up the industry more than has alredy been done. As some might recall, 3dfx was given the same ultimatum back in 99', yet the news wasn't even released until 2 years later after
  • Anonymous User - Friday, October 3, 2003 - link

    So by all means, Do Not Download Detonator 50 Drivers!!! Along with this, NV has been caught cheating on benchmarks as they usually do over at Anandtech . Notice that all of the realworld benchmarks perform better on ATi, yet all synthetic benchmarks perform better by a large margin on NV hardware. "These violations are inexcusable" said a DX employee, and I'd have to agree. So without the inside drive on DX10, NV will not be able to even optimize their cards as ATi can and will probably fall into bankruptsy just as 3dfx did before them...
  • Anonymous User - Friday, October 3, 2003 - link

    NVIDIA out of DX10? Discuss
    There's an interesting link on Gearbox Software's forums that claim NVIDIA has been shunned by Microsoft's DirectX team for future versions of the API - Thanks SidiasX!

    Nvidia's NV38 (along with the rest of the FX series) has been dubbed as a substandard card by team dx. This means that DX will not include NV in it's developement range for directx10. Team DX made the decision "as a favor to the graphics industry". Team DX claims that NV violated their partnership agreement by changing the DX9 code with their latest set of drivers as caught by Xbit labs recently. This violates the licensing agreement and conpromises DX's quality in order to make it seem as if ATi and NV cards alike display the same image quality (which would be really bad in this case). This can only be fixed by reinstalling dx9b.

    ATI's "Development Agreement"


    it's looking bad for Nvidia..
  • Anonymous User - Friday, October 3, 2003 - link

    NVIDIA out of DX10? Discuss
    There's an interesting link on Gearbox Software's forums that claim NVIDIA has been shunned by Microsoft's DirectX team for future versions of the API - Thanks SidiasX!

    Nvidia's NV38 (along with the rest of the FX series) has been dubbed as a substandard card by team dx. This means that DX will not include NV in it's developement range for directx10. Team DX made the decision "as a favor to the graphics industry". Team DX claims that NV violated their partnership agreement by changing the DX9 code with their latest set of drivers as caught by Xbit labs recently. This violates the licensing agreement and conpromises DX's quality in order to make it seem as if ATi and NV cards alike display the same image quality (which would be really bad in this case). This can only be fixed by reinstalling dx9b.

    ATI's "Development Agreement"


    it's looking bad for Nvidia..
  • Anonymous User - Friday, October 3, 2003 - link

    NVIDIA out of DX10? Discuss
    There's an interesting link on Gearbox Software's forums that claim NVIDIA has been shunned by Microsoft's DirectX team for future versions of the API - Thanks SidiasX!

    Nvidia's NV38 (along with the rest of the FX series) has been dubbed as a substandard card by team dx. This means that DX will not include NV in it's developement range for directx10. Team DX made the decision "as a favor to the graphics industry". Team DX claims that NV violated their partnership agreement by changing the DX9 code with their latest set of drivers as caught by Xbit labs recently. This violates the licensing agreement and conpromises DX's quality in order to make it seem as if ATi and NV cards alike display the same image quality (which would be really bad in this case). This can only be fixed by reinstalling dx9b.

    ATI's "Development Agreement"


    it's looking bad for Nvidia..
  • Anonymous User - Friday, October 3, 2003 - link

    The CPu is not out.
    the NV38 is not out
    the new drivers 52.14 are not out.
    and these drivers have issues and probably IQ degradation.

    the test should go up to 1600 x 1200 at least,
    we should stress video cards not CPU's.
    DX9 needs to be included in the benches.

    I know what my next card will be ,
    ATI will be replacing my Nvidia soon.

    I want to play HL2 and TR aod.(I love the game).

    I remember , years ago ,when ATI came out with a faster card and the next day Nvidia had a new driver that increased performance by 25%.

    I'm still disgusted since the cheat drivers with bad IQ, and poor DX9 .





Log in

Don't have an account? Sign up now