DX9 Gaming Performance


Gaming Performance

Gaming Performance

Gaming Performance

Gaming Performance

Gaming Performance

Gaming Performance

We first saw the Far Cry anomaly in past testing of nVidia PCI Express graphics cards. nVidia PCI Express does not perform as well on Far Cry as their comparable AGP graphics cards. This is not the same pattern that we have seen in the ATI PCI Express cards, which perform similarly in Far Cry in either AGP or PCIe flavors.

Keeping in mind that all other Athlon 64 benchmarks were run with an FX53, it is interesting to see that the nVidia PCIe on nF4 top the Aquamark 3 results. It appears that this benchmark responds well to the PCIe interface.

Other DX9 benchmarks are essentially the same as seen on the nForce3 Ultra chipset with the same CPU and an AGP 6800 Ultra.

General Performance and Encoding DX8 and OpenGL Gaming Performance
Comments Locked

101 Comments

View All Comments

  • geogecko - Tuesday, October 19, 2004 - link

    #68

    Hmm...that search result at newegg.com pulls up 12595 results. Far to many for me to look through...

    Did you copy the link correctly?

    Thanks for the information. If the link won't work, an official part number from newegg (or vendor part number) will work for me.

    J.
  • thebluesgnr - Tuesday, October 19, 2004 - link

    I'm a little disapointed that the original article didn't say anything about sound, and that it still doesn't in the "Final words..." page. No, I don't meand SoundStorm.

    From AT's previous article on CK8-04: "Vanilla flavored CK8-04 is very much the same as nForce3 250Gb, with the addition of 7.1 high definition audio and PCI Express".

    So, they dropped the high def audio?
    If that's the case both Intel and VIA (if the information on the VT8251 is confirmed) are ahead in this area, which is, for many, much more important than some silly hardware firewall.

    In closing, I'm disapointed at AnandTech for:

    1) being excessively positive about nForce4 (no mention of lack of high def audio, no mention of any disadvantages of SLI, like higher price of the motherboard and power consumption of two cards, or lack of PCI-E x1 in that MSI mobo);

    2) completely ignoring the release of VIA K8T890 and KT880 chipsets.

    The KT880 has been out for months, there are motherboards in retail (the K7V88 in particular seems to be doing very well, given the number of user reviews and their ratings on newegg).
    Also, you reviewed the [b]nVidia[/b] nForce2 Ultra 400Gb chipset, so "socket A is dead" is not really an answer I'd understand.


  • haris - Tuesday, October 19, 2004 - link

    What's the big deal about SLI? The average increase in performance will probably be around 50-60%. That's nothing to be ashamed about, but at what cost do you get it? Two 6600's still cost almost as much as one high end card, so there is little/no cost savings. What about the power requirements and noise level. That machine has got to be a freaking monster to work/play on.
  • KristopherKubicki - Tuesday, October 19, 2004 - link

    Geogecko:

    http://www.newegg.com/app/SearchProductResult.asp?...

    Kristopher
  • mrdudesir - Tuesday, October 19, 2004 - link

    #62
    First off, each slot is an x16 slot physically but only 8x of actual bandwith. However that still means that each slot has 4GB/s of bandwith, way more than any modern cards used. There will not be any performance hit, simply because the slots have plenty of excess bandwith.
  • geogecko - Tuesday, October 19, 2004 - link

    Can I get an exact part number of the Corsair 3200XL memory you are talking about on the test platform? I've been looking for it, but I've not seeing this 3208v1.1 number anywere...

    Thanks. By the way, which memory is better, the OCZ or the Corsair?
  • knitecrow - Tuesday, October 19, 2004 - link

    All you guys about doom3 don't need hardware, should read:
    Http://www.beyond3d.com/forum/viewtopic.php?topic=14459&forum=9
    http://www.theinquirer.net/?article=17525

    Basically creative said it invented a particular 3d positioning method, Id was forced to license and support EAX HD.


    #62, Not unlike a CPU, a GPU is programmable to a certain degree. I am sure you can make it do almost anything.... but a dedicated solution will always be more efficient.



    #63 -- "A card based around the VIA Envy 24HT is all anyone needs."

    Rubbish.


    Envy24 cards do jack for 3d positional audio. If you compare a software based vs hardware based solution, the hardware based stuff (soundstorm, creative noiseblaster stuff) always win out. They are more accurate in their positioning and reproduction.
  • quanta - Tuesday, October 19, 2004 - link

    Actually, id licensed EAX HD for use with Doom 3. Even without EAX HD support, Doom 3 will just send the audio streams to DirectSound 3D engine for mixing purposes, which will take advantage of 3d audio accelerations if any.
  • PrinceGaz - Tuesday, October 19, 2004 - link

    Doom 3 doesn't use hardware accelerated sound, so SoundStorm has no benefit. If you are having problems with the sound, you might want to adjust hardware acceleration or something.

    Sound only takes a tiny amount of CPU power when you've got a processor like a 3800+ so it doesn't really matter whether or not its hardware accelerated. Its even less important when you consider that games are increasingly GPU bound, and that theres plenty of CPU power spare for processing sound. A card based around the VIA Envy 24HT is all anyone needs.
  • quanta - Tuesday, October 19, 2004 - link

    #54/57, the decision to dump dedicated SoundStorm hardware actually made a lot of sense, because NVIDIA already has a powerful VPU that can be used as an APU if the company wanted. In fact, NVIDIA can just license AVEX[1], which currently only works on NVIDIA processors, and if NVIDIA play the cards right, it can just bought the BionicFX company now/soon and keep an edge over the competitions all to itself.

    As for the SLI, I think it will be too confusing for end users, and the dual slot design will likely be short-lived. Think about it, there are only 20 PCIE lanes on nForce 4, and each video card uses 16, so at least one card only runs a fraction of the speed, crippling performance. It may be technically correct that current apps don't need all 32 lanes, but it will be tech support nightmare for video card manufacturers from users who expected full blown performance. It will be much easier to just build a 16/20/32/etc-lane PCIE video card with two VPUs in it. That way users don't have to worry about the upgrade restrictions and performance issues, and easier for video card makers to sell dual VPU products. Sure, you lose the upgradability, but without tech support problems, card makers don't have to worry about people buying fewer cards because they want to wait for cheaper, more user friendly SLI solutions.

    [1] http://www.bionicfx.com/

Log in

Don't have an account? Sign up now