Neverwinter Nights: Shadow of the Undrentide Performance


The 3D Diablo style graphics are standard for a game of this type, but the effects from spells are really exquisite. There are times when the screen just lights up with almost blindingly brilliant fire effects, and the deep shadows seem to suck the gamer into the darkness. These effects are what make having a good graphics card important for this game.




We see some more good indications of scaling here as well. NWN does rely a lot on CPU power, but with 4xAA and 8xAF, the cards do seperate themselves.
FarCry Performance Warcraft III: The Frozen Throne Performance
Comments Locked

77 Comments

View All Comments

  • Pete - Monday, April 19, 2004 - link

    Shinei,

    I did not know that. </Johnny Carson>

    Derek,

    I think it'd be very helpful if you listed the game version (you know, what patches have been applied) and map tested, for easier reference. I don't even think you mentioned the driver version used on each card, quite important given the constant updates and fixes.

    Something to think about ahead of the X800 deadline. :)
  • zakath - Friday, April 16, 2004 - link

    I've seen a lot of comments on the cost of these next-gen cards. This shouldn't surprise anyone...it has always been this way. The market for these new parts is small to begin with. The best thing the next gen does for the vast majority of us non-fanbois-who-have-to-have-the-bleeding-edge-part is that it brings *todays* cutting edge parts into the realm of affordability.
  • Serp86 - Friday, April 16, 2004 - link

    Bah! My almost 2 year old 9700pro is good enough for me now. i think i'll wait for nv50/r500....

    Also, a better investment for me is to get a new monitor since the 17" one i have only supports 1280x1024 and i never turn it that high since the 60hz refresh rate makes me go crazy
  • Wwhat - Friday, April 16, 2004 - link

    that was to brickster, neglected to mention that
  • Wwhat - Friday, April 16, 2004 - link

    Yes you are alone
  • ChronoReverse - Thursday, April 15, 2004 - link

    Ahem, this card has been tested by some people with a high-quality 350W power supply and it was just fine.


    Considering that anyone who could afford a 6800U would have a good powersupply (Thermaltake, Antec or Enermax), it really doesn't matter.


    The 6800NU uses only one molex.
  • deathwalker - Thursday, April 15, 2004 - link

    Oh my god...$400 and u cant even put it in 75% of the systems on peoples desks today without buying a new power supply at a cost of nearly another $100 for a quailty PS...i think this just about has to push all the fanatics out there over the limit...no way in hell your going to notice the perform improvement in a multiplayer game over a network..when does this maddness stop.
  • Justsomeguy21 - Monday, November 29, 2021 - link

    LOL, this was too funny to read. Complaining about a bleeding edge graphics card costing $400 is utterly ridiculous in the year 2021 (almost 2022). You can barely get a midrange card for that price and that's assuming you're paying MSRP and not scalper prices. 2004 was a great year for PC gaming, granted today's smartphones can run circles around a Geforce 6800 Ultra but for the time PC hardware was being pushed to the limits and games like Doom 3, Far Cry and Half Life 2 felt so nextgen that console games wouldn't catch up for a few years.
  • deathwalker - Thursday, April 15, 2004 - link

  • Shinei - Thursday, April 15, 2004 - link

    Pete, MP2 DOES use DX9 effects, mirrors are disabled unless you have a PS2.0-capable card. I'm not sure why, since AvP1 (a DX7 game) had mirrors, but it does nontheless. I should know, since my Ti4200 (DX8.1 compatible) doesn't render mirrors as reflective even though I checked the box in the options menu to enable them...
    Besides, it does have some nice graphics that can bog a card down at higher resolutions/AA settings. I'd love to see what the game looks like at 2048x1536 with 4xAA and maxed AF with a triple buffer... Or even a more comfortable 1600x1200 with same graphical settings. :D

Log in

Don't have an account? Sign up now