With so much going on I've become delinquent (as usual) in updating you all, but I'm hopefully at the turning point for that. Pentium D 805, Socket-AM2 and MacBook Pro were all very interesting, but right now the main thing I'm working on is performance under Bethesda's hottest title: Oblivion.

Oblivion is turning out to be the most stressful game I've run on the latest hardware, and its performance stats make that very clear. Even with a pair of Radeon X1900 XTs you are still unable to run at 1600 x 1200 with everything turned up and get smooth frame rates everywhere. Although the majority of my testing thus far has been with GPUs later this week I will start to look at the impact of the platform as a whole on Oblivion, eventually leading into CPU performance using the title.

The game itself is quite possibly the best I've played in several months at the least; that's a pretty big compliment coming from someone who historically isn't the biggest RPG fan, but Oblivion is accessible enough to make it fun for just about anyone.

My Oblivion testing put me in a situation where I had to deal with basically every GPU released over the past 2 - 3 years, which in turn left me with a handful of gripes that I don't think I've properly voiced here - most of them involving CrossFire/SLI.

Just so you all know what happens behind the scenes, whenever we write a story even remotely hinting at the idea of running SLI on a non-NVIDIA platform we usually get several angry emails from NVIDIA. And while I would never recommend purchasing a non-NVIDIA motherboard with hopes of running SLI on it, NVIDIA's reaction does highlight a much larger problem. NVIDIA, and to a lesser extent ATI, are far too focused on delaying the transition to a truly seamless multi-GPU environment in order to try and lock customers in to purchasing one chipset or another.

While in the mainstream, NVIDIA's policies don't really hurt it, at the very high end it is nothing but annoying. Although ATI's CrossFire Xpress 3200 (RD580) chipset is, finally, competitive with NVIDIA's SLI offerings the latter is simply found on more motherboards that have been around for much longer. So I can see why a lot of users would still prefer to go the safe route with a NVIDIA SLI platform instead of the ATI chipset. But if you're looking for the absolute highest performance in Oblivion, you'll want a Radeon X1900 XT(X) and if you can afford it and want even better performance you'll want two. Unfortunately that means that you've got to change your purchase around to either buy NVIDIA GPUs or an ATI motherboard, which may not be what you originally wanted to do.

It hurts ATI as well, take the example above. ASUS' RD580 board has gotten some great reviews and is a very solid motherboard, making it very good competition for its SLI x16 motherboard. But if you want to use NVIDIA GPUs with it in SLI (or with the hopes of someday upgrading to SLI) you're out of luck. Once again your GPU choice ties you into a particular chipset choice.

Although there is a lot of validation that goes in to testing and certifying SLI/CrossFire platforms, there's no harm (to the end user) in at least offering "at your own risk" support and seeing what sort of response there is. While publicly the problem is always stated to be about validation and guaranteeing an excellent user experience, the real goal is to force exclusivity within a computer.

With a tremendous install base of SLI platforms, NVIDIA is far less likely to just wake up one day and offer support for ATI and Intel chipsets (unless one of them ponies up and pays a lot of money) so instead I turn to ATI. ATI has already enabled support for CrossFire on Intel 975X platforms, and if it wants to gain further acceptance of its multi-GPU solution it should do the same on NVIDIA SLI platforms. While that could potentially hurt its chipset sales, it also has the potential of increasing GPU sales if ATI can be the only company to offer a multi-GPU solution that works on any chipset.

Ideally it would also push NVIDIA to do the same, and hopefully mean an earlier end to what is truly a silly situation. By not offering universal multi-GPU solutions that work on any platform equipped with the right number of PCIe slots ATI and NVIDIA are not working in the best interests of its customers and are rather publicly operating in the best interests of its own pockets. Although it's unfortunately rare for the customer to come first, the current multi-GPU platform situation is a bit more pronounced than usual.

ATI has already taken the first step by offering support for Intel 975X platforms, unfortunately at a time when Intel's platforms aren't very popular among gamers. While it's a nice (and perhaps calculated) gesture, I want more. The question is, will ATI go the rest of the way?

I'm headed off to the airport now, tomorrow Vinney and I have two meetings that should hopefully be the last two things that we'll need to sign off on before the house can be finished up. We're finalizing our hardwood floor stain color and I'm doing a final walkthrough with the structured wiring guy to make sure that the excessive amounts of CAT5e are where they should be. With those two things done, we hope to close on the house during the second week of May. The trip down to NC will be a short one because of work, I should be back on Wednesday with Oblivion as my top priority. If there's anything in particular you'd like to see, let me know and as always I'll do my best to include it.

Once again, I'm sorry for letting you all fall behind on updates as to what's going on but I hope to change that once this move back home finally happens. There's a lot of changes here at AT that are in the process of happening as well which I will update you all on at another time. Until then, take care and have a great week :)
Comments Locked

29 Comments

View All Comments

  • punko - Tuesday, April 18, 2006 - link

    quote:

    There's a lot of changes here at AT that are in the process of happening as well


    Guides! Guides!

    I hope we get back to a regular rhythm for system guides. I'm itchin' for a new system, and the landscape sure has changed since the last mid/high guide came out.

    Of course with AM2 just around the corner, I can see an arguement for a delay . . .

    But I'm still awaitin' for my Guides!
  • crimson117 - Tuesday, April 18, 2006 - link

    Anand,

    While you're doing games I'd love to see an update to the WoW performance guide - especially WoW on a MacbookPro. Since wow.exe is a universal binary, comparing it under Windows and OS X on exactly the same hardware could be very interesting, and would be something of a first.

    -Jim
  • UlricT - Tuesday, April 18, 2006 - link

    When you talk about ATi providing support for CrossFire on Nvidia chipsets, would only the video drivers need to support it? Wouldn't the Nvidia chipset drivers also need to support this?
  • taemun - Tuesday, April 18, 2006 - link

    It'd be great if you could compare a 7900GT and 7900GTX (256 and 512mb respectively) at the same clocks, at varying resolutions (upto really high ones)... just to see how much difference there is there.

    Thanks
  • bwmccann - Tuesday, April 18, 2006 - link

    Anand,
    How long have you guys been building that house? It seems like years...either it is the size of the White House or you guys are doing the construction. :-)

    Brian
  • SLCentral - Monday, April 17, 2006 - link

    Hey Anand,

    In your Oblivion performance evaluation, are you going to include some kind of comparison to the Xbox 360 version? If it takes SLI/Crossfire to run it at 1600x1200 with eye candy on the PC, I'd much rather invest in a Xbox 360. It's not fun to play at 1024x768 (I don't have SLI) on a 2560x1600 screen, but I'd love to see some 720p action on an Xbox 360 and my DLP.
  • nordicpc - Monday, April 17, 2006 - link

    I highly agree. While there are a few mid-range vs. elite hardware reviews, I'd like to know how much better the PC version can be against the 360. Has the PC already overcome the coveted 360 and how much hardware does it take? I personally have a 32" LCD HDTV I game on with my PC, and although it can be a PITA sometimes, it's usually worth it to have AA and AF at 1280x720.

    Ohh yeah, don't forget the widescreen support for us!
  • smitty3268 - Monday, April 17, 2006 - link

    You might want to check out this article in the meantime: http://www.gamespot.com/features/6147028/index.htm...">http://www.gamespot.com/features/6147028/index.htm...

    Short story is that the PC version looks better than the 360, but only if you have really top of the line hardware.
  • nordicpc - Tuesday, April 18, 2006 - link

    Thanks for the link. I was kinda hoping for a reason to buy the 360, but I guess since my PC would already look better, there's no point.

Log in

Don't have an account? Sign up now