Doom 3 Performance

There's a 2.7% difference in frame rate between the 3D1 and the 2 x 6600GT SLI solution at 16x12 under Doom 3 without AA. This performance improvement is all due to the memory clock speed increase over the stock 6600 GT speed on the 3D1. The extra 120MHz with which each GPU can hit memory helps to make up for the limited bandwidth to each chip. Off the bat, we don't see any performance gains inherent in going with a single card SLI solution.

Doom 3

Doom 3

With about a 5.3% increase in performance bump, the 3D1's lead over the stock 2 x 6600 GT solution is simply due to its 12% memory clock speed increase. Of course, it is good to confirm that there are no negatives that come from going with a single card SLI solution here.

Doom 3

Doom 3

Throughout this test, the Intel SLI solution performs very poorly, putting in numbers between one- and three-quarters their potential shown on the AMD platform. The fact that the Intel system is not as swift a performer under Doom 3 in general is not a help here either, but we are working with GPU limited tests that help to negate that factor.

The Test Far Cry v1.3 Performance
Comments Locked

43 Comments

View All Comments

  • sprockkets - Friday, January 7, 2005 - link

    Thanks for the clarification. But also some were using the Server Intel chipset cause it had 2 16x slots, instead of the desktop chipset to use SLI. Like the article said though, the latest drivers only like the nvidia sli chipet.
  • ChineseDemocracyGNR - Friday, January 7, 2005 - link

    #29,

    The 6800GT PCI-E is probably going to use a different chip (native PCI-E) than the broken AGP version.

    One big problem with nVidia's SLI that I don't see enough people talking about is this:
    http://www.pcper.com/article.php?aid=99&type=e...
  • Jeff7181 - Friday, January 7, 2005 - link

    Why is everyone thinking dual core CPU's and dual GPU video cards is so far fetched? Give it 6-12 months and you'll see it.
  • RocketChild - Friday, January 7, 2005 - link

    I seem to recall ATi was frantically working on a solution like this to bypass Nvidia's SLI solution and I am not reading anything about their progress. From the position the article points to BIOS hurdles, does it look like we are going to have to wait for ATi to release their first chipset to support a multi-GPU ATi card? Anyone here have any information or speculations?
  • LoneWolf15 - Friday, January 7, 2005 - link

    25, the reason you'd want to buy two 6600GT's instead of one 6800GT is that PureVideo functions work completely on the 6600GT, whereas they are partially broken on the 6800GT. If this solution didn't work in only Gigabyte boards, I'd certainly consider it myself.
  • skiboysteve - Friday, January 7, 2005 - link

    Im confused as to why anyone would buy this card at all. Your paying the same price as a 6800GT and getting the same performance with all the issues that go with Gigabyte SLI. thats retarded.
  • ceefka - Friday, January 7, 2005 - link

    Are there any cards available for the remaining PCI-E slots?
  • Ivo - Friday, January 7, 2005 - link

    Obviously, the future belongs to the matrix CPU/GPU (IGP?) solutions with optimized performance/power consumption ratios. But there is still a relatively long way (2 years?) to go. The recent NVIDIA's NF4-SLI game is more marketing, then technical in nature. They are simply checking the market, concurrence, and … enthusiastic IT society :-) The response is moderate, as the challenge is. But the excitements are predetermined.
    Happy New Year 2005!
  • PrinceGaz - Friday, January 7, 2005 - link

    I don't understand why anyone would want to buy a dual-core 6600GT rather than a similarly priced 6800GT.
  • DerekWilson - Friday, January 7, 2005 - link

    I appologize for the omission of pictures from the article on publication.

    We have updated the article with images of the 3D1 and the K8NXP-SLI for your viewing pleasure.

    Thanks,
    Derek Wilson

Log in

Don't have an account? Sign up now