Do we see the same thing with World in Conflict?

World in Conflict is also system bound. In our overclock test, we did not see any improvement in performance using the built-in benchmark. This indicates that the limitation is coming from somewhere else. Our platform swap did give us almost 10% improvement but our performance “curve” was still just as flat, meaning we’re still limited somewhere.

With the type of high-end hardware now in our hands that has enabled us to expose such limitations, we still have a lot of investigating to do before we better understand the issues. It is clear that the benchmark for World in Conflict is much harder on the system then actual gameplay, so we will try out some in-game tests to see if we can’t make any more sense of this one. For now, here’s what we see on Skulltrail.


What about Crysis on Very High? All the rest
Comments Locked

54 Comments

View All Comments

  • DDH III - Thursday, October 23, 2008 - link

    So you're saying that when my 1st 9800gx2 gets here in the mail next week, that it actually help my landlord on his heating bill? That's good because he pays for the electric too, and I feel kinda bad already. Since while heating bills in Interior Alaska are nasty, the electric is just as .. but anyways.

    Great review. Though I won't be able to run these in quad on my current mobo. But then, I never liked the FPS's much since quake, which brings me to my point:

    These days I play insane amounts of Supreme Commander. It is CPU intensive. If you are looking for another Benchmark when you start cranking up the GHz, ...well just give it a look.

    But to say a bit: 8 player is CPU pain. I only have dual cores, but my first dual cores were AMD (my games never needed the most rocking FPS) I had to buy into go C2D, just for game speed..forget the eye-candy.

    In short I read this article and went..huh...me too.

    Anyways, I am one of those Dell 3007 owner's, and I learned a lot. TY. And I don't live in North Pole AK, I only work there.




  • luther01 - Friday, May 9, 2008 - link

    hi guys. i have a quad 9800 gx2 system as follows:

    q6600 processor @2.8GHz
    4 gig pc-6400 RAM
    780i motherboard
    2x 9800gx2

    i must say that so far i haven't noticed any performance gains over one 9800gx2, and in some games like The Witcher i have actually had a performance drop of 10-15 fps. In crysis, the frame rate in v high settings is disappointingly low, sometimes dropping to 15fps in large environments. i think maybe there is another bottleneck in my system
  • Das Capitolin - Sunday, April 27, 2008 - link

    It appears that NVIDIA is aware of the problem, yet there's no word on what or when it might be fixed.

    http://benchmarkreviews.com/index.php?option=com_c...">http://benchmarkreviews.com/index.php?o...&tas...
  • Mr Roboto - Sunday, March 30, 2008 - link

    Over at tweaktown they're coming up with the same results as Anand did even with the 9800GTX.
    Is it possible a lot of the problems are because of the G92\94 has a smaller 256-Bit memory interface? Could it cause that much of a difference especially combined with less VRAM. Just a thought.

    http://www.tweaktown.com/articles/1363/1/page_1_in...">http://www.tweaktown.com/articles/1363/1/page_1_in...
  • ianken - Thursday, March 27, 2008 - link

    ...logging CPU loading shows it evenly loaded across four cores at about 50%. But then I'm running a lowly 8800GTS(640mb) SLI setup.

    The numbers in this review just seem kinda off.
  • DerekWilson - Friday, March 28, 2008 - link

    what settings were you testing on?

    we see high quality and very high quality settings have an increasing impact on CPU ... the higher the visual quality, the higher the impact on CPU ... it's kind of counter intuitive, but it's a fact.

    you actually might not be able to hit the CPU limit with very high settings because the GPU limit might be below the CPU limit... Which seems to be somewhere between 40 and 50 fps (depending on the situation) with a 3.2GHz intel processor.
  • seamusmc - Wednesday, March 26, 2008 - link

    I like Hard OCP's review because while the Quad SLI solution does put up higher numbers then a 8800 Ultra SLI solution they pointed out some serious problems in all games besides Crysis.

    It appears that memory is a bottleneck and many games have severe momentary drops in FPS at high resolutions and/or with AA, making the gaming experience worse then an 8800 Ultra SLI solution. I strongly recommend folks take a look at Hard OCP's review.

    AnandTech's review only covers average FPS which does not address nor reveal the kinds of issues the Quad SLI solution is having.
  • B3an - Wednesday, March 26, 2008 - link

    Thanks for mention of the HardOCP review. A lot better than Anands.

    Very disappointed with Anand on this article. I posted a comment asking why i was getting such bad frame rates at high res with AA, and Anand did not even address this. When they must have run in to it at 2560x1600, and just about all the games they tested at 2560x1600 will get killed with AA because of the memory bottleneck. I'm talking from trying this myself. If i knew about it i wouldn't have got a GX2 as it's pretty pointless with a 30" monitor.
    So are they getting paid by NV or what?

    Very disappointed.
  • AssBall - Wednesday, March 26, 2008 - link

    "Very disappointed with Anand on this article. I posted a comment asking why i was getting such bad frame rates at high res with AA, and Anand did not even address this."


    They are trying to answer their own questions and solve their own problems right now. This comment section is for comments on the article, not your personal technical support line to Anand.
  • B3an - Thursday, March 27, 2008 - link

    When i said "I posted a comment asking why i was getting such bad frame rates at high res with AA, and Anand did not even address this."

    I meant, why has this website not addressed this issue in any of there GX2 articles. As any decent website would, especially being as it's not hard to run in to. I mean a high-end card like this would be for people with 24" and 30" monitors like myself. As anything lower than 1920x1200 a GTX would be more than enough. Yet the card has massive memory bandwidth problems and/or not enough usable VRAM. So frame rates get completely crippled when a certain amount of AA is used on games.
    Making the card pretty pointless.

Log in

Don't have an account? Sign up now