World in Conflict

Much like Crysis, World in Conflict just doesn't run well on anything. We had to resort to High Quality defaults without any AA, and even then the frame rates we saw were hardly consistently smooth. The X2 stays ahead of the 8800 GTS 512, but not by much: the performance difference is only a few percentage points.

The X2's advantage does grow at 2560 x 1600 to 25%, but at lower resolutions it's hardly noticeable.

If WIC is on your favorites list, you're going to have to wait for something faster from NVIDIA to really push higher frame rates.

World in Conflict - Built in Benchmark

World in Conflict - Built in Benchmark

World in Conflict - Built in Benchmark

The Witcher Power Consumption
Comments Locked

74 Comments

View All Comments

  • poohbear - Monday, January 28, 2008 - link

    well its about time, good job amd, lets see u maintain the performance lead damn it!
  • boe - Monday, January 28, 2008 - link

    Howdy,

    I appreciate any benchmarks we can get but if you do a followup on this card with newer drivers, I hope you will consider the following

    1. A comparison with a couple of older cards x1900 and 7900

    2. A sound measurement of the cards e.g. db at full utilation from 2'

    3. Crossfire performance if this card supports it.

    4. Benchmarking on FEAR - all bells and whistles turned on

    5. DX10 vs. DX9 performance.


    Thanks again for creating this article - I'm considering this card.
  • perzy - Monday, January 28, 2008 - link

    Am I the only one tired of all these multicores? I guess programming gets even more complex now. I guess the future all games will have development cycles like Duke Nukem forever -10+ years....?

    Are the GPU's hitting the heatwall 2 now?

    Soon I'll stop reading these hardware sites. The only reports in the near future will be 'yet another core added.' Yipee.
  • wien - Monday, January 28, 2008 - link

    Coding for a multi-GPU setup is not really any different that coding for a single-GPU one. All the complexity is handled by the driver, unlike with multi-core CPUs.
  • FXi - Monday, January 28, 2008 - link

    Have to say they did a good job, not great, but very good. We do need to see the 700 though, as this won't hold them for long.

    The other thing both camps need to address is dual monitors using SLI/CF. It's been forever since this tech has been out and it hasn't been fixed. Dual screens are commonplace and people like them. Could be one large and one smaller, or dual midrange but people want the FPS without losing their 2nd screen.

    I'm sure there will be a rash of promises to fix this that won't materialize for years :) (as before)
  • ChronoReverse - Tuesday, January 29, 2008 - link

    Actually, that was one of the things that was fixed by ATI. Dual screens will work even if it's in a window and _spanning_ the monitors. I'll see if I can find the review that showed that.
  • murphyslabrat - Monday, January 28, 2008 - link

    Come on AMD, don't die until we get the Radeon 4870 x2!
  • Retratserif - Monday, January 28, 2008 - link

    Third to last paragraph.

    The fact "hat" both?

    Overall good article. To bad we didnt get to see temps or overclocks.
  • PeteRoy - Monday, January 28, 2008 - link

    Is Nvidia and ATI will just put more of the same instead of innovate new technologies.

    Wasn't that what killed 3dfx, Nvidia should know.
  • kilkennycat - Monday, January 28, 2008 - link

    The next-gen GPU family at nVidia is in full development. Hold on to your wallet till the middle of this year (2008). You may be in for a very pleasant surprise.

Log in

Don't have an account? Sign up now