The Witcher

To measure performance when playing The Witcher we ran FRAPS during the game's first major cutscene at the start of play. We started recording frame rates as the cutscene faded in and stopped right before Geralt grabs his sword.

The Witcher really likes the Radeon HD 3870 X2, as it was faster than even a pair of 8800 GTs running in SLI except at 2560 x 1600. Once again, we're talking about a 40%+ performance advantage over the closest single-card NVIDIA solution.

The Witcher

The Witcher

The Witcher

Unreal Tournament 3 World in Conflict
Comments Locked

74 Comments

View All Comments

  • poohbear - Monday, January 28, 2008 - link

    well its about time, good job amd, lets see u maintain the performance lead damn it!
  • boe - Monday, January 28, 2008 - link

    Howdy,

    I appreciate any benchmarks we can get but if you do a followup on this card with newer drivers, I hope you will consider the following

    1. A comparison with a couple of older cards x1900 and 7900

    2. A sound measurement of the cards e.g. db at full utilation from 2'

    3. Crossfire performance if this card supports it.

    4. Benchmarking on FEAR - all bells and whistles turned on

    5. DX10 vs. DX9 performance.


    Thanks again for creating this article - I'm considering this card.
  • perzy - Monday, January 28, 2008 - link

    Am I the only one tired of all these multicores? I guess programming gets even more complex now. I guess the future all games will have development cycles like Duke Nukem forever -10+ years....?

    Are the GPU's hitting the heatwall 2 now?

    Soon I'll stop reading these hardware sites. The only reports in the near future will be 'yet another core added.' Yipee.
  • wien - Monday, January 28, 2008 - link

    Coding for a multi-GPU setup is not really any different that coding for a single-GPU one. All the complexity is handled by the driver, unlike with multi-core CPUs.
  • FXi - Monday, January 28, 2008 - link

    Have to say they did a good job, not great, but very good. We do need to see the 700 though, as this won't hold them for long.

    The other thing both camps need to address is dual monitors using SLI/CF. It's been forever since this tech has been out and it hasn't been fixed. Dual screens are commonplace and people like them. Could be one large and one smaller, or dual midrange but people want the FPS without losing their 2nd screen.

    I'm sure there will be a rash of promises to fix this that won't materialize for years :) (as before)
  • ChronoReverse - Tuesday, January 29, 2008 - link

    Actually, that was one of the things that was fixed by ATI. Dual screens will work even if it's in a window and _spanning_ the monitors. I'll see if I can find the review that showed that.
  • murphyslabrat - Monday, January 28, 2008 - link

    Come on AMD, don't die until we get the Radeon 4870 x2!
  • Retratserif - Monday, January 28, 2008 - link

    Third to last paragraph.

    The fact "hat" both?

    Overall good article. To bad we didnt get to see temps or overclocks.
  • PeteRoy - Monday, January 28, 2008 - link

    Is Nvidia and ATI will just put more of the same instead of innovate new technologies.

    Wasn't that what killed 3dfx, Nvidia should know.
  • kilkennycat - Monday, January 28, 2008 - link

    The next-gen GPU family at nVidia is in full development. Hold on to your wallet till the middle of this year (2008). You may be in for a very pleasant surprise.

Log in

Don't have an account? Sign up now