Call of Duty World at War Analysis



1680x1050 1920x1200 2560x1600



1680x1050 1920x1200 2560x1600



1680x1050 1920x1200 2560x1600

Age of Conan Analysis Crysis Analysis
Comments Locked

44 Comments

View All Comments

  • JarredWalton - Sunday, March 1, 2009 - link

    Fixed, thanks. Note that it's easier to fix issues if you can mention a page, just FYI. :)
  • askeptic - Sunday, March 1, 2009 - link

    This is my observation based on their review over the last couple of years
  • ssj4Gogeta - Sunday, March 1, 2009 - link

    It's called being fair and not being biased. They did give the due credit and praise to AMD for RV770 and Phenom II. You probably haven't been reading the articles.
  • SiliconDoc - Wednesday, March 18, 2009 - link

    He's a red fan freak-a-doo, with his tenth+ name, so anything he sees is biased against ati.
    Believe me, that one is totally goners, see the same freak under krxxxx names.
    He must have gotten spanked in a fps by an nvidia card user so badly he went insane.
  • Captain828 - Sunday, March 1, 2009 - link

    In the last couple of years, nVidia and Intel have had better performing hardware than the competition.
    So I don't see any bias and the charts don't show any either.
  • lk7200 - Wednesday, March 11, 2009 - link


    Shut the *beep* up f aggot, before you get your face bashed in and cut
    to ribbons, and your throat slit.
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Another name so soon raging red fanboy freak ? Going to fantasize about murdering someone again, sooner rather than later ?
    If ati didn't suck so badly, and be billion dollar losers, you wouldn't be seeing red, huh, loser.
  • JonnyDough - Tuesday, March 3, 2009 - link

    Hmm...X1900 series ring a bell? Methinks you've been drinking...
  • Razorbladehaze - Sunday, March 1, 2009 - link

    Wow, what i was really looking forward to here disappeared entirely. I was expecting to see more commentary on the subjective image quality of the benchmarks, and there was even less discussion relating to that then in the past two articles kinda a bummer.

    On the side note what was shown was what I expected from piecemeal of a number of other reviews. Nice to see it combined though.

    The only nougat of information I found disturbing is to hear the impression that CUDA is better than what ATI has promoted. This in light of my understanding that nVidia just hired a head tech officer from the University where Stream (what ati uses) computing took roots. Albeit that CUDA is just an offshoot of this, it would seem to me that, this hiring would lead me to beleive that nvidia will be migrating towards stream rather than the opposite. Especially if GPGPU computing is to become commonplace.

    I think that it would be in nVidia's best interest to do this as I am afraid that Intel is right and that nvidia's future may be bleak if GPGPU computing does not take hold and this is one strategy to migrate towards their rival AMD's GPGPU to reduce resource usage to explore this tech.

    Well yeah... i think i went way way off on a tangent on this one so...yeah im done.
  • DerekWilson - Monday, March 2, 2009 - link

    Sorry about the lack of image quality discussion. It's our observation that image quality is not significantly impacted by multiGPU. There are some instances of stuttering here and there, but mostly this is in places where performance is already bad or borderline, otherwise we did note where there were issues.

    As far as GPGPU / GPU computing, CUDA is a more robust and more widely adopted solution than is ATI Stream. CUDA has made more inroads in the consumer space, and especially in the HPC space than has Stream. There aren't that many differences in the programming model, but CUDA for C does have some advantages over Brook+. I prefer the fact that ATI opens up it's ISA down to the metal (along side a virtual ISA), while NVIDIA only offers a virtual ISA.

    The key is honestly adoption though: the value of the technology only exists as far as the end user has a use for it. CUDA leads here. OpenCL, in our eyes, will close the gap between NVIDIA and ATI and should put them both on the same playing field.

Log in

Don't have an account? Sign up now