Crysis: Warhead

Being our most stressing game, Crysis also tends to generate some of our most interesting results. Right off the bat at 2560 we have the 5830 beating the GTX 275 by 9%, the 4870 by 5%, and losing to the 4890 by 6%. But lower that resolution and we fall in to another pattern: 10% under the 4890, 6% under the GTX 275, 3% over the 4870, and 20% under the 5850.

Crysis is a game where we expected the ROP loss to hurt the most, but that clearly isn’t the case. The 5830 does have the unfortunate distinction of being the first 5800 series card to drop below 30fps at 1920 however, a resolution the 5850 and 5870 can sustain.

The Test Far Cry 2
Comments Locked

148 Comments

View All Comments

  • MadMan007 - Thursday, February 25, 2010 - link

    Since when is ATi taking marketing technique pointers from nVidia?

    "...the 5830 has some very useful advantages over the 4890 – DX/DirectCompute 11, Eyefinity, better OpenCL support, and bitstreaming audio..."

    Substitute PhysX, CUDA, and 3D display and that would be an NV marketing line.

    (btw why does using quote tags always throw an error in article comments?)
  • Ramon Zarat - Saturday, February 27, 2010 - link

    I beg to differ. There are very clear distinctions between the technologies you mentioned!!!


    CUDA: Proprietary API, closed platform strictly regulated by Nvidia that will be soon obsolete due to OpenCL broad adoption. Market penetration is still limited to vertical market niches.
    Stream: Based on OpenCl, an open platform supported by the whole community representing the future of the industry which will presumably enable any if not all applications and games properly coded and compiled to benefit from it.

    PhysX: Proprietary API supported by only a dozen games out of which 10 are very bad.
    Havok: Will transparently use OpenCL open standard to do in-game Physics, which will ensure a wide adoption.

    Nvidia: Bitsreaming *REGULAR* audio over HDMI
    ATI: Bitstreaming *TrueHD/DTS-HD Master Audio* audio over HDMI

    3D vision: Proprietary API. Need one of *ONLY* 4 Nvidia approved 120Hz LCD, ( http://www.nvidia.com/object/3D_Vision_Requirement...">http://www.nvidia.com/object/3D_Vision_Requirement... ) and the games must be supported in driver. Costly setup, low market penetration.
    Eyefinity: Actually work out of the box for 2D environment. You only need any 2 LCD/CRT + 1 LCD with display port (any brand) or a DVI/HDMI panel with an active converter. A 6 ports version is launching in a couple of weeks. For 3 panels gaming, game profiles are now outside drivers and available almost as soon as a new games come out. Drivers for games still need some polishing.

    I try very hard to be objective, but the facts speak by themselves. ATI is doing better technology right now and shouldn't be ashame to publicize its superiority. By contrast, Nvidia's totalitarian TWIMTBP program, dictatorial proprietary stuff everywhere, and deceptive general attitude as of late ("late", as in the last 5 years...), are ethically highly questionable. The day ATI do the same, I will denounce them as well.
  • piroroadkill - Thursday, February 25, 2010 - link

    Exactly, nobody gives a shit.

    The 4890 is faster and cheaper, the end
  • ImSpartacus - Thursday, February 25, 2010 - link

    No kidding. I am so thankful that I got my 4890 when it came out. I only paid $225 for it too.

    It still hasn't been topped in its price point.
  • Makaveli - Thursday, February 25, 2010 - link

    I picked up my 4890 in Oct for $189 and still laughing about it.

    I won't bother upgrading until the successor too the 5xxx series comes out.
  • kmmatney - Thursday, February 25, 2010 - link

    I jumped on the MSI HD4890OC deal for $180 a year ago, and actually received the rebate after 4 months. Amazing that you can't spend the same amount of maney and get something that performs better a year later.
  • strikeback03 - Thursday, February 25, 2010 - link

    Not really that amazing, it is what happens when there is no real competition. If Nvidia can shock the world and drop something new and good at the $200 price point it is a good bet you will see the whole market adjust quickly.
  • Deville - Thursday, February 25, 2010 - link

    Exactly. It's silly to offer another card that performs in the range of last gen's cards. What's the point of "upgrading" if there's no upgrade?
    If it can barely keep up with last year's models, how can we expect it to do the DX11 stuff? And isn't the DX11 stuff pretty much the only reason to upgrade anyway?

    Here's the problem when comparing new versions of 5000 series cards:
    The numbering system helps, but we have precious little data to show us how DX11 even performs under these new cards.

    I love reading your shootouts, but give us DX11 benchies, please.
  • san1s - Thursday, February 25, 2010 - link

    that's exactly what I was thinking
  • gumdrops - Thursday, February 25, 2010 - link

    Where are all the DX11 game tests like DIRT 2 or Alien vs Predator? BattleForge is the only one and it's unclear if the game was even run in DX11 mode for cards that support it.

Log in

Don't have an account? Sign up now