Crysis: Warhead

Kicking things off as always is Crysis: Warhead, the toughest game in our benchmark suite. The GeForce GTX 465 trails the Radeon HD 5850 by about 4fps at every resolution. This translates to within 91% and 82% of the 5850’s performance, with that gap increasing with resolution. Ultimately NVIDIA just misses the sweet-spot at lower resolutions. Meanwhile the minimum framerates are almost tied with the 5850, which is roughly what we expect based on the fact that the GTX 465 doesn’t have a memory capacity advantage like the GTX 480 and 470.

Meanwhile compared to the GTX 470, the GTX 465 is between 20% and 27% slower on average FPS, and 18%-32% slower for minimum framerates.

The Test BattleForge: DX10


View All Comments

  • MadMan007 - Monday, May 31, 2010 - link

    Yeah I've got to agree. It's one thing to not rerun benchmarks with new drivers on older cards or ones that are well out of the intended competition envelope but to not redo cards that are new and might benefit greatly from the new drivers just seems lazy. Reply
  • Greenhell6 - Monday, May 31, 2010 - link

    My 4870X2 Is still rocking out!!! Faster than the 5870 in some tests and on par with the GTX 480. Cant complain since i picked it up for about nothing. Reply
  • SunSamurai - Monday, May 31, 2010 - link

    Whats the power cost to run those again? ;) Reply
  • Greenhell6 - Monday, May 31, 2010 - link

    Very little for me, since I only use my 4870X2 for gaming and nothing else- I have another rig that uses all low power components for everything else. And now with two kids my gamiing computer get's turned on less and less. You have to admit it though--The 4870X2 Numbers are still very impressive--Rock's your face off.... Reply
  • Nighteye2 - Monday, May 31, 2010 - link

    In cases like this, it would be good if power costs are also taken into account. If a card is cheaper but uses a lot more power, you may still end up paying more through your electricity bill.

    Why not make a comparison based on lifetime costs, rather than only purchase price? You can estimate lifetime costs by adding the power usage over about 1000 hours on full load and 500 hours idle - and that'd probably still be a low estimate for some people.
  • C'DaleRider - Monday, May 31, 2010 - link

    OK.....but whose power costs are you going to use as a metric? Massachusetts? California? Idaho? Georgia? Michigan?

    Rural or urban?

    Or should they be limited to U.S. figures only as this site is read internationally?

    So, I guess AT should post your request with power figures from every state in the U.S., urban and rural averages per state, Canada by province, Mexico, Germany, Italy, England, France, Spain, and Turkey. Hope that's enough coverage for you.

    On the other hand, I'd personally think that anyone with two functional brain cells can make a determination that consuming 100W of extra power will cost more to run. Simple point to make as this review made it.
  • Nighteye2 - Monday, May 31, 2010 - link

    Just use the US average cost/kwh. The prices in the articles are also in USD, so that would be most convenient. It doesn't need to be accurate to the cent, just give a decent indication/estimate.

    Oh, and I suspect global power costs to be similar enough for such a figure to have meaning to all readers globally (I'm not from the US, btw).
  • Apocy - Monday, May 31, 2010 - link

    If you are so interested in power costs, here is the basic calculation (take in mind I live in Bulgaria)
    Power cost during day - 0.12$ (US dollars)
    Idle power gap - 9
    Load power gap - 105

    Given we use the system only during daytime we have ROI:
    Idle - 185185,1852 hours
    Load - 15873,01587 hours

    So on average you will need 700 days to pay off these 20$ by saving power with 5850.
  • rbnielse - Monday, May 31, 2010 - link

    Your calculation are off by an order or magnitude.

    It's actually 1587 hours (load) to pay off the 20$ pricegap, and frankly that's not a lot.

    Most people who buy a highend graphics card like this are going to play at least 20 hours per week, which means they'd even out the costs after a year and half at the most. So for the majority of buyers I'd say the GTX465 is actually more expensive than the HD5850, in addition to being slower and louder.
  • Apocy - Tuesday, June 1, 2010 - link

    Yep you are right, I placed 20$ as 20,000 cents not 2,000 that's why the numbers are 10 times higher.
    So ok then, 70 days roughly :)

    In reality if you consider the power savings, yes 5850 will become cheaper with time. Guessing the average user will hit these 70 days of playtime mark by the 8th month if he plays roughly an average of 6-8 hours :)

    So yeah, in conclusion, this card is totally useless unless you want physX and 3d Vision

Log in

Don't have an account? Sign up now