Crysis, Metro, DiRT 3, Shogun 2, & Batman: Arkham City

Our first graphics test is Crysis: Warhead, which in spite of its relatively high system requirements is the oldest game in our test suite. Crysis was the first game to really make use of DX10, and set a very high bar for modern games that still hasn't been completely cleared. And while its age means it's not heavily played these days, it's a great reference for how far GPU performance has come since 2008

NVIDIA’s internal guidance on GT 640 DDR3 performance is that it should beat the Radeon HD 6670 by around 20%, however Kepler’s poor performance under Crysis means that isn’t going to happen here. At 31fps at 1680 with Mainstream quality the GT 640 is just barely playable here, with the Radeon HD 7750-800 (a similar sub-75W card) more than doubling its performance. Worse, the GT 640 actually loses to the GT 240 here, with NVIDIA’s two-generation old card beating it by 14%. To NVIDIA’s credit this happened to be the only test where that occurs, but this does a great job driving home the point that the GT 640 is heavily handicapped with DDR3.

On that note, given what we’ve seen with Kepler so far with GK104 cards and now with the GK107 GT 640, this further reinforces the idea that Crysis above all else a memory bandwidth hungry test. GTX 680 failed to greatly improve upon the GTX 580 when the two had similar amounts of memory bandwidth, and while the GT 640 does improve upon the GT 440 by quite a bit at times the fact that it loses to the GDDR5 GT 240 lends further proof to our theories. It will be interesting to see what happens here once we do see a GDDR5 card.

Finally, while we’ve focused thus far on the GT 640’s poor performance relative to its current competition, it’s not all bad news for NVIDIA. With Performance and Gamer quality in particular the GT 640 improves upon the GT 440 by a rather impressive 48% despite the fact that the two cards have similar memory bandwidth, reflecting just how much of an impact doubling the shader performance and quadrupling the ROPs can have.

On that note, as this happened to be one of only a couple of games where our test settings overlapped our iGPU test settings, we’ve also thrown in our Intel HD graphics numbers. The CPUs aren’t identical (all of our dGPU testing is on SNB-E), but we’re GPU limited to such a large degree that it doesn’t make a practical difference. NVIDIA wants to sell the GT 640 as an upgrade to i3/i5 systems with Intel’s HD graphics, and while its performance may be lacking compared to its competition, at the very least GT 640 handily surpasses any iGPU. Intel’s decision to ship most desktop IVB CPUs with HD 2500 means that GT 640 can nearly quadruple the IVB GPU’s performance under Crysis.

Looking at the minimum framerates the story is much the same. The GT 640 is well behind the 7750 and similar cards. At best it manages to beat the GT 240, most likely due to the former’s lack of total VRAM (it only has 512MB).

Metro

Paired with Crysis as our second behemoth FPS is Metro: 2033. Metro gives up Crysis’ lush tropics and frozen wastelands for an underground experience, but even underground it can be quite brutal on GPUs, which is why it’s also our new benchmark of choice for looking at power/temperature/noise during a game. If its sequel due this year is anywhere near as GPU intensive then a single GPU may not be enough to run the game with every quality feature turned up.

Metro: 2033 - 1680x1050 - DX10 Medium Quality + 16xAF

Relative to its competition the GT 640 improves slightly over what we saw in Crysis, but it’s still trailing the Radeon HD 6670, never mind the 7750. Performance has improved by over 50% over the GT 440, which is enough to push us past 30fps at 1680 with medium quality settings, but that’s as much as NVIDIA’s going to get out of the GT 640 here.

DiRT 3

DiRT 3 is our next DX11 game. Developer Codemasters Southam added DX11 functionality to their EGO 2.0 engine back in 2009 with DiRT 2, and while it doesn't make extensive use of DX11 it does use it to good effect in order to apply tessellation to certain environmental models along with utilizing a better ambient occlusion lighting model. As a result DX11 functionality is very cheap from a performance standpoint.

DiRT 3 is traditionally a game that favors NVIDIA here, and while the GT 640 finally surpasses the 6670, it’s by no means a great showing for the GT 640. Again it’s handily beaten by the 7750 and GTS 450, and for as light as DiRT 3 is, we still can’t even break 40fps at 1680 Ultra quality without AA. To achieve 60fps here it’s necessary to turn it down to Medium quality. Elsewhere performance relative to the GT 440 has increased by nearly 60%, which is a big jump for NVIDIA but not enough to surpass their competition.

Total War: Shogun 2

Total War: Shogun 2 is the latest installment of the long-running Total War series of turn based strategy games, and alongside Civilization V is notable for just how many units it can put on a screen at once.

Under Shogun the story is much the same. The GT 640’s performance relative to the anemic GT 440 has greatly improved, jumping up by upwards of 50%, but it trails everything faster than a Radeon HD 6670. At the very least Shogun 2 is a relatively non-intensive game at 1680, so even at high quality the GT 640 is still achieving better than 40fps.

Batman: Arkham City

Batman: Arkham City is loosely based on Unreal Engine 3, while the DirectX 11 functionality was apparently developed in-house. With the addition of these features Batman is far more a GPU demanding game than its predecessor was, particularly with tessellation cranked up to high.

Batman: Arkham City - 1680x1050 - High Quality + FXAA-Low

Batman: Arkham City is another game that traditionally favors NVIDIA’s GPUs, but again this isn’t much of a help here. At 1680 with Very High quality the GT 640 can just crack 30fps, and if we drop down to High quality that becomes a far more playable 51fps. But at the same time this leads to it greatly trailing the usual suspects at all configurations, and even the 6670 pulls ahead at High quality. Even the performance gains relative to the GT 440 have tapered off a bit, with the GT 640 only picking up 34% at High quality.

Musing About Memory Bandwidth & The Test Portal 2, Battlefield 3, Starcraft II, Skyrim, & Civ V
Comments Locked

60 Comments

View All Comments

  • cjs150 - Thursday, June 21, 2012 - link

    "God forbid there be a technical reason for it.... "

    Intel and Nvidia have had several generations of chip to fix any technical issue and didnt (HD4000 is good enough though). AMD have been pretty close to the correct frame rate for a while.

    But it is not enough to have the capability to run at the correct frame rate is you make it too difficult to change the frame rate to the correct setting. That is not a hardware issue just bad design of software.
  • UltraTech79 - Wednesday, June 20, 2012 - link

    Anyone else really disappointed in 4 still being standardized around 24 fps? I thought 60 would be the min standard by now with 120 in higher end displays. 24 is crap. Anyone that has seen a movie recorded at 48+FPS know whats I'm talking about.

    This is like putting shitty unleaded gas into a super high-tech racecar.
  • cjs150 - Thursday, June 21, 2012 - link

    You do know that Blu-ray is displayed at 23.976 FPS? That looks very good to me.

    Please do not confuse screen refresh rates with frame rates. Screen refresh runs on most large TVs at between 60 and 120 Hz, anything below 60 tends to look crap. (if you want real crap trying running American TV on an European PAL system - I mean crap in a technical sense not creatively!)

    I must admit that having a fps of 23.976 rather than some round number such as 24 (or higher) FPS is rather daft and some new films are coming out with much higher FPS. I have a horrible recollection that the reason for such an odd FPS is very historic - something to do with the length of 35mm film that would be needed per second, the problem is I cannot remember whether that was simply because 35mm film was expensive and it was the minimum to provide smooth movement or whether it goes right back to days when film had a tendency to catch light and then it was the maximum speed you could put a film through a projector without friction causing the film to catch light. No doubt there is an expert on this site who could explain precisely why we ended up with such a silly number as the standard
  • UltraTech79 - Friday, June 22, 2012 - link

    You are confusing things here. I clearly said 120(fps) would need higher end displays (120Hz) I was rounding up 23.976 FPS to 24, give me a break.

    It looks good /to you/ is wholly irrelevant. Do you realize how many people said "it looks very good to me." Referring to SD when resisting the HD movement? Or how many will say it again referring to 1080p thinking 4k is too much? It's a ridiculous mindset.

    My point was that we are upping the resolution, but leaving another very important aspect in the dust that we need to improve. Even audio is moving faster than framerates in movies, and now that most places are switching to digital, the cost to goto the next step has dropped dramatically.
  • nathanddrews - Friday, June 22, 2012 - link

    It was NVIDIA's choice to only implement 4K @ 24Hz (23.xxx) due to limitations of HDMI. If NVIDIA had optimized around DisplayPort, you could then have 4K @ 60Hz.

    For computer use, anything under 60Hz is unacceptable. For movies, 24Hz has been the standard for a century - all film is 24fps and most movies are still shot on film. In the next decade, there will be more and more films that will use 48, 60, even 120fps. Cameron was cock-blocked by the studio when he wanted to film Avatar at 60fps, but he may get his wish for the sequels. Jackson is currently filming The Hobbit at 48fps. Eventually all will be right with the world.
  • karasaj - Wednesday, June 20, 2012 - link

    If we wanted to use this to compare a 640M or 640M LE to the GT640, is this doable? If it's built on the same card, (both have 384 CUDA cores) can we just reduce the numbers by a rough % of the core clock speed to get rough numbers that the respective cards would put out? I.E. the 640M LE has a clock of 500mhz, the 640M is ~625Mhz. Could we expect ~55% of this for the 640M LE and 67% for the 640M? Assuming DDR3 on both so as not to have that kind of difference.
  • Ryan Smith - Wednesday, June 20, 2012 - link

    It would be fairly easy to test a desktop card at a mobile card's clocks (assuming memory type and functional unit count was equal) but you can't extrapolate performance like that because there's more to performance than clockspeeds. In practice performance shouldn't drop by that much since we're already memory bandwidth bottlenecked with DDR3.
  • jstabb - Wednesday, June 20, 2012 - link

    Can you verify if creating a custom resolution breaks 3D (frame packed) blu-ray playback?

    With my GT430, once a custom resolution has been created for 23/24hz, that custom resolution overrides the 3D frame-packed resolution created when 3D vision is enabled. The driver appeared to have a simple fall through logic. If a custom resolution is defined for the selected resolution/refresh rate it is always used, failing that it will use a 3D resolution if one is defined, failing that it will use the default 2D resolution.

    This issue made the custom resolution feature useless to me with the GT430 and pushed me to an AMD solution for their better OOTB refresh rate matching. I'd like to consider this card if the issue has been resolved.

    Thanks for the great review!
  • MrSpadge - Wednesday, June 20, 2012 - link

    It consumes about just as much as the HD7750-800, yet performs miserably in comparison. This is an amazing win for AMD, especially comparing GTX680 and HD7970!
  • UltraTech79 - Wednesday, June 20, 2012 - link

    This preform about as well as an 8800GTS for twice the price. Or half the preformance of a 460GTX for the same price.

    These should have been priced at 59.99.

Log in

Don't have an account? Sign up now