Left 4 Dead

Finally we have Left 4 Dead. Based on the venerable Source engine, it can run well even on the slowest GPUs. In the case of the 5450, it can technically keep above 30fps even with 4x anti-aliasing when we’re running at 1024. However it’s skirting the line enough that running it without the AA for an 18% performance boost is the better choice.

Among our low-end cards the results have been very consistent, and it doesn’t change here. The 4550 comes out on top, followed by the 5450, and then the G210. The 5450 continues to get around half the performance of the GT 220.

Batman: Arkham Asylum Power & Temperatures
Comments Locked

77 Comments

View All Comments

  • Lifted - Thursday, February 4, 2010 - link

    The first graph on each of the benchmark pages lists a 5670, the second graph lists a 4670. Typo or are you actually using different cards?
  • Ryan Smith - Thursday, February 4, 2010 - link

    It's not a typo. We never ran the 5670 at 1024x768, there was no reason to. It's more than fast enough for at least 1280.

    The 4670 data is from the GT 240 review, which we used 1024 on (because GT 240 couldn't cut the mustard above 1024 at times).
  • 8steve8 - Thursday, February 4, 2010 - link

    should have had the clarkdale igp in there for good measure, if you aren't gaming I'd guess that igp would be the way to go
  • MrSpadge - Thursday, February 4, 2010 - link

    Would have been interesting to compare idle power consumption: Clarkie + IGP vs. Clarkie + 5450.
  • Ryan Smith - Thursday, February 4, 2010 - link

    Testing a Clarkie requires switching out our test rig, so the results wouldn't be directly comparable since it means switching out everything including the processor. Plus Anand we only have a couple of Clarkies, which are currently in use for other projects.

    At this point Clarkie (and any other IGP) is still less than half as fast as 5450.
  • strikeback03 - Thursday, February 4, 2010 - link

    That brings up the point though that with a card this low on the totem pole it might be nice to include a benchmark or two of it paired with similarly low-priced hardware. I understand the reason for generally using the same testbed, but when it is already borderline playable it would be nice to know that it won't get any slower when actually paired with a cheap processor and motherboard.
  • Ryan Smith - Thursday, February 4, 2010 - link

    Testing a Clarkie requires switching out our test rig, so the results wouldn't be directly comparable since it means switching out everything including the processor.

    At this point Clarkie (and any other IGP) is still less than half as fast as 5450.
  • kevinqian - Thursday, February 4, 2010 - link

    Hey Ryan, I'm glad you are the first reviewer to utilize Blaubart's very helpful deinterlacing benchmark. I would just like to note that with ATI, it seems memory bandwidth plays a big part in deinterlacing method as well. For example, the HD 4650 DDR2 can only perform MA deinterlacing, even tho it has the same shaders as the (VA capable) 4670. The only bottleneck there seems to be the DDR2 memory bandwidth. On the other hand, with the HD 4550, though it has DDR3, it is limited to 64bit memory interface, so that seems to be a limiting factor.

    I have an old HD 2600Pro DDR2 AGP. When I OC the memory from 800mhz stock to 1000mhz, VA gets activated by CCC and confirmed in Cheese slices.

    Nvidia's deinterlacing algorithm seem to be less memory intensive as even the GT220 with DDR2 is able to perform VA-like deinterlacing.
  • Ryan Smith - Thursday, February 4, 2010 - link

    Yeah, I've seen the bandwidth idea thrown around. Unfortunately I don't have any additional suitable low-end cards for testing it.
  • ET - Thursday, February 4, 2010 - link

    I think I remember reading that the interpolation of input values in the pixel shader was moved from fixed function units to being done by the shaders.

Log in

Don't have an account? Sign up now