The Test

For the 5700 series launch, AMD issued some new drivers as the previous 8.66 driver set did not include support for these cards. The driver set we used for these cards is 8.66.6, which is from the same branch as the earlier drivers. In our own testing, we haven’t seen any performance differences between these drivers and the previous ones on the 5800 series cards, but AMD did note that certain configurations might see a small performance boost. As such our results are still using the original 8.66 driver for the 4000 and 5800 series.

Also, as AMD sent us a pair of 5770s, we have tested these cards in a Crossfire configuration. This configuration is largely academic, as 2 5770s is just shy of the price of a 5870 and brings with it all of the limitations of multi-GPU scaling as compared to single-GPU scaling.

On a final note, our 5750 sample is a 1GB card.

CPU: Intel Core i7-920 @ 3.33GHz
Motherboard: Intel DX58SO (Intel X58)
Chipset Drivers: Intel (Intel)
Hard Disk: Intel X25-M SSD (80GB)
Memory: Patriot Viper DDR3-1333 3 x 2GB (7-7-7-20)
Video Cards:

ATI Radeon HD 5870
ATI Radeon HD 5850
ATI Radeon HD 5770
ATI Radeon HD 5750
ATI Radeon HD 4870 X2
ATI Radeon HD 4890
ATI Radeon HD 4870 1GB
ATI Radeon HD 4850
ATI Radeon HD 3870
ATI Radeon HD 4670 512MB
NVIDIA GeForce GTX 295
NVIDIA GeForce GTX 285
NVIDIA GeForce GTX 275
NVIDIA GeForce GTX 260 Core 216
NVIDIA GeForce GTS 250
NVIDIA GeForce 8800GT

Video Drivers:

NVIDIA ForceWare 190.62
ATI Catalyst Beta 8.66
ATI Catalyst Beta 8.66.6
ATI Catalyst 9.9

OS: Windows 7 Ultimate 64-bit

Meet The 5750 Crysis: Warhead


View All Comments

  • GrizzlyAdams - Tuesday, October 13, 2009 - link

    That may be due to some architectural improvements in the 5770's shaders. The drop in performance in other games may be due to the decreased memory bandwidth, which may not matter with regards to Far Cry 2. Reply
  • papapapapapapapababy - Tuesday, October 13, 2009 - link

    this cards are super lame... 5750, now with +80 stream processors ! XD that 5750 is basically a ( lower clocked!) 4770... guess what ati? that cost me $85 bucks 6 months ago! but who cares right? nvidia is dead so why bother? just slap a dx11 sticker, rice the price ati? Reply
  • The0ne - Tuesday, October 13, 2009 - link

    Just wanted to say I like the conclusion and it's dead spot on on the suggestions and advices.

    I'm very surprise almost no one is talking or bringing up the subject of DirectX. DX11 has more chance to succeed yet less attention. It's amazing how badly DX10 was to sway consumers about face.
  • kmmatney - Tuesday, October 13, 2009 - link

    The problem with DX10 was that you had to buy Vista to get it... Reply
  • MadMan007 - Tuesday, October 13, 2009 - link

    DX10 rendering paths of games that were also DX9 (meaning all of them at the time and even now) were also *slower* and provided little to no i.q. improvements. So even if it hadn't been Vista-only (and only morans keep on with the Vista FUD after SP1) there was no real benefit. DX11 looks to be different in all respects. Reply
  • Lifted - Wednesday, October 14, 2009 - link

    Yeah, get a brain!">
  • Zool - Tuesday, October 13, 2009 - link

    Quite strange that with die size 166mm2 againts 260mm2(rv770) and with 128bit memmory it costs this much. And the 5750 has disabled one simd which even increase the amount of usable chips (but maybe its disabled just for the diference or else the two cards would be exatly the same except clocks).
    The Tessellation part with fixed units is exatly the same as 5800 series or tuned down ?
  • philosofool - Wednesday, October 14, 2009 - link

    I chalk it up to lowish 40nm yields at TSMC. Reply
  • Spoelie - Wednesday, October 14, 2009 - link

    + higher cost per wafer as a 55nm one
    + ddr5 prices
  • Mint - Tuesday, October 13, 2009 - link

    Unless you absolutely need to take advantage of the lower power requirements of the 40nm process (e.g. you pay a ton for power)...

    According to your tests, the 5770 consumes a whopping 48W less idle power than the 4870, and other reviews have comparable results. If your computer is out of standby a modest 10 hours a day, that works out to 175 kWh per year. That's easily $15/year even for people with cheap electricity.

    The funny thing is that I usually see people overstating the savings from power efficiency...

Log in

Don't have an account? Sign up now