The Witcher

The Witcher continues the trend of the 4870 surpassing the 9800GTX+ by a wide margin and edging out the more expensive GTX 260. It even blows well past its own brother the 4850. Here we see the additional memory bandwidth of the 4870 makes itself very prominent with a 39% boost in performance over the 4850, well beyond just the improvement in core speed. Although both cards offer framerates we'd consider playable at our stock resolution of 1920x1200, the 4870 is definitely the much more comfortable choice, with plenty of headroom for features such as additional anti-aliasing beyond just 2x.

Finally, it's interesting to note that the 4870 and the 3870 X2 are neck-and-neck until we finally crank up the resolution to 2560x1600, at which point the 3870 X2 pulls ahead. This is not what we would have expected. The HD4000 series seems to scale just a bit worse with resolutionthan either NVIDIA's cards or the HD3000 series.


Click to Enlarge

Assassin's Creed Bioshock
Comments Locked

215 Comments

View All Comments

  • Final Destination II - Wednesday, June 25, 2008 - link

    Dear girls and guys,

    does anyone know of a manufacturer, who offers a HD4850 with a better cooler? I'm desperately searching for one...


    Please reply!
  • Graven Image - Wednesday, June 25, 2008 - link

    Asus recently announced a 4850 with a non-stock cooler, though their version still doesn't expel the air out the back like a dual slot design. (http://www.asus.com/news_show.aspx?id=11871)">http://www.asus.com/news_show.aspx?id=11871). Its not available yet thought. My guess is mid-July we'll probably start seeing a couple different fan and heatsink designs.
  • strikeback03 - Thursday, June 26, 2008 - link

    Only dual-slot card I've ever used was an EVGA 8800GTS 640, it sucked air in the back and blew it into the case.
  • Final Destination II - Wednesday, June 25, 2008 - link

    Nice! 7°C cooler, that's a start! I guess I'll wait a bit more, then.
  • Spacecomber - Wednesday, June 25, 2008 - link

    Although I'm somewhat dubious about dual card solutions, I keep looking at the benchmarks and then at the prices for a couple of 8800 GTs.

    Perhaps, if the 4870 forces Nvidia to reduce their prices for the GTX 260 and the GTX 280, they will likewise bring down the price for the 9800 GX2. This is already the fastest single card solution, and it sells for less than the GTX 280. If this card starts selling for under $400 (maybe around $350), will this become Nvidia's best answer to the 4870?

    Given the performance and the prices for the 4870 and the 9800 GX2 will Nvidia be able to price the GTX 280 competitively, or will it simply be vanity product - ridiculously priced and produced only in very small numbers?

    It should be interesting to see where the prices for video cards end up over the course of the next few weeks.
  • kelmerp - Wednesday, June 25, 2008 - link

    Better HD knickknacks? Better offloading/upscaling?
  • chizow - Wednesday, June 25, 2008 - link

    The HD4000 series have better HDMI sound support with 8ch LPCM over HDMI, but still can't pass uncompressed bistreams. Image quality hasn't changed as there isn't really any room to improve.
  • kelmerp - Wednesday, June 25, 2008 - link

    It would be nice to have a video card, where it doesn't matter how weak the current-gen processor is (say the lowliest celeron available), the card can still output 1080p HDTV without dropping any frames.
  • Chaser - Wednesday, June 25, 2008 - link

    Good to have back at the FRONT of the finish line.
  • JPForums - Wednesday, June 25, 2008 - link

    Ragarding the SLI scaling in Witcher:
    The GTX 280 SLI setup may be running into a bottleneck or driver issues, rather than seeing inherent scaling issues. Consider, the 9800 GTX+ SLI setup scales from 22.9 to 44.5. So the scaling isn't an inherent SLI scaling problem. Though it may point to scaling issues specific to the GTX 280, it is more likely that the problem lies elsewhere. I do, however, agree with your general statement that when CF is working properly, it tends to scale better. In my systems, it seems to require less CPU overhead.

Log in

Don't have an account? Sign up now