Now that we have some hardware in our hands and NVIDIA has formally launched the GeForce GTX 295, we are very interested in putting it to the test. NVIDIA's bid to reclaim the halo is quite an interesting one. If you'll remember from our earlier article on the hardware, the GTX 295 is a dual GPU card that features two chips that combine aspects of the GTX 280 and the GTX 260. The expectation should be that this card will fall between GTX 280 SLI and GTX 260 core 216 SLI.

As for the GTX 295, the GPUs have the TPCs (shader hardware) of the GTX 280 with the memory and pixel power of the GTX 260. This hybrid design gives it lots of shader horsepower with less RAM and raw pixel pushing capability than GTX 280 SLI. This baby should perform better than GTX 260 SLI and slower than GTX 280 SLI. Here are the specs:

Our card looks the same as the one in the images provided by NVIDIA that we posted in December. It's notable that the GPUs are built at 55nm and are clocked at the speed of a GTX 260 despite having the shader power of the GTX 280 (x2).

We've also got another part coming down the pipe from NVIDIA. The GeForce GTX 285 is a 55nm part that amounts to an overclocked GTX 280. Although we don't have any in house yet, this new card was announced on the 8th and will be available for purchase on the 15th of January 2009.

There isn't much to say on the GeForce GTX 285: it is an overclocked 55nm GTX 280. The clock speeds compare as follows:

 

Core Clock Speed (MHz) Shader Clock Speed (MHz) Memory Data Rate (MHz)
GTX 280 602 1296 2214
GTX 285 648 1476 2484

 

We don't have performance data for the GTX 285 yet, but expect it (like the GTX 280 and GTX 295) to be necessary only with very large displays.

  GTX 295 GTX 285 GTX 280 GTX 260 Core 216 GTX 260 9800 GTX+
Stream Processors 2 x 240 240 240 216 192 128
Texture Address / Filtering 2 x 80 / 80 80 / 80 80 / 80 72/72 64 / 64 64 / 64
ROPs 28 32 32 28 28 16
Core Clock 576MHz 648MHz 602MHz 576MHz 576MHz 738MHz
Shader Clock 1242MHz 1476MHz 1296MHz 1242MHz 1242MHz 1836MHz
Memory Clock 999MHz 1242MHz 1107MHz 999MHz 999MHz 1100MHz
Memory Bus Width 2 x 448-bit 512-bit 512-bit 448-bit 448-bit 256-bit
Frame Buffer 2 x 896MB 1GB 1GB 896MB 896MB 512MB
Transistor Count 2 x 1.4B 1.4B 1.4B 1.4B 1.4B 754M
Manufacturing Process TSMC 55nm TSMC 55nm TSMC 65nm TSMC 65nm TSMC 65nm TSMC 55nm
Price Point $500 $??? $350 - $400 $250 - $300 $250 - $300 $150 - 200

For this article will focus heavily on the performance of the GeForce GTX 295, as we've already covered the basic architecture and specifications. We will recap them and cover the card itself on the next page, but for more detail see our initial article on the subject.

The Test

Test Setup
CPU Intel Core i7-965 3.2GHz
Motherboard ASUS Rampage II Extreme X58
Video Cards ATI Radeon HD 4870 X2
ATI Radeon HD 4870 1GB
NVIDIA GeForce GTX 295
NVIDIA GeForce GTX 280 SLI
NVIDIA GeForce GTX 260 SLI
NVIDIA GeForce GTX 280
NVIDIA GeForce GTX 260
Video Drivers Catalyst 8.12 hotfix
ForceWare 181.20
Hard Drive Intel X25-M 80GB SSD
RAM 6 x 1GB DDR3-1066 7-7-7-20
Operating System Windows Vista Ultimate 64-bit SP1
PSU PC Power & Cooling Turbo Cool 1200W

Age of Conan Performance
Comments Locked

100 Comments

View All Comments

  • Gasaraki88 - Monday, January 12, 2009 - link

    It's stupid to get this card if you don't have a 30" monitor and a high end cpu. They are testing the video card here not CPUs. Testing on slower CPU will just show every card pegged at the same frame rate.

    This review was fine, thanks. =)
  • SiliconDoc - Thursday, January 15, 2009 - link

    Gee, suddenly the endlessly bragged about "folding" means absolutely zero (ATI cards suck at it BTW) ... and you've discounted CUDA, and forgotten about PhysX ,,, and spit in the face of hundreds of thousands of single PCI-e 16x motherboard owners.
    Now go to the 4870x2 reviews and type that same crap you typed above - because I KNOW none of you were saying it THEN on the 4870x2 reviews...
    In fact, I WAS THE ONLY ONE who said it at those reviews... the main reason BEING THE 4870X2 WAS RECIEVING ENDLESS PRAISE FOR WINNING IN THAT ONE 2560X resolution....
    Yes, they couldn't stop lauding it up over how it excelled at 2560x - oh the endless praise and the fanboys drooling and claiming top prize...
    Now all of sudden, when the red card get SPANKED hard....
    ____________________________________________________________

    Yes, when I posted it at the red reviews - I didn't have folding or PhysX to fall back on... to NEGATE that... not to mention EXCELLENT dual gpu useage and gaming profiles OUT OF THE BOX.

    The facts are, the 4870x2 had better be at least 100 bucks cheaper, or more - who wants all the hassles ?
  • TheDoc9 - Tuesday, January 13, 2009 - link

    It wasn't fine for me, and I don't believe that this card should only be purchased by those with a 30" monitor and bleeding edge cpu. That someone with a fast core proc might be able to find some use with this product vs. the next slowest alternative. Prove to me I'm wrong.
  • Nfarce - Tuesday, January 13, 2009 - link

    Ok. I can look at many data results on this website with the GTX280 paired with an i7 and a stock clocked E8500 and do some interpolation of said data into the results here.

    See Exhibit A from the Jan. 8 article on AMD's Phenom II X4 940 & 920 using a single GTX280:

    Crysis Warhead @ 1680x1050 (mainstream quality, enthusiast on) results:

    Stock E7200 @ 2.53 GHz-> 66.2 fps
    Stock E8600 @ 3.30 GHz-> 84.0 fps
    Stock i965 @ 3.20 GHz-> 86.8 fps

    Now back to this report with the same game resolution (but using gamer quality with enthusiast on) with a single GTX280:

    Stock E7200 @ 2.53 GHz -> ???
    Stock E8600 @ 3.30 GHz -> ???
    Stock i965 @ 3.20 GHz -> 36.6 fps

    Now using the GTX295:

    Stock E7200 @ 2.53 GHz -> ???
    Stock E8600 @ 3.30 GHz -> ???
    Stock i965 @ 3.20 GHz -> 53.1fps

    With the above data, it shouldn't take an M.I.T. PhD to reasonably get a figure of potentials with slower CPUs and lower resolutions.





  • TheDoc9 - Wednesday, January 14, 2009 - link

    That actually is informative.
  • A5 - Monday, January 12, 2009 - link

    If you're not playing at 25x16, this card isn't going to make anything playable that isn't already on an existing, cheaper solution.

    In that same vein, the people who will drop $500 on a video card will most likely have a high-end CPU - there isn't a reason to test it on systems that aren't the top of the heap. They're testing the card, not the whole system - the conclusion to be made is that on any given set of hardware, Card X will outperform Card Y.
  • SiliconDoc - Monday, January 12, 2009 - link

    Just like the bloated pig 4870x2 has been for so many months - a costly, outlandish, not neccessary, wild good for nothing at a bad pricepoint, unless you're playing at 2560x - and let's add, at that level it's sometimes not even playable framerates anyway.
    Glad to see people have finally come to realize what a piece of crap the 4870x2 solution is - thanks NVidia for finally straightening so many out.
    This is great.
  • Hxx - Tuesday, January 13, 2009 - link

    you obviously have no clue about video cards or you cannot afford one, which explains your attitude. First off the 4870 x2 is an awesome card much faster than any other card available except the 295. Second, it is reasonably priced at 400 after mir which is not bad for a high end product. This card can run every game out there just as good as the gtx 295. There is no difference between the two because once the game runs at 35 fps or above you will not notice a difference. In other words, the 16 fps difference in cod5 between the cards has no value because the game plays fine at 40fps. The gtx 295 priced at $500 is a waste of money. And btw I am not an ATI fanboy , I own a gtx 280.
  • strikeback03 - Wednesday, January 14, 2009 - link

    The 4870x2 launched at $550, so unless you need a new card RIGHT NOW you can wait until the initial rush on the GTX295 is over and the price settles down some.
  • SiliconDoc - Tuesday, January 13, 2009 - link

    So tell me about the forced dual mode in ATI ? Oh that's right mr know it all, they don't have that.
    SLI does.
    Yes, I obviously know notihng, but you made a fool of yourself.
    BTW - please give me the link for the $400.00 4870x2 - because I will go buy one - then post all the driver and crash issues.

    Waiting....

Log in

Don't have an account? Sign up now