Now that we have some hardware in our hands and NVIDIA has formally launched the GeForce GTX 295, we are very interested in putting it to the test. NVIDIA's bid to reclaim the halo is quite an interesting one. If you'll remember from our earlier article on the hardware, the GTX 295 is a dual GPU card that features two chips that combine aspects of the GTX 280 and the GTX 260. The expectation should be that this card will fall between GTX 280 SLI and GTX 260 core 216 SLI.

As for the GTX 295, the GPUs have the TPCs (shader hardware) of the GTX 280 with the memory and pixel power of the GTX 260. This hybrid design gives it lots of shader horsepower with less RAM and raw pixel pushing capability than GTX 280 SLI. This baby should perform better than GTX 260 SLI and slower than GTX 280 SLI. Here are the specs:

Our card looks the same as the one in the images provided by NVIDIA that we posted in December. It's notable that the GPUs are built at 55nm and are clocked at the speed of a GTX 260 despite having the shader power of the GTX 280 (x2).

We've also got another part coming down the pipe from NVIDIA. The GeForce GTX 285 is a 55nm part that amounts to an overclocked GTX 280. Although we don't have any in house yet, this new card was announced on the 8th and will be available for purchase on the 15th of January 2009.

There isn't much to say on the GeForce GTX 285: it is an overclocked 55nm GTX 280. The clock speeds compare as follows:

 

Core Clock Speed (MHz) Shader Clock Speed (MHz) Memory Data Rate (MHz)
GTX 280 602 1296 2214
GTX 285 648 1476 2484

 

We don't have performance data for the GTX 285 yet, but expect it (like the GTX 280 and GTX 295) to be necessary only with very large displays.

  GTX 295 GTX 285 GTX 280 GTX 260 Core 216 GTX 260 9800 GTX+
Stream Processors 2 x 240 240 240 216 192 128
Texture Address / Filtering 2 x 80 / 80 80 / 80 80 / 80 72/72 64 / 64 64 / 64
ROPs 28 32 32 28 28 16
Core Clock 576MHz 648MHz 602MHz 576MHz 576MHz 738MHz
Shader Clock 1242MHz 1476MHz 1296MHz 1242MHz 1242MHz 1836MHz
Memory Clock 999MHz 1242MHz 1107MHz 999MHz 999MHz 1100MHz
Memory Bus Width 2 x 448-bit 512-bit 512-bit 448-bit 448-bit 256-bit
Frame Buffer 2 x 896MB 1GB 1GB 896MB 896MB 512MB
Transistor Count 2 x 1.4B 1.4B 1.4B 1.4B 1.4B 754M
Manufacturing Process TSMC 55nm TSMC 55nm TSMC 65nm TSMC 65nm TSMC 65nm TSMC 55nm
Price Point $500 $??? $350 - $400 $250 - $300 $250 - $300 $150 - 200

For this article will focus heavily on the performance of the GeForce GTX 295, as we've already covered the basic architecture and specifications. We will recap them and cover the card itself on the next page, but for more detail see our initial article on the subject.

The Test

Test Setup
CPU Intel Core i7-965 3.2GHz
Motherboard ASUS Rampage II Extreme X58
Video Cards ATI Radeon HD 4870 X2
ATI Radeon HD 4870 1GB
NVIDIA GeForce GTX 295
NVIDIA GeForce GTX 280 SLI
NVIDIA GeForce GTX 260 SLI
NVIDIA GeForce GTX 280
NVIDIA GeForce GTX 260
Video Drivers Catalyst 8.12 hotfix
ForceWare 181.20
Hard Drive Intel X25-M 80GB SSD
RAM 6 x 1GB DDR3-1066 7-7-7-20
Operating System Windows Vista Ultimate 64-bit SP1
PSU PC Power & Cooling Turbo Cool 1200W

Age of Conan Performance
POST A COMMENT

100 Comments

View All Comments

  • MadMan007 - Monday, January 12, 2009 - link

    The HD4000s were and are certainly great bang for the buck but not using too much power...not so much. The idle power draw was behind NV and the load power draw is generally in proportion to performance. Reply
  • SiliconDoc - Monday, January 12, 2009 - link

    The GTX260 BEATS the 4870 in power consumption- as in IT'S LOWER for the GTX260.
    Just like the GTX295 BEATS the 4870x2 in power useage.
    In the 260/4870 case, full 3d was within 1-3 watts, and in 2d/idle the 260 was 30 watts lower - taking the CROWN.
    Similar in this case- although NVidia declares the 298 watts max, which red fans love to site - the actual draw is less- as sites that have tested all show.
    Oh well. More FUD from the reds will be all over the place.
    Tonight I learned that this 2 gpu thing on one card with framerates like the raved and glroified 4870x2 are just one big waste without 2560x rez - but before the 4870x2 was praised beyond measure for actually winning very often JUST THAT SINGLE HIGHEST REZZ ni the benchmarks.. lol
    It's like politics, total spin and twist, and forget anything else.
    Reply
  • SlyNine - Tuesday, January 13, 2009 - link

    OK you're an Nvidia fanboi, We get it.

    The 4870 was the better deal. get over it.
    Reply
  • SiliconDoc - Tuesday, January 13, 2009 - link

    Merely a fan of the truth. Is the truth that hard for you tools to face ?
    Apparently a lot of freaks have decided to face the truth now = as in get a 30" or forget this level of card, which DOES INCLUDE the 4870x2 - only even more so - as you see disabling x2 with that card is a no go...
    So it took NVidia releasing the slightly better card to wake up the ATI fanboys. 6 months of lies and fanning their red flamnes all over the place, and now Nvidia has brought most of them to their senses.
    " Noone needs a card this good"- is the half hearted response from the red tools now. Or better yet, since the 4870x2 just dropped from 500 bucks to lower pricepoint because it got slayed, now they want the no profiles, driver issue red piece of crap, anyway.
    Whatever...
    Now, WOULD YOU LIKE TO REFUTE ANYTHING ? Including the power consumption I brought up ? Please, you're on the internet, go look, before you make a fool of yourself calling me a fanboy.
    GO LOOK.
    Reply
  • kk650 - Wednesday, January 14, 2009 - link

    too bad a fatarse burger eating yank like yourself didn't die on 9/11 along with several thousand of your fucking countrymen, you cunt. Now go die of a heart attack, retard Reply
  • DJMiggy - Wednesday, January 14, 2009 - link

    That comment was very much uncalled for. You are a very crass individual. I hope you never have to lose a friend or loved one to something that could have been prevented and wish you and your family well. Reply
  • kk650 - Tuesday, January 13, 2009 - link

    He's a fucking moron Reply
  • mhouck - Monday, January 12, 2009 - link

    I have to agree with the others that this article was disappointing. I would think that the nvidia dual GPU would be compared to their last dual GPU the GX2. I was really expecting to see a comparison between the last generation solution and this generation. Maybe a look at driver support for the GX2 and how its doing as compared to ATI's driver support for their X2's. From all the articles I've read, we are constantly asking whether the driver support will be there to make dual GPU's in SLI and Crossfire worthwhile. WELL, TAKE A LOOK AND REPORT BACK! These solutions have been out for a year now!! Maybe I expected too much. :-( Reply
  • MadMan007 - Monday, January 12, 2009 - link

    Which GTX 260 is included in the charts, GTX 260-192 or GTX 260-216? Reply
  • BSMonitor - Monday, January 12, 2009 - link

    Don't you guys usually show us the system power consumption charts for these hefty GPU's? Curious where is stands on that front against 260, 280 SLI and 4870x2. Reply

Log in

Don't have an account? Sign up now