Now that we have some hardware in our hands and NVIDIA has formally launched the GeForce GTX 295, we are very interested in putting it to the test. NVIDIA's bid to reclaim the halo is quite an interesting one. If you'll remember from our earlier article on the hardware, the GTX 295 is a dual GPU card that features two chips that combine aspects of the GTX 280 and the GTX 260. The expectation should be that this card will fall between GTX 280 SLI and GTX 260 core 216 SLI.

As for the GTX 295, the GPUs have the TPCs (shader hardware) of the GTX 280 with the memory and pixel power of the GTX 260. This hybrid design gives it lots of shader horsepower with less RAM and raw pixel pushing capability than GTX 280 SLI. This baby should perform better than GTX 260 SLI and slower than GTX 280 SLI. Here are the specs:

Our card looks the same as the one in the images provided by NVIDIA that we posted in December. It's notable that the GPUs are built at 55nm and are clocked at the speed of a GTX 260 despite having the shader power of the GTX 280 (x2).

We've also got another part coming down the pipe from NVIDIA. The GeForce GTX 285 is a 55nm part that amounts to an overclocked GTX 280. Although we don't have any in house yet, this new card was announced on the 8th and will be available for purchase on the 15th of January 2009.

There isn't much to say on the GeForce GTX 285: it is an overclocked 55nm GTX 280. The clock speeds compare as follows:

 

Core Clock Speed (MHz) Shader Clock Speed (MHz) Memory Data Rate (MHz)
GTX 280 602 1296 2214
GTX 285 648 1476 2484

 

We don't have performance data for the GTX 285 yet, but expect it (like the GTX 280 and GTX 295) to be necessary only with very large displays.

  GTX 295 GTX 285 GTX 280 GTX 260 Core 216 GTX 260 9800 GTX+
Stream Processors 2 x 240 240 240 216 192 128
Texture Address / Filtering 2 x 80 / 80 80 / 80 80 / 80 72/72 64 / 64 64 / 64
ROPs 28 32 32 28 28 16
Core Clock 576MHz 648MHz 602MHz 576MHz 576MHz 738MHz
Shader Clock 1242MHz 1476MHz 1296MHz 1242MHz 1242MHz 1836MHz
Memory Clock 999MHz 1242MHz 1107MHz 999MHz 999MHz 1100MHz
Memory Bus Width 2 x 448-bit 512-bit 512-bit 448-bit 448-bit 256-bit
Frame Buffer 2 x 896MB 1GB 1GB 896MB 896MB 512MB
Transistor Count 2 x 1.4B 1.4B 1.4B 1.4B 1.4B 754M
Manufacturing Process TSMC 55nm TSMC 55nm TSMC 65nm TSMC 65nm TSMC 65nm TSMC 55nm
Price Point $500 $??? $350 - $400 $250 - $300 $250 - $300 $150 - 200

For this article will focus heavily on the performance of the GeForce GTX 295, as we've already covered the basic architecture and specifications. We will recap them and cover the card itself on the next page, but for more detail see our initial article on the subject.

The Test

Test Setup
CPU Intel Core i7-965 3.2GHz
Motherboard ASUS Rampage II Extreme X58
Video Cards ATI Radeon HD 4870 X2
ATI Radeon HD 4870 1GB
NVIDIA GeForce GTX 295
NVIDIA GeForce GTX 280 SLI
NVIDIA GeForce GTX 260 SLI
NVIDIA GeForce GTX 280
NVIDIA GeForce GTX 260
Video Drivers Catalyst 8.12 hotfix
ForceWare 181.20
Hard Drive Intel X25-M 80GB SSD
RAM 6 x 1GB DDR3-1066 7-7-7-20
Operating System Windows Vista Ultimate 64-bit SP1
PSU PC Power & Cooling Turbo Cool 1200W

Age of Conan Performance
Comments Locked

100 Comments

View All Comments

  • SiliconDoc - Monday, January 12, 2009 - link

    Pssst ! The GTX295 wins hands down in both those departments...that's why it's strangely "left out of the equation".
    (most all the other sites already reported on that - heck it was in NVidia's literature - and no they didn't lie - oh well - better luck next time).
  • Amiga500 - Wednesday, January 14, 2009 - link

    Well... to be honest...


    If leaving out power consumption pisses people like you off - good one anandtech!


    (I guess your nice and content your nvidia e-penis can now roam unopposed?)
  • SiliconDoc - Thursday, January 15, 2009 - link

    First of all, it's "you're" when you are referring to my E-PENIS. (Yes, please also capitalize ALL the letters to be properly indicative of size.)
    Second, what were you whining about ?
    Third, if you'd like to refute my points, please make an attempt, instead of calling names.
    Fourth, now you've fallen to the lowest common denominator, pretending to hate a fellow internet poster, and supporting shoddy, slacker work, with your own false fantasy about my temperament concerning power and the article.
    What you failed to realize is me pointing out the NVidia advantage in that area, has actually pissed you off, because the fanboy issue is at your end, since you can't stand the simple truth.
    That makes your epeenie very tiny.
  • kk650 - Tuesday, January 13, 2009 - link

    Remove yourself from the gene pool, fuckwit americunt
  • darckhart - Monday, January 12, 2009 - link

    from all the games where the gtx295 beats the 4870x2, it's only a 3-5 fps win. i don't see how that "gets the nod even considering price." at best, that's $10 per frame. i think we need to see thermals and power draw (i don't recall if you talked about these in the earlier article) to better justify that extra $50.
  • JarredWalton - Tuesday, January 13, 2009 - link

    I bought a 4870X2 a couple months back... if I had had the option of the GTX 295, it would have been my pick for sure. I wanted single-slot, dual-GPU (because I've got a decent platform but am tired to dual video cards). I cannot overstate the importance of drivers, and frankly ATI's drivers still disappoint on a regular basis. Until and unless ATI can get driver profiles into their drivers for CrossFire, I don't think I can ever feel happy with the solution. Also, 4870X2 drivers need to bring back the "CrossFire disable" option in the drivers; we all know there are two GPUs, and there are still occasional games where CrossFire degrades performance over a single GPU.
  • TheDoc9 - Monday, January 12, 2009 - link

    Definitely a half-ass review, something I don't expect from Anandtech. Something more to come later?

    Many questions on these cards can still be asked;
    -Testing at other resolutions, not just a recommendation to stay away unless playing at 2500 res. and a 30" monitor.
    -Testing on other rigs, such as a mid range quad core and dual core to give us an idea of how it might perform on our rig (who don't own a mega clocked i7)

    I don't like to sound negative, but honestly there was no enthusiasm written in this preview/review/snapshot/whatever it's supposed to be. Kind of disappointing how every other major site has had complete reviews since launch day. Was this written on the plane trip home?
  • bigonexeon - Friday, January 16, 2009 - link

    i dont see the point of using an intel i7 as intel released a article that the i7 cache 3 has a memory leak thats only attempt at fixing it is a software patch which there not going too fix until the second generation of the i7s also why compare standard cards too a newer card why not put the older cards newer designs against the newer cards etc the superclocks,xxx which is the latest edition of the old model used
  • theprodigalrebel - Monday, January 12, 2009 - link

    Might want to take a second look at the line graph and the table below it.
  • Iketh - Monday, January 12, 2009 - link

    This article was just right. I had no enthusiasm to read about this card because there isnt anything to get excited about. Apparently Derek didn't either. Im sure there will that enthusiasm again when a next gen card appears and there is something new to talk about.

    It's also having to follow the Phenom II article.

Log in

Don't have an account? Sign up now