Just last week, NVIDIA announced both the GTX 295 and GTX 285. Today we have availability on both and test results for the GTX 285. As we weren't able to get power tests done time to include in the GTX 295 review, we also have those available today.

EVGA was kind enough to provide the hardware for this review. They sent us two GTX 280s for single and SLI testing. They provided us with overclocked cards, but for this article we underclocked them to stock GTX 285 speeds in order to learn what we can expect from non-overclocked variants.

The hardware looks the same as the current GeForce GTX 280. There really isn't anything aside from the GPU that appears different (except the sticker on the card that is).

We've already indicated the changes that have gone into the GTX 285, but here's another look at the updated clock speeds and the test setup.

  GTX 295 GTX 285 GTX 280 GTX 260 Core 216 GTX 260 9800 GTX+
Stream Processors 2 x 240 240 240 216 192 128
Texture Address / Filtering 2 x 80 / 80 80 / 80 80 / 80 72/72 64 / 64 64 / 64
ROPs 28 32 32 28 28 16
Core Clock 576MHz 648MHz 602MHz 576MHz 576MHz 738MHz
Shader Clock 1242MHz 1476MHz 1296MHz 1242MHz 1242MHz 1836MHz
Memory Clock 999MHz 1242MHz 1107MHz 999MHz 999MHz 1100MHz
Memory Bus Width 2 x 448-bit 512-bit 512-bit 448-bit 448-bit 256-bit
Frame Buffer 2 x 896MB 1GB 1GB 896MB 896MB 512MB
Transistor Count 2 x 1.4B 1.4B 1.4B 1.4B 1.4B 754M
Manufacturing Process TSMC 55nm TSMC 55nm TSMC 65nm TSMC 65nm TSMC 65nm TSMC 55nm
Price Point $500 $400 $350 - $400 $250 - $300 $250 - $300 $150 - 200

The price point for the GTX 285 is $400, but newegg has parts for $380 right now and overclocked variants for not too much more.

The Test

Test Setup
CPU Intel Core i7-965 3.2GHz
Motherboard ASUS Rampage II Extreme X58
Video Cards ATI Radeon HD 4870 X2
ATI Radeon HD 4870 1GB
NVIDIA GeForce GTX 295
EVGA NVIDIA GeForce GTX 285 SLI
NVIDIA GeForce GTX 280 SLI
NVIDIA GeForce GTX 260 SLI
EVGA NVIDIA GeForce GTX 285
NVIDIA GeForce GTX 280
NVIDIA GeForce GTX 260
Video Drivers Catalyst 8.12 hotfix
ForceWare 181.20
Hard Drive Intel X25-M 80GB SSD
RAM 6 x 1GB DDR3-1066 7-7-7-20
Operating System Windows Vista Ultimate 64-bit SP1
PSU PC Power & Cooling Turbo Cool 1200W

Age of Conan Performance
Comments Locked

76 Comments

View All Comments

  • nyran125 - Friday, January 16, 2009 - link

    even the 8800GTS 512mb run all these games still with a decent frame rate of 30 - 80 fps, no point in upgrading at the moment till the net big graphical enhancement or graphical revolution to games , like the next big Crysis or Oblivian type enhancement. Till then , its a complete waste of money buying a newer card if you already have an 8800. Because you know that when the next big graphics game comes out like a new Elder Scrolls(Oblivian) or something the newest card out at that time wont even be enough to run it till 6 months to a year after the game is out. Sometimes it takes even longer for the graphics cards to catch up to the games on Maximum settings. So stop wasting your money lol.
  • Captain828 - Friday, January 16, 2009 - link

    so you're telling me you're getting >30FPS when playing Crysis @ Very High, 1680x1050 + 2xAA + 4xAF ??!
  • ryedizzel - Thursday, January 15, 2009 - link

    Not sure if anyone has said this yet but why the heck are you guys running all your benchmarks at 2560x1600? I mean seriously, how many people are really using monitors that big? AT THE LEAST please show some benchmarks for 1680x1050!
  • Iketh - Friday, January 16, 2009 - link

    lol they are there... are you unable to read a line chart?
  • JarredWalton - Friday, January 16, 2009 - link

    You don't buy parts like the GTX 285/295 or 4870X2 to play at 1680x1050 for the most part. In fact, 2560x1600 and 30" LCDs is the primary reason that I bought a 4870X2 around the time Fallout 3 came out. You can see that even at higher resolutions, there are several titles that are somewhat system limited (i.e. the GPU is so powerful that the benchmark isn't fully stressing the GPU subsystem).
  • MadMan007 - Friday, January 16, 2009 - link

    That's certainly true and I think we understand why the charts are for the highest resolution and it's nice to provide data for lower resolutions. Aside from making a graph for each resolution, perhaps it would be possible to make them interactive somehow...say I click on 1920x1200 below the graph, then that data is charted. What would be really top notch is if I could choose which cards and which resolutions to compare.
  • GhandiInstinct - Friday, January 16, 2009 - link

    MadMan,

    I only wish they did that. Then their reviews would be my #1 source.
  • Stonedofmoo - Friday, January 16, 2009 - link

    But that's just the point though. Most people are still running 22" monitors at 1680x1050 res. We don't NEED top end powerful cards that Nvidia and ATI seem only interested in building.

    What we're looking for are upper midrange parts like a hypothetical GTX 240/220 if they were to exist to replace the aging and now redundant Geforce 9 series.

    Seriously:-
    ATI have more midrange parts than nvidia but really need to work on their rubbish power consumption, especially at idle.
    Nvidia need to actually have some midrange parts but have the power consumption sorted.

    Both need to refocus. I've never seen Nvidia go for so long without releasing a completely new series of cards from top to bottom end.
  • SiliconDoc - Monday, January 19, 2009 - link

    Well more cards are always better and keep us entertained and interested, but this continuous call for "midrange" from NVidia is somewhat puxzzling to me.
    Since the 9800xxxx takes on the 4850, then there's the 9600 on down and the 88gtxx -- I mean it's all covered...
    ATI jusr relased the 4830 on their new highest core crippled to take on the 9800(GT) so they claim...
    I guess if I were NVidia I wouldn't waste my company time pr money on people wanting to read "new technology reviews" based upon cards that REFILL an already filled space that the competition just a bit ago, after 2 years of near NADA, finally brought to market some competition.
    Since ATI was dang out of it for so long - why should NVidia retool the GT200 core to STOMP all their 9800 8800 9600 and the like pieces?
    You want them to destroy themselves and their own line so you can say " Hey, new tech in the midrange - now I can decide if I want a 4830 or 4670 or 4850 or one of these crippled GT200's " - then moments later you'll say to yorself " Wait a minute, why should I get rid of my 9800gtx ?!"...
    I mean REALLY ...
    Would someone please expalin to me what I'm missing ?
    It is all "I want a crippled cheap GT200 core ", isn't it ?
    Maybe part of it is why are we still here when the 8800 was released in Nov 2006 ?
    Maybe the question should be why is ATI still taking on 2 year old NVidia tech.
    AMD just took another huge charge loss from it's ati division, and I'm not certain NV is doing any much better ( though gpu-z shows 65% of the market is NV's ) - so why would NV do an expensive die/core rollout that crushes their already standing cores that compete with ati midrange just fine ?
    It just does not make any sense.
  • hk6900 - Saturday, February 21, 2009 - link


    Die painfully okay? Prefearbly by getting crushed to death in a
    garbage compactor, by getting your face cut to ribbons with a
    pocketknife, your head cracked open with a baseball bat, your stomach
    sliced open and your entrails spilled out, and your eyeballs ripped
    out of their sockets. Fucking bitch

    I really hope that you get curb-stomped. It'd be hilarious to see you
    begging for help, and then someone stomps on the back of your head,
    leaving you to die in horrible, agonizing pain. *beep*

    Shut the *beep* up f aggot, before you get your face bashed in and cut
    to ribbons, and your throat slit.

Log in

Don't have an account? Sign up now