New Release 275 Drivers & The Test

Launching alongside the GTX 560 will be the newest branch of NVIDIA’s GeForce drivers: Release 275 beta. This actually comes hot on the heels of Release 270, which came out only a month and a half ago.

Unlike Release 270 NVIDIA isn’t making a lot of performance promises, so this is mostly a feature release. The big item in 275 for most gamers will be further refinements to the auto-update mechanism first introduced in 270. NVIDIA has finally fully ported over Optimus’ auto-update feature, meaning that NVIDIA’s drivers can now automatically find and install profile updates in the background. However whereas Optimus profile updates were necessary for switchable graphics, for desktop users the primary purpose of auto-updating profiles is for SLI and anti-aliasing compatibility, as NVIDIA uses compatibility flags in their profiles to make those features work.

Automatic profile updates won’t completely absolve SLI from periods of incompatibility, but it should help. NVIDIA has released out of band profile updates for SLI before, but these were rather rare. If they now release profile updates much more frequently, then this will be a boon for SLI users, particularly GTX 295/590 users. Otherwise SLI is mostly limited by what can be done with a profile – if NVIDIA has to update the driver itself, then users will still need to wait for a new driver release. Which on that note, NVIDIA hasn’t changed the auto-update procedure for the drivers: profiles will auto-download and install, but driver updates must still be manually approved.

NVIDIA tells us that in the future they will also be able to deliver 3D Vision compatibility updates with profiles, but this will probably require a bit of a rearchitecting to their drivers and profiles. Currently NVIDIA’s profiles contain a few flags for 3D Vision (mainly whether it’s allowed), but there aren’t any sweeping compatibility bits as there are for SLI and AA.

Moving on, the other big functionality update with 275 is a new resizing and scaling UI in the NVIDIA control panel. The core functionality of scaling hasn’t changed much as NVIDIA has offered these controls for quite some time, but now scaling controls are available for VGA and HDMI displays, versus just DVI and DisplayPort as it was previously. There’s also a new override option or Windows 7, for forcing misbehaving programs to use NVIDIA’s scaling options instead of their own. (Ed: We’ve never actually encountered this before. If you know of any games/applications that need this option, please let us known in the comments)

As for resizing, NVIDIA has tweaked the UI to help better guide users through using overscan correction and/or disabling overscan on their TVs. The ideal method of dealing with overscan is to disable it on the TV (thereby ensuring 1:1 pixel mapping), which is what NVIDIA now first directs users toward. For users that can’t disable overscan, they can then unlock NVIDIA’s resizing controls. NVIDIA tells us that they’ve also done some work to improve resizing compatibility for games/applications that try to force standard resolutions, but we have not had an opportunity to test this yet.

The Release 275 betas should be available later today, with WHQL drivers appearing within a month.

The Test

As we mentioned in our introduction, the lack of any reference-clocked cards means that the GTX 560’s clocks – and thereby the GTX 560’s performance – is not well rooted. As a result we’ve tested our ASUS GTX 560 DirectCU II Top at a few different clockspeeds. We’ve tested the card at NVIDIA’s reference clocks of 810/4004 (GTX 560 Base), along with the slowest "mid-grade" card on NVIDIA’s release list: 850/4104 (GTX 560 Mid). NVIDIA is pitching the GTX 560 as their $199 card, so for the purposes of our review we’ll be focusing primarily on the mid-clocked GTX 560, as this is the approximate speed of most of the $199 cards. If you buy a $199 GTX 560 today, this should closely represent the speed of the card you’re buying.

Ideally 810/4004 cards will be relegated to the OEM market, but if not we also have the base clocks included for appropriate consideration. It goes without saying that we’d rather NVIDIA just create two different product lines rather than having so many cards under the same umbrella, but at this point we’re just repeating ourselves.

We’ve also included our overclocking results with the ASUS GTX 560 DirectCU II Top, colored in orange. As we were only able to reach 950/4400, the performance gains are limited.

For drivers, on the NVIDIA side we’re using 275.20 beta for the GTX 560, the GTX 460 1GB, and GTX 560 Ti. In practice the average performance difference between release 275 and release 270 is around 1% in favor of 275. On the ATI side we’re using the Catalyst 11.5a hotfix; however do note that in our testing we’ve found that performance is identical to the 11.4 drivers.

CPU: Intel Core i7-920 @ 3.33GHz
Motherboard: ASUS Rampage II Extreme (X58)
Chipset Drivers: Intel 9.1.1.1015 (Intel)
Hard Disk: OCZ Summit (120GB)
Memory: Patriot Viper DDR3-1333 three x 2GB (7-7-7-20)
Video Cards: AMD Radeon HD 6970
AMD Radeon HD 6950 2GB
AMD Radeon HD 6870
AMD Radeon HD 6850
AMD Radeon HD 5870
AMD Radeon HD 5850
AMD Radeon HD 5770
AMD Radeon HD 4870
NVIDIA GeForce GTX 580
NVIDIA GeForce GTX 570
NVIDIA GeForce GTX 560 Ti
ASUS GeForce GTX DirectCU II Top
NVIDIA GeForce GTX 550 Ti
NVIDIA GeForce GTX 480
NVIDIA GeForce GTX 470
NVIDIA GeForce GTX 460 1GB
NVIDIA GeForce GTS 450
NVIDIA GeForce GTX 285
NVIDIA GeForce GTX 260 Core 216
Video Drivers: NVIDIA ForceWare 262.99
NVIDIA ForceWare 270.51 Beta
NVIDIA ForceWare 275.20 Beta
AMD Catalyst 10.10e
AMD Catalyst 11.4
AMD Catalyst 11.5a
OS: Windows 7 Ultimate 64-bit

 

Meet Asus’s GTX 560 DirectCU II Top Crysis: Warhead
POST A COMMENT

66 Comments

View All Comments

  • L. - Thursday, May 19, 2011 - link

    You're almost there:

    You need an n-level classification with multipliers on scores i.e. :

    Game A, favors nVidia (score * .9), is demanding (score * 1) -> score = base score *.9

    Game B favors AMD (score * .9), is not demanding (score * .5) -> score = base score * .45

    And so on.

    Of course money and politics play, but that has no importance to gamers, they just want the best out of what they pay, and if some company plays better politics to offer a better result, so be it.
    Reply
  • Ryan Smith - Tuesday, May 17, 2011 - link

    The test suite is due for a refresh, and will either be updated at the end of this month or next month once we build our new SNB testbed. Reply
  • L. - Wednesday, May 18, 2011 - link

    Thanks :) Reply
  • TheJian - Wednesday, May 18, 2011 - link

    OK. I get it. If a game runs better on Nvidia we should just consider it old trash and never benchmark it. Regardless of the fact that it's arguably one of the most important franchises in history. CIVILIZATION 5 (released Sept 21, 2010, expansion packs released even later I'm sure, always usually 2 packs released). Even though it was release 2 years after your crysis warhead (2008). Civ5 will continue to be relevant for another year or two (if not more, people play these for years).

    Sid Meier didn't make Nvidia's cards run better on it either. Nvidia let out MULTITHREADED RENDERING. Umm, that's a DirectX 11 feature isn't it? But isn't Civ5 an old junk game we shouldn't benchmark? LOL.
    http://forums.anandtech.com/showpost.php?p=3152067...
    Ryan Smith (from here at AT) explaining just a month ago (exactly today) how Nvidia sped up Civ5. It doesn't favor them. They are just making their cards RUN THE WAY THEY SHOULD. Full performance, with multithreading. NOTE: Ryan says it's NOT a sweet cheat they came up with. The cards should run like this!

    "For example here, on the important/demanding/modern games". Again I guess we have a different definition of modern. 2008 vs. 2010. I'd say you got it backwards, they didn't test crysis 2 here. Read the comments from Ryan. This game can harness 12 threads from the cpu. Firaxis thinks they're gpu limited! Check this:

    "At this point in time we appear to be GPU limited, but we may also be CPU limited. Firaxis says Civ V can scale to 12 threads; this would be a hex-core CPU with hyperthreading. Our testbed is only a quad-core CPU with HT, meaning we probably aren't maxing out Civ V on the CPU side. And even with HT, it's likely that 12 real cores would improve on performance relative to 6 cores + HT. Firaxis believes they're GPU limited, but it's hard to definitively tell which it is."

    Nope not very modern. Dump this game AT. ;) Obviously hawx is old (Tom's tests hawx2, but the game sucks). Personally I'd rather benchmark based on popularity. How many ever played hawx or hawx2? If a large portion of us are playing it, you should benchmark it (of course you can only bench so many, the point is don't bench it if nobody plays it). I agree you should just NOT benchmark anything over 200fps (draw the line where you want). At this speed nobody has to worry about playing on even a $100 card most likely so nobody cares (can you even call yourself a gamer if you don't have at least discrete card? $50 $100 card?).

    Metro 2033 is 6 months OLDER than Civ 5 and scored an avg of 10pts less at metacritic than Civ5. Nuff said? Will you still be complaining about Civ5 when AMD turns on Multithreading on AMD's cards? LOL.

    Maybe they should test crysis 2? Head over to Tomshardware where the 560 beats your beloved 6870 by a good 10% in 1920x1200, even more when you look at the MINIMUM FPS. Which to me is far more important than any other number as games suck below 30fps (AMD's will dip here where Nvidia usually dips less), but everyone can play fine at 200fps.

    Crimson117 pretty much said it best describing a games favorite features and why they run better on one architecture or another. IF memory serves, mass effect/2/3 are based off unreal 3 engine (unreal 4 coming 2012 I think, but totally aimed at Xbox720? whatever they call it, so says Sweeney). Many games in development are based on that engine. It's a good indicator of unreal 3 engine games, current and future releases.

    The 6950 is not the direct competition for the 560 (oh, and are you completely unaware of the fact that half of the cards for sale on newegg only have 1GB of memory just like the 560?...whatever). You do have to pay an extra amount for 2GB, which start at about $240. Meanwhile newegg has a good 1/2 dozen 560's for under $200. Apple's/Oranges? Oh, everyone that buys a new card hopes it will still be relevant at least a year later...I thought that went without saying?

    "The one real true fact is the 6950 gets a huge win in all demanding titles, has 1GB more vRAM and can even unlock+OC very easily to levels none of the 560 versions can reach."

    Umm, correct, because it's another $40, I'd expect that. That's 20% more cost. Does it perform 20% better in every game? Must not, or AT wouldn't be saying it "trades blows with 6950" correct? Dirt 2 from Dec2009 (just a 3 months older than metro2033) is dominated by nvidia. Sorry. Suspect dirt3 will too. Wolfenstein (representing Id's Tech4 engine, until RAGE) is again dominated by Nvidia (no surprise, it's OpenGL based which Nvidia has always been good at). Rage could change all that, but currently (meaning NOW and RELEVANT) it's tech4. Again wolf's not much older (Aug2009) than metro 2033 (about 7 months?). DirectCompute for DirectX 11 looks like Nvidia dominates a 6970 there too (even base model beats 6970). Hmm. They don't have DX12 yet right? At idle the card is great power wise (leads all), under crysis it looks fine at load (overclocking only raises it 10w, very nice - AMD obviously still has a watts advantage in my mind - but nice improvement over 460 perf/watt wise). Great idle noise (80% of my pc time isn't in games), leads all. Load noise blows away my 5850 and it's nowhere near driving me out of my room. I love the 5850. So I'd happily take a 560 at 3-5 less DB's of noise if I were in the market now.

    Have you seen Physx in Alice? Very cool. I don't think AMD has that (wish NV would give or lic it to them). Watch E3, looks like some of Nvidia's stuff isn't bad... :) DUKE in 3D? Nvidia pretty cool?

    FYI: as noted I own a radeon 5850 from Amazon.com :) (took them 7-8 months to fill my backorder but $260 was still $40 less at the time). I own it purely because of noise/heat was a winner at the time, price awesom, perf was a wash and really depended on your favorite games. Both sides have great tech. It just depends on what you play. I'll be buying something in Dec, but honestly have no idea which side will get my money NOW. It will depend on what I'm playing THEN...LOL. OH, some people do use linux. :) I'm more than happy about the fact that AT see fit not to make their benchmarking picks base on making AMD a winner every time...Thanks AT. Between here, toms and a few others I cover about 40 games and always know which card is best for me at the time I buy. :) That's the point of AT etc right? ahh....Trolls...Fanboys... :(
    Reply
  • L. - Wednesday, May 18, 2011 - link

    Interesting reply, but I am no AMD fanboy, i'm a bang for the buck fanboy that's all.
    Quick reply to your stuff :
    Civ 5 is older tech than Metro 2033, nobody cares about the release date.
    Civ 5 is also NOT a very relevant game, as there are others in the RTS genre which have much much more success (SC2 anyone ? ).

    Dirt 2 is also irrelevant as it gets the "lots-of-fps" syndrome which does not help any card make a big difference at all.

    Wolfenstein is as relevant as it is on my hdd, sitting there not being played.

    Where I cite the 6950 as competition (I can buy the 2gig version for 215 bucks, thanks), I assume we are talking about the upper range 560OC and 560 TI which are the cards I talk about when comparing apples to apples.

    AS a summary, your pricing argument does NOT stand (1/2 cards on newegg = all shit cards, no 560ti and no asus 560OC).

    While it is important to keep an eye on games that favor one architecture or another, these 'wins' should definitely weight less, especially when they are temporary (as you said about civ5, nvidia released a special patch just for that game, maybe amd will do the same, who knows), be they amd or nvidia wins.

    I like nVidia, I like their fermi idea, I like a lot of the stuff they do, but that does not change a thing about the current gamer gfx market : nvidia = more watts and more dollars for the same average performance in modern games.

    And no, civ5 will not be relevant a year from now, it's part of those games that die quick, it's not made by blizzard and it's not named sc2, the next few decent rts will wash it out, as for all past civ's and "normal" rts's.
    Reply
  • Iketh - Wednesday, May 18, 2011 - link

    hey now, respect to Age of Empires baby!!! Reply
  • Stas - Wednesday, May 18, 2011 - link

    AoE is the shit Reply
  • TheJian - Wednesday, May 18, 2011 - link

    Bang for buck boy wouldn't make an across the board statement about Nvidia's product line being all trash. There are more than a few great buys on either side and not a lot of competition in the $200 range for the reviewed card here (so say AT in this article). Steampowered shows stats, and Civ5 is in the top 10 EVERY NIGHT. Just checked, 16,000 playing NOW, and that usually doubles or so at night (last I checked that is, months ago - but day stats look pretty much the same. As soon as an expansion comes out these numbers will swell again. If people can be proved to be playing it NOW (every night), I think it's relevant. Note METRO 2033 isn't even there, oops my mistake....500 peak players (that's it down near bottom in like 98th place)...ROFL. Really? Civ4 Beyond the sword has more people playing right now. People just quit playing though, because Civ5 sucks. /sarcasm/

    Civ5's re-playability is awesome. People don't quit playing it quickly. Expansion packs add even more to it. Agreed, bench SC2 also. We're clear on Civ5 not being a forgotten game now right? Top 10 played games every night on steam. Nuff said? It's not Starcraft 2, you're right, won't be played for 10 years probably. But I'd venture to guess they'll play it at least as long as Civ4 (yep, still playing it MORE than metro2033...ROFL). It's first expansion isn't even out yet.

    Wolfenstein (like it or not) is the latest OpenGL game out. So it's a good indicator of OpenGL performance (the only indicator we can get now, until Tech5, or another OpenGL game comes out. This is it. It's not really about the game, so much as the engine under it. When possible they test games based on engines that are being used no and on many games in the future. Prey2 and Brink use it too. Prey 2 coming 2012, Brink of course just out. Tech 4 still relevant eh? Games still being made with it. We shouldn't bench games that will give us an idea of a future games performance though...This sites for benching games you like. Test Brink then. You're just not making a good argument yet :)

    Please provide the LINK for the $215 2GB 6950 radeon? I can't get it off newegg for less than $239 after rebates.
    "AS a summary, your pricing argument does NOT stand (1/2 cards on newegg = all shit cards, no 560ti and no asus 560OC)." I wonder about the clocks on your card if you actually give me a link to it.

    Um, I quoted the "$hit cards" as you call them because that's what AT is reviewing here today. It's a review of the GTX 560, not the 560TI (never compared the TI, it's not the reviewed product). The ASUS TOP card is only $220 at newegg. ASUS also has a higher than tested here MID card that's only $199! 850/4104 (GTX 560 Mid), but this card is 850/4200. Not much faster (100mhz on the memory) but not a BASE card by any means.

    Uh, it's completely UNIMPORTANT to keep an eye on games that favor one architecture over the other and then weight them. LOL. Your problem is as a fanboy you just can't get over any game running better on your enemies product. It's IMPORTANT to be testing games that are using today engines (that will be used tomorrow in games based on said engines), or at least the top games out now that are heavily played. I don't care who's better at what. And if one wins over the other after benching it, we don't just throw it out because NV happens to win it.

    NO, I Didn't say nvidia released a special patch. Nvidia FIXED their drivers to work properly, AMD is currently trying to do the same. They're late. Sorry. Nvidia got there first (and should affect many other games, it's not a patch for this game, it's fully in the DX driver info box now). AMD is failing to produce a multithreading driver for DX currently so NV looks good in Civ5 (for now). Jeez did you read Ryan's post? It's NOT a cheat patch or something.

    I guess you should just make up a list of games for them that run great on AMD. You'll be happy then :) I won't care as long as you can prove they're very popular and played a lot now, or based on an engine that will be in games coming up shortly. FYI: DIRT2 is a DirectX 11 racer. The EGO engine is used in RaceDriver:GRID,F1 2010 & Flashpoint:Dragon Rising as well. Oh, Dirt3 and GRID2, F1 2011 and Flashpoint:Red River will be running EGO 2.0. Yep, games not out yet. So by testing one game we have a decent idea of play in 8 EGO engine based games. Though F1 2010 is probably better to indicate this as it runs EGO 1.5. But still all in the family here and very relevant. GTX 580 gets just over 110fps. The tested card is around 72. I wouldn't call this a 500fps waste game. Don't forget they aren't posting MINIMUM's here (occasionally, but I wish they ALWAYS included them as it affects gameplay below 30fps).
    Reply
  • L. - Thursday, May 19, 2011 - link

    There are no great buys on the nVidia side at the moment, just go read all the relevant reviews and compare.

    16.000 people, double at night ? lol ???

    When I played EvE Online we were over 40.000 at night, and it's surely grown bigger since.. your game is irrelevant, thanks.

    Metro 2033 being not popular because it's not a multi game makes no difference : it IS a demanding game and a GOOD benchmark, noone cares if it sucks or not. It's like far cry .. it wasn't a very good game but it was a perfect benchmark.

    A game cannot be considered relevant or "played" when you have less than 5.000 people playing it online.. otherwise you could go ahead and say Command and Conquer Generals : Zero Hour is still alive ... and i can assure you it's totally dead, swarmed by cheaters and unpatched flaws.

    Wolfenstein didn't seem all that beautiful to me .

    The fact that you are unable to find good suppliers is your problem not mine, I'm not going to give you everything I searched for just because you're flaming around.

    Err .. it's a gigabyte 6950 2gb .. clocks and hardware is of course perfect.

    Well, base cards, shit cards, whatever cards, all cards but the top one don't ever come close to a hd6950 that costs at most 20 bucks more. and if you're looking on the cheap side, the 6870 eats the lowish 560's .. done.

    Top games being played has nothing to do with benchmarking (cfr far cry, crysis ...).
    Do you play unigine a lot ? ^^
    Games favoring one architecture strongly should be weighted less because they are not representing the gfx but rather the driver / gfx calls picked for the engine.

    The fact that nVidia patched is great, AMD will too, and thus it is not very relevant to review a game w/ and w/o patched drivers to compare two HARDWARE pieces, even if it's clear AMD tends to be late on driver patches.
    Reply
  • YukaKun - Tuesday, May 17, 2011 - link

    I know that card ain't aimed towards the "enthusiast" crowd, but doing a lil' tandem on multi display config would have been very nice. At least, let 'em be on SLI for larger resolutions.

    Cheers!
    Reply

Log in

Don't have an account? Sign up now