New Release 275 Drivers & The Test

Launching alongside the GTX 560 will be the newest branch of NVIDIA’s GeForce drivers: Release 275 beta. This actually comes hot on the heels of Release 270, which came out only a month and a half ago.

Unlike Release 270 NVIDIA isn’t making a lot of performance promises, so this is mostly a feature release. The big item in 275 for most gamers will be further refinements to the auto-update mechanism first introduced in 270. NVIDIA has finally fully ported over Optimus’ auto-update feature, meaning that NVIDIA’s drivers can now automatically find and install profile updates in the background. However whereas Optimus profile updates were necessary for switchable graphics, for desktop users the primary purpose of auto-updating profiles is for SLI and anti-aliasing compatibility, as NVIDIA uses compatibility flags in their profiles to make those features work.

Automatic profile updates won’t completely absolve SLI from periods of incompatibility, but it should help. NVIDIA has released out of band profile updates for SLI before, but these were rather rare. If they now release profile updates much more frequently, then this will be a boon for SLI users, particularly GTX 295/590 users. Otherwise SLI is mostly limited by what can be done with a profile – if NVIDIA has to update the driver itself, then users will still need to wait for a new driver release. Which on that note, NVIDIA hasn’t changed the auto-update procedure for the drivers: profiles will auto-download and install, but driver updates must still be manually approved.

NVIDIA tells us that in the future they will also be able to deliver 3D Vision compatibility updates with profiles, but this will probably require a bit of a rearchitecting to their drivers and profiles. Currently NVIDIA’s profiles contain a few flags for 3D Vision (mainly whether it’s allowed), but there aren’t any sweeping compatibility bits as there are for SLI and AA.

Moving on, the other big functionality update with 275 is a new resizing and scaling UI in the NVIDIA control panel. The core functionality of scaling hasn’t changed much as NVIDIA has offered these controls for quite some time, but now scaling controls are available for VGA and HDMI displays, versus just DVI and DisplayPort as it was previously. There’s also a new override option or Windows 7, for forcing misbehaving programs to use NVIDIA’s scaling options instead of their own. (Ed: We’ve never actually encountered this before. If you know of any games/applications that need this option, please let us known in the comments)

As for resizing, NVIDIA has tweaked the UI to help better guide users through using overscan correction and/or disabling overscan on their TVs. The ideal method of dealing with overscan is to disable it on the TV (thereby ensuring 1:1 pixel mapping), which is what NVIDIA now first directs users toward. For users that can’t disable overscan, they can then unlock NVIDIA’s resizing controls. NVIDIA tells us that they’ve also done some work to improve resizing compatibility for games/applications that try to force standard resolutions, but we have not had an opportunity to test this yet.

The Release 275 betas should be available later today, with WHQL drivers appearing within a month.

The Test

As we mentioned in our introduction, the lack of any reference-clocked cards means that the GTX 560’s clocks – and thereby the GTX 560’s performance – is not well rooted. As a result we’ve tested our ASUS GTX 560 DirectCU II Top at a few different clockspeeds. We’ve tested the card at NVIDIA’s reference clocks of 810/4004 (GTX 560 Base), along with the slowest "mid-grade" card on NVIDIA’s release list: 850/4104 (GTX 560 Mid). NVIDIA is pitching the GTX 560 as their $199 card, so for the purposes of our review we’ll be focusing primarily on the mid-clocked GTX 560, as this is the approximate speed of most of the $199 cards. If you buy a $199 GTX 560 today, this should closely represent the speed of the card you’re buying.

Ideally 810/4004 cards will be relegated to the OEM market, but if not we also have the base clocks included for appropriate consideration. It goes without saying that we’d rather NVIDIA just create two different product lines rather than having so many cards under the same umbrella, but at this point we’re just repeating ourselves.

We’ve also included our overclocking results with the ASUS GTX 560 DirectCU II Top, colored in orange. As we were only able to reach 950/4400, the performance gains are limited.

For drivers, on the NVIDIA side we’re using 275.20 beta for the GTX 560, the GTX 460 1GB, and GTX 560 Ti. In practice the average performance difference between release 275 and release 270 is around 1% in favor of 275. On the ATI side we’re using the Catalyst 11.5a hotfix; however do note that in our testing we’ve found that performance is identical to the 11.4 drivers.

CPU: Intel Core i7-920 @ 3.33GHz
Motherboard: ASUS Rampage II Extreme (X58)
Chipset Drivers: Intel 9.1.1.1015 (Intel)
Hard Disk: OCZ Summit (120GB)
Memory: Patriot Viper DDR3-1333 three x 2GB (7-7-7-20)
Video Cards: AMD Radeon HD 6970
AMD Radeon HD 6950 2GB
AMD Radeon HD 6870
AMD Radeon HD 6850
AMD Radeon HD 5870
AMD Radeon HD 5850
AMD Radeon HD 5770
AMD Radeon HD 4870
NVIDIA GeForce GTX 580
NVIDIA GeForce GTX 570
NVIDIA GeForce GTX 560 Ti
ASUS GeForce GTX DirectCU II Top
NVIDIA GeForce GTX 550 Ti
NVIDIA GeForce GTX 480
NVIDIA GeForce GTX 470
NVIDIA GeForce GTX 460 1GB
NVIDIA GeForce GTS 450
NVIDIA GeForce GTX 285
NVIDIA GeForce GTX 260 Core 216
Video Drivers: NVIDIA ForceWare 262.99
NVIDIA ForceWare 270.51 Beta
NVIDIA ForceWare 275.20 Beta
AMD Catalyst 10.10e
AMD Catalyst 11.4
AMD Catalyst 11.5a
OS: Windows 7 Ultimate 64-bit

 

Meet Asus’s GTX 560 DirectCU II Top Crysis: Warhead
Comments Locked

66 Comments

View All Comments

  • DanNeely - Wednesday, May 18, 2011 - link

    Where exactly are you finding a $300ish 2560x monitor? IIRC even the best sales Dell had on refurb 3007's only dropped as a low as $800ish, and with the potential inventory of off lease 3007's mostly gone by now and the 3008 and 3011's being significantly more expensive deals that good aren't likely to repeat themselves in the future.
  • L. - Thursday, May 19, 2011 - link

    My mistake you two ;)

    I was thinking about a lcd pannel from idunnowho that had 2xyz * something resolution and that was dirt cheap .. obviously 2560*1440 aren't common at all and overpriced.

    On the other hand, you could make the argument for a dual monitor setup below 400 bucks that spans more pixels and thus makes more use of the gfx.
  • Stas - Wednesday, May 18, 2011 - link

    You fail once again. You plan on keeping this card until 20" screens hit 2560x1600/1440? It will probably only be... oh, idk... 10 years?
    And $330 for a decent 2560 screen? Links plz.
  • L. - Thursday, May 19, 2011 - link

    Yes sir, I would like to be agressive to you to ?

    On the other hand ... 10 years ??

    My Amoled screen on my n900 has a dpi small enough to cram more than 4*full HD on a 20" pannel, something that will happen soon enough as the oled processes mature.

    Again, my mistake on the monitor price, memory error.
  • L. - Thursday, May 19, 2011 - link

    Who would buy a 200 bucks card to play on a single 150 bucks monitor when the whole config costs 700+ bucks ?

    200 bucks is A_DECENT_AMOUNT_OF_MONEY for a GFX, it means you're a gamer (maybe a poor one though) and it means you might be interested in dual screen (meh you spent 700 bucks on the tower, why not 2*150 for dual 22" 1080p monitors ?).
  • L. - Tuesday, May 17, 2011 - link

    I'm seeing quite a trend with AMD stuff getting better scores (relatively) on more recent and demanding games, and I'm wondering if it would be time to weight games differently for a better comparison.

    For example here, on the important/demanding/modern games (let's take Metro 2033 and Crysis to have undisputable arguments here), the 560 doesn't ever come close to a 6950 and only the best version can beat the 6870 by almost nothing.

    If somebody buys a gfx today, it might be to use it for at least another year, and in that sense, the weight of less important games should be diminished a lot, including hawx-120fps-fest, other 100+ fps titles and the clearly nVidia-favoring Civ5.

    What is important here, is that NOONE has any interest in buying a gtx560 today, because of the following extremely important points :

    -> AMD offerings do better in more demanding games, and will thus do better in future games
    -> AMD offerings (6950 for example) have more memory, which WILL be used in next-gen games for sure, as for every gen
    -> Noone cares if they have 110 or 120 fps in hawx, which is a console game anyway

    I believe the use of any PC component for gamers can be summarized to this in the end :

    -> can it play this game ? 30+
    -> can it play it well ? 60+

    Because any of those components will for most people be used 2 years from now, the fact that older / less-demanding games get 110 fps is completely irrelevant, might as well show 557 fps in quake3 as a benchmark...

    As a summary, could you anandtech guys please tweak your test list / weighting in order to better inform the less-informed readers of your website ?

    It is utter nonsense to state today that a 560Ti "trades blows" with a 6950 or that a factory OC'd 560 "comes close" to a 6950 either.

    The one real true fact is the 6950 gets a huge win in all demanding titles, has 1GB more vRAM and can even unlock+OC very easily to levels none of the 560 versions can reach.

    nVidia has done some great stuff in the past, but one has to admit that outside of quad-sli gtx580 there is no use in buying anything nVidia this round, as AMD offers better performance + performance/watt at every price point this time around.

    There is one argument for nVidia and that argument (no, not the drivers, because you do NOT play on linux) is the nVidia goodies like 3d gaming and other minor stuff.
  • crimson117 - Tuesday, May 17, 2011 - link

    I half agree with you... some of your commentary is good (HAWX lol) but one particular conclusion is not tenable:

    "AMD offerings do better in more demanding games, and will thus do better in future games"

    When Mass Effect 3 comes out, I expect that like Mass Effect 2 it will strongly favor nVidia GPU's - unless they rewrote the entire engine.

    New games cannot be classified into demanding vs non-demanding - each game engine has its favorite factors, be it clock speed, memory bandwidth, stream processors, ROP's, etc, so I expect each game will have its favorite card.
  • formulav8 - Tuesday, May 17, 2011 - link

    The thing is, in games that people can actually use the horsepower, the Radeon is the best card.

    If you care about getting 500fps on Quake3 instead of 450fps, then the GTX is the better card.
  • lowlymarine - Tuesday, May 17, 2011 - link

    The problem is that if they DON'T complete rewrite the entire engine, Mass Effect 3 will continue to be a festival of even mid-range cards breaking 60 FPS. While there's nothing wrong with that per se - ME2 is one of the better-looking games out there despite being not particularly intensive, after all - it still means that nVidia's slight advantage over AMD in that game is meaningless. Compare that to Crysis where even the 6970 falls short of 60 FPS at WUXGA, and the sizable lead AMD carries over the competition there has real, noticeable impact on the game.
  • Stas - Wednesday, May 18, 2011 - link

    Correction:
    New games cannot be classified into demanding vs non-demanding - each game engine has its favorite chip developer. Money and politics will decide performance in certain games.

Log in

Don't have an account? Sign up now