With the highly anticipated The Witcher 3 being released today, NVIDIA has pushed out their requisite launch day Game Ready driver for the game with the release of driver version 352.86.

As is typical for a Game Ready driver, the focus on this driver is enabling SLI and GeForce Experience support for The Witcher 3, which along with its high profile nature is also one of the games in NVIDIA’s latest game bundle. Along with The Witcher, this driver also includes profile updates and bug fixes for a handful of other games, including Civilization: Beyond Earth and Magicka 2.

Perhaps more interesting is the fact that this marks the first Windows 7/8 (WDDM 1.x) release of a R352 branch driver. NVIDIA’s previous release for Grand Theft Auto V was from the R349 branch, and in fact this new branch comes barely a month after the introduction of the R349 branch. Unfortunately at this time beyond the handful of bug fixes and new profiles we don’t know whether R352 includes any major updates, as NVIDIA's release notes don't mention much else for this release. New branches typically contain more significant feature and performance enhancements, so there may be a surprise or two in here. At any rate what we do know is that NVIDIA has yet to merge their Win7/8 and Win10 drivers at this time, so this release only contains support for Microsoft’s current OSes, while the in-development Windows 10 continues to receive its own driver updates.

As usual, you can grab the drivers for all current desktop and mobile NVIDIA GPUs over at NVIDIA’s driver download page.

Source: NVIDIA (via SH SOTN)

Comments Locked

28 Comments

View All Comments

  • chizow - Monday, May 18, 2015 - link

    Kepler was good for what it was expected to run for its time.
  • chizow - Monday, May 18, 2015 - link

    So I guess all the reviews that show Nvidia managed to extract 1.6x more perf and 2x perf/w on Maxwell with nearly the same transistor count, die sizes and wattage compared to Kepler were all a lie huh?
  • hawtdawg - Monday, May 18, 2015 - link

    When those comparisons are made against their own GPU's, I find it decidedly convenient for their perf/watt statistics that games started suddenly running like dogshit on Kepler around the time Maxwell launched.
  • chizow - Monday, May 18, 2015 - link

    You never considered for a moment that the actual games are evolving and taking of some of Kepler's weaknesses and Maxwell's strengths? Next-gen games from next-gen consoles and all that good stuff? I mean as sad as it is that old archs aren't running new games as well as older games, that's kind of par for the course I guess and why we eventually upgrade is it not?
  • hawtdawg - Monday, May 18, 2015 - link

    Why don't pure graphics benchmarks show this? Why is AMD's 4 year old architecture not showing this? Maxwell was not a significant architectural change, and there is certainly nothing about games that have been released in the last year that are a huge departure from previous games that Kepler demolishes (Frostbite 3 anyone?). The only thing that has changed is that Gameworks was introduced last year.
  • chizow - Monday, May 18, 2015 - link

    AMD's GCN has actually always been better at compute than Kepler, you can go back to the Dirt games with Global Illumination to see Nvidia falling behind, but not without cost, as AMD has always been weaker at Tesselation and that shows too. Kepler was also saddled with a lot of unused compute, and ofc as we know, Maxwell necessarily stripped this and focused completely on gaming.

    As for not being major differences, you may want to check again and read up on the Maxwell Part 2 review (980 review). Not only does Maxwell have 2x the ROP to mem controller ratio as Kepler, it also has fewer shared resources for each SM, meaning you can get more granularity from each SM when running parallel but less similar code like compute. They said in some cases they were seeing 90% increase in utilization per SM....kinda makes sense in this context doesn't it?

    http://www.anandtech.com/show/8526/nvidia-geforce-...

    And now look at the Compute synthetics....kinda mirror GameWorks games performance doesn't it?
    http://www.anandtech.com/show/8526/nvidia-geforce-...

    Again, you acknowledge the fact Nvidia kind of formalized all of these next-gen features and libraries into GameWorks, but then reject the idea that their next-gen architecture was designed to run that type of code better, while also rejecting all of the benchmarks that show Maxwell is 1.6x or faster than Kepler at least, and possibly more in workloads that favor Maxwell arch.
  • chizow - Monday, May 18, 2015 - link

    Edit: 1st para should read: Kepler was also saddled with a lot of unused *DP* compute
  • darkfalz - Monday, May 18, 2015 - link

    I hope this fixed the bug where G-sync was on even during desktop.

Log in

Don't have an account? Sign up now