Last week I got an interesting email the day after NVIDIA dropped their 182.06 GeForce drivers to coincide with the launch of FEAR 2. To sum up the relevant bits of the email, the question was “Why are NVIDIA’s drivers so big?”

Although the most immediate answer to that is not particularly interesting – they’re as big as they need to be because they must be – this made for a great excuse to do some exploratory dissection. The latest GeForce Windows Vista 64 drivers weigh in at 100MB, which is not necessarily huge given that broadband is (mostly) plenty and disk space is cheap, but none the less it’s a bit of an infamous mark to hit if only because it wasn’t that long ago that 100MB was the size of an entire operating system. Just to be clear we’re not knocking NVIDIA here (we’ll get to why everything is so big in a moment) but the size of their drivers does make for an interesting discussion. What’s in a 100MB driver?


NVIDIA 182.06

We’re not going to list every single file (there are 59 of them, most of which are small files we don’t care about) but we’ll hit the highlights. Right off the bat, we run in to the matter of redundancy. Anyone who has seen the structure of Vista 64 has seen that there are a number of seemingly redundant applications and libraries. The reason for this being that Vista 64 needs both 64bit and 32bit versions of most libraries so that it can run native 64bit applications and run 32bit applications through WoW64. This kind of redundancy sneaks in to video drivers too due to the way the OS is structured. All told the Vista 64 driver package is 18MB larger than the Vista 32 package due to the need for a separate 64bit OpenGL driver (nvoglv64), a separate 64bit Direct3D driver (nvd3dumx), and finally due to general binary bloat (64bit binaries are slightly larger due to the larger instructions).

Next on the list we have the NVIDIA control panel, which is spread amongst several files. We’ll call it 15MB, but it’s probably a bit larger than that. We’ll follow that up with the Vista kernel driver (nvlddmkm) at nearly 5MB, the 32bit OpenGL driver (nvoglv32) at 4.5MB, the display server (nvDispS) at 3.5MB, and the CUDA driver (nvcuda) is well down the list at 1.2MB. There are a few other files bigger than this, and many more smaller than this, but it quickly adds up.

The single biggest file however is PhysX_9.09.0203_SystemSoftware, which as the name implies is the PhysX installer. The PhysX software is for all intents and purposes its own beast; it’s a CUDA application that doubles as its own drivers and libraries. Of the 100MB for the entire GeForce driver package, 40MB of that is just for PhysX. As for why it’s so big, most of this is due to the structure of the PhysX drivers as inherited by NVIDIA upon their purchase of AGEIA last year. Every version of the PhysX middleware requires its own version of the PhysX core library (PhysXCore) along with some object code for the still-supported AGEIA PCI and PCIe PPUs. There are 25 versions of the middleware included in NVIDIA’s PhysX drivers at this moment, which is what leads to large size of the PhysX installer.

Finally, there is the matter of the compressed and installed size of NVIDIA’s drivers. So far we’ve been quoting the compressed size of everything since that’s how the driver is downloaded, but it’s worth mentioning the installed (uncompressed) size of the drivers too. This isn’t quite as easy to track and add up as with the compressed drivers, but we’d estimate the total size is in the ballpark of 250MB. This is roughly split evenly between the GPU drivers and the PhysX drivers. Also keep in mind that this doesn’t include NVIDIA’s optional System Tools software needed for GPU overclocking, GPU monitoring, etc. That’s another 82MB compressed.

We also took a quick look at some previous GeForce driver packages for Vista 64, to get an idea of how much the drivers have grown over the past couple of years. Drivers are like any other software: more computing power and more space eventually leads to everything getting a bit bigger, so there’s no real surprise here that they grew in size.


NVIDIA 163.75


NVIDIA GeForce Driver Installer Size (Vista 64)
Version
PhysX Installer
Total Size
163.75 N/A 43MB
175.19 N/A 50MB
178.13 50MB 103MB
180.48 35MB 91MB
182.06 40MB 100MB


The 163 drivers are from November of 2007, so this covers roughly a 1.25 year time span. The big leap in size with the 178 drivers is when NVIDIA started including the PhysX installer with the GeForce drivers, which actually made those drivers slightly bigger at 103MB (due in large part to a 50MB PhysX installer). Excluding the PhysX installer, there’s no specific pattern in growth - between 163 and 182 virtually every driver file got bigger. The control panel and OpenGL drivers are the biggest culprits here due to their large size in the first place. As for the PhysX installer, the aforementioned need for it to include several versions of the PhysX libraries makes it an outlier. NVIDIA did beat 15MB out of it for the 180 drivers (down to 35MB) only for it to jump up to 40MB for 182. While we only have a short window of time to judge from, so far it’s the single biggest reason that NVIDIA’s GeForce drivers are growing as much as they are, and it looks like that will be continuing in to the future.

Not to be left out, we also cracked open ATI’s latest Catalyst 9.2 drivers to take a look at their size. The 62MB driver package is a bit bigger on the driver side than the control panel side (40MB/22MB or so) and we’d estimate the total installed size is around 140MB, although it’s even harder to gauge than NVIDIA’s drivers. This makes them roughly as big as NVIDIA’s own GPU drivers and control panel; the difference between the two comes down to the PhysX software.

And for those of you curious about platform differences, the latest NVIDIA GeForce drivers for Linux x64 (180.29) are 20MB.


ATI Catalyst 9.2

Comments Locked

33 Comments

View All Comments

  • nycromes - Thursday, February 26, 2009 - link

    This is the age old question of which came first the chicken or the egg? Or the question about using different fuels for vehicles, how do we get stations to carry the new fuel type when little to no cars use the fuel and how do we get people to buy the cars when they can't get fuel for it in most locations?


    NVIDIA is spreading the code out through their platform automatically so that developers can start using it on a mass scale. Otherwise it would be hard to get developers to code for it and it would flop. In other words, they are building the infrastructure that will enable the success of their system/platform.
  • TA152H - Thursday, February 26, 2009 - link

    You don't think Nvidia's been knocked down to their knees already?

    Show me someone that owns Nvidia stock, and I'll show you an idiot. With Intel about to enter the GPU fray, their life is going to get a lot more difficult, and soon. More than that Intel despises them, for their obnoxious advertising and is going after them pretty aggressively now. They were much better off staying under their radar.

    The reality is, with the processors getting more and more built into them (memory interface, for example, and soon the GPU), what can be added outside of it becomes less, and that's where Nvidia survives. Since ATI makes a very competitive, if not generally better, GPU right now, it's even worse for them.

    Once Intel enters the scene, it's going to be ugly. They are an EXTREMELY formidable company when they make up their minds to compete in a market, not only because their design resources dwarf what is available to their competitors, but also because their manufacturing technology is much better than anything Nvidia or AMD can even approach. Couple that with their software prowess (greatly underestimated), and their greater ability to support their products, and induce other companies to use them with cash incentives, and you're whistling past the graveyard when they have you in their cross-hairs.

    I'm not saying Larrabee is going to be the greatest thing since Cheddar Cheese, but that's the whole point. It doesn't have to be. They are so much better with the things that surround it, that the design gets a lot of help. On top of this, even if it's not, the fact that Intel has entered the market very aggressively, and is taking it much more seriously means that whether this product is good or not, they will keep working on it, and they'll almost certainly get it right.

    On top of this, Nvidia's reputation has taken a beating with their faulty/defective products lately.

    Because AMD already makes CPUs, they can match Intel with regards to synergy between the two (for example, moving them on die), but Nvidia is out in the cold. I wouldn't use their stock to wrap fish with. But, I've always hated the company for the same reasons you do, and only once bought a card based on their technology (and was irritated by it, and will never buy another again).
  • garydale - Thursday, February 26, 2009 - link

    Intel entered the graphics arena a long time ago but have yet to do anything significant. Basically, Intel graphics run a couple of generations behind what NVidia and ATI put out, even comparing apples to apples on the integrated GPU front.

    Can Intel catch up? Possibly, but its not a given. Let's face it, developing a top-notch GPU is difficult. I for one welcome their efforts because having a third major player has got to make for even fiercer competition. However, I think it'll be a while before they get out of the basement and between now and then business plans can change.
  • TA152H - Thursday, February 26, 2009 - link

    Thanks for your response, now I understand why people still own Nvidia stock, but I'll tell you why I disagree with it.

    Intel never entered the high performance graphics market, even after they ate up Real3D and came out with the i740. The point of these GPUs was to help make the platform better, and to leave the high end stuff to Nvidia, 3DFX, ATI, etc... Clearly, they saw the need to at least offer chipsets with 3D accelerators, and have dominated the market ever since. Clearly, the market they targetted they were very successful in, and for that reason I wouldn't use their history as an indictment against their success, but an example of it.

    Now their target is a different market, and a market they had left alone. It's not a second attempt at this market, after a failure, it's a first. And, maybe it will miss the mark. Maybe it will not meet expectations. Maybe it will be late and by the time it comes out, be obsolete when compared to its competitors. Even in these cases, it won't be the last word on it. They'll come out with another one, and another, and they do have a lot of smart people there, that do learn. With all their advantages, particularly with manufacturing, it's so difficult to imagine them never being able to compete with a weak company like Nvidia, that doesn't even make a processor. A single die GPU/Processor has some nice advantages, not the least of which is communication between the two is much faster, and splitting instructions should be a lot more efficient. It's just not something Nvidia has the option to do, but both AMD and Intel clearly feel there's a lot of good in it.

    As far as the Larrabee being a niche product, since when does Intel target niches? I don't think they'd spend so much time and money developing it for a niche market. Of course, not everything goes as planned, so it's possible, but it's clearly not what they're targeting, and should it fall into this role, that certainly doesn't mean the successor will. It's not like the Itanium, where you have to rewrite a program, or operating system to run on it to give it support. If the product is good, and Intel gets the drivers right, it's going to sell well. They've got everything covered extraordinarily well.

    Bet against Intel at your own peril. They do make mistakes, but, invariably, they recover from them. If you think it's a good idea to assume they'll always screw up, you should apply to AMD for a job. But, can they afford to hire anymore? Hmmm, I wonder why not? It's not JUST the economy.
  • aj28 - Thursday, February 26, 2009 - link

    I would go so far as to doubt that Intel's presence in the GPU market will even increase competition in the mainstream. Larrabee is an entirely different beast than cards on the market right now, and will undoubtedly cater to a niche market. Personally the only chunk of the graphics pie I can see Intel even having a chance at is high-end workstations, where their name alone will turn the heads of large corporations regardless of performance, and X86 technology might just be seriously adopted.
  • Jaguar36 - Thursday, February 26, 2009 - link

    The dissection was great, but what I don't understand is why the actual driver is so big. Assuming there are no graphics in there, 18mb is an awful lot of code. What all is it doing that makes the code so big?
  • LinkedKube - Sunday, March 1, 2009 - link

    Its making your video card run?
  • 7Enigma - Thursday, February 26, 2009 - link

    Individual game/program "optimizations" to beat the Orb and the competition? :) I kid, I kid.....kind of.
  • Targon - Thursday, February 26, 2009 - link

    When it comes to the total size of the driver package, you have to figure that both AMD and NVIDIA are including drivers for not just YOUR video card, but also for all devices in the family.

    The ATI driver package includes drivers for the Radeon 9500-9800 family, the X300-X800, the X1300-X1800, and all cards going forward from there. Each of these product families need their own drivers, even though the interface to those drivers may be the same(Catalyst Control Center).

    In addition to this, AMD also includes the chipset and AVIVO drivers in their driver packages, which adds to how large each release is.

    The idea behind these huge driver releases is that most consumers have no idea what sort of video card or GPU they have in their system, so AMD and NVIDIA make just one big package that has all the drivers included. Since there are SLI and Crossfire optimizations in the drivers as well, all of that extra stuff gets included in the same drivers.

    Finally, while the drivers themselves are important, the connection between the control application and the drivers also requires a bit of code. So, AMD just putting support for Catalyst Control Center into their drivers will increase the size. Being able to adjust the performance v. quality slider, or force AA in the drivers rather than making the application turn it on or off is a part of this. If all of that extra functionality were cut out, drivers would probably be a lot smaller than they are today.
  • Matt Campbell - Thursday, February 26, 2009 - link

    I was just pondering this a few days ago. Thanks, neat breakdown Ryan.

Log in

Don't have an account? Sign up now