Taken from Mystique G200 Review
To fully understand where the performance and image quality come from when looking at a video card such as the Millennium G200, we must turn our attention to the heart of the card, the chipset - in this case, the Matrox G200.
Imagine that you are on an 8-lane highway. The 8-lanes of this highway allow for more traffic to move from one end of it to the other, however there is a catch. The cars on the highway can only be moving in one direction at a time, meaning that all the cars must either be moving up the highway or down it but not both at the same time (all 8-lanes move in the same direction). Consider that the limited functionality of an internal 128-bit Data Bus when applied to video cards, on any given CPU clock cycle the data being transferred via the internal 128-bit Data Bus can only flow in one direction (to the graphics engine). On the following clock cycle the data can be transferred down the bus in the other direction (from the graphics engine). While this approach does have its benefits, when dealing with 2D images and bitmaps where the data that must be transferred down the bus remains quite small (less than 128-bits) there is a much more efficient way of approaching this.
Let's take that highway example from above, now instead of making that highway an 8-lane highway let's split it up into a 4-lane going and a 4-lane coming highway. Meaning that at the same time 4 lanes of cars can be traveling on the highway in the opposite direction of 4 lanes of cars on the other side of the highway (4 lanes can be leaving the city while 4 lanes can be entering). If there is no need for 8 lanes to be open for transportation in any one direction then the first 8-lane highway wouldn't be as efficient as this modified 4/4-lane highway. The same theory applies to the Matrox G200.
Instead of occupying the entire width of a 128-bit bus to transfer data in 64-bit chunks why not create a dual 64-bit setup with one bus dedicated to sending data to the graphics engine and the other dedicated to receiving data from it. This is what the G200's 128-bit DualBus architecture is, in essence it is 2 64-bit buses offering the same combined bandwidth as a single 128bit data bus while allowing for data to be sent in parallel to and from the graphics engine. It is this technology that gives the G200 the edge over the competition in its 2D performance, allowing for 24-bit desktop color depths at a faster level of performance than most of the competition can do at 16-bit color depths.
3D Performance and Image Quality
What good is a 2D combo card without the powerful 3D punch to back up its dimension crippled counterpart in performance? The G200 doesn't lose any points here either, with a 100 million pixels/second fill rate one would expect the G200 to be able to hold its ground fairly well in 3D games and applications. You must keep in mind that the G200 was never intended to be a Voodoo2-killer rather a lower cost alternative for those who don't have the funds to accommodate a single Voodoo2 + 2D accelerator which justifies the sub-Voodoo2 levels of performance you'll be seeing from the G200.
One advantage, outside of price, that the G200 holds over the Voodoo2 as well as all other 2D/3D combo cards is its top notch image quality. Using Vibrant Color Quality (VCQ) Rendering the G200 is capable of rendering images in 32-bits per pixel color (meaning 8-bits for red, green, blue and alpha) even if the rendering is set to 16-bpp the internal calculations and accuracy is done in 32-bpp and dithered down to 16-bpp upon displaying the images. Matrox illustrated this difference quite nicely on their VCQ Whitepaper which is summarized beautifully by the two screenshots they provided below:
The difference is clearly noticable in the 32-bit Rendered Incoming screenshot, whether or not this will mean anything to you as the frames are flying past you while playing is up to you to decide, however once you experience the G200's image quality you'll never want to go back.
Another excellent example Matrox used was the following screenshot comparison of a 32-bit texture rendered in both 32-bits and 16-bits to visually illustrate the difference between the G200 and a chipset only capable of 16-bit Rendering in a 32-bit scenario.
As more and more games begin to use 32-bit textures you can expect to notice differences like the one illustrated above between a graphics card capable of 32-bit rendering and one limited to 16-bit rendering. Since the number of games that make use of 32-bit textures is extremely low this feature doesn't carry as much weight as performance does, for example. But for now you can rest assured knowing that one day you'll be able to make more use of the G200's advanced rendering capabilities - better safe than dithered in this case.