If you go back far enough in the computer industry, there have been many successful video card companies. Back before the whole 3D craze kicked off, some of the fastest 2D video cards came courtesy of Matrox, and while they made some attempts at producing compelling 3D graphics cards, they were never able to grab the performance crown from NVIDIA or ATI. Their last real attempt at the 3D graphics market came in 2002 with the Parhelia-512, and as was the case with previous efforts it basically ended up falling short. Interestingly, the Parhelia-512 supported "surround gaming" long before AMD's Eyefinity, and that may have opened the gates for what would become Matrox's core focus over the next decade: multi-display video cards.

Since 2002, there haven't been many reviews of Matrox cards because the focus shifted to industries that need not just two or three but potentially a dozen or more displays all running from a single system. Their last graphics card update was in 2009, and since then the top product has been the M9188, a single card capable of driving eight DisplayPort or DVI connections, with the possibility of using two cards to drive 16 displays. Who needs that many displays? Well, the financial and security markets are two easy examples, as they both have use cases where six or more displays is "reasonable", and digital signage is another category where Matrox can provide useful technology. These are all professional markets, and the M9188 is priced accordingly ($1500+), but if you were looking to build a system with good graphics performance, Matrox basically hasn't been relevant as their cards seem to focus almost exclusively on 2D performance these days.

That might be changing with future products given today's announcement, as Matrox will be switching to AMD-designed GPUs for their next generation of multi-display products. These will continue to support Matrox's PowerDesk desktop management software, but what's not clear is whether Matrox will be doing much in the way of customized hardware. The announcement states that "key features of the selected AMD GPU include 28nm technology with 1.5 billion transistors; DirectX 11.2, OpenGL 4.4 and OpenCL 1.2 compatibility; shader model 5.0; PCI Express 3.0 and 128-bit memory interface."

From that we can surmise that Matrox will be using a variant of the Cape Verde GCN core, which is one of the lower performance GCN parts from AMD. In fact, Matrox may actually be using AMD's FirePro W600 cards, only with custom Matrox-developed software applications. This would also mean Matrox is looking at a maximum of six display outputs per graphics card (compared to eight on the M9188), but AMD already has the ability to run up to six GPUs in a system with the appropriate motherboard meaning up to 36 displays off a single system is theoretically possible.

The hardware is of course only part of the equation, and Matrox's PowerDesk software is something that benefits many businesses and professionals. Matrox notes that "critical productivity-enhancing features available with Matrox PowerDesk software will continue to be supported on the next line of Matrox graphics cards designed with AMD GPUs." These features include the ability to configure and manage multi-display setups, which can get tricky once you move past two or three displays. PowerDesk has tools to configure stretching, cloning, pivot, bezel management, and other items that are important for a professional multi-display configuration.

There are plenty of upsides to this announcement. For one, it allows Matrox to reallocate resources that are currently going into hardware development and instead focus on their core competency, which at this point is multi-display solutions. PowerDesk is well regarded in their target market, and this will allow Matrox to continue to improve the platform without trying to design their own hardware. AMD benefits as they're able to partner with Matrox and potentially sell their GPUs at higher "professional" prices, and they may also increase their share of digital signage and other multi-display markets.

And of course the customers that purchase the cards benefit as they get to move to a modern platform with support for all the latest DirectX, OpenGL, and OpenCL libraries. Long-term, this also opens the doors for Matrox to offer substantially higher performance 3D solutions from AMD for customers that need such features. Overall, this announcement isn't likely to affect most computer users, but it's good to see Matrox still hanging around after several decades in the computer graphics industry, something many of their competition from the 90s didn't manage to achieve.

Source: Matrox PR

POST A COMMENT

36 Comments

View All Comments

  • Wolfpup - Thursday, September 4, 2014 - link

    Makes sense to me that Matrox would refocus and not design GPUs themselves.

    Geez, I remember when Matrox was a big name for consumers. I remember some of their ads I think! Matrox Millennium? I think that's the Matrox GPU I had in a system in...I guess the late 90s?

    I remember their image quality was excellent back when not everyone's were. Unreal 1 ran great on it at least! :-D
    Reply
  • xrror - Thursday, September 4, 2014 - link

    It's kinda irritating to just see the hand-wave that Matrox just didn't try hard enough or were fools when they made the Parhelia. No even nVidia was blind-sided by how spectacularly the ATi/ArtX Kahn - Radeon 9700 Pro came out of the gate.

    This isn't meant to be some fanboi rant, it's more that holy moly the 9700Pro really did redefine the game - and if you weren't living/gaming/LAN party during that era, that kinda gets missed.

    If anything, the Matrox Parhelia should go down as a case study of having about the most pathologically WORST timing in video card history. Matrox really tried, and put their R&D heart into releasing the card - nobody could have guessed a mere few months later the enormous jump ATi would push into the market.

    Remember everyone, the 9700pro was the first time that you could even consider having AA enabled and it not be a slide show.

    historical rant over. sorry all
    Reply
  • wireframed - Saturday, September 6, 2014 - link

    Yeah, it was more than double the performance of the previous generation, as I recall. It took YEARS before anything significantly outperformed the card, and it was a viable gaming card for years after its release. And that was in a time were games generally forced upgrades a lot more than today. Reply
  • ecuador - Friday, September 5, 2014 - link

    Ah, I remember when I got my first PC (around 94-95?) those Matrox MGA Millenniums where way out of my price range (I had to go for the cheapest S3 or CL), so when, a decade later, we were throwing away some workstations at my University, I pulled a Millennium with the 6MB daughter-card to keep as a souvenir. I mean it felt a bit like throwing away a 10 year old Ferrari, especially considering Ferrari no longer made sports cars by then (just farm tractors). Reply
  • Daniklad - Monday, September 8, 2014 - link

    I have to step in here. Matrox was an innovator and I'm sorry they were forced out of the consumer market. Matrox had better quality hardware for VGA output but lost that advantage when digital displays became standard. The superior color processing is also a lost advantage they don't have anymore.

    The G400 had some quite advanced capabilities for its time that neither NVidia nor ATI had. Like environment mapped bumpmapping (EMBM, reflections on bumpy surfaces basically).

    The Parhelia had 16x antialiasing in a special mode (FAA) that had basically no performance impact because it only processed edge pixels (5-10%) of the screen which also lead to crisper looking texturing. Unfortunately all games weren't compatible.

    They also had the first hardware that could do Hardware Displacement Mapping which Anand reviewed in may 2002. Today we know this as hardware tessellation and geometry shaders.

    I hope AMD and Matrox can lean on each other and together avoid dropping into the abyss. We need them more than many people realize.
    Reply
  • MrGoodBytes - Tuesday, October 7, 2014 - link

    glad I'm not the only one who was shocked to see matrox news and take a deep trip down memory lane. From someone who's only ran laptops the past decade+, I really wish they moved to laptop docking stations/usb. I still struggle daily with MMS docking station quirks, and it's the constant reminder that "if matrox did this it'd just be easy".

    My last desktop still runs a parhelia. Holy crap, that triple monitor gaming was mind blowing at the time. I feel bad for anyone who didn't get to experience wolfenstein with that at the time. but yes, it did also kinda suck on the 3D thing, not their fault, as stated earlier, just really unfortunate market timing. I know MMS is pretty much a given today but they are the reason for that!!! and they just always did it effortlessly. I remember in college playing around with how many monitors I could attach to a single computer having access to a stack of G200MMS (lol, ISA boards), and the answer was 12 CRT's before we ran into issue with throwing breakers and just having long enough vga cords. The drivers handled it trivially... blew my mind. A year before in 2001, I had some capacitors blow on my motherboard, and I'll be damned if it didn't fry every single component of my computer: ram, cpu, HD's, MONITOR.... but that G450 still works to this day.

    the answer is that if matrox comes out with a card with some remedial acceleration and PCIe is going to be compatible for a while, I'd buy it in a heart beat reguardless of price.
    Reply

Log in

Don't have an account? Sign up now