Back to Article

  • SeannyB - Wednesday, September 03, 2014 - link

    Now there's a name that evokes some nostalgia. A friend of mine had one of those Matrox Parhelia triple-head setups which was pretty novel in the days before ubiquitous 27" 1440p displays. And also the 64-bit color rendering was a sight to see, at least in Matrox's own tech demos. Of course the performance wasn't all that, but image quality was Matrox's emphasis. I myself owned a G400 Max (1999) and I remember that being the case as well-- mediocre performance, fantastic image quality. Reply
  • SeannyB - Wednesday, September 03, 2014 - link

    Or no, it was 10-bit-per-channel color, now that I'm skimming Anand's old review. Reply
  • B3an - Thursday, September 04, 2014 - link

    I always thought that one of the main features for Matrox was better image and colour output compared to AMD or Nvidia (like higher bit and more accurate). This article doesn't mention anything about that. So how does this affect colour now that they will use AMD GPU's? Reply
  • Senti - Thursday, September 04, 2014 - link

    Likely all AMD cards are capable of 10-bit color, so there should be no concerns for this area. "Image quality" in digital days is just bit-exactness (especially in 2D), so should be no problems here either.

    I hope that Matrox could give AMD some push to improve their quite pathetic workstation drivers. It's mostly bugs that annoy people today, not features or speed of GPUs.
  • silverblue - Thursday, September 04, 2014 - link

    They weren't bad (as you said, excellent image quality, and two RAMDACs), it's just the price hurt. A friend had the G400 and it outperformed my Savage4 Pro quite nicely which certainly helped in Unreal Tournament.

    The Parhelia... eek. Well, the potential was there, but underperforming most of the Ti-4xxx line wasn't exactly impressive. Drivers did improve things over time, but the card was too expensive to compete.
  • ddriver - Thursday, September 04, 2014 - link

    Is there any nostalgia for the abysmal 3D performance of an insanely expensive product? Reply
  • LemmingOverlord - Wednesday, September 03, 2014 - link

    Matrox was tremendous when it came to 2D and image quality, never having actually broken into the 3D market proper. Their biggest asset was their software, rather than their hardware. Most corporate PCs came with inexpensive Matrox 3D cards, which were good for basic computing but rubbish at any type of 3D

    I really don't see how a company like Matrox remains competitive, tho'. By becoming an AMD vendor they could just be signalling they've reached the end.
  • ToniCipriani - Wednesday, September 03, 2014 - link

    It's kinda sad, actually. I actually still remember the MGA Millenium and Mystique cards. Having one of those back then was like luxury. Reply
  • caseyse - Thursday, September 04, 2014 - link

    When I purchased my MGA Millenium, with its 4MB VRAM memory expansion board (6MB RAM total) it was the top performer at the time. A couple generations prior to this card, I was using their blazing fast Mach32 card. I remember the scandal Matrox found itself in when it was discovered they had written a routine in their firmware to recognize the leading graphics card benchmark at the time in order to produce favorable results. Reply
  • Creig - Thursday, September 04, 2014 - link

    Pretty sure Nvidia had them beat:
  • StevoLincolnite - Friday, September 05, 2014 - link

    Yep, I had the Millennium paired up with a 3DFX Voodoo 2, good times! I actually still have those cards boxed up somewhere. Reply
  • tipoo - Wednesday, September 03, 2014 - link

    I think they still had a niche in the medical space, for highly accurate 2d renders of medical scans on high res displays. Reply
  • Guspaz - Thursday, September 04, 2014 - link

    I worked at Matrox once, and I'll just limit myself to saying that I would be surprised to hear somebody say that their software was an asset. Reply
  • Guspaz - Thursday, September 04, 2014 - link

    Actually, to address the competitiveness thing, they do more than just graphics cards. They have an imaging division (frame grabbers and machine vision and stuff) and a professional video division (pro AV gear for broadcast and live video and stuff). Reply
  • nerd1 - Wednesday, September 03, 2014 - link

    I still remember the days when you have to purchase separate VGA card - and matrox card with dual VGA output was quite a popular choice (plus voodoo accelerator card). Reply
  • Laxaa - Thursday, September 04, 2014 - link

    Blast from the past indeed! Reply
  • MartinT - Thursday, September 04, 2014 - link

    Kind of odd to see them going with AMD, but I guess AMD's refusal to implement proper power saving modes for more than a single display (and resulting considerably higher power use in dual-display modes) becomes irrelevant when you're talking about using >3 displays at once, and the higher total number of outputs per GPU takes over as the main concern.

    As for Matrox, well, I enjoyed my Mystique 220/Voodoo² SLI combo back in '97. They never figured out how to do 3D, though, good thing they found a niche.
  • wireframed - Saturday, September 06, 2014 - link

    I have two displays on my R290, and it clocks down to 300/150MHz just fine. My GeForce 570 originally didn't clock down from its 3D speed with more than one display, but they kinda fixed that. (Kinda, because originally you had to have two identical resolution displays for it to work. Not sure if that's still the case.)
  • iwod - Thursday, September 04, 2014 - link

    Arh... Good old Matrox....
    My First thought into the story was, how does 1 display card actually drive that many display? Aren't there any bandwidth contention? Or do they use Low Resolution and Scale up?
  • JarredWalton - Thursday, September 04, 2014 - link

    The current M9188 has 2GB RAM and is focused primarily on 2D use cases. It can drive all eight displays at up to 2560x1600 over DisplayPort, or 1920x1200 over DVI. How it works with eight displays for "normal" use is something I couldn't say, but apparently the markets that use that many monitors are okay with its performance. Reply
  • wireframed - Saturday, September 06, 2014 - link

    They're just pushing 2D data, so it's not like there are gigabytes and gigabytes of data. You can easily calculate what you need to push 2560x1600x24 bits Per pixel for however many fps you need. You need around 16.3MB per frame, so 30fps or even 60fps wouldn't require much bandwidth, even for 8 or 16 displays. It's a little over 15GB/s, easily doable with a modern card. A cheap card like the r260 has around 90GB/s of memory bandwidth. You just need to enough lanes to the display connectors. Reply
  • iwod - Thursday, September 04, 2014 - link

    Another thing, Why Couldn't Matrox just license IP from Imagination or ARM? Reply
  • deltatux - Thursday, September 04, 2014 - link

    Probably because those parts might not deliver the graphics performance used in professional displays? They probably had their reasons for not using PowerVR or Mali. Reply
  • JarredWalton - Thursday, September 04, 2014 - link

    You'd need to create robust Windows drivers for PowerVR or Mali or any other GPU, and as we've seen from AMD, Intel, and NVIDIA over the years, that's not always easy to get right. Reply
  • frenchy_2001 - Thursday, September 04, 2014 - link

    why not license Kepler from NV then? ;) Reply
  • deltatux - Thursday, September 04, 2014 - link

    Little known fact, Dell sells their PowerEdge T320 with Matrox graphics. I use Matrox on a daily basis lol. Reply
  • username609 - Tuesday, September 16, 2014 - link

    HP's servers have alternated between "embedded Matrox" and "embedded AMD" for years. Reply
  • Stuka87 - Thursday, September 04, 2014 - link

    I had a G400+, and it was a fine card for the time, but it was costly. I later replaced it with a GF3Ti200 which blew it out of the water.

    But the Matrox card had some nice features and 2d image quality.
  • Syran - Thursday, September 04, 2014 - link

    I'll always remember Matrox cards fondly. They were the perfect pairing for the voodoo 2's... *sigh* my friend and I used to have G200 + sli Voodoo 2's that were amazing setups. Reply
  • digitalsolo - Thursday, September 04, 2014 - link

    This makes me reminisce about my old Matrix MGA Mystic 4MB back in the day. Wonderful little card that was, played Mechwarrior 2 and Descent with the upgraded texture packs! Reply
  • Wolfpup - Thursday, September 04, 2014 - link

    Makes sense to me that Matrox would refocus and not design GPUs themselves.

    Geez, I remember when Matrox was a big name for consumers. I remember some of their ads I think! Matrox Millennium? I think that's the Matrox GPU I had in a system in...I guess the late 90s?

    I remember their image quality was excellent back when not everyone's were. Unreal 1 ran great on it at least! :-D
  • xrror - Thursday, September 04, 2014 - link

    It's kinda irritating to just see the hand-wave that Matrox just didn't try hard enough or were fools when they made the Parhelia. No even nVidia was blind-sided by how spectacularly the ATi/ArtX Kahn - Radeon 9700 Pro came out of the gate.

    This isn't meant to be some fanboi rant, it's more that holy moly the 9700Pro really did redefine the game - and if you weren't living/gaming/LAN party during that era, that kinda gets missed.

    If anything, the Matrox Parhelia should go down as a case study of having about the most pathologically WORST timing in video card history. Matrox really tried, and put their R&D heart into releasing the card - nobody could have guessed a mere few months later the enormous jump ATi would push into the market.

    Remember everyone, the 9700pro was the first time that you could even consider having AA enabled and it not be a slide show.

    historical rant over. sorry all
  • wireframed - Saturday, September 06, 2014 - link

    Yeah, it was more than double the performance of the previous generation, as I recall. It took YEARS before anything significantly outperformed the card, and it was a viable gaming card for years after its release. And that was in a time were games generally forced upgrades a lot more than today. Reply
  • ecuador - Friday, September 05, 2014 - link

    Ah, I remember when I got my first PC (around 94-95?) those Matrox MGA Millenniums where way out of my price range (I had to go for the cheapest S3 or CL), so when, a decade later, we were throwing away some workstations at my University, I pulled a Millennium with the 6MB daughter-card to keep as a souvenir. I mean it felt a bit like throwing away a 10 year old Ferrari, especially considering Ferrari no longer made sports cars by then (just farm tractors). Reply
  • Daniklad - Monday, September 08, 2014 - link

    I have to step in here. Matrox was an innovator and I'm sorry they were forced out of the consumer market. Matrox had better quality hardware for VGA output but lost that advantage when digital displays became standard. The superior color processing is also a lost advantage they don't have anymore.

    The G400 had some quite advanced capabilities for its time that neither NVidia nor ATI had. Like environment mapped bumpmapping (EMBM, reflections on bumpy surfaces basically).

    The Parhelia had 16x antialiasing in a special mode (FAA) that had basically no performance impact because it only processed edge pixels (5-10%) of the screen which also lead to crisper looking texturing. Unfortunately all games weren't compatible.

    They also had the first hardware that could do Hardware Displacement Mapping which Anand reviewed in may 2002. Today we know this as hardware tessellation and geometry shaders.

    I hope AMD and Matrox can lean on each other and together avoid dropping into the abyss. We need them more than many people realize.
  • MrGoodBytes - Tuesday, October 07, 2014 - link

    glad I'm not the only one who was shocked to see matrox news and take a deep trip down memory lane. From someone who's only ran laptops the past decade+, I really wish they moved to laptop docking stations/usb. I still struggle daily with MMS docking station quirks, and it's the constant reminder that "if matrox did this it'd just be easy".

    My last desktop still runs a parhelia. Holy crap, that triple monitor gaming was mind blowing at the time. I feel bad for anyone who didn't get to experience wolfenstein with that at the time. but yes, it did also kinda suck on the 3D thing, not their fault, as stated earlier, just really unfortunate market timing. I know MMS is pretty much a given today but they are the reason for that!!! and they just always did it effortlessly. I remember in college playing around with how many monitors I could attach to a single computer having access to a stack of G200MMS (lol, ISA boards), and the answer was 12 CRT's before we ran into issue with throwing breakers and just having long enough vga cords. The drivers handled it trivially... blew my mind. A year before in 2001, I had some capacitors blow on my motherboard, and I'll be damned if it didn't fry every single component of my computer: ram, cpu, HD's, MONITOR.... but that G450 still works to this day.

    the answer is that if matrox comes out with a card with some remedial acceleration and PCIe is going to be compatible for a while, I'd buy it in a heart beat reguardless of price.

Log in

Don't have an account? Sign up now