POST A COMMENT

7 Comments

Back to Article

  • JarredWalton - Wednesday, September 29, 2004 - link

    I take it you're referring to the Matrox Parahelia, Saist. I guess I didn't make it entirely clear that I was talking about *high powered* graphics chips running more than two monitors. ;)

    If you just want multiple monitors for an application - like four or even eight - you can do that using PCI graphics cards. If you wanted to do some sort of 3D rendering to all the monitors, though, you need a powerful graphics processor. For gaming, I think multiple monitors is currently in the exotic computing realm, but businesses might find it useful.

    Anyway, ATI still has their AFR patents, which I wouldn't be surprised to see them use. We'll just have to wait and see. :p
    Reply
  • Saist - Wednesday, September 29, 2004 - link

    [quote]I haven't heard anything about ATI supporting some variant of SLI, but it is a possibility. You also have companies like Alienware that have a different technology for rendering using more than one graphics card. However, there are also people that would like to use more than two monitors. Having two PCIe graphics slots would allow for four monitors, and in rendering applications like CAD/CAM it would be beneficial to have more than one graphics processor.
    [/quote]

    cough *surround view* cough
    Reply
  • JarredWalton - Wednesday, September 29, 2004 - link

    I haven't heard anything about ATI supporting some variant of SLI, but it is a possibility. You also have companies like Alienware that have a different technology for rendering using more than one graphics card. However, there are also people that would like to use more than two monitors. Having two PCIe graphics slots would allow for four monitors, and in rendering applications like CAD/CAM it would be beneficial to have more than one graphics processor.

    Anyway, ATI did do the old Rage MAXX card in the past, so multiple PCIe slots with ATI's alternate frame rendering technology could be in the works. I have *no* inside information on this, though (and if AT has signed an NDA on anything about this, I haven't heard it). In other words, this is all rampant speculation on my part. :)
    Reply
  • kalaap - Wednesday, September 29, 2004 - link

    if ati is supporting dual gfx card slots, does that mean they're going to have an sli variation also?

    i mean, why would they just support it for nvidia only?
    Reply
  • JarredWalton - Tuesday, September 28, 2004 - link

    Actually, that's not correct, ROQSpock. The X300 is indeed DX9 hardware. The core is codenamed RV370, while the X600 is codenamed RV380. To be honest, other than clock speeds, I'm not entirely sure what the difference between those two cores is. It's sort of like the R350 and R360 cores (9800 Pro and 9800 XT - although the R360 was later used for both). Basically, the X300 and X300 SE are PCIe versions of the 9600 and 9600 SE, while the X600 and X600 Pro are PCIe versions of the 9600 Pro and 9600 XT. That's my understanding of things. Reply
  • ROQSpock - Tuesday, September 28, 2004 - link

    I'm pretty sure they meant to say x600, not x300. The x300 is not directx9 hardware. Reply
  • ImJacksAmygdala - Monday, September 27, 2004 - link

    Am I the only one that sees the irony in that by the time HL2 comes out, ATI's integrated chipset graphics core will run HL2 as well as the 9600XT cards they were giving out HL2 vouchers with... LOL!

    I feel kind bad for these guys that geared up for just HL2 back in Sept 2003 and now look at the GF6600 numbers compared... One long development cycle, one possible pending lawsuit, and one video card generation later and HL2 still isn't here...

    Yawn... Bring out the Nforce4 already! I want my duel GF6600s in SLI with a (939)3200+ on a Nforce4 already... If I said thats how HL2 was meant to be played would I sound like a fanboy?
    Reply

Log in

Don't have an account? Sign up now