POST A COMMENT

38 Comments

Back to Article

  • Shark974 - Friday, December 23, 2005 - link

    Would shut up about ATI availabilty!

    I haven't heard a peep from anyobody about the fact 512 GTX has been out of stock at 750 basically SINCE LAUNCH.

    I'm getting tired of this biased CRAP.

    As somebody on another forum said about the CF review where Anand claimed a 512 GTX setup runs 1400..where? You cant buy them! Let alone for 700 each.

    Reply
  • Slaimus - Tuesday, December 27, 2005 - link

    The GTX was the ONLY high end card that was available to buy on the day the NDAs expired. Reply
  • DigitalFreak - Friday, December 23, 2005 - link

    ATI has pretty much paper launched everything since the X800XT PE, so I'm glad that AT keeps harping on them.

    The GTX 512 was a hard launch. There just weren't that many cards available, and they sold out quickly. Besides, how many people are going to purchase a $700 video card anyway?

    And yes, I've owned only ATI cards since the 8500...
    Reply
  • Live - Friday, December 23, 2005 - link

    No way was the 7800 GTX 512 a real "hard launch". If so the ATIs "hard launched" 1800XL and XT as well, "There just weren't that many cards available".

    "Hard launch" and "There just weren't that many cards available" don’t go together.

    The 7800GT(X) 256 were real hard launches were cards were available from the get go and availability just kept ramping up and MSRP were soon reached. That’s just not the case for the 512 version.
    Reply
  • bob661 - Monday, December 26, 2005 - link

    I guess you didn't bother to look at all when they were launched because you could get 512 parts for about two weeks. Shit, Newegg had three 512 cards for about 2 weeks before they ran out. Nvidia launched and deliver actual product. Where was ATI's product? Reply
  • bob661 - Monday, December 26, 2005 - link

    I wonder how many "launches" ATI will do on the X1900 parts? Reply
  • Cygni - Friday, December 23, 2005 - link

    There is NOT a chipset called the 7800GTX 512. Memory size is decided by the BOARD MAKERS. If the board makers dont think there is enough demand, they wont make many.

    They hard launced the GTX, the end.
    Reply
  • piroroadkill - Saturday, December 24, 2005 - link

    The GTX 512 is not at all the same as the GTX with more RAM. Do your homework. Reply
  • Anton74 - Friday, December 23, 2005 - link

    The 7800 GTX 512 is in fact a different part than the 7800 GTX. The difference goes beyond the additional RAM.

    http://www.anandtech.com/video/showdoc.aspx?i=2607">http://www.anandtech.com/video/showdoc.aspx?i=2607
    Reply
  • ViRGE - Friday, December 23, 2005 - link

    Eh? They mentioned the 7800 GTX 512 just earlier today in the GDDR3 insider story. Reply
  • Clauzii - Thursday, December 22, 2005 - link

    The R580 part starts to sound a little like the GPU for the XBOX 360 - hopefully the R580 will be a little better then..

    Well, I´m ATI so it better be!
    Reply
  • Furen - Thursday, December 22, 2005 - link

    I was under the impression that the R580 would be the first PC GPU that utilized unified shaders. If this was the case then the 48 pipel shaders we hear about would, instead, be 48 general purpose shaders, which would make performance with them slightly lower than a 24 pixel pipeline part. I find it very doubtful that ATI would jump from 16 traditional pixel shaders to 48, since this would increase the die size tremendously. Reply
  • coldpower27 - Saturday, December 24, 2005 - link

    No R580, is still based on R520 technology, but more refined and enhanced, a change as large as Unfied Shaders can only be implemented in R600/G80. So were looking functionality wise as capable as the shaders used in R520.

    48 Pixel Shaders with 16 Pipelines is believable on the 90nm process, 48 Pixel Shaders and 48 Pipes isn't.
    Reply
  • quasarsky - Thursday, December 22, 2005 - link

    ati went from 16 pixel shaders to 48?

    wow

    wonder what nvidia will go to from 24. :)

    if ati is as efficient with 48 shaders as they are with 16, that will be amazing!

    can you imagine 2 gpu dies on one card and then having two cards? thats like 192 shaders!!!

    i figure my x800xt aiw agp is enough for earth 2160 but i figured out a way to make that slow down. i had like 1,000 infantry go into battle in the lunar corporation campaign, and yeah it got choppy. lol :)
    Reply
  • defter - Friday, December 23, 2005 - link

    "if ati is as efficient with 48 shaders as they are with 16, that will be amazing!"

    Unfortunately it won't be. RV530 has 12 pixel shaders and 600MHz clock speed, still it's slower than GF6800GS with 12 pixel shaders and 450MHz clocks. RV530 isn't much faster than even GF6600GT which has only 8 pixel shaders @ 500MHz.

    Difference between R520 and R580 is equal to the difference between RV515 and RV530: number of pixel shaders is tripled, however number of texture units and ROPs isn't increased. Thus you won't be getting anywhere near 3x performance.
    Reply
  • Shark974 - Friday, December 23, 2005 - link

    Nvidia is going to 32 pipes.

    And for the last time, R580 is FOURTY EIGHT TRUE PIPES. I've been arguing this on Hardocp for weeks and been RIGHT EVERY TIME.

    All the haters want to say it's all this other stuff, no.

    This will kill Nvidia in performane.
    Reply
  • quasarsky - Saturday, December 24, 2005 - link

    i hope ur right dude and its 48 pipes :).

    i'd say it is possible. if you looked at a ati slideshow of their die map everything they had on the bigger process compared to 90 nm was shrunk to fit in a way smaller space on the die. so basically it leaves that much more space for other stuff on the die, because what once took up the whole die, now only takes up like 1/2 to 1/3 of that space. 32 extra pipes? yes. :).
    Reply
  • coldpower27 - Saturday, December 24, 2005 - link

    Very doubtful it is 48 true pipes, if it's a 90nm product, would simply be too expensive.
    Reply
  • Griswold - Saturday, December 24, 2005 - link

    What does that have to do with the piece being 90nm? Reply
  • coldpower27 - Saturday, December 24, 2005 - link

    The piece being on 90nm means that R580 is going to be more complex then the already large R520 die at 263 mm2, there is room to add some more functionality but not an enormous amount more.

    48 Pipes AND 48 Shaders is possible if it is on 80nm.
    Reply
  • DigitalFreak - Friday, December 23, 2005 - link

    So you work for ATI and know this for a fact? Reply
  • shabby - Thursday, December 22, 2005 - link

    It can have a 100 pixel shaders, it still only has 16 rops. Reply
  • Clauzii - Thursday, December 22, 2005 - link

    Do You know that as a fact?

    And even if it only has 16 Output pipelines, if the 48 shaders can finish the rest I don´t think it´s gonna be a big problem, ´caus the shaders do the dirty work...

    But yet again - We could then hope for R600 to be 32 Pipelines with 64 shaders? :)
    Reply
  • quasarsky - Thursday, December 22, 2005 - link

    i feel like a newb :-D lol.

    will ati ever move beyond 16 pixel pipelines?

    :(
    Reply
  • xTYBALTx - Thursday, December 22, 2005 - link

    Your terminology is crossed. Reply
  • SLIguy - Thursday, December 22, 2005 - link

    SLI/CrossFire are more than PR gimmicks. They give one the ability to run more games at higher resolutions and details than any single card solution, at least 90% of the time.

    F.E.A.R. and CoD2 are great examples of games that when running at maximum settings even at just 1600x1200, just don't run smoothly, even with a single GTX 512, in my experience. I don't know about the X1800XT since I don't have one, but from the numbers I've seen, I doub't a single X1800XT can do any better.

    Now go SLI with the GTX 512's, and now we're talking. F.E.A.R. and CoD2 just run like butter @ 1600x1200 maximum detail.

    Cost effective? No. Fun. Hell yeah!

    So while the dual card thing is proably never going to be something you pick at Best Buy, it will be the choice of PC enuthsiets willing to spend the money.

    I too lament this master card setup. Plus, I want to see a motherboard with both SLI and CrossFire on it.

    Reply
  • FordFreak - Friday, December 23, 2005 - link

    Not that many people run games at the resolution so it still is more of a PR gimmick than anything else. If you have money to buy a monitor that does 1600x1200, you have money to buy 2 over priced video cards. Reply
  • alienz - Monday, December 26, 2005 - link

    LOL, are you kidding? My 6 year old Dell 19" CRT does 1600x1200 resolution, and it does it quite well. Reply
  • bob661 - Monday, December 26, 2005 - link

    I was about to say my Samsung 19" CRT does 1600x1200 also but I run it at 1280x960. I also play my games at 1280x960. A single 7800GT does it well even on COD2 but SLI 7800GT's would be REALLY nice. Reply
  • Braznor - Thursday, December 22, 2005 - link

    LOL, this time the immovable object (ATI) will collide into the impossible force (Nvidia) Nvidia's upcoming G71 launch should keep the canadians nervous. Reply
  • coldpower27 - Monday, December 26, 2005 - link

    Yes, it might have been better to simply say to keep ATI nervous, unless everyone is allright with saying "ATI's upcoming R580 should keep the Americans nervous." Reply
  • poohbear - Sunday, December 25, 2005 - link

    "the canadians nervous"???? wtf is this the war of 1812?!? dont matter if they're norwegian or south african, competition is good. this has nothing to do w/ nationality. Reply
  • bob661 - Monday, December 26, 2005 - link

    Ho hum. ATI is a Canadian company. Reply
  • Griswold - Saturday, December 24, 2005 - link

    Or the other way around. Reply
  • Brian23 - Thursday, December 22, 2005 - link

    aw man that sucks. I thought they were going to fix this whole master card slave card thing with the R580.

    I hate the fact that you have to buy a special card if you want crossfire.
    Reply
  • Chadder007 - Thursday, December 22, 2005 - link

    :werd:
    Same here, master card = lame
    Reply
  • Shintai - Thursday, December 22, 2005 - link

    Agree, but on the other hand...how many of the consumer base gonna use CF or SLI anyway. It´s more a PR gimmick than something people buy. Reply
  • MrSmurf - Tuesday, December 27, 2005 - link

    It's not PR gimmick.

    It's an easy way to:

    1. Gain the performance crown (at least in the case with nvidia, ATI just had to respond)
    2. Get more people to buy their chipsets (mobo)
    3. Sell more GPU
    4. Get people to notice their company

    SLI and CR are clearly designed for hardcore gamers and a cheap way for everyone to upgrade their system without having to get rid of their old card.
    Reply

Log in

Don't have an account? Sign up now