POST A COMMENT

76 Comments

Back to Article

  • Pete - Monday, April 19, 2004 - link

    Shinei,

    I did not know that. </Johnny Carson>

    Derek,

    I think it'd be very helpful if you listed the game version (you know, what patches have been applied) and map tested, for easier reference. I don't even think you mentioned the driver version used on each card, quite important given the constant updates and fixes.

    Something to think about ahead of the X800 deadline. :)
    Reply
  • zakath - Friday, April 16, 2004 - link

    I've seen a lot of comments on the cost of these next-gen cards. This shouldn't surprise anyone...it has always been this way. The market for these new parts is small to begin with. The best thing the next gen does for the vast majority of us non-fanbois-who-have-to-have-the-bleeding-edge-part is that it brings *todays* cutting edge parts into the realm of affordability. Reply
  • Serp86 - Friday, April 16, 2004 - link

    Bah! My almost 2 year old 9700pro is good enough for me now. i think i'll wait for nv50/r500....

    Also, a better investment for me is to get a new monitor since the 17" one i have only supports 1280x1024 and i never turn it that high since the 60hz refresh rate makes me go crazy
    Reply
  • Wwhat - Friday, April 16, 2004 - link

    that was to brickster, neglected to mention that Reply
  • Wwhat - Friday, April 16, 2004 - link

    Yes you are alone Reply
  • ChronoReverse - Thursday, April 15, 2004 - link

    Ahem, this card has been tested by some people with a high-quality 350W power supply and it was just fine.


    Considering that anyone who could afford a 6800U would have a good powersupply (Thermaltake, Antec or Enermax), it really doesn't matter.


    The 6800NU uses only one molex.
    Reply
  • deathwalker - Thursday, April 15, 2004 - link

    Oh my god...$400 and u cant even put it in 75% of the systems on peoples desks today without buying a new power supply at a cost of nearly another $100 for a quailty PS...i think this just about has to push all the fanatics out there over the limit...no way in hell your going to notice the perform improvement in a multiplayer game over a network..when does this maddness stop. Reply
  • deathwalker - Thursday, April 15, 2004 - link

    Reply
  • Shinei - Thursday, April 15, 2004 - link

    Pete, MP2 DOES use DX9 effects, mirrors are disabled unless you have a PS2.0-capable card. I'm not sure why, since AvP1 (a DX7 game) had mirrors, but it does nontheless. I should know, since my Ti4200 (DX8.1 compatible) doesn't render mirrors as reflective even though I checked the box in the options menu to enable them...
    Besides, it does have some nice graphics that can bog a card down at higher resolutions/AA settings. I'd love to see what the game looks like at 2048x1536 with 4xAA and maxed AF with a triple buffer... Or even a more comfortable 1600x1200 with same graphical settings. :D
    Reply
  • Da3dalus - Thursday, April 15, 2004 - link

    I'd like to see benchmarks of Painkiller in the upcoming NV40 vs R420 tests... Reply
  • Brickster - Thursday, April 15, 2004 - link

    Am I the only one who thinks Nvidia's Nalu is the MOST bone-able cartoon out there?

    Oy, get the KY!
    Reply
  • Warder45 - Thursday, April 15, 2004 - link

    Did any reviews try and overclock the card? Is it not possible with the test card? Reply
  • DonB - Thursday, April 15, 2004 - link

    Would have been better if it had a coax cable TV input + TV tuner. For $500, I would expect a graphic card to include EVERYTHING imaginable. Reply
  • Pete - Thursday, April 15, 2004 - link

    Shinei #37,

    "Speaking of DX9/PS2.0, what about a Max Payne 2 benchmark?"

    MP2 doesn't use DX9 effects. The game requires DX9 compatability, but only DX8 compliance for full effects.

    Xbit-Labs has a ton of benches of next-gen titles as well, and is worth checking out. NV40 certainly redeems itself in the HL2 leak. :)
    Reply
  • Wwhat - Thursday, April 15, 2004 - link

    Anybody happen to know if it's possible to use a second (old) PSU to run it, you can pick up cheap 235 watt PSU's and would be helped with both extra connectors and power.
    I'm not sure it won't cause 'sync' problems though as a small difference between the rails of 2 PSU's would cause one to drain the other if the card's connectors aren't decoupled enough from the AGP port.



    Reply
  • Pumpkinierre - Thursday, April 15, 2004 - link

    Agrre with you Trog #59 on the venting. Also with DX9.0c having fp32 as spec., does this mean that FX series cards redeem themselves? (As the earlier DX9 spec was fp24 which was'nt present on the FX gpus causing a juggling act between fp16 and fp32 to match performance and IQ). Still, full fp32 on the FX cards might be too slow.
    Reply
  • mrprotagonist - Thursday, April 15, 2004 - link

    What's with all the cheesy comments before the benchmarks? Anyone? Reply
  • Cygni - Thursday, April 15, 2004 - link

    "what mobo and mobo drivers were used? i hear that the nforce2 provides an unfair performance advantage for nvidia"

    The test was on an Athlon 64 3400+ system, so i doubt it was using an Nforce2. But ya, i agree, the system specs were short. More details are required.
    Reply
  • Brickster - Wednesday, April 14, 2004 - link

    Derek, what was that Monitor you used?

    Thanks!
    Reply
  • Deanz79 - Wednesday, April 14, 2004 - link

    AWESOME!! I wouldnt mind borrowing the card for the night :P Reply
  • TrogdorJW - Wednesday, April 14, 2004 - link

    There are a few things I take away from all the previews of the 6800 Ultra.

    One is that ATI is going to be hard pressed to actually top it. Both will have 16x1 designs, but I don't think ATI will have the 32x0 option, which might be important for games with lots of shadows. (I believe the ATI cards are going to be around 180 million transistors, which leads me to believe that they will not have quite as many features.) I also doubt that ATI will actually support fp32 this time around, which aces DX9.0c/PS3 support from them. That may or may not really matter.

    The next thing is sort of related to the first point: Nvidia now has more features that ATI, but there are still some bugs to work out. DX9 games that were optimized for NV3x seem to be dropping quality on the 6800U. Hopefully the fix to use fp32 instead of fp16 will be both easy and not result in a major performance drop. We'll have to wait and see, though. Other sites have shown quite a few areas that need driver revs, but that's nothing new. At least with NVidia, I feel confident their driver team will fix any major issues and probably increase performance a decent amount as well.

    I also agree with someone else that said the previews might be lower clocked than the final release. First, the RAM is spec'ed for 600 MHz, which makes it odd that they're running at 550 MHz. They may not hit 600, but 575 or maybe 585 seems likely (or at the very least that should be an OC'ing option). The core is currently at 400 MHz, and I think they might be able to bump that up a bit more, but 222 million transistors at .13 micron might not go much higher. We'll have to see what some of the shipping cards from GB, A-bit, Asus, etc. offer in terms of OC'ing headroom, as they might offer better cooling solutions.

    Related to the heat and clockspeed, I'm a little shocked at the heatsink/fan design. If they're going to all the trouble of having a huge HSF, I can't see any reason to not switch the direction it blows and have the Ultra version vent the hot air outside the case. Maybe noise was the reason, or component placement, but I would really prefer to have anything that size making use of external venting. It would be like having your power supply sucking air into the case instead of blowing out... Sure, it might cool the PS better, but the case temp would jump dramatically.

    My final thought is that it will be very interesting to see what sort of price and performance can be had from the regular 6800 cards, and even the 6800XT. I didn't think there would be a "soft-mod" option for Nvidia this round, but it appears I was wrong. Unless NVidia has some way of preventing this from being done. Regardless, if the 6800U is going to start at $500 and the 6800 will go for $300, we could be looking at a 6800XT for $200 or so. It should also have at least the performance of the 5950U, and most likely better.

    Incidentally, I'm betting the mid-range cards (i.e. 6500 or 6600 or whatever) will not really be that great, though, as they'll likely trim them down to 2 or 4 vertex pipelines and 4 or 8 pixel pipelines, so they'll end up looking like something inbetween the 5700U and the 5900XT. And don't look for help from ATI here, as the X300 and X600 look to be renamed 9600SE and 9600XT parts, respectively (a la the Radeon 9000 to 9200 line).
    Reply
  • IamTHEsnake - Wednesday, April 14, 2004 - link

    Whoops! The Radeon 9800 xt only scored 6138 while NV40 scored 12350+ in 3DMark'03. That Ladies and Gentlemen is 2x as many points! Reply
  • IamTHEsnake - Wednesday, April 14, 2004 - link

    Wow I read the review and all I can say is WoW. I read somewhere else that this card scored 12250+ in 3dMark'03 while the 9800 xt scored 8350 on the same system, same set-up. From one generation to the next 33% increase is not bad. not bad at all.


    Come on ATi! I'm rootin' for you!!!
    Reply
  • Schadenfroh - Wednesday, April 14, 2004 - link

    what mobo and mobo drivers were used? i hear that the nforce2 provides an unfair performance advantage for nvidia, even tho the ati should run at the same speed as on a differant motherboard, nvidia just gets an extra boost Reply
  • Warder45 - Wednesday, April 14, 2004 - link

    I want to see the multimedia bench's. Hopefully another article with AMD vs Intel. Reply
  • AlexWade - Wednesday, April 14, 2004 - link

    The thing is freakin' huge! I'm willing to bet dollars-to-doughnuts that ATI's new card isn't the size of a football. Even if this huge beast tops in performance, the extra 20 pounds rules out LAN parties.

    I'll admit, the performance is great. But if ATI is smaller and performs near, or slightly below, then that is the one to buy.
    Reply
  • AlexWade - Wednesday, April 14, 2004 - link

    The thing is freakin' huge! I'm willing to bet dollars-to-doughnuts that ATI's new card isn't the size of a football. Even if this huge beast tops in performance, the extra 20 pounds rules out LAN parties.

    I'll admit, the performance is great. But if ATI is smaller and performs near, or slightly below, then that is the one to buy.
    Reply
  • Reflex - Wednesday, April 14, 2004 - link

    Personally I'll wait to see the budget line on these, I refuse to spend more than $200 on a video card. Chances are I'll end up going Ati however, the 2D video quality is just noticably better, and most of my time on my PC is spent reading, not gaming.

    Oh well, at least the gamers can be happy again. Too bad the AGP slot is not at the bottom of the motherboard, could build some interesting external vented cases if the card could stick that fan outside the case. ;)
    Reply
  • Reliant - Wednesday, April 14, 2004 - link

    Any ideas how the Non Ultra version will perform? Reply
  • Reliant - Wednesday, April 14, 2004 - link

    Any ideas how the Non Ultra version will perform? Reply
  • segagenesis - Wednesday, April 14, 2004 - link

    I cant agree with #45 more. People rush to judgement when its no secret that ATI will be coming out with thier goods very soon also. "Wow look this card is really fast!!! I cant believe it!" well this sounds like almost every other graphics card release from ATI or nVidia in the past. To me nVidia had better have come out with something good after ther lackluster Geforce FX 5800 wasnt anything terribly special. I used to like nVidia alot (heh my ti4600 still runs fine) but when it comes to looking for a new card, I'll pick whichever one is faster *and* has the features I want. If it wasnt for such turnoffs like the 2-slot design and now even 2 power connections required Im not sure I am ready to spend $500 just yet...

    Sorry if im obtuse but if ATI comes out with a part thats either equal (note the key term there) in performance or maybe even slightly slower... I'd go for ATI and thier better IQ that the Radeon 9700 series so impressed me on and made me wish for more out of my ti4600. That and a single slot/single power type design would probably put me in thier boat.

    Fanboy ATI opinion? I've owned nVidia from the Riva TNT to the ti4600 and many in-between.
    Reply
  • Lonyo - Wednesday, April 14, 2004 - link

    #42, the jump from the Ti4600 to the 9700Pro wasn't good for you? I woul dhave thought finally playable AA/AF was quite a jump.
    Personally, it seems less of a jump than the 4600 -> 9700.


    And I will reserve judgement on how much of an accomplishment nVidia have made until I see what ATi release.
    If it's of similar power, but maybe has 1 molex, or is a single slot solution, they will have accomplished more.
    It's not just raw performance, we'll have to see how it all stacks up, and how long it takes to release the things!
    Reply
  • ChronoReverse - Wednesday, April 14, 2004 - link

    Some site tested the 6800U on a 350W supply and it worked just fine.


    Myself, I think my Enermax 350W with its enhanced 12V rail will take it just fine as well.
    Reply
  • Regs - Wednesday, April 14, 2004 - link

    Yeah, Nvidia did make one hell of an accomplishment. They just earned a lot of respect back from both fan clubs. You have to respect the development and research that went into this card and the end result turns out to be just as we anticipated if not more.

    I really don't know how anybody could pick a "club" when seeing hardware like this perform so well.

    Im hoping to see the same results from ATI.

    Just too bad they are some costly pieces of hardware ;)
    Reply
  • araczynski - Wednesday, April 14, 2004 - link

    nice to FINALLY see a universally quantifiable performance increase from one generation to the next.

    but the important thing is how it competes with the x800 from ati, not against older cards.

    as for the power supply, i think the hardcore crowd that these are geared at already have more then enough power, and quite frankly i would be suprised if these woudln't work fine on a solid 350W from a reputable source (i.e. not your 350W ps for $10 from some 'special' sale).

    They're being conservative knowing that many of the people have crappy power supplies and don't know better.
    Reply
  • klah - Wednesday, April 14, 2004 - link

    "Anyone know when it ships to retail stores?"

    http://www.eetimes.com/semi/news/showArticle.jhtml...

    "GeForce 6800 Ultra and GeForce 6800 models, are currently shipping to add-in-card partners, OEMs, system builders and game developers. Retail graphics boards based on the GeForce 6800 models are scheduled for release in the next 45 days."
    Reply
  • Jeff7181 - Wednesday, April 14, 2004 - link

    This has me a bit curious... maybe I didn't read close enough... but is this the 6800 or the 6800 Ultra? Reply
  • saechaka - Wednesday, April 14, 2004 - link

    wow impressive. i really want one. wonder if it will run ok with my 380w powersupply Reply
  • Cygni - Wednesday, April 14, 2004 - link

    Personally, im very impressed, and i havent had an Nvidia product in my main gaming rig since my Geforce256. The card may be huge, power hungry, hot, and loud (maybe), but that is some SERIOUS performance.

    How long has it been since Nvidia has had a top end card that DOUBLED the performance of the last top end card? Pretty awesome, I think. I dont have the money to pick one up, but hopefully the mid/low end gets some love from both ATI and Nvidia as well. The 9200/9600/5200/5600 dont really appeal to me... not enough of a performance leap over a $20 8500!
    Reply
  • Marsumane - Wednesday, April 14, 2004 - link

    This card owns... Anyone know when it ships to retail stores? Guesses even? Reply
  • SpaceRanger - Wednesday, April 14, 2004 - link

    I'd like to see what ATI comes up with before I make my decision. I rushed to judgement back when the GF4 TI4600 came out, and regretted making the quick call to buy. If I don't have to get a new PSU for the ATI solution, I'll consider it, even if performance is 5-10FPS slower. Adding 100 bucks to the already costly 500 for the card doesn't justify the expenditure. Reply
  • gordon151 - Wednesday, April 14, 2004 - link

    AtaStrumf is so right. More than likely you'll be able to buy the X800s before you can buy this. Reply
  • Shinei - Wednesday, April 14, 2004 - link

    Well, I'm sold. Yeah, that sounds fanboyish, but this thing is a solid performer and doesn't require me to completely replace my display drivers... Even if ATI wins by five FPS and has a lens flare in a forgotten corner of a screenshot that you have to stare at for ten minutes to spot, my money is going to NV40--assuming the prices come down a little. ;)
    Speaking of DX9/PS2.0, what about a Max Payne 2 benchmark? I'm curious what NV40 can do on that game with maxed out everything... :)
    Reply
  • skiboysteve - Wednesday, April 14, 2004 - link

    i love anandtech's deep technical reviews but yall did no where near enough testing, the xbit article does a hell of allot more testing, 48 pages!

    http://www.xbitlabs.com/articles/video/display/nv4...

    the card fucking rapes everything.

    the anand tests dont show nearly the rape the xbit ones do...
    Reply
  • AtaStrumf - Wednesday, April 14, 2004 - link

    I find it really funny when people say that they will wait until ATi releases their X800 to make up their buying decisions.

    It's not you can run out and BUY this card right now or tomorrow. Of yourse you will wait. You don't really have a choice :)
    Reply
  • ChronoReverse - Wednesday, April 14, 2004 - link

    The Techreport tested out the total power draw of this thing and it only drew slightly higher than the 5950 (both of which draws more than the 9800XT).


    So it seems the recommendation isn't actually necessary (and my Enermax enhanced 12V lines will take it easily).
    Reply
  • Pete - Wednesday, April 14, 2004 - link

    mkruer #27, all the reviews I've read mention $500 for the 6800U, and $299 for a 12-pipe 128MB 6800. Reply
  • DerekWilson - Wednesday, April 14, 2004 - link

    #27,

    The 6800 Ultra (which we tested) will be priced at $500

    The 6800 (with 12 pipes rather than 16) will be priced at $300
    Reply
  • Pete - Wednesday, April 14, 2004 - link

    quikah #26: FarCry comparison screens are at HOCP.

    http://hardocp.com/article.html?art=NjA2LDU=

    Apparently PS3 wasn't enabled, but the 6800U looks better than the 5950U running PS2. It's still uglier than the 9800XT, sadly. Banding abounds, both here and in FiringSquad's Lock-On screens. Puzzling, really. If the 6800U really runs FP32 as fast as FP16 within memory limits, I wonder if all it will take to get IQ on a level with ATi is forcing the 6800U to run the ATi path or removing the NV3x path's _pp hints.
    Reply
  • mkruer - Wednesday, April 14, 2004 - link

    Well I hope this card is on par with ATi's or visa versa. ATi is planning to see their best at $500 pop and Nvidia is selling their at $400. How long to you think ATi is going to see their card for that price if the performance is virtually identical. Finally in the terms of the Power. Makes me wonder why PCI-Ex doesn’t include enough voltage from the socket? VPU's are getting to the point that they are just as powerful and complex as their CPU brethren, and will require the same power requirements as the CPU. Some one didn’t do their homework I guess. Well hears hoping that it will be in the next specification.
    Reply
  • quikah - Wednesday, April 14, 2004 - link

    Can you post some screen shots of Far Cry? The demo at the launch event was pretty striking so I am wondering if PS 3 were actually enabled since you didn't see any difference. Reply
  • Novaoblivion - Wednesday, April 14, 2004 - link

    Wow nice looking card I just hope the new ATI doesnt kick its ass lol Reply
  • Rudee - Wednesday, April 14, 2004 - link

    When you factor in the upgrade price of a power supply and a top of the line CPU, this is going to be one heck of an expensive gaming experience. People will be wise to wait for ATI's newest flagship before they make any purchase decisions. Reply
  • Pete - Wednesday, April 14, 2004 - link

    Nice review, Derek. Some impressive performance, but now I'm expecting more from ATi in both performance (due to higher clockspeed) and IQ (I'm curious if ATi improved their AF while nV dropped to around ATi's current level). I also have a sneaking suspicion nV may clock the 6800U higher at launch, but maybe they're just giving themselves room for 6850U and beyond (to scale with faster memory). But a $300 12-pipe 128MB 6800 should prove interesting competition to a ~$300 256MB 9800XT.

    The editor in me can't refrain from offering two corrections: I'm pretty sure you meant to say Jen Hsun (not "Jensen") and well nigh (not "neigh").
    Reply
  • Mithan - Wednesday, April 14, 2004 - link

    Looks like a fantastic card, however I will wait for the ATI numbers first :)


    PS:
    Thanks for including the 9700 Pro. I own that and it was nice to see the difference.
    Reply
  • dawurz - Wednesday, April 14, 2004 - link

    Derek, could you post the monitor you used (halo at 2048 rez), and any comments on the look of things at that monstrous a resolution?

    Thanks.
    Reply
  • rainypickles - Wednesday, April 14, 2004 - link

    does the size and the power requirement basically rule out using this beast of a card in a SFF machine? Reply
  • Damarr - Wednesday, April 14, 2004 - link

    It was nice to see the 9700 Pro included in the benchmarks. Hopefully we'll see the same with the X800 Pro and XT so there can be a side-by-side comparison (should make picking a new card easier for 9700 Pro owners like myself :) ). Reply
  • DerekWilson - Wednesday, April 14, 2004 - link

    We are planning on testing the actual power draw, but until then, NVIDIA is the one that said we needed to go with a 480W PS ... even making that suggestion limits their target demographic.

    Though, it could simply be a limitation of the engineering sample we were all given... We'll just have to wait an see.
    Reply
  • Regs - Wednesday, April 14, 2004 - link

    Wow, very impressive. Yet very costly. I'm very displeased with the power requirments however. I'm also hoping newer drivers will boost performance even more in games like Far cry. I was hoping to see at least 60 FPS @ 1280x1024 w/ 4x/8x. Even though it's not really needed for such a game and might be over kill, however It would of knocked me off my feet enough where I could over look the PSU requirement. But ripping my system apart yet again for just a video card seems unreasonable for the asking price of 400-500 dollars. Reply
  • Verdant - Wednesday, April 14, 2004 - link

    i don't think the power issue is as big as some make it out to be, some review sites used a 350 W psu, and two connectors on the same lead and had no problems under load Reply
  • dragonballgtz - Wednesday, April 14, 2004 - link

    I can't wait till December when I build me a new computer and use this card. But maybe by then the PCI-E version. Reply
  • DerekWilson - Wednesday, April 14, 2004 - link

    #11 you are correct ... i seem to have lost an image somewhere ... i'll try to get that back up. sorry about that. Reply
  • RyanVM - Wednesday, April 14, 2004 - link

    Just so you guys know, Damage (Tech Report) actually used a watt meter to determine the power consumption of the 6800. Turns out it's not much higher than a 5950.

    Also, it makes me cry that my poor 9700Pro is getting more than doubled up in a lot of the benchmarks :(
    Reply
  • CrystalBay - Wednesday, April 14, 2004 - link

    Hi Derek, What kind of voltage fluctuations were you seeing... just kinda curious about the PSU... Reply
  • PrinceGaz - Wednesday, April 14, 2004 - link

    A couple of comments so far...

    page 6 "Again, the antialiasing done in this unit is rotated grid multisample" - nVidia used an ordered grid before, only ATI previously used the superior rotated grid.

    page 8 - both pictures are the same, I think the link for the 4xAA one needs changing :)

    Can't wait to get to the rest :)
    Reply
  • ZobarStyl - Wednesday, April 14, 2004 - link

    dang ive got a 450W...sigh. That power consumption is really gonna kill the upgradability of this card (but then again the x800 is slated for double molex as well). I know it's a bit strange but I'd like to see which of these cards (top end ones) can provide the best dual-screen capability...any GPU worth its salt comes with dual screen capabilities and my dually config needs a new vid card and I dont even know where to look for that...

    and as for cost...these cards blow away 9800XT's and 5950's...it wont be 3-4 fps above the other that makes me pick between a x800 and a 6800...it will be the price. Jeez, what are they slated to hit the market at, 450?
    Reply
  • Icewind - Wednesday, April 14, 2004 - link

    Upgrade my PSU? I think not Nvidia! Lets see what you got Ati Reply
  • LoneWolf15 - Wednesday, April 14, 2004 - link

    It looks like NVidia has listened to its customer base. I'm particularly interested in the hardware MPEG 1/2/4 encoder/decoder.

    Even so, I don't run anything that comes close to maxing my Sapphire Radeon 9700, so I don't think I'll buy a new card any time soon. I bought that card as a "future-proof" card like this one is, and guess what? The two games I wanted to play with it have not been released yet (HL2 and Doom3 of course), and who knows when they will be? At the time, Carmack and the programmers for Valve screamed that this would be the card to get for these games. Now they're saying different things. I don't game enough any more to justify top-end cards; frankly, and All-In-Wonder 9600XT would probably be the best current card for me, replacing the 9700 and my TV Wonder PCI.
    Reply
  • TheAudit - Wednesday, April 14, 2004 - link

    Nice! Reply
  • MemberSince97 - Wednesday, April 14, 2004 - link

    You guys shoulda done a mini review of that 510W PSU that was used.... Reply
  • Verdant - Wednesday, April 14, 2004 - link

    looks awesome congrats to nvidia on raising the bar!

    personally i don't game very much, and the only reason my Geforce2 was replaced was for the dual-heads of the Radeon 9000

    but as an enthusiast, any leaps make me excited :p

    can't want to see ATIs new cards
    Reply
  • Lonyo - Wednesday, April 14, 2004 - link

    Not as impressive as other sites made it look in many circumstances.
    But still quite boost in performance.
    Reply
  • NYHoustonman - Wednesday, April 14, 2004 - link

    Jesus... Ya, looks like I'll be upgrading before college... Reply
  • gordon151 - Wednesday, April 14, 2004 - link

    Damn, two independant cable lines and a 480W PSU. Good thing it kills in performance, but still too pricey for me. Bring on the 6800XT for us broke people :P. Reply
  • KristopherKubicki - Wednesday, April 14, 2004 - link

    Impressive green one.

    Hope it doesnt cost $500.

    Kristopher
    Reply

Log in

Don't have an account? Sign up now