POST A COMMENT

127 Comments

Back to Article

  • multiblitz - Sunday, June 26, 2005 - link

    It would be great of you could do a comparison between the 6800 and the 7800 in video /DVD-playback-quality similar to the comparison betwenn the X800 and the 6800 you did last year. Reply
  • at80eighty - Saturday, June 25, 2005 - link

    OMG! I've never seen so many bitching whiners come outta the woodworks like this!!

    You A-holes oughta remember that this site has been kept free

    F
    R
    E
    E

    The editors owe YOU nothing. At all.

    AT team - accidents happen. Keep up the great work!

    /#121 : well said. Amazing how these turds dont realise that the knife cuts both ways...
    Reply
  • mrdeez - Friday, June 24, 2005 - link

    #124
    You can stfu too...j/k..point taken .

    I guess the real issue for me is that this card is a beast but ill never have it in my sli rig......i want all settings maxed at playable resolutions thats just me.........and i will not go back to crt...lol crt thats was lame dude
    Reply
  • Momental - Friday, June 24, 2005 - link

    #122 The problem with your solution regarding "all of us just getting two 6880U's" works perfectly for those with an SLI-capable board, yes? Some of us, like myself, anticpated the next generation of GPU's like the 7800 series and opted to simply upgrade to one of those when the dust settled and prices slid back a bit.

    Additionally, telling someone to "STFU" isn't necessary. We can't hold a conversation if we're all silent. Knowhuddamean, jellybean? Hand gestures don't work well over the internet, but here's one for you..........
    Reply
  • SDA - Friday, June 24, 2005 - link

    LCD gamers shouldn't be bothering with new graphics cards, they should get new monitors.

    kidding, I have nothing against LCDs. The real advantage of showing the card run at 2048x1536 is that it lets you see how well the card scales to more stressful scenarios. A card that suddenly gets swamped at higher resolutions will probably not fare well in future games that need more memory bandwidth.

    On a side note, you can get a CRT that will run 2048x1536 @ a reasonable refresh for about $200 shipped (any Sony G520 variant, such as the Dell P1130). The only things that would actually be small in games are the 2D objects that have set pixel sizes, everything else looks beautiful.
    Reply
  • mrdeez - Friday, June 24, 2005 - link

    #121
    lol ty for your insight....anyway like i said this card is not for lcd gamers as most have a 12x10 or 16x12.....so what purpose does this card have??answer me this batman and you have the group that should buy this card -otherwise, the rest of us should just get 2 6800u....this card is geared more for workstation graphics not gaming.....unless you game on a hi def crt and even then max res would be 1920 by 1080i..or something like that.....
    Reply
  • SDA - Friday, June 24, 2005 - link

    #116, if people in the comments thread are allowed to give their opinions, why shouldn't #114 give his too? Surely even an illiterate like you should realize that arguing that everyone is entitled to his or her own argument means that the person you're arguing with is too.

    #119, some people have different requirements than others. Some just want no visible blur, others want the best contrast ratio and color reproduction they can get.
    Reply
  • bob661 - Thursday, June 23, 2005 - link

    #188
    Oh yeah. The monitor goes up to 16x12.
    Reply
  • bob661 - Thursday, June 23, 2005 - link

    #118
    I play BF2 on a Viewsonic VP201b (20.1") at work and it's very good. No streaking or ghosting. Video card is a 6800GT. I play at 1280x960.
    Reply
  • Icehawk - Thursday, June 23, 2005 - link

    Well, I for one think 1280x1024 is pretty valid as that is what a 19" LCD can do. I'd just want to see a maxed out 12x10 chart to see how it does - I know a 6800 can't do it for every game with full AA and AF. Otherwise I agree - a 12x10 with no options isn't going to show much with current games.

    See, I'm considering swapping my two 21" CRTs for two 19" LCDs - and they won't do more than 12x10. I'd love to do two 20-21" LCDs but the cost is too high and fast panels aren't to be found. 19" is the sweet spot right now IMO - perhaps I'm wrong?

    Thanks AT for a nice article - accidents happen.
    Reply
  • mrdeez - Thursday, June 23, 2005 - link

    also:maybe gaming in hi def......ona big screen Reply
  • mrdeez - Thursday, June 23, 2005 - link

    #114
    Dude just stfu......we are here to comment what we want and say it freely......minus threats and name calling....as i said before this card is not for gamers...maybe elite gamers that have a monitor that does these resolutions but most gamers i know have went to lcd and i have yet to see any lcd[im sure there are some]do these resolutions so this card really is a card for crt elite gamers......lol with those resolutions on a 21 inch monitor you would need binoculars as glasses to play the game....the tanks on bf2 would be ant like small....
    Reply
  • bob661 - Thursday, June 23, 2005 - link

    #114
    I am SO glad that Anand remains in business despite all the bitches that are in these comment sections.
    Reply
  • Locut0s - Thursday, June 23, 2005 - link

    Those who are complaining that they should have reviewed at lower resolutions should think for a minute. First of all you are talking about a 600 buck card, most people who have that kind of money to spend on a card also have a monitor that is capable of 1600x1200 or better. Also benchmarking at any lower resolution on a card like this in todays games is almost pointless as you are almost entirely CPU bound at those resolutions. Do you really want to see page after page of 1024x768 charts that differ by only 4-5 percent at the most?

    Also give the editors a break when it comes to writing these articles. As others have said this is not a subscription site and given the number of visitors and the quality of the articles I'm amazed, and gratified, that the people of Anandtech keep putting out article after long article despite all the winning that goes on over spelling mistakes and graph errors that more often than not are corrected within a few hours.
    Reply
  • SDA - Thursday, June 23, 2005 - link

    That's a really great comparison, #112, especially seeing as how we pay for AnandTech and any problems with it could leave us stranded in the middle of nowhere. And so witty, too!

    Jarred, ah, thanks.
    Reply
  • Questar - Thursday, June 23, 2005 - link

    "Our Web Editor is on vacation and we are all doing our own HTML and editing for the next 10 days. In our usual process, the article goes from an Editor to the Web Editor who codes the article, checks the grammar, and checks for obvious content errors. Those steps are not in the loop right now."


    " do know Derek as a very conscientious Editor and I would ask that you please give him, and the rest of us, a little slack this next week and a half"


    Dear Mr. Fink,
    I am sorry to hear about the problems you have had with your vehicle breakdowns. You see, our quality inspector was on vacation that week, so we just shipped our vehicles strait off the assembly line. Please cut us a little slack, as we usually build much better vehicles.

    Sincerly,
    Buncha Crap,
    CEO
    Crappy Moters Inc.

    Reply
  • frisbfreek - Thursday, June 23, 2005 - link

    my question is how did they do hi res 2048x1536 when the card is only single-link DVI? Shouldn't either an analog connection or dual-link be necessary? Reply
  • yacoub - Thursday, June 23, 2005 - link

    #108 - what do you want, a $2000 CPU to go with your $1200 in GPUs? =P Reply
  • CtK - Thursday, June 23, 2005 - link

    so dual display is still not available with dual 7800s?!?!?!?!? Reply
  • smn198 - Thursday, June 23, 2005 - link

    Come on Intel & AMD. Keep up! Reply
  • WaltC - Thursday, June 23, 2005 - link

    I found this remark really strange and amusing:

    "It's taken three generations of revisions, augmentation, and massaging to get where we are, but the G70 is a testament to the potential the original NV30 design possessed. Using the knowledge gained from their experiences with NV3x and NV4x, the G70 is a very refined implementation of a well designed part."

    Oh, please...nV30 was so poor that it couldn't even run at its factory speeds without problems of all kinds--which is why nVidia officially cancelled nV30 production after shipping a mere few thousand units. JHH, nVidia's CEO went on record saying, "nV30 was a failure" [quote, unquote] at the time. nV30 was [i]not[/i] the foundation for nV40, let alone the G70.

    Indeed, if anything could be said to be foundational for both nV40 and G70, it would be ATi's R3x0 design of 2002. G70, imo, has far more in common with R300 than it does nV30. nV30, if you recall, was primarily a DX8 part with some hastily bolted on DX9-ish add-ons done in response to R300 (fully a DX9 part) which had been shipping for nine months prior to nV30 getting out of the door.

    In fact, ATi owes its meteoric rise to #1 in the 3d markets over the last three years precisely to the R3x0 products which served as the basis for its later R4x0 architectures. Good riddance to nV3x, I say.

    I'm always surprised at the short and selective memories displayed so often by tech writers--really makes me wonder, sometimes, whether they are writing tech copy for their readers or PR copy at the behest of specific companies, if you know what I mean.
    Reply
  • JarredWalton - Thursday, June 23, 2005 - link

    98 - As far as I know, the power was measured at the wall. We use a device called "Kill A Watt", and despite the rather lame name, it gives accurate results. It's almost impossible to measure the power draw of any single component without some very expensive equipment - you know, the stuff that AMD and Intel use for CPUs. So under load, the CPU and GPU (and RAM and chipset, probably) are using far more power than at idle. Reply
  • PrinceGaz - Thursday, June 23, 2005 - link

    I agree, starting at 1600x1200 for a card like this was a good idea. If your monitor can only do 1280x1024, you should consider getting a better one before buying a card like the 7800gtx. As a 2070/2141 owner myself, I know that a good monitor capable of high resolutions is a great investment that lasts a helluva lot longer than graphics cards, which are usually worthless after four or five years (along with most other components).

    I'm surprised that no one has moaned about the current lack of an AGP version, to go with their Athlon XP 1700+ or whatever ;)
    Reply
  • Johnmcl7 - Thursday, June 23, 2005 - link

    I think it was spot on to have 1600x1200 as the minimum resolution, given the power of these cards I think 1024x768, no AA/AF results for 3Dmark2003/2005 which have been thrown around are a complete waste of time.

    John
    Reply
  • Frallan - Thursday, June 23, 2005 - link

    Good review... And re: the NDA deadlines and the sleapless nights - don't sweat it if a few mistakes are published. The readers here have their heads screwed on the right way and will find the issues for soon enough. And for everyone that does not do 12*16 or 15*20 the answer is simple - U Don't Need The Power!! Save your hard earnt money and get a 6800gt instead. Reply
  • Calin - Thursday, June 23, 2005 - link

    Maybe if you could save the game, change the settings and reload it you could obtain images from exactly the same positions. In one of the fence images, the distance to the fence is quite a bit different in different screenshots Reply
  • Calin - Thursday, June 23, 2005 - link

    You had an 7800 SLI? I hate you all
    :p
    Reply
  • xtknight - Thursday, June 23, 2005 - link

    Edit: last post correction: actually 21-page report! Reply
  • xtknight - Thursday, June 23, 2005 - link

    Jeez...a couple spelling errors here and there...who cares? I'd like to see you type up a 12-page report and get it out the door in a couple days with no grammatical or spelling errors, especially when your main editor is gone. Remember that English study that showed the human brain interpreted words based on patterns and not spelling?

    I did read the whole review, word-for-word, with little to no trouble. There was not a SINGLE thing I had trouble comprehending. It's a better review than most sites have done which test lower resolutions. I love the non-CPU-limited benchmarks here.

    One thing that made me chuckle was "There is clearly a problem with the SLI support in Wolfenstein 3D". That MS-DOS game is in dire need of SLI. (It's abbreviated Wolfenstein: ET. Wolf3D is an oooold Nazi game.)
    Reply
  • SDA - Thursday, June 23, 2005 - link

    Derek or Jarred or Wesley or someone:

    Did you measure system power consumption as how much power the computer drew from the wall, or how much power the innards drew from the PSU?


    #95, it's a good thing you know enough about running a major hardware site to help them out with your advice! :-)
    Reply
  • mrdeez - Thursday, June 23, 2005 - link

    Derek-
    Please post benches with resolutions that are commonly used or this article becomes a workstatin graphics card article and not one for gamers....I mean really 2046x3056 or whatever the hell...lol...#1 who games at that res??? While this card is powerful it should be mentioned that unless you use a res over 1600x12000 this card is unnecessary.......lol those were some ridculous resolutions though.......and again post some benches with 1280x1024 for us lcd'ers.....
    Reply
  • Shinei - Thursday, June 23, 2005 - link

    #95: Did you pay to read this article? I know I didn't...

    #94: I guess you missed the part where they said that all resolutions below 1600x1200 were essentially identical in performance? If you only play in 1024x768, why are you reading a review about a $600 video card--go buy a 6600GT instead.
    Reply
  • jeffrey - Wednesday, June 22, 2005 - link

    #83

    Has the staff at Anandtech not ever heard of "Vacation Coverage"?

    The excuse of your Web Editor being on vacation is, in reality, an admission of improper planning.

    A major hardware site that is dedicated to cutting-edge technology should have planned better. New high-end GPU launches happen by NVIDIA only about 2-3 times a year at most.

    This was one of the HUGE launches of the year and it was messed-up becuase the team didn't feel it was important enough to get some help for the article. There was damage done to Anandtech today due to the article errors and due to the casual admission in post #83 about not caring to properly cover a "Super Bowl" type of product launch today.

    Save your apologies to the message board, give them to Anand.
    Reply
  • geekfool - Wednesday, June 22, 2005 - link

    How about benchmarking some useful resolutions? This review was essentially useless. Reply
  • JarredWalton - Wednesday, June 22, 2005 - link

    86 - Trust me, most of us other editors saw the article, and quite a few of us offered a helping hand. NDAs a a serious pain in the rear, though. Derek was busy pulling all nighters and functioning on limited sleep for several days, and just getting the article done is only half the battle. Getting the document and results into the document engine for a large article with a lot of graphs can take quite a while and is an error prone process.

    The commentary on the gaming benchmarks, for instance, was written in one order and posted in a different order. So please pardon the use of "this is another instance" or "once again" when we're talking about something for the first time. Anyway, I've got a spreadsheet of the benchmarks from early this morning, and other than non-functional SLI in a few games, the numbers appeared more or less correct. The text also didn't have as many typos. Crunch time and getting the final touches put on a major article isn't much fun.

    Thankfully, I'm just the SFF/Guide man, so I'm rarely under NDA pressure. ;)
    Reply
  • robp5p - Wednesday, June 22, 2005 - link

    I would love to see someone start benchmarking in widescreen resolutions! 1920x1200 begs for a fast video card like this. As was pointed out, the only real benefits of the 7800 come at high resolutions, and many people buying high resolution monitors these days are getting widescreen LCD's

    and btw, my 2405fpw is sitting in a box right next to me in the office, begging me to open it up before I get home...this thing will be fun to get home on the subway
    Reply
  • patriot336 - Wednesday, June 22, 2005 - link

    Where is the Monarch and Tiger love?
    TigerDirect http://www.tigerdirect.com/applications/searchtool...

    Monarchcomputer http://www.monarchcomputer.com/Merchant2/merchant....

    Both are 599.00$
    Reply
  • BikeDude - Wednesday, June 22, 2005 - link

    $600 for a card that only features single-link DVI outputs? Yeah right, pull the other one nVidia!

    --
    Rune
    Reply
  • ta2 - Wednesday, June 22, 2005 - link

    As a player of eve-online, I can tell you that the game is entirely CPU dependent. On that matter, it will 100% any CPU you have. I mean ANY CPU. Also for the testing, you should use 1600x1200 max AA and AF and go into an area with many player ships on eve-online. I guarantee you will not get 60 FPS. Impractical and unscientific, but would still give better results than this review. Reply
  • TinyTeeth - Wednesday, June 22, 2005 - link

    I am very impressed of the performance of the new chip. Nvidia seems to have fixed the problems SLI had during the 6800 generation.

    I am also pleased they have managed to deliver the cards so quickly. That also puts some pressure on ATI.
    Reply
  • Alx - Wednesday, June 22, 2005 - link

    Face it, this launch isn't gonna hurt anyone except people with minds too small to accept that there is simply one more option than there was before. If you liked pc gaming yesterday, then there is no reason why this launch should make you stop liking it today. Unless you're a retarded buttbaby who can't handle choices. In that case please get a console and stop coming to this site. Reply
  • mlittl3 - Wednesday, June 22, 2005 - link

    #82, Wesley

    Well that sucks that ya'll have lost your web editor for awhile. Especially when there is so much cool hardware coming out around now. In our research lab, we pass around our publications and conference posters to others in the group so that a fresh pair of eyes see them before they go live or to the journal editor. But of course, everyone else at AT is also busy so oh well.

    Good work guys and I look forward to the "new CPU speed bump" article (or FX-57 for those not under NDAs).

    Mark

    PS. If ya'll have an opening for another web editor, you should hire #84 (ironchefmorimoto). I hear he can cook really well.
    Reply
  • AtaStrumf - Wednesday, June 22, 2005 - link

    Nicely corrected Derek, I think there are just a few typos left, like this one (**):

    Page 20
    Power Consumption
    We measured power consumption between the power supply and the wall. This multiplies essentially amplifies any differences in power draw because the powersupply is not 100% efficient. Ideally we would measure power draw of the card, but it is very difficult **determine** to determine the power draw from both the PCIe bus and the 12V molex connector.

    AND a few double "Performances" in the title (Performance Performance) starting with page 10.

    Nice card nVidia!!! I hope ATi isn't too far behind though. Crossfire --> cheap SLi ;-) I need a nice midrange product out by September when it'll be time to upgrade to a nice E6 stepping S939 A64 and something to take the place of my sweet old GF2 MX (I'm not kidding, I sold my 6600GT AGP, and now I'm waiting for the right time to move to PCIe).
    Reply
  • IronChefMoto - Wednesday, June 22, 2005 - link

    Amen -- you guys work hard on your articles. Keep up the great work. And don't f*cking bother the web editor. We...er...they don't get enough vacation as it is.

    IronChefMorimoto
    (another web editor who needs a break)
    Reply
  • Wesley Fink - Wednesday, June 22, 2005 - link

    Derek was too modest to mention this in his comments, but I think you should know all the facts. Our Web Editor is on vacation and we are all doing our own HTML and editing for the next 10 days. In our usual process, the article goes from an Editor to the Web Editor who codes the article, checks the grammar, and checks for obvious content errors. Those steps are not in the loop right now.

    The next thing is NDA's and launches. We are always under the gun for launches, and lead times seem to get shorter and shorter. Derek was floating questions and graphs last night at 3 to 4 AM with an NDA of 9AM. Doing 21 pages of meaningful commentary in a short time frame, then having to code it in HTML (when someone else normally handles that task), is not as easy as it might appear.

    I do know Derek as a very conscientious Editor and I would ask that you please give him, and the rest of us, a little slack this next week and a half. If you see errors please email the Editor of the article instead of making it the end of the world in these comments. I assure you we will fix what is wrong. That approach, given the short staff, would be a help to all of us. We all want to bring you the information and quality reviews you want and expect from AnandTech.
    Reply
  • IronChefMoto - Wednesday, June 22, 2005 - link

    #79 -- But why wouldn't it be a high quality article, mlittl3? I thought you told me that AT was infallible? Hmmm? ;-) Reply
  • Houdani - Wednesday, June 22, 2005 - link

    Thanks for the refresh, Derek. I went back and took a peek at the revised graphs. I have a couple of comments on this article before you move on to the next project.

    >> When the Splinter Cell page was refreshed, the graph for 20x15x4 apparently disappeared.

    >> When you removed the SLI's from the Guild War page, it looks like the 7800GTX changed from 50.5 to 55.1 (which is the score previously given to the 6800 Ultra SLI).

    >> Several of the pages have scores for no AA benches listed first, while other pages have scores for the 4xAA listed first. While the titles for the graphs are correct, it's a little easier to read when you stay consistent in the ordering. This is a pretty minor nit-pick, though.

    >> Thanks for updating the transparency images to include mouseover switches ... quite handy.
    Reply
  • fishbits - Wednesday, June 22, 2005 - link

    "They priced themselves into an extremely small market, and effectively made their 6800 series the second tier performance cards without really dropping the price on them. I'm not going to get one, but I do wonder how this will affect the company's bottom line."

    The 6800s were "priced into an extremely small market." How'd that line turn out? I don't imagine they've released this product with the intention of losing money overall. Why do you think retailers bought them? Because they know the cards won't sell and they're happy to take the loss? It's already been proven that people will pay for you to develop and sell a $300, wait $400, wait $500 video card. It's already been proven that people will pay a $100+ premium for cards that are incrementally better, not just a generation better. Sounds like this target is a natural, especially knowing it'll eventually fall into everyone else's purchasing ability.

    Being able to say you have the bar-none best card out there by leaps and bounds is certainly worth something. Look at all the fanboys that are out there. Every week or month you're able to stay on top of the benches means you get more people who'll swear by your products no matter what for years to come. Everyone you can entice into buying your card who sees it as a good product will buy your brand in the future as a preference, all other options being equal. I could be wrong, but suspect Nvidia's going to make money off this just fine.
    -----------------------
    "I am proud that our readership demands a quality above and beyond the norm, and I hope that that never changes. Everything in our power will be done to assure that events like this will not happen again."

    See... that's why I'm a big fan of the site.

    Reply
  • mlittl3 - Wednesday, June 22, 2005 - link

    #78, I bet you didn't even read the article. How do you know it demonstrated editoral integrity? Reply
  • IronChefMoto - Wednesday, June 22, 2005 - link

    #23 (mlittl3) still can't pronounce "Penske" and "terran" right, regardless of the great editorial integrity demonstrated by the AT team today. Thanks! Reply
  • BenSkywalker - Wednesday, June 22, 2005 - link

    Derek-

    I wanted to offer my utmost thanks for the inclusion of 2048x1536 numbers. As one of the fairly sizeable group of owners of a 2070/2141 these numbers are enormously appreciated. As everyone can see 1600x1200x4x16 really doesn't give you an idea of what high resolution performance will be like. As far as the benches getting a bit messed up- it happens. You moved quickly to rectify the situation and all is well now. Thanks again for taking the time to show us how these parts perform at real high end settings.
    Reply
  • blckgrffn - Wednesday, June 22, 2005 - link

    You're forgiven, by me anyway :) It is also the great editorial staff that makes Anandtech my homepage on every browser on all of my boxes!

    Nat
    Reply
  • yacoub - Wednesday, June 22, 2005 - link

    #72 - Totally agree. Some Rome: Total War benchs are much needed - but primarily to see how the game's battle performance with large numbers of troops varies between AMD and Intel more so than NVidia and ATi, considering the game is highly CPU-limited currently in my understanding. Reply
  • DerekWilson - Wednesday, June 22, 2005 - link

    Hi everyone,

    Thank you for your comments and feedback.

    I would like to personally apologize for the issues that we had with our benchmarks today. It wasn't just one link in the chain that caused the problems we had, but there were many factors that lead to the results we had here today.

    For those who would like an explanation of what happened to cause certain benchmark numbers not to reflect reality, we offer you the following. Some of our SLI testing was done forcing multi-GPU rendering on for tests where there was no profile. In these cases, the default mutli-GPU mode caused a performance hit rather than the increase we are used to seeing. The issue was especially bad in Guild Wars and the SLI numbers have been removed from offending graphs. Also, on one or two titles our ATI display settings were improperly configured. Our windows monitor properties, ATI "Display" tab properties, and refresh rate override settings were mismatched. This caused the card to render. Rather than push the display at a the pixel clock we expected, ATI defaulted to a "safe" mode where the game is run at the resolution requested, but only part of the display is output to the screen. This resulted in abnormally high numbers in some cases at resolutions above 1600x1200.

    For those of you who don't care about why the numbers ran the way they did, please understand we are NOT trying to hide behind our explanation as an excuse.

    We agree completely that the more important issue is not why bad numbers popped up, but that bad numbers made it into a live article. For this I can only offer my sincerest of apologies. We consider it our utmost responsibility to produce quality work on which people may rely with confidence.

    I am proud that our readership demands a quality above and beyond the norm, and I hope that that never changes. Everything in our power will be done to assure that events like this will not happen again.

    Again, I do apologize for the erroneous benchmark results that went live this morning. And thank you for requiring that we maintain the utmost integrity.

    Thanks,
    Derek Wilson
    Senior CPU & Graphics Editor
    AnandTech.com
    Reply
  • Dmitheon - Wednesday, June 22, 2005 - link

    I have to say, while I'm am extremely pleased with nVidia doing a real launch, the product leaves me scratching my head. They priced themselves into an extremely small market, and effectively made their 6800 series the second tier performance cards without really dropping the price on them. I'm not going to get one, but I do wonder how this will affect the company's bottom line. Reply
  • OrSin - Wednesday, June 22, 2005 - link

    I not tring to be a buthole but can we get a benchmark thats a RTS game. I see 10+ games benchmarks and most are FPS, the few that are not might as well be. Those RPG seems to use a silimar type engine. Reply
  • stmok - Wednesday, June 22, 2005 - link

    To CtK's question : Nope, SLI doesn't work with dual-display. (Last I checked, Nvidia got 2D working, but NO 3D)...Rumours say its a driver issue, and Nvidia is working on it.

    I don't know any more than that. I think I'd rather wait until Nvidia are actually demonstrating SLI with dual or more displays, before I lay down any money.
    Reply
  • yacoub - Wednesday, June 22, 2005 - link

    #60 - it's already to the point where it's turning people off to PC gaming, thus damaging the company's own market of buyers. It's just going to move more people to consoles, because even though PC games are often better games and much more customizable and editable, that only means so much and the trade-off versus price to play starts to become too imbalanced to ignore. Reply
  • jojo4u - Wednesday, June 22, 2005 - link

    What was regarding the AF setting? I understand that it was set to 8x when AA was set to 4x? Reply
  • Rand - Wednesday, June 22, 2005 - link

    I have to say I'm rather disappointed in the quality of the article. A number of apparently nonsensical benchmark results, with little to no analysis of most of the results.

    A complete lack of any low level theoretical performance results, no attempts to measure any improvements in efficiency of what may have caused such improvements.

    Temporal AA is only tested on one game with image quality examined in only one scene. Given how dramatically different games and genres utilize alpha textures your providing us with an awfully limited perspective of it's impact.

    Reply
  • swatX - Wednesday, June 22, 2005 - link

    THE SLI is meant to played on high res.. if you got money to brn on SLI then i am damn sure you got money to burn on a 19" monitor ;) Reply
  • CtK - Wednesday, June 22, 2005 - link

    can Dual Display be used in SLi mode?? Reply
  • Johnmcl7 - Wednesday, June 22, 2005 - link

    In general 6600GT SLI performance seems a bit random, in some cases it's really good as with BF2 but in others not as good as a 6800GT.

    John
    Reply
  • bob661 - Wednesday, June 22, 2005 - link

    Anyone notice how a SLI'd 6600GT is just as quick as a 6800 Ultra in BF2? Reply
  • R3MF - Wednesday, June 22, 2005 - link

    give me some details on the 7800 and 7800GT

    what, when, and how much?
    Reply
  • bob661 - Wednesday, June 22, 2005 - link

    #59
    I am more eager to see how the new midrange cards will perform than these parts but if I had a spare $600 I would jump all over this.
    Reply
  • bob661 - Wednesday, June 22, 2005 - link

    #56
    LMAO!!!!
    Reply
  • bob661 - Wednesday, June 22, 2005 - link

    #44
    And I thought paying $350 for a video cards was too much then or even before than there was the $200 high end and before that the $100 high end. I balked at all of those prices but I understood why they were prices as such and didn't bitch everytime the costs went up. The bar keeps being raised and the prices go with it. Inflation, more features and the fact that most of us here can afford $350 video cards pushes the cost of new PREMIUM cards higher by the year. It's only going to go up unless either people quit buying the high end cards or the manufatucrers find a magical process to reduce costs dramatically.
    Reply
  • Johnmcl7 - Wednesday, June 22, 2005 - link

    You're quite right, there's always a premium for the best, I don't see any difference here, no-one is being forced to buy this graphics card. As usual, I'll wait until something offers me a better price/performance ratio over my current X850XT/6800 Ultra duo.

    John
    Reply
  • Avalon - Wednesday, June 22, 2005 - link

    Seems to be a problem with the last Knights of the Old Republic 2 graph. Both 7800GTX setups are "performing" less than all the other cards benched. Despite all the mistakes, it still seems like I was right in that this card is made for those who play at high resolutions. Anyone with an R420 of NV40 based card that plays at 16x12 or less should probably not bother upgrading, unless they feel the need to. Reply
  • CrystalBay - Wednesday, June 22, 2005 - link

    Does this card play Riddick smoothly @ shader 2++ ????? Reply
  • fishbits - Wednesday, June 22, 2005 - link

    "In aboot 5 years i figure we'll be paying 1000 bucks for a video card. These prices are getting out of control, every generation is more expensive then the last. Dont make me switch to consoles damnit."

    Funny, I can't afford the very best TVs the minute they come out. Same for stereo components. But I don't cry about it and threaten "Don't make me switch to learning the ukelele and putting on my own puppet shows to entertain myself!" Every time a better component comes out, it means I get a price reduction and feature upgrade on the items that are affordable/justifiable for my budget.

    Seriously, where does the sense of entitlement come from? Do these people think they should be able to download top-of-the-line graphics cards through BitTorrent? Do they walk around Best Buy cursing out staff, manufacturers and customers for being so cruel as to buy and sell big-ass plasma TVs?

    On second thought, get your console and give up PC gaming. That way you can stop being miserable, and we can stop being miserable hearing about your misery.
    Reply
  • tazdevl - Wednesday, June 22, 2005 - link

    Funny how the single card deltas here are higher than at any other site.

    Unwhelmed for the amount of money and lack of performance increase.

    Have to commend nVIDIA for ensuring retail availability at launch.
    Reply
  • archcommus - Wednesday, June 22, 2005 - link

    Impressive, but I'm still happy with my X800 XL purchase for only $179. For what it seems, with a 1280x1024 display, I won't need the kind of power this card delivers for a very long time. And less than $200 compared to $600, with still excellent peformance for now and the forseeable future? Hmm, I'll take the former. Reply
  • Lonyo - Wednesday, June 22, 2005 - link

    I would have liked some 1280x1024 benchmarks with 8xAA from the nVidia cards and 6xAA from ATi to see if it's worth getting something like a 7800GTX with 17/19" LCD's to run som esuper high quality settings in terms of AA/AF. Reply
  • segagenesis - Wednesday, June 22, 2005 - link

    I'm not disappointed. For one thing the price of current cards will likely drop now, and there will also be mid-range parts soon to choose from. I think the transparency AA is a good idea for say... World of Warcraft. The game is loaded with them and too often can you see the blockyness of trees/grass/whatever.

    #44 - Actually are you new to the market? :) I remember when early "accelerated" VGA cards were nearly $1000. Or more.

    Everybody lambasted NVIDIA last year for the lack of product (6800GT/Ultra) to the market, so them actually making a presence this year instead of a paper launch should also be commended. Of course, now what is ATI gonna pull out of its hat?
    Reply
  • KeDaHa - Wednesday, June 22, 2005 - link

    The screenshot shows very clearly that SSAA provides quite a quality improvement over no AA

    The difference is bloody miniscule, perhaps if you used an image SLIGHTLY larger than 640x480 to highlight the difference?
    Reply
  • L3p3rM355i4h - Wednesday, June 22, 2005 - link

    Wowzers. Time to get rid of teh 9800... Reply
  • shabby - Wednesday, June 22, 2005 - link

    In aboot 5 years i figure we'll be paying 1000 bucks for a video card. These prices are getting out of control, every generation is more expensive then the last.
    Dont make me switch to consoles damnit.
    Reply
  • Xenoterranos - Wednesday, June 22, 2005 - link

    Hell, for the same price as an SLI setup I can go out and get a 23 inch cinema display...And since these cards can't handle the 30" native resolution anyway, it's a win-win. And yeah, whats up with the quality control on these benchmarks! I mean really, I almost decided to wait for the ATI next-gen part when I saw this (GeForce man since the GeForce2 GTS!) Reply
  • Johnmcl7 - Wednesday, June 22, 2005 - link

    If they're too busy for the article, that's fair enough, the point is they should put it up when they've had time to check it over, rather than rush an article up that isn't ready to be published.

    John
    Reply
  • IronChefMoto - Wednesday, June 22, 2005 - link

    Regarding the "shame on Anandtech" comments -- y'all ever think they were too busy sh*tting themselves at the performance of this card to really pay that much attention to the article? ;-)

    IronChefMorimoto
    Reply
  • Johnmcl7 - Wednesday, June 22, 2005 - link

    The prices I've seen here in the UK for the 7800s here are around 400 pounds, the 6800 Ultras are currently around 300 pounds. So quite an increase over the NV40s but not unacceptable given the performance, I'm sure they'll come down in price once the early adopters have had their fill.

    John
    Reply
  • yacoub - Wednesday, June 22, 2005 - link

    #26 - You must be new to the market, relatively speaking. I remember quite well the days when high-end new videocards were at MOST $400, usually $350 or less when they debuted. It was more than a year or two ago though, so it might have been before your time as a PC gamer. Reply
  • rimshot - Wednesday, June 22, 2005 - link

    Not sure why the price is so high in North America, here in Aus you can get a 7800GTX for the same price as a 6800GT ($850AU).

    Reply
  • nitromullet - Wednesday, June 22, 2005 - link

    "What no Crossfire benchies? I guess they didn't wany Nvidia to loose on their big launch day."

    Ummm... maybe because CrossFire was paper launched at Computex, and no one (not even AT) has a CrossFire rig to benchmark? nVidia is putting ATI to shame with this launch and the availability of the cards. Don't you think if ATI had anything worth a damn to put out there they would?

    All that aside... I was as freaked out as the rest of you by these benchmarks at first (well moreso than some actually, becuase I just pulled the $600 trigger last night on an eVGA 7800GTX from the egg). However, these graphs are clearly messed up, and some appear to have already been fixed. I guess someone should have cut Derek off at the launch party yesterday.
    Reply
  • blckgrffn - Wednesday, June 22, 2005 - link

    Very disapointed at the fit and finish of this article. Anandtech is supposed to have the best one, not a half baked one :( I even liked HardOCP better even with their weird change the levels of everything approach - at least it has a very good discussion of the differences between MS and SS AA and shows some meaningful results at high res as well.

    Shame on Anandtech :(
    Reply
  • fishbits - Wednesday, June 22, 2005 - link

    Good release.

    Can we get a couple of screen shots with the transparency AA?

    "Maybe this one of the factors that will lead to the Xbox360/PS3 becoming the new gaming standard as opposed to the Video Card market pushing the envelope."
    Yeah, because the graphics components in consoles don't require anything but three soybeans and a snippet of twine to make. They're ub3r and free! Wait, no, you pay for them too eventually even if not in the initial console purchase price. Actually I think the high initial price of next gen graphics cards is a sign of health for PC gaming. There are some folks not only willing to pay high dollars for bleeding edge performance, they're willing to pay even higher dollars than they were in the past for the top performers. Spurs ATI/Nvidia to keep the horsepower coming, which drives game devs to add better and better graphics, etc.

    "They only reveresed a couple of labels here and there, chill out. It's still VERY OBVIOUS which card is which just by looking at the performance!"
    Eh, I use benchmarks to learn more about a product than what my pre-conceived notions tell me it "ought" to be. I don't use my pre-conceived notions to accept and dismiss scientific benchmarks. If the benches are wrong, it is a big deal. Doesn't require ritual suicide, just fixing and maybe better quality control in the future.
    Reply
  • Thresher - Wednesday, June 22, 2005 - link

    2x6800GT costs almost the same amount as this single card and gives up nothing in performance.

    The price of this thing is ridiculous.
    Reply
  • rubikcube - Wednesday, June 22, 2005 - link

    Just wanted to say thanks for starting your benchmarks at 1600x1200. It really makes a difference in the usability of the benchmarks. Reply
  • VIAN - Wednesday, June 22, 2005 - link

    "NVIDIA sees texture bandwidth as outweighing color and z bandwidth in the not too distant future." That was a quote from the article after saying that Nvidia was focusing less on Memory Bandwidth.

    Do these two statements not match or is there something I'm not aware of.
    Reply
  • obeseotron - Wednesday, June 22, 2005 - link

    These benchmarks are pretty clearly rushed out and wrong, or at least improperly attributed to the wrong hardware. SLI 6800 show up faster than SLI 7800's in many benchmarks, in some cases much more than doubling single 6800 scores. I understand NDAs suck with the limited amount of time to produce a review, but I'd rather it have not been posted until the afternoon than ignore the benchmarks section. Reply
  • IronChefMoto - Wednesday, June 22, 2005 - link

    #28 -- Mlittl3 can't pronounce Penske or terran properly, and he's giving out grammar advice? Sad. ;) Reply
  • SDA - Wednesday, June 22, 2005 - link

    QUESTION

    Okay, allcaps=obnoxious. But I do have a question. How was system power consumption measured? That is, was the draw of the computer at the wall measured, or was the draw on the PSU measured? In other words, did you measure how much power the PSU drew from the wall or how much power the components drew from the PSU?
    Reply
  • Aikouka - Wednesday, June 22, 2005 - link

    Wow, I'm simply amazed. I said to someone as soon as I saw this "Wow, now I feel bad that I just bought a 6800GT ... but at least they won't be available for 1 or 2 months." Then I look and see that retailers already have them! I was shocked to say the least. Reply
  • RyDogg1 - Wednesday, June 22, 2005 - link

    But my question was "who," was buying them. I'm a hardware goon as much as the next guy, but everyone knows that in 6-12 months, the next gen is out and price is lower on these. I mean the benches are presenting comparisons with cards that according to the article are close to a year old. Obviously some sucker lays down the cash because the "premium," price is way too high for a common consumer.

    Maybe this one of the factors that will lead to the Xbox360/PS3 becoming the new gaming standard as opposed to the Video Card market pushing the envelope.
    Reply
  • geekfool - Wednesday, June 22, 2005 - link

    What no Crossfire benchies? I guess they didn't wany Nvidia to loose on their big launch day. Reply
  • Lonyo - Wednesday, June 22, 2005 - link

    The initial 6800U's cost lots because of price gouging.
    They were in very limited supply, so people hiked up the prices.
    The MSRP of these cards is $600, and they are available.
    MSRP of the 6800U's was $500, the sellers then inflated prices.
    Reply
  • Lifted - Wednesday, June 22, 2005 - link

    #24: In the Wolfenstein graph they obviously reversed the 7800 GTX SLI with the Radeon.

    They only reveresed a couple of labels here and there, chill out. It's still VERY OBVIOUS which card is which just by looking at the performance!

    WAKE UP SLEEPY HEADS.
    Reply
  • mlittl3 - Wednesday, June 22, 2005 - link

    Derek,

    I know this article must have been rushed out but it needs EXTREME proofreading. As many have said in the other comments above, the results need to be carefully gone over to get the right numbers in the right place.

    There is no way that the ATI card can go from just under 75 fps at 1600x1200 to over 100 fps at 2048x1535 in Enemy Territory.

    Also, the Final Words heading is part of the paragraph text instead of a bold heading above it.

    There are other grammatical errors too but those aren't as important as the erroneous data. Plus, a little analysis of each of the benchmark results for each game would be nice but not necessary.

    Please go over each graph and make sure the numbers are right.
    Reply
  • Regs - Wednesday, June 22, 2005 - link

    Yikes @ the graphs lol.

    I just came close to pushing the button to order one of these but then I said...what games can't play on a 6800GT at 16x12 res? There is none. Far Cry was the only game that comes close to doing it.

    Bravo to Nvidia, his and boo @ lagging game developers.
    Reply
  • bob661 - Wednesday, June 22, 2005 - link

    #19
    Are you new to this market or do you have a short memory? Don't you remember that the initial 6800 Ultra's cost around $700-800? I sure as hell do. Why is everyone complaining about pricing? These are premium video cards and you will pay a premium price to buy them.
    Reply
  • Barneyk - Wednesday, June 22, 2005 - link

    Yeah, not a single comment on any of the benchmarks, what is up with that?

    There were alot of wierd scenarios there, why is there NO performance increase in SLI some of the time?
    And why is 6800Ultra SLI faster then 7800GTX SLI??

    Alot of wierd stuff, and not a singel comment or analysis about it, I always read most new tests here on AT first becasue its usually the best, but this review was a double boogey to say the least...
    Reply
  • Dukemaster - Wednesday, June 22, 2005 - link

    @21: The score of the X850XT PE in Wolfenstein still looks messed up to me... Reply
  • shabby - Wednesday, June 22, 2005 - link

    Ya some of the scores dont make much sense, 7800 sli loosing to a single 7800? Reply
  • yacoub - Wednesday, June 22, 2005 - link

    Hey, looks great! $350 and you've got a buyer here! Reply
  • Lifted - Wednesday, June 22, 2005 - link

    Guys, they simply reversed the 6800 Ultra SLI and 7800 GTX SLI in all of the 1600 x 1200 - 4x AA graphs.

    Now everthing is kosher again.
    Reply
  • Johnmcl7 - Wednesday, June 22, 2005 - link

    To 18 - I have to admit, I didn't bother looking closely at them, seeing the X850XT supposedly beating all the other cards by such a margin at those resolutions showed they were completely screwed up! I didn't notice the performance increase as you go up the resolution, maybe it's something I missed on my own X850XT? ;) I wish...that would be a neat feature, your performance increases as your resolution increases.

    I agree it needs pulled down and checked, not to be harsh on AT but this isn't the first time the bar graphs have been wrong - I would rather wait for a review that has been properly finished and checked rather than read a rushed one, as it stands it's no use to me because I have no idea if any of the performance figures are genuine.

    John
    Reply
  • RyDogg1 - Wednesday, June 22, 2005 - link

    Wow, who exactly is paying for these video cards to warrant the pricing? Reply
  • Lonyo - Wednesday, June 22, 2005 - link

    To #14, the X850XT performance INCREASED by 33% from 1600x1200 to 2048x1536 according to the grahics, so to me that just screams BULLSH!T.
    I think the review needs taking down, editing, and then being put up again.
    Or fixed VERY quickly.
    AT IMO has let people down a bit this time round, not the usual standard.
    Reply
  • Diasper - Wednesday, June 22, 2005 - link

    oops posted before i wrote anything. Some of the results are impressive, others aren't at all. In fact results seem to be all over the board - I suspect drivers are something of the culprit and are to be blamed. Hopefully, as new drivers come out we'll see some performance increases or at least more a uniform distribution of good results Reply
  • Diasper - Wednesday, June 22, 2005 - link

    Reply
  • Live - Wednesday, June 22, 2005 - link

    Derek get cracking, the graphs are all messed up! And the Transparency AA Performance section could use some info on what game it is tested on and some more comments. I also think that each benchmark warrants some comments for all of us that have a hard time remembering two numbers at the same time. Keep it simple folks…. Reply
  • Johnmcl7 - Wednesday, June 22, 2005 - link

    I agree something is wrong with these results, I thought they were odd but when I got to the Enemy Territory ones they seem completely wrong - at 2048x1536 and 4xAA the X850XT is apparently getting 104 fps, while the 6800 Ultra gets 48.3 and the SLI 6800 Ultras are only getting 34.6 fps! Especially bearing in mind it's an OpenGL game.

    John
    Reply
  • rimshot - Wednesday, June 22, 2005 - link

    This has got to be an error by Anandtech, all other reviews show the 7800GTX in SLI at those same settings hammering the 6800Ultra in SLI. Reply
  • Lonyo - Wednesday, June 22, 2005 - link

    The benchmarks are all a load of crap it seems.
    Check the Wolfenstein benchmarks.
    The X850XT goes from 74fps @ 1600x1200 w/4xAA to 103fps @ 2048x1536 w/4xAA
    A 33% increase when the res gets turned up. Good one.

    There also seem to be many other similar things which look like errors, but they could just be crappy nVidia drivers, or something wrong with SLI profiles.

    Who knows, but there's definately a lot of things which look VERY odd/suspicious here.
    Reply
  • Dukemaster - Wednesday, June 22, 2005 - link

    My Iiyama VMP 454 does 2048 no prob so i'm game :p Reply
  • vanish - Wednesday, June 22, 2005 - link

    oh and in several of the benchmarks, the 6800U SLI more than doubles the performance over the single 6800U. Is that normal? I thought SLI gains were generally about 45% or so. Reply
  • rimshot - Wednesday, June 22, 2005 - link

    Is it just me or is it a little strange that the 6800Ultra SLI outperforms the 7800GTX SLI at 1600x1200 with 4xAA in every benchmark ??? Reply
  • PrinceXizor - Wednesday, June 22, 2005 - link

    No comment on the fact that in virtually every game it LOSES to the 6800 SLI at 1600x1200 at 4XAA.

    All other scores look very impressive. But, in this particualar group of settings, the 6800 SLI eats it for lunch.

    P-X
    Reply
  • vanish - Wednesday, June 22, 2005 - link

    From what i'm seeing the 6800U SLI beats the 7800GTX[SLI] in most normal resolutions. I don't know, but usually when a new generation comes out it should at least beat the previous generation. Sure, it works wonders on huge resolutions, but very few people actually have monitors that can display these types of resolutions. Most people don't have monitors above 1200x1000 resolution, much less 1600x1200. Reply
  • Live - Wednesday, June 22, 2005 - link

    What’s up with the BF2 graphs? The 6800u SLI scores more at 1600x1200 4xAA (76,3) then it does at the same resolution without AA (68,3). That doesn’t make sense does it?

    Sorry for the extremely poor spelling…
    Reply
  • Dukemaster - Wednesday, June 22, 2005 - link

    Over 2,5 than a 6800 ultra in Battlefield 2 and Splinter Cell, how the hell is that possbile?? Reply
  • vortmax - Wednesday, June 22, 2005 - link

    Good job Nvidia with the launch. Now lets see if ATI can match the performance and availability. Reply
  • ryanv12 - Wednesday, June 22, 2005 - link

    From what I see, the 7800GTX is really of benefit to you if you have a monitor that is higher than a 1600x1200 resolution. Fairly impressive though, I must say. I also wasn't expecting double the performance of the 6800's since it only has 50% more pipes. I can't wait to see the 32 piped cards! Reply
  • Live - Wednesday, June 22, 2005 - link

    Looks good. to bad i ahve to wiat a few month until ATI releases the competition. Reply
  • bpt8056 - Wednesday, June 22, 2005 - link

    First post!! Congrats to nVidia for pulling off an official launch with retail availability. Reply

Log in

Don't have an account? Sign up now