POST A COMMENT

47 Comments

Back to Article

  • Anonymous User - Thursday, October 23, 2003 - link

    The Mobility Radeon 9600 with 128MB is available from Compaq/HP.

    MR9600 Pro:
    http://h10010.www1.hp.com/wwpc/us/en/sm/WF05a/3219...

    Mobility FireGL T2:
    http://h10010.www1.hp.com/wwpc/us/en/sm/WF05a/3219...

    So go get yourself one today! Coz I am!

    -Ad
    Reply
  • Anonymous User - Thursday, October 09, 2003 - link

    There are some people around that are developers. I personally use OpenGL for all my CG projects and there is no comparison for OpenGL. Traditionally nvidia used to have the upper hand in OpenGL (my Golden Sample Ti4200 runs better than Radeon 9700). I'm not favouring nvidia or ati. What I need is something that performs under OpenGL and not DX9...

    Dell inspiron 8600 is a great choice, but it comes with 5650 Go. It is reasonably cheap and extremely powerfull. Easy to get (online) or via a university (my case). Although ATI is faster under DX9, it is not supported by the big names (Dell, Toshiba, Compaq...). So if 5650 is even 80% as fast as 9600 under OpenGL, it IS a choice for me... If it is yet again 400% slower... NO

    Please give us some OpenGL numbers!!!

    Even Quake 3 will do, GLExcess, whatever...

    Thanks

    Yannis

    Norwich, UK
    Reply
  • Andrew Ku - Monday, September 29, 2003 - link

    #43 Well the different results aren't unexpected. You used a different resolution. :) Reply
  • Anonymous User - Saturday, September 27, 2003 - link

    I'd like to see a couple openGL tests included in the comparison.

    Thanks.
    Reply
  • Anonymous User - Thursday, September 18, 2003 - link

    I get different results!

    I have a Dell Inspiron 8600 with the NVidia 5650 running AquaMark3. I'm using the driver that Dell ships with the 8600 (version 4.4.8.2). I get VASTLY better results on than what's posted in this article. Below, I'm taking my results vs. the article's Radeon numbers:

    Frames per second (FPS)
    My results Results from article
    Chapter Go5650 Go5650 Radeon9600 fps
    1 22.30 11.64 25.97
    2 9.38 4.23 6.68
    3 16.15 8.87 15.00
    4 6.52 5.15 11.27
    5 14.72 9.31 19.93
    6 14.28 8.47 17.96
    7 18.27 9.92 17.08
    8 13.00 6.63 12.56
    9 9.47 4.67 7.93

    I submitted my results to Aquatech's results board under my user name "RonSchaaf" I ran the test multiple times with the same results, running with the Aquatech defaults.

    Big Note: I just double-checked everything and I ran my tests at 1024x768x32, No FSAA, 4x Anisotropy, Maximum Details, with the Driver set to Maximum Quality. But I can't run at 1280x1024 like was done for the article because the Aquatech program won't let me change setting without springing for the "Professional" version.
    Reply
  • Anonymous User - Wednesday, September 17, 2003 - link

    Nice review.
    It is a good idea you tested the DX9 power of the cards and not some driver or game "optimisations"
    Reply
  • Anonymous User - Tuesday, September 16, 2003 - link

    #34 Said: "Download 51.75 and run the test. Then tell us what you see. What a bunch of CRAP."

    Your right... with Det 51.75 they'd see a bunch of CRAP. Take a look at these image quality results: http://www.gamersdepot.com/hardware/video_cards/at...


    btw, the accoding to nVidia, the det 51.75 isn't ready to be installed on any machine yet. Kind of funny how that didn't stop them from saying it was the only valid version for benching hl 2...
    http://www.techconnect.ws/modules.php?name=News&am...
    Reply
  • Anonymous User - Tuesday, September 16, 2003 - link

    I agree with shalmanese

    )
    Reply
  • Anonymous User - Tuesday, September 16, 2003 - link

    I didn't see a mention of the speed of the CPU used.

    Anyone know? (I might have missed it, but it wasn't on the benchmark setup page)

    I know it's a P-M, but at what speed?
    Reply
  • Shalmanese - Monday, September 15, 2003 - link

    Eh, I thought all in all ,it was a pretty ordinary review, lots of mistakes throughout.

    First of all, your graph numbers are up to 6 significant figures, round them down to even fps or 1 decimal place at the very least.

    While theoretical comparisons laughing at how much the ATI card beat the nVidia card are all very pleasant, some indication for people who may have wanted to BUY these cards, what sort of performance they were in for might be nice as well. This means adding the NV3X and the DX8 codepath figures for HL2 etc. Also, a Go4200 and a Mobility 9000 thrown in might have been good as well but I understand that time may have not been adequate.

    I also noticed that the CPU wasn't listed for the laptop. Is this part of the NDA info? Seems unusual as this is normally given.

    pg1:

    "Mobility Radeon 9600 in North America" should be "THE Mobility..."

    "...between Mobility Radeon 9600..." again, missed a THE ... infact, its all throughout the article.

    " You may have seen other media report benchmark scores that have been called into question. In our time spent benchmarking the two mobile graphics processors, we have yet to be able to recreate a similar scenario."

    huh? you've yet to create a benchmark that has been called into question? What are you trying to say?

    pg2:

    "specifies that the Mobility Radeon 9600 consumes 1.0V while running, and 0.5W in Windows idle." Is that V or W? theres no point telling us what voltage the chip is running at when working. Give us wattage figures.

    pg3: again, you give a V figure.

    Shalmanese
    Reply
  • Anonymous User - Monday, September 15, 2003 - link

    #34 you clearly have no idea how reviews work. AnandTech isn't going to use BETA (I repeat, BETA) drivers for a review like this, or any review for that matter unless the review specifically concentrates on the drivers themselves. In addition, the laptops tested and any laptop you can find right now is shipping with 44 or 45 series NVIDIA drivers.

    Besides, the BETA 50 series of drivers already look suspicious with slightly lower IQ and the absence of fog present in HL2. What other little IQ degradation are in these drivers is anyone's guess.

    Point is, AnandTech did exactly what they should of done, not use the 50 series drivers until they're ready to go, or WHQL'ed in other words.

    By the way, get a clue NVIDIOT.
    Reply
  • Anonymous User - Monday, September 15, 2003 - link

    #34 - You, sir, are a moron. Yeah, download the Det50 drivers and see what happens when nVidia converts all DX9 calls to DX8. Why not just set the game to DX8 yourself and save all the smoke and mirrors? Go ahead and pay $500 for a DX8 graphics card if you're that stupid. Reply
  • Anonymous User - Monday, September 15, 2003 - link

    #27 Probavly you just can´t get reasonable frame rates on DX9 environments. They both look amazing in DX 8 games but can´t handle high quality DX 9. That´s why 25 is upset with the lack of focus of the reviewer. Wait the next train, because there is no first seat class in this wagon. Reply
  • Anonymous User - Monday, September 15, 2003 - link

    Why can't you answer the question people have asked? Did you use the Det 50 driver? The answer is NO. Say it NO! You used NVIDIA 44.82. Your results are invalid. Are you guys biased or what? Is ATYT paying you off as well. Download 51.75 and run the test. Then tell us what you see. What a bunch of CRAP. Reply
  • Anonymous User - Monday, September 15, 2003 - link

    #26 - if nVidia reduces video quality, they have to be penalized for that. You have to level the IQ playing field before you can compare frame rates. Nvidia lost, give it a rest. Reply
  • Anonymous User - Monday, September 15, 2003 - link

    and a Voodoo PC M460 M10, a Targa M10 (in Germany), a Gericomm M10 (Germany), an Acetbis Peacock M10 (Germany), and ATI told me that there will be 5 more in North America by the end of October. Reply
  • Anonymous User - Monday, September 15, 2003 - link

    M10 won all the tests, and full DX9 (24-bit, according to Microsoft). What more DX9 do you want? Do you work at nVidia or something? Reply
  • Anonymous User - Monday, September 15, 2003 - link

    Sxotty - you're such an immature little bonehead. Are you saying that Nvidia should have won this review? ATI is crushing nVidia at 24-bit, even when nVidia is running in 12/16. And do you really believe the BS spin coming from nVidia that turning off the fog was 'a bug'? Reply
  • Anonymous User - Monday, September 15, 2003 - link

    Actually, the reviewer used nVidia's latest shipping mobile drivers -- and if you go any NV31M notebook manufacturer's site, you'll see that the ones he uses are in fact the ones posted. Reply
  • Anonymous User - Monday, September 15, 2003 - link

    There is an 9600 PRO laptop available - it's the Sager NP5680. My father is buying one today. Here's the link: http://www.powernotebooks.com/products.php3?displa... Reply
  • Anonymous User - Monday, September 15, 2003 - link

    #25 those benchmarks were performed at 1280x1024 with 4x AA and 8x AF. They provide a better theoretical test than lower settings because the test will be more GPU-limited than CPU-limited: if you wanted to play the game at good frame rate you would not be using such high settings.

    However I agree we need to see real-world numbers too: what settings are necessary to see reasonable frame rates out of this (say 40-50 FPS)?
    Reply
  • Anonymous User - Monday, September 15, 2003 - link

    #18 what are you talking about? Isn't that the opposite of what I (#17) said? Or did a post get deleted to cause numbers to get out of sync?

    #19 Andrew so you admit these numbers don't actually tell you how the game will perform using the "appropriate" code path for the 56x0? Even though that reduces image quality so shouldn't really be compared directly to the Radeon, it still would be nice to see real-world numbers for the sake of comparison.
    Reply
  • Anonymous User - Monday, September 15, 2003 - link

    "The scores that we achieved in AquaMark 3 are similarly reminiscent of our scores in Half-Life 2 but without such large margins. In AquaMark 3, the GeForce FX Go5650 achieves sub 10 fps scores in all but one of the scenarios. Meanwhile, the Mobility Radeon 9600 on the average is situated in the mid teens. Minimally, though, the Mobility Radeon 9600 shows its clear lead over the GeForce FX Go5650 with a 58% lead. At its best, the Mobility Radeon 9600 doubles the margin between its counterpart, and this just reinforces the GeForce FX Go5650’s trouble in true DX9 benchmarks."
    I really think there is a misinterpretation of the AquaMark 3 numbers. What is the point of being able to one gpu outperforms the other in up to 58% if none of them can push numbers above 24 fps? The reviewer should have noted that none of theses solutions will do when it comes to all DX9 games even in low quality setups.
    The honest recomendation would better be: wait for the next gen DX9 mobile chips because there is not such thing as true DX9 mobile solution neither from Nvidia nor from ATI.
    Reply
  • dvinnen - Monday, September 15, 2003 - link

    <<<We are currently revising our graphics benchmark suite in the anticipation of future DX9 stuff. These two GPUs are full DX9 parts, and we are benchmarking them accordingly. UT2003 and our current line of benchmarking titles are DX8, and therefore aren't specifically appropriate for this context. Why are our choices of benchmark titles odd? The Mobility and Go mobile graphics parts are no more than mobile version of desktop processors (clocked down, better power management features and in the M10 case integrated memory package).>>>

    I understand that. But the whole suite dosen't have to be dx9 to get an idea how it will play. I agreee with HL2, Warcraft3, and Splinter Cell, because lots of people play them. (Or in the case of HL2, will play.) AquaMark3 is also a good choice. But not all games are going to be Dx9. OpenGL is still a viable choice. Doom3 is going to use it and many games will be using the engin in the comeing years. I also brought up UT03 because lots of people play it. Quake3 is rather usless now, I agree. It became outdated long ago and now people are pushing 500 fps on it. But a OpenGL benchmark (like RtCW:ET, and yes I know it's still based on Quake3 engin, but you don't get the insane FPS) would be apprciated.
    Reply
  • Andrew Ku - Monday, September 15, 2003 - link

    #20: Please look at our test configuration page. The divers we used are the newest available drivers for testing, at per the time of the head to head. Remember, mobile drivers need to be vailidated by the mobile system vendor, not the graphics part vendor. Reply
  • Anonymous User - Monday, September 15, 2003 - link

    No drivers are going to make up a 400% differential (!). Nvidia had better get their act together for NV4x. Reply
  • Anonymous User - Monday, September 15, 2003 - link

    Ugghh...
    I just purchased a Dell Inspiron 8600 with the 128mb Geforce Go 5650. Was looking forward to having a mobile platform to play some of the upcoming games. How disappointing to realize that the 5650 just won't be up to snuff. Nvidia should be ashamed of themselves...

    I would have liked to have waited until a ATI radeon 9600 came out for a dell system, but I got a good deal on the laptop and the 9600 card just doesn't seem to have wide distribution yet except in some very expensive custom laptops. Maybe I'll be able to switch out my Nvidia card for an ATI card when it becomes available from Dell?
    Reply
  • Anonymous User - Monday, September 15, 2003 - link

    I would like to know if this was done using the Rel 50 drivers that aren't "publically released". I remeber hearing comment that these drivers were made for DX9 Games (I think) and that NVIDIA stopped research on the current drivers in DX9 months ago. I really think it would skew the results and the benchmarks do about as much good as the ones posted on HL2 benchmarks a couple days ago. (Sept 12th I think)

    Also, that comment on openGL makes me think.

    Don't get me wrong NVIDIA isn't looking to good,
    But if the drivers aren't the upmost recent on there card I'd like to know how it's a head to head test?
    Reply
  • Andrew Ku - Monday, September 15, 2003 - link

    #17: The article was subtitled "Taking on DX9." Therefore we benchmarked in DX9 as we stated on the Half-Life 2 page. Reply
  • Anonymous User - Monday, September 15, 2003 - link

    #17: Yes you are correct. We should all run our monitors at 30Hz too, any more is a waste. Reply
  • Anonymous User - Monday, September 15, 2003 - link

    That 30 FPS-eye-limit rubbish always comes up in these sort of threads - I can't believe there are people who think they can't tell the difference between a game running at 30 FPS and 60 FPS.

    Anyway, I'd like to ask about the HL2 benches - you mention the 5600 is supposed to drop down a code path, but don't specifically say which one was used in the tests. DX8? Mixed? The charts say "DX 9.0", so if that was indeed used then it's interesting from a theoretical point of view but doesn't actually tell us how the game will run on such a system, since the DX8 code path is recommmended by Valve for the 5200/5600.
    Reply
  • Anonymous User - Monday, September 15, 2003 - link

    The "car wheels not rotating right" effect is caused by aliasing, and you'll still get that effect even if your video card is running at 2000fps.

    Besides, you're limited by your monitor's refresh rate anyhow.
    Reply
  • Anonymous User - Monday, September 15, 2003 - link

    #14 that is incorrect and totally misleading. Humans can tell the difference up to about 60fps (sometimes a little more).

    Have you ever seen a movie where the car's tires dont seem to rotate right? Thats becuse at 29.97fps you notice things like that.

    Reply
  • Anonymous User - Monday, September 15, 2003 - link

    #13, unless your not human, the human eye cant see a difference at 30fps and up. 60fps is a goal for users cause at that point, even if there is a slow down to 30fps you cant see the difference. Reply
  • Anonymous User - Monday, September 15, 2003 - link

    Overall, I liked the article...

    However, whilst I understand that you wanted to run everything at maximum detail to show how much faster one chipset may be than another, it would have been helpful if some lower resolution benchmarks could have been thrown in.

    After all, what good does it do you to know that chip B may perform at 30fps whilst chip A performs at 10fps if both are unplayable?

    I don't mind whether I can play a game at an astoundingly good detail level or not - I care more about whether I can play the game at all! :)

    In the end, we'd all love to be able to play all our games in glorious mega-detail looks-better-than-real-life mode at 2000fps, but it's not always possible.

    A big question should be can I play the game at a reasonable speed with a merely acceptable quality. And that's the sort of information that helps us poor consumers! :)

    Thanks for your time and a great article.
    Reply
  • Sxotty - Monday, September 15, 2003 - link

    Um do you mean floating point (FP16) or 16bit color? As opposed to FP32 on the NV hardware, as ATI's doesn't even support FP32, which is not 32bit color. ATI supports FP24. LOL and the no fog thing was just funny, that is NV's fault it is not like it has to be dropped they did it to gain a tiny fraction of performance. Reply
  • rqle - Monday, September 15, 2003 - link

    I really like this comment:

    "Don’t forget that programmers are also artists, and on a separate level, it is frustrating for them to see their hard work go to waste, as those high level settings get turned off."

    Hope future article on graphic card/chipset will offer more insight on how the may developer feel.
    Reply
  • Anonymous User - Monday, September 15, 2003 - link

    please note: the warcraft benchmark was done under direct3d. now nvidia cards perform badly under direct 3d with warcraft whereas ati does a very fine job. it's a completely different story, however, if u start warcraft 3 with the warcraft.exe -opengl command. so please take note of that, only very few people about this anyway. my quadro 4 700 go gl gets about +10fps more under opgengl compared to d3d! Reply
  • Pete - Monday, September 15, 2003 - link

    Nice read. Actually, IIRC, UT2003 is DX7, with some DX7 shaders rewritten in DX8 for minimal performance gains. Thus, HL2 should be not only the first great DX9 benchmark, but also a nice DX8(.1) benchmark as well. Reply
  • Anonymous User - Monday, September 15, 2003 - link

    so valve let you guys test out half life 2 on some laptops eh? very nice. (great review to, well written) Reply
  • Andrew Ku - Monday, September 15, 2003 - link

    We are currently revising our graphics benchmark suite in the anticipation of future DX9 stuff. These two GPUs are full DX9 parts, and we are benchmarking them accordingly. UT2003 and our current line of benchmarking titles are DX8, and therefore aren't specifically appropriate for this context. Why are our choices of benchmark titles odd? The Mobility and Go mobile graphics parts are no more than mobile version of desktop processors (clocked down, better power management features and in the M10 case integrated memory package). Reply
  • dvinnen - Monday, September 15, 2003 - link

    Where's UT2003 and other stables? Odd choice of benchmarks. I would of liked to see how it stood up to desktop varients also. Reply
  • Andrew Ku - Monday, September 15, 2003 - link

    AgaBooga,

    Question 1: Actually, we were considering memory bandwidth as a possible issue. I will try and report back as soon as we sort this out.

    Question 2: We tested at 1600x1200 for benchmark purposes, as it shows degrade. Additionally, the newer desknotes and mobile multimedia notebooks are capable of this resolution and higher.
    Reply
  • Anonymous User - Monday, September 15, 2003 - link

    Great review, funny too. (And it wasn't just the horrible failure of the Go5650 to perform that I found amusing!) Reply
  • AgaBooga - Sunday, September 14, 2003 - link

    Wow, nice set of benchmarkings applications! That is really something you've put together! My compliments to you!

    Do you think it is bound by something other than the GPU at 1024x768 on Splinter Cell 2_2_1 Set 1? Also, why was it tested at 1600x1200 because laptop users usually don't use resolutions that high on a relatively small screen than what is used on a desktop.
    Reply
  • Andrew Ku - Sunday, September 14, 2003 - link

    I am somewhat considered a new writer. My first article was the CEO Forum - Q3/2003. Reply
  • AgaBooga - Sunday, September 14, 2003 - link

    New article writer? Not bad, it seems pretty good! Reply

Log in

Don't have an account? Sign up now