More From CeBIT: New Mobile Parts

Unfortunately, we were unable to get any notebooks with these new GPUs to test them out before we tell you about them, but both NVIDIA and AMD are announcing new mobile GPUs today.

NVIDIA's move parallels what's happening on the desktop in that the newest additions to the mobile line up are 55nm G92 based parts with names in the new style NVIDIA has chosen. Actually, the entire lineup of 9xxxM series parts is being replaced by parts with new names. This is certainly more expected on the mobile side, as we usually do see much more lag in this space than on the desktop.

As for the specifics, the new parts are the top of the line models. The GTX 280M will be up to 50% faster than the 9800M GTX, which is nice in theory, but final performance will still be up to notebook makers who will set the final clocks on the part on a per notebook basis to accommodate their power budget. The GTX 260M is one step down from the 280M in that it has 112 SPs enabled (like the original G92 introduced as the 8800 GT) and lower maximum clock speeds.

These two high end GTX parts replace the top end 9800M parts, and subbing for the 9800M GS is the GTS 160M which will also offer improved performance, although we didn't get full specifications on this part. Rounding out the bottom of the lineup are the GT 130M and the G 110M.

On the AMD front, we see something a little more intriguing in the form of the first 40nm GPUs in the mobile space. Smaller die sizes, lower power and better power are promised, though the general naming will stay the same for AMD. The new 40nm 4800 series parts can be paired with either DDR3, GDDR3, or GDDR5; the choice is up to the notebook maker. AMD touts the fact that they can get about double the processing power in the same area with their new process, which will only benefit them going forward.

NVIDIA paints the GDDR5 option as overkill, but we really won't know about performance of either the new NVIDIA or AMD parts until we have hardware to test.

The NVIDIA and AMD supplied relative performance graphs are nearly useless in sorting out how these parts should compare to each other, so we'll really have to save the head to head for a time when we have hardware on our hands. 40nm could be a big plus for AMD, but remember that NVIDIA has made the first move in making mobile drivers available from their web site. The value of that is very high, as notebook OEMs tend not to like updating their drivers very often. Sure, it's possible to hack desktop drivers onto a mobile part, but it is a supreme headache and we hope AMD will soon follow in NVIDIA's footsteps with this move.

Back to the Tests at Hand

Now that we've covered all the announcements and introductory material, let's get to testing the hardware we've got in our hot little hands.

We got our card just a couple days ago, so we haven't had time to test everything, and we've only received one card so we haven't been able to test SLI with the 1GB version. We would also have added to our benchmarks by including 1280x1024 in our tests if we had had the time. This is a very important resolution for this class of hardware, but 1680x1050 should be a good enough indicator of relative performance in most cases so that this won't matter too much.

Our comparisons will be a little lop sided though. We've got two each (for single and dual configurations) of the 512MB 4850 and the 512MB GTS 250 (the 9800 GTX+). These comparisons we can do, and it's nice and neat as both parts are now set at a $130 (cutting recent street prices by about $15). We do have a GTS 250 1GB, but we don't have a 1GB 4850 to compare it to. On the flip side, since we've only got 1 GTS 250 1GB, we can't compare GTS 250 1GB SLI to the 4850 X2 2GB we have.

The test setup hasn't changed for this article, except that we've had to use the 182.08 for the GTS 250 1GB.

Test Setup
CPU Intel Core i7-965 3.2GHz
Motherboard ASUS Rampage II Extreme X58
Video Cards Sapphire ATI Radeon HD 4850 X2 2GB
ATI Radeon HD 4870 512MB CrossFire
ATI Radeon HD 4850 CrossFire
ATI Radeon HD 4870 512MB
ATI Radeon HD 4850
NVIDIA GeForce GTX 260 SLI
NVIDIA GeForce 9800 GTX+ SLI
NVIDIA GeForce GTX 260 core 216
NVIDIA GeForce GTS 250 1GB
NVIDIA GeForce 9800 GTX+
Video Drivers Catalyst 8.12 hotfix
ForceWare 181.22
Hard Drive Intel X25-M 80GB SSD
RAM 6 x 1GB DDR3-1066 7-7-7-20
Operating System Windows Vista Ultimate 64-bit SP1
PSU PC Power & Cooling Turbo Cool 1200W
Why NVIDIA Did It Age of Conan & Call of Duty World at War Performance
Comments Locked

103 Comments

View All Comments

  • Leyawiin - Tuesday, March 3, 2009 - link

    Good refinement of an already good card. New more compact PCB, lower power comsumption, lower heat, better performance, 1GB. If Nvidia feels thats worthy of a rename, why should anyone get their drawers in a bunch?

    But please, let the conspiracy theories fly if there was a rewrite of the conclusion. Could be it was just poorly done and wasn't edited, but thats not as fun as insinuating Nvidia must have put pressure on the AT.
  • Gannon - Tuesday, March 3, 2009 - link

    Because it's lying, the Core should always match the original naming scheme. Nvidia is just doing this to get rid of inventory and cause market confusion so that dimwits who don't do their research go for the 'newer' ones, when in fact they are the older.

    I hate this practice, creative did the same thing with some of their soundblaster cards, the soundblaster PCI I believe it was, it was some other chipset from a company they had bought out and merely renamed and rebadge the card "soundblaster"

    Needless to say I hate the practice of deceiving customers, imagine you're in a restaurant and you ordered something but then they switched it on you to something else, you'd rightly get pissed off.

    If people weren't so clueless about technology they wouldn't get away with this shit. This is where the market fails, when your customers are clueless, it's sheep to the slaughter.
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Yeah, imagine, you ordered coke one day, and the next week you ordered coca cola off the same menu, and they even had the nerve to bring it in a different shaped glass. What nerve, huh !
    They sqirted a little more syrup in the latter mix, and a bit less ice, and you screamed you were decieved and they tricked you, and then you went off wailing away that it's the same thing anyway, but you want the coca cola not the coke because it tasted just a tiny bit better, and you had darn better see them coming yup with some honest names.
    Then ten thousand morons agreed with you - then the cops hauled you out in a straight jecket.
    I see what you mean.
    Coke is coca cola, and it should not be renamed like that - or heck people might buy it.
    I guess that isn't fair ... because people might buy it. It might even a different price at a different restaurant, or even be called something else and taste different out of a can vs a glass - and heck that ain't "fair".
    You do know I think you're all pretty much whining lunatics, now, right ? Just my silly opinion, huh.
    Coke, coca cola, soda, pop, golly - what will people do but listen to the endless whiners SCREAM it's all the same and stop fooling people....
    I guess it was a slow news YEAR.
  • SunnyD - Tuesday, March 3, 2009 - link

    Since NVIDIA really wanted to push PhysX... I'm curious which if any of the tested titles have PhysX support and if it's enabled in those titles as tested. I'd be really interested to see what kind of performance hit the PhysX "holy grail" takes from this new/old card when trying to compare it to its competition.
  • SiliconDoc - Wednesday, March 18, 2009 - link

    I wonder why they haven't done a Mirror's Edge PhysX extravaganza test - they can use secondary PhysX cards and then use the primary for enabling, turn it on and off and compare - etc.
    But not here - Derek would grind off all his tooth enamel, and Anand can't afford the insurance for him.
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Derek CAN'T include PsysX, and NEVER SAYS wether or not he has it disabled in the nvidia driver panel - although this site USED to say that.
    If they even dare bring up PhysX - ati looks BAD.
    Hence, to keep as absolutely MUM as possible, is the best red fan rager course.
    You see of course, Derek the red has to admit that yes, even NVIDIA ITSELF brought this SAD BIAS up to Derek...
    Oh well, once a raging red rooster, always a red rooster - and NOTHING is going to change that. (or so it appears)
    Is that 10 or 15 points of absolute glaring bias now ?
    ____________________________________________________________
    " We're just trying to save the billions losing ati so we have competition and lower prices - so shut up SiliconDoc ! Do you want to pay more, ALL OVER AGAIN FOR NVIDIA CARDS !!?!"
    ____________________________________________________________

    PS, thanks for lying so much red roosters, you've done a wonderful job of endless bs and fud, hopefully now Obama can bail out amd/ati, and my nvidia cuba badaboom low power game profiles, forced sli, PhysX cards will remain the best buy and continue to utterly dominate with only DDR3 memory in them.

    PSS - Yes, I can hardly wait for nvidia DDR5 - oh will that ever be fun - be ready to rip your red badges off your puny chests fellers - I'm sure you'll suddenly find a way to reverse 180 degrees after a few weeks of "gating nvidia for stealing ati ddr5 intellectual property".
    LOL
    Oh it's gonna be a blast.
  • C'DaleRider - Tuesday, March 3, 2009 - link

    Very early this morning, I stumbled upon this article when it was originally put up....and went directly to the conclusions page. Interesting read....and I should have saved that page.

    Subsequently, the entire review went down with this reasoning, "...ust we had some engine issues... missing images and such. I don't have the images or I'd put them on the server and set the article to "live" again. Anand and Derek have been notified; sorry for the delays."

    Well, it's back up and what do you know.....the conclusions have now become somewhat softer, or as a few others on another forum put it who also saw the "original" review...circumcised, censored, and bullied by nVidia.

    Shame that the original conclusion has been redone....would have liked others to actually see AT had some independence. Guess that's a lost ideal now...........
  • strikeback03 - Tuesday, March 3, 2009 - link

    Interesting, you mentioned in the comments in the other article that you didn't get to see any of the review, as when you clicked it went to the i7 system review.
  • JarredWalton - Wednesday, March 4, 2009 - link

    Thanks for the speculation, but I can 100% guarantee that the "pulling" of the article was me taking it down due to missing images. I did it, and I never even looked at the rest of the article, seeing that it was 3AM and I had just finished editing a different article.

    Was the conclusion edited before it was put back up? Yes, but not by me. That's not really unusual, though, since we typically have someone else read over things before an article goes live, and with a bit more discussion the wording can be changed around. It would have changed regardless, and not because of anything NVIDIA said.

    Is the 9800 GTX+ naming change stupid? I certainly think so. However, that doesn't make the current conclusion wrong. The card reworking does have benefits, and at the new price it's definitely worth a look as a midrange option.
  • RamarC - Tuesday, March 3, 2009 - link

    please consider styling the resolution links so they stand out a bit or look button-ish. it took me a minute to realize they were clickable.

Log in

Don't have an account? Sign up now