More From CeBIT: New Mobile Parts

Unfortunately, we were unable to get any notebooks with these new GPUs to test them out before we tell you about them, but both NVIDIA and AMD are announcing new mobile GPUs today.

NVIDIA's move parallels what's happening on the desktop in that the newest additions to the mobile line up are 55nm G92 based parts with names in the new style NVIDIA has chosen. Actually, the entire lineup of 9xxxM series parts is being replaced by parts with new names. This is certainly more expected on the mobile side, as we usually do see much more lag in this space than on the desktop.

As for the specifics, the new parts are the top of the line models. The GTX 280M will be up to 50% faster than the 9800M GTX, which is nice in theory, but final performance will still be up to notebook makers who will set the final clocks on the part on a per notebook basis to accommodate their power budget. The GTX 260M is one step down from the 280M in that it has 112 SPs enabled (like the original G92 introduced as the 8800 GT) and lower maximum clock speeds.

These two high end GTX parts replace the top end 9800M parts, and subbing for the 9800M GS is the GTS 160M which will also offer improved performance, although we didn't get full specifications on this part. Rounding out the bottom of the lineup are the GT 130M and the G 110M.

On the AMD front, we see something a little more intriguing in the form of the first 40nm GPUs in the mobile space. Smaller die sizes, lower power and better power are promised, though the general naming will stay the same for AMD. The new 40nm 4800 series parts can be paired with either DDR3, GDDR3, or GDDR5; the choice is up to the notebook maker. AMD touts the fact that they can get about double the processing power in the same area with their new process, which will only benefit them going forward.

NVIDIA paints the GDDR5 option as overkill, but we really won't know about performance of either the new NVIDIA or AMD parts until we have hardware to test.

The NVIDIA and AMD supplied relative performance graphs are nearly useless in sorting out how these parts should compare to each other, so we'll really have to save the head to head for a time when we have hardware on our hands. 40nm could be a big plus for AMD, but remember that NVIDIA has made the first move in making mobile drivers available from their web site. The value of that is very high, as notebook OEMs tend not to like updating their drivers very often. Sure, it's possible to hack desktop drivers onto a mobile part, but it is a supreme headache and we hope AMD will soon follow in NVIDIA's footsteps with this move.

Back to the Tests at Hand

Now that we've covered all the announcements and introductory material, let's get to testing the hardware we've got in our hot little hands.

We got our card just a couple days ago, so we haven't had time to test everything, and we've only received one card so we haven't been able to test SLI with the 1GB version. We would also have added to our benchmarks by including 1280x1024 in our tests if we had had the time. This is a very important resolution for this class of hardware, but 1680x1050 should be a good enough indicator of relative performance in most cases so that this won't matter too much.

Our comparisons will be a little lop sided though. We've got two each (for single and dual configurations) of the 512MB 4850 and the 512MB GTS 250 (the 9800 GTX+). These comparisons we can do, and it's nice and neat as both parts are now set at a $130 (cutting recent street prices by about $15). We do have a GTS 250 1GB, but we don't have a 1GB 4850 to compare it to. On the flip side, since we've only got 1 GTS 250 1GB, we can't compare GTS 250 1GB SLI to the 4850 X2 2GB we have.

The test setup hasn't changed for this article, except that we've had to use the 182.08 for the GTS 250 1GB.

Test Setup
CPU Intel Core i7-965 3.2GHz
Motherboard ASUS Rampage II Extreme X58
Video Cards Sapphire ATI Radeon HD 4850 X2 2GB
ATI Radeon HD 4870 512MB CrossFire
ATI Radeon HD 4850 CrossFire
ATI Radeon HD 4870 512MB
ATI Radeon HD 4850
NVIDIA GeForce GTX 260 SLI
NVIDIA GeForce 9800 GTX+ SLI
NVIDIA GeForce GTX 260 core 216
NVIDIA GeForce GTS 250 1GB
NVIDIA GeForce 9800 GTX+
Video Drivers Catalyst 8.12 hotfix
ForceWare 181.22
Hard Drive Intel X25-M 80GB SSD
RAM 6 x 1GB DDR3-1066 7-7-7-20
Operating System Windows Vista Ultimate 64-bit SP1
PSU PC Power & Cooling Turbo Cool 1200W
Why NVIDIA Did It Age of Conan & Call of Duty World at War Performance
Comments Locked

103 Comments

View All Comments

  • sbuckler - Wednesday, March 4, 2009 - link

    I don't understand the hate. They rebranded but more importantly dropped the price too. This forced ati to drop the price of the 4850 and 4870. That's a straight win for the consumer - whether you want ati or nvidia in your machine.
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Oh, now stop that silliness ! Everyone worthy knows only ati drops prices and causes the evil green beast to careen from another fatal blow. ( the evil beast has more than one life, of course - the death blow has been delivered by the sainted ati many times, there's even a shrine erected as proof ).
    Besides, increasing memory, creating a better core rollout, redoing the pcb for better efficiency and pricing, THAT ALL SUCKS - because the evil green beast sucks, ok ?
    Now folllow the pack over the edge of the cliff into total and permanent darkness, please. You know when it's dark red looks black, yes, isn't that cool ? Ha ! ati wins again ! /sarc
  • Hrel - Wednesday, March 4, 2009 - link

    I can't wait to read your articles on the new mobile GPU's and I'm REALLY looking forward to a comparison between 1GB 4850 and GTS250 cards; as well as a comparison between the new design for the GTS250 512MB and the HD4850 512MB.

    It seems to me, if Nvidia wanted to do right by their customers, that they'd just scrap the 1GB GTS250 and offer the GTX260 Core216 at the $150 price point, it has a little less RAM so there's a little savings for them there. But then, that's if they wanted to do the right thing for their customers.

    It's about time they introduced some new mobile GPU's, I hope power consumption and price is down as performance goes up!

    I look forward to AMD releasing a new GPU architecture that uses significantly less power, like the GT200 series cards do. 40nm should help with that a bit though.

    Finally, a small rant: When you think about it, we really haven't seen a new GPU architecture from Nvidia since the G80. I mean, the G90 and G92 are just derivatives of that and they only offer marginally better performance on their own; if you disregard the smaller manufacturing process the prices should even be similar at release. Then even the GT200 series cards, while making great gains in power efficiency, are still based on G92 and STILL only offer marginally better performance than the G92 parts; and worse, they cost a lot to make so they're overpriced for what they offer in performance. I sincerely hope that by the end of this year there has been an official press release and at least review samples sent out of completely new architectures from both AMD and Nvidia. Of course it'd be even better if those parts were released to market some time around November. Those are my thoughts anyway; congrats to you if you actually read through all of this:)
  • SiliconDoc - Wednesday, March 18, 2009 - link

    " It seems to me, if Nvidia wanted to do right by their customers, that they'd just scrap the 1GB GTS250 and offer the GTX260 Core216 at the $150 price point, it has a little less RAM so there's a little savings for them there. But then, that's if they wanted to do the right thing for their customers. "
    _________________

    So, they should just price their cards the way you want them to, with their stock in the tank, to satisfy your need to destroy them ?
    Have fun, it would be the LAST nvidia card you could ever purchase. "the right thing for you" - WHAT EVER YOU WANT.
    Man, it's just amazing.
    Get on the governing board and protect the shareholders with your scheme, would you fella ?
  • Hrel - Saturday, March 21, 2009 - link

    Hey, I know they can't do that. But that's their fault too; they made the GT200 die TOO BIG. I'm just saying, in order for them to compete in the market place well that's what they'd have to do. I DO want them to still make a profit; cause I wanna keep buying their GPU's. It's just that compared to the next card down, that's what the GTX260 is worth, cause it's just BARELY faster; maybe 160. But that's their fault too. The GT200 DIE is probably the WORST Nvidia GPU die EVER made, from a business AND performance standpoint.
  • SiliconDoc - Saturday, March 21, 2009 - link

    PS - you do know you're insane, don't you ? The " GT200 is the probably the worst die from a performance standpoint."
    Yes, you're a red loon rooster freak wacko.
  • Hrel - Thursday, April 9, 2009 - link

    you left out Business standpoint, so I guess you at least concede that GT200 die is bad for business.
  • SiliconDoc - Saturday, March 21, 2009 - link

    Now you claqim you know, and now you ADMIT there is no place for it if they did, anyhow. Imagine that, but "you know" - even after all your BLABBERING to the contrary.
    Now, be aware - Derek has already stated - the 40nm is coming with the GT200 shrunk and INSERTED into the lower bracket.
    Maybe he was shooting off his mouth ? I'm sure "you konw" -
    ( Like heck I am )
    Six months from now, or more, and 40nm, will be a different picture.
  • Hrel - Wednesday, April 1, 2009 - link

    seriously, what are you talking about?
    pretty sure I'm gonna just ignore you from now on; pretty certain you are medically insane!

    I'd respond to what you said, I honestly have no idea what you were TRYING to say though.
  • SiliconDoc - Wednesday, April 8, 2009 - link

    You don't need to respond, friend. You blabber out idiocies of your twisted opinion that noone in their right mind could agree with, so its clear you wouldn't know what anyone else is talking about.
    You whine nvidia made the gt200 core too big, which is merely your stupid opinion.
    The g92 core(ddr3) with ddr5 would match the 4870(drr5), which is a 4850(ddr3) core.
    So nvidia ALREADY HAS a 4850 killer, already has EVERYTHING the ati team has in that region - AND MORE BECAUSE OF THE ENDLESS "REBRANDING".
    But you're just too crewed up to notice it. You want a GT200 that is PATHETIC like the 4830 - a hacked down top core. Well, only ATI can do that, because only their core SUCKS THAT BADLY without ddr5.
    NVidia ALREADY HAS DDR3 ON IT.
    SHOULD THEY GO TO DDR2 TO MOVE THEIR GT200 CORE DOWN TO YOUR DESIRED LEVEL ?
    Now, you probably cannot understand ALL of that either, and being stupid enough to miss it, or so emotionally petrified, isn't MY problem, it's YOURS, and by the way, it CERTAINLY is not NVidia's - they are way ahead of your tinny, sourpussed whine, with aJUST SOME VERY BASIC ELEMENTARY FACTS THAT SHOULD BE CLEAR TO A SIXTH GRADER.
    Good lord.
    the GT200 chips already have just ddr3 on them mr fuddy duddy, they CANNOT cut em down off ddr5 to make them as crappy as the 4850 or 4830, which BTW is matched by the two years old g80 revised core- right mr rebrand ?
    Wow.
    Whine whine whine whine.
    I bet nvidia people look at that crap and wonder how STUPID you people are. How can you be so stupid ? How is it even possible ? Do the red roosters completely brainwash you ?
    I know, you don't understand a word, I have to spell it out explicitly, just the very simple base drooling idiot facts need to be spelled out. Amazing.

Log in

Don't have an account? Sign up now