Gaming Performance

For this initial look, we've trimmed our usual battery of game tests down to eight titles. We still have one "simulation" (GRID), four first-person shooters, two role-playing games, and one strategy game. All of these gaming tests were run at 1920x1080 using High/Very High detail settings. For the Clarksfield system, this required the use of an external LCD since otherwise we would be limited to 1600x900.

Assassin's Creed DX9

Call of Duty: World at War

Crysis - High

Empire Total War

Far Cry 2 DX10 0xAA

F.E.A.R. 2: Project Origin

Mass Effect

Race Driver: GRID 0xAA

It's clear that we are GPU limited at these settings in some of the titles, but a few games are also clearly CPU limited. Falling into the CPU limited category, Assassin's Creed and Call of Duty: World at War both show similar performance with the i7 systems and show little if any benefit to SLI. The QX9300 in the Eurocom is 15~25% slower than the i7-920XM in these two titles.

Most of the games show a clear benefit for SLI, however, and likewise they show that the GTX 280M is a bottleneck for gaming performance. SLI improved performance by 15% in GRID, 20% in Mass Effect, and almost 30% in Far Cry 2. The big winners for SLI are Crysis (~60% boost from SLI), F.E.A.R 2 (90% boost), and Empire: Total War (99% faster with SLI).

If you're after optimal gaming performance, obviously just getting the fastest CPU or the fastest GPU alone won't cut it in every situation. You need a balanced platform with a CPU and GPU matched to offer the best performance in a large variety of situations. QX9300 with GTX 280M SLI often tips the scales too far towards the GPU side of the equation, while i7-975 is complete overkill for a single GTX 280M (about the level of a desktop 9800 GTX+/GTS 250).

We are definitely interested in seeing what the i7-920XM - as well as the i7-820QM and i7-720QM - can do with SLI graphics in the future. We're also looking forward to the day when we see mobile versions of stuff like the HD 5870, preferably with power gate transistors.

Synthetic Graphics Performance Battery Life and Power
Comments Locked

63 Comments

View All Comments

  • gstrickler - Thursday, September 24, 2009 - link

    I get the impression that Intel isn't really interested in making a low power quad core for laptops. My guess is that they see dual core + HT as the solution for users who need good battery life, and anyone needing a Quad core "portable" won't be running off battery (or will have a much larger battery). I don't agree with that, I think there are a small but growing percentage of users who could benefit from quad core performance on a notebook and still want/need good battery runtime. It's for not mainstream users yet, but it will be in a few years. For now, we have to choose between extra CPU power, extra battery life, or extra weight.
  • Pirks - Thursday, September 24, 2009 - link

    People will keep buying gaming laptops in droves 'cause they are very convenient - you can game wherever you want and you are not bound to your big ol' immovable desktop tower. And they are cheap too. Take a look at this one for example: http://www.bestbuy.com/site/olspage.jsp?skuId=9366...">http://www.bestbuy.com/site/olspage.jsp...&typ...

    After my Alienware M17 which I'm VERY happy with I'd get this one, veeery sweet machine and cheap too, just $1k. Why bother with stationary desktop when this beautiful Asus lappy can get everything done including every game including Crysis too?

    Jarred just can't understand the beauty of running your games literally anywhere. Sad, just sad. Grow up AT guys, desktop is the past and lappys are the future :P
  • JarredWalton - Thursday, September 24, 2009 - link

    I review laptops, I run tests on them, and gaming performance is still at least two generations behind desktops. That $1000 ASUS is a good gaming laptop, to be sure, and a much better value that a $3000+ Alienware. However, a single GTX 260M is a 9800 GTX desktop part equivalent (or GTS 250 if you prefer the latest name change).

    There's a big reason I don't test with AA enabled on *any* of the laptops: they can't handle it. Heck, these $3000+ laptops struggle to run 1080p at times. Is it childish to think that my 30" display with 4870X2 provides a far more compelling experience than a laptop?

    If you love gaming laptops, I've got full reviews of these three (and your M17x) in the works. And for every comment like yours stating that I need to grow up and get past the desktop, I'll get 10 comments on my high-end reviews saying, "Who in their right mind buys these things!?" The answer is "Pirks" apparently, along with people that want to go to LAN parties (a very small minority of gamers, considering the millions playing games) who also have the money to spend on gaming laptops (an even smaller minority).

    Anyway, I loved the Gateway FX series kicking off the $1300 gaming laptop era, and the ASUS G50 series (including your linked laptop) is great as well... for what it is. A convenient laptop to me usually doesn't get an hour of battery life, have an extremely hot bottom (no, not like that...), or weigh 12 pounds. Okay, that ASUS probably weighs 7 pounds and gets 90 minutes of battery life, and it's not as toasty as a Clevo DTR setup. That's why it will sell a lot more units than the W87CU, though.

    The problem is, Intel's "gaming laptops" slide includes http://www.bestbuy.com/site/olspage.jsp?skuId=9379...">stuff like this VAIO. GeForce 9600M is NOT a gaming solution by any stretch of the imagination. It wasn't even a gaming solution when it launched, with its paltry 32 SPs. It can handle WoW at moderate detail settings and resolution, sure, and it's faster than any IGP, but Crysis and many other games will still struggle.
  • GeorgeH - Thursday, September 24, 2009 - link

    My main laptop has a Quadro FX 770M (roughly equivalent to the 9600M GT you linked to) powering a 15.4" 1920x1200 screen and I have to disagree with you a little bit there.

    It's very true that when playing games like Far Cry 2 zebras look more like dirty white horses and some of the jagged edges could put your eye out, but at the end of the day the gaming experience isn't really all that different from my desktop. Other titles are similar; yes, graphics sliders sometimes need to be set to the low end of the scale, but most games are still very playable and you often really don't miss the "shiny" graphics options.

    At this point, it'd take a really solid implementation of something like AMD's Eyefinity to make me want to build another super high-end gaming desktop. Maybe I'm just getting old, but current advances in graphics technology just don't seem to wow as much as they did 10 years ago - subjectively they often feel more like baby steps than giant leaps.

    The reason I bring all this up is to suggest that a short paragraph on your subjective gaming impressions going between the laptops and your desktop might be a good idea. Just noting that AA is off often isn't enough for us mortals that don't have a few $3000 laptops and $2000 towers just lying around. ;)
  • ltcommanderdata - Wednesday, September 23, 2009 - link

    Is the 45W/55W TDPs for Clarksfield really that much of a concern over previous generation CPUs? Penryn processors may have had 35W/45W TDPs but that also had separate northbridges with the PM45 rated at 7W. Clarksfield with it's integrated memory and PCIe controllers basically has the northbridge integrated and absorbs it's TDP rating. Once the northbridge is taken into account, the 45W/55W TDP of Clarksfield is only 3W higher than previous generation Penryn+northbridge combinations.

    I do agree though that the low clock speeds, relatively high clock speeds, and high prices of Clarksfield makes it unlikely that we will see quad cores break into the mainstream mobile market in this generation. While it may not be needed on desktop where the thermal constraints are relaxed, I think a 32nm quad core Westmere derivative is definitely needed in the mobile market. It's unfortunate that we'll likely have to wait at least another year before we see a 32nm mobile quad core with Sandy Bridge.
  • gstrickler - Thursday, September 24, 2009 - link

    You're correct, a current C2Q at 2.0-2.53GHz with a 45W TDP plus PM45 + ICH9M (9.5W combined TDP) is 54.5W TDP, vs a Core i7 mobile (45W/55W TDP + PM55 (3.5W TDP) is 48.5W/58.5W TDP, each without IGP. Put the C2Q with an Nvidia 9400M (G) chipset (12W TDP) and it looks a bit better at 57W TDP including a good IGP, but it's still in the same power range as the i7 and as shown in the article, it's notably slower.

    However, you're overlooking something. Intel currently offers C2D @ 1.6GHz with 20W TDP and 2.13GHz w 17W TDP, indicating that the should be able to make 20W and 34W TDP C2Q on their current 45nm process. In fact, they've got a Xeon L5408 (2.13GHz Quad Core) @ 40W TDP, so clearly they can get to the numbers I suggest. Couple those "possible" CPUs with an Nvidia 9400M (G) chipset @ 12 W TDP, and you're looking at a 1.6GHz C2Q w/ 32W TDP or 2.13GHz C2Q w/ 46W TDP for the complete CPU, chipset, and a good IGP. Or, go with the PM45+ICH9M (9.5W TDP combined) for 29.5W or 43.5W TDP for complete systems without GPU.

    Compare that to a mobile i7 + PM55 at 48.5W - 58.5W TDP, without GPU, that's 34%-64% more peak load power than what Intel could theoretically do with C2Q today with similar base clock speeds. Yes, the i7 has power gating that will allow it to use lower power at idle, and it has Turbo mode which will allow it to perform better when only 1 or 2 cores are in use, but Turbo mode will still use more peak power than these hypothetical C2Q. Of course, you can turn off Turbo mode, but then you're back to performance much closer to the C2Q.

    Compared to what Intel actually offers today, the new chips are an improvement for those who need quad core performance in a portable. However, compared to what they've shown that they could offer today if they chose to, the Clarksfield chips don't look like they're much of an improvement. If Intel applied the power gating and/or turbo features to the C2Q, Clarksfield might not look like an improvement at all. Of course, since Intel isn't doing that, it's all speculation.

    Bottom line, Clarksfield gives more performance in a notebook, but at a notable cost in power usage (and a corresponding cost in battery life) vs what Intel could do today using C2 based systems. If battery life and weight are important to you, Clarksfield is no big deal, and it leaves you waiting for Arrandale or lower power Clarksfield CPUs. If top performance is your concern and you can live with shorter battery life and/or more weight, then Clarkfield gives you a new option.
  • jcompagner - Friday, September 25, 2009 - link

    but you are overlooking something

    These are not replacements of the ultra low voltage stuff or even replacements of the mobile Pxxxx core 2 duo's..

    These processors are replacements of the high end Core Duos that do have a 35W or the Core Quads that have a 45W TDP's

    Please do compare them with what they are aiming to replace..

    yes the C2D 1.6Ghz 20W is much lower but now compare them with the power you get from a Core I7 720..

    Its completely depending on what you do. If you do use 2-4 cores for your daily work then i think the Power/Watt is way better with the Core I7 for what you get.. The I7 is way faster done with it's work then the C2D so it can be in idle way quicker .. And looking at the review it seems that it saves more power in idle then the Core duo's!
  • gstrickler - Thursday, September 24, 2009 - link

    In the 2nd paragraph, it should be "C2D @ 1.6GHz with 10W TDP"
  • justme2009 - Wednesday, September 23, 2009 - link

    "It's good to finally see an official Nehalem CPU for the mobile sector. Power gate transistors have the potential to seriously improve battery life, and we can't wait to see that sort of technology begin making its way into CPUs as well as processors. In terms of performance, things are a little bit of a mixed bag."

    Don't you mean GPUs as well as processor? :p

    I'm still waiting for arrandale, I'll be skipping this generation and upgrading in one or two years.
  • rbbot - Wednesday, September 23, 2009 - link

    simply for the increased ram capacity in a normal mainstream chassis.

Log in

Don't have an account? Sign up now