Heat and Noise

Given the relatively similar configurations of the two M18x units we've reviewed, it should come as no surprise that thermals are remarkably close between them as well. For reference we're including the results from the NVIDIA GeForce GTX 580M-equipped system to compare against the pair of Radeons in this unit, though it bears mentioning that HWMonitor could only read the temperatures from one Radeon HD 6990M. The RAID 0 configuration also prevents HWMonitor from getting temperature readings from the hard drives.


Single Radeon HD 6990M (Left) and CrossFire (Right)


Single GTX 580M (Left) and SLI (Right)

So what can we gather from this information? First, it seems that the pair of GeForces run ever so slightly hotter than the Radeons, and that produces warmer peak core temperatures on the overclocked i7-2920XM. While the GeForces are really pushing the i7-2920XM to its thermal limits, the Radeons still get peak temperatures above 90C on the cores, more than anyone wants to see.

On the plus side, the i7-2920XM's overclock doesn't seem to be exacting too much in the way of heat. At stock speeds thermals are essentially comparable under load when only a single GPU is enabled. End users of the M18x may want to keep an eye on temperatures just to be on the safe side, but should still be reasonably comfortable with a solid overclock on their processor.

As for noise, the fans run too low to really be noticeable when the M18x is idling, but under load it picks up to about 43dB. This is still nowhere near as nightmarishly loud as the Clevo X7200 can get; that notebook has to cool roughly 330 watts of hardware in a chassis roughly the same size as the M18x. Overall the M18x is definitely noticeable when being stressed, but it's not overwhelmingly so.

Gaming Performance Conclusion: NVIDIA Retains Their Title, But Barely
Comments Locked

24 Comments

View All Comments

  • Alexvrb - Sunday, October 16, 2011 - link

    Meaker is right. You can overclock it 100% and it doesn't mean beans if its throttling the heck out of it.
  • Meaker10 - Friday, October 14, 2011 - link

    You have the 6990M coming out behind the 6970M results....
  • JarredWalton - Friday, October 14, 2011 - link

    There are several factors at play. First, different drivers -- newer may not always be better, but without having both laptops and retesting, we can't say for sure. Second and more important by far is that the X7200 has a hex-core i7-990X. Even overclocked, the i7-2920XM can't always match it. Third, there's a difference in chipsets; the X7200 uses the X58 while the M18x uses the HM67. The X58 has tri-channel memory with two full x16 PCIe slots, where the mobile platforms go with dual-channel and two x8 PCIe slots.

    While individually, each of these seems minor, taken together it's not too surprising to see the X7200 win some of the gaming benchmarks. Also notice that in more GPU-limited tests (Metro 2033, Mafia II, and STALKER at our Ultra settings), the 6990M CF setup outpaces the 6970M CF by a fairly large margin. Most of our other titles, even at max settings, may not completely saturate the GPUs.
  • Meaker10 - Friday, October 14, 2011 - link

    While I agree for the most part, if we were CPU limited we would see a hard wall, the CPU utilisation is close between AMD/Nvidia so that while there can sometimes be gaps they are not large.

    Well looking at the highest setting Dirt 2 benchmarks we see:

    580M in the lead on the back of the 2920XM over the x7200 setup by 15%, a lead that suggests no GPU bottleneck.

    Now looking at the M18X the 6990M crossfire is getting 77% of the FPS of the 6970 setup. Usually crossfire is less limited by PCI-E lanes than SLI.

    Usually drivers can alter results, but if we project where the 6990M crossfire results should be then they are underperforming by around 30%, not something you would expect in newer drivers.

    Have you checked the card's were not throttling during this run? Looks a bit suspect to me.
  • Meaker10 - Friday, October 14, 2011 - link

    Sorry I meant to say it indicates no CPU bottleneck.
  • JarredWalton - Friday, October 14, 2011 - link

    I'll ask Dustin to check on the clocks and thermals of the AMD 6990M, as well as the i7-2920XM -- both could potentially throttle. I'll ask him about AMD driver version as well. Also, while on the desktop the SLI and CF results are often similar in terms of CPU utilization, on notebooks all bets are off. Every time I've played with an SLI or CF laptop, I've always felt like performance was never quite where I'd expect for the given hardware.

    For instance, SLI and CF scaling on desktops compared with scaling on notebooks seems like it usually doesn't do nearly as well on notebooks. I'd have to go analyze some hard numbers, but it's just been my impression. Another example is when you launch a game on an SLI or CF notebook, the display usually seems to flicker on and off for 10 seconds. Maybe that's been fixed now, but the last time I tested it I seriously thought, "WTF is going on!?"

    In theory, everything should be the same, but when you go mobile it rarely feels that way. Yet one more reason to recommend the M17x or ASUS G74SX over the M18x.
  • Meaker10 - Saturday, October 15, 2011 - link

    I have a 16F2 barebone myself (GT683R based).

    It would be interesting to face the x7200 and M18x against each other. Recording a baseline one card performance and looking at the mobile vs desktop chipset scaling with the same drivers and cards.
  • ik9000 - Saturday, October 15, 2011 - link

    re your comment that the premium for a single 580 isn't worth it over the 6990. While dual GPU set-ups don't offer optimus, with a single card isn't this an option? Would Optimus give a single GTX580 machine a better battery life than a single 6990?
  • iamlilysdad - Saturday, October 15, 2011 - link

    It appears that Dell now lists a 256gb + 750gb option for custom configuration for the M17X R3. Guess that's another plus in the column for the "little" brother.
  • Akv - Sunday, October 16, 2011 - link

    Another "gaming laptop". Yawn...

    Sorry to repeat that every six months, but I am part of the population who have all the necessary large gaming equipment at home (screens, cases, fans, mice, keyboards...) and who use their laptop when traveling.

    In that sense the prospect of gaming with a laptop seems to me appalling compared to what my desktops can offer, and the prospect of using a low-power, low-noise, light-weigth laptop with excellent screen and excellent speakers seems highly appropriate.

Log in

Don't have an account? Sign up now