AMD vs. Intel: Battery Life Investigated

by Jarred Walton on 8/5/2009 5:00 PM EST
POST A COMMENT

80 Comments

Back to Article

  • wiak - Friday, August 07, 2009 - link

    like the subject says, ati or nvidia graphics are superior to anything intel has, its a fact, just see any benchmark ;) Reply
  • tygrus - Monday, August 10, 2009 - link

    Add in a low end nvidia GPU and the price goes up another $150+ and maybe 20% less battery life ?

    In future articles: consider performance per watt (you said performance will be in the final review); measure power load on battery/mains(minus battery) over time for different tasks/conditions; multiple configs to indicate scailing and normal trends.

    Isolate system components: use external LCD to switch off local screen (acroos varying laptops); swap HD; USB DVD drive compared to internal; swap RAM and change speed (very minor); swap mini-card wireless if possible... Hours of fun for the whole familly ... Who's paying you ask ? .. Don't look at me :)
    Reply
  • alexruiz - Tuesday, August 11, 2009 - link

    Interesting article. I personally am at a loss regarding battery life in laptops. Let me tell you my experience:

    I bought an emachines M6805 in Jan 2004. As you remember, this machine was the darling of power laptop users in a budget (~$1600) The machine had a mobile Athlon 64 3000+ clawhammer (130 nm) rated at 65 W and 11W in idle. It had a discrete mobility radeon 9600 and a desktop chipset VIA K8T800. Intel had not a CPU that could compete in raw performance with the mobile clawhammer as it only had the mobile P4 or the P-M banias at the time of the mobile Athlon 64 launch, both of which the clawhammer handled easy. It wasn't until the P-M Dothan than Intel matched the mobile clawhammer in performance while achieving lower power consumption.

    Back into power consumption, the M6805 could go 2:30 - 2:45 hrs running under battery. Some guys even managed to reach 3:10 hr. You would think to achieve that battery life with that kind of horsepower and hungry parts would reuire an immense battery, remember, the K8T800 was a desktop chipset and the mobility radeon 9600 was a discrete powerful videocard, plus the mobile Athlon wa srated at 65W!

    Well, the battery was only 6 cells rated at 4400mAH.... not big at all. I managed 3:05 hrs after switching form xP to win2K as the powernow worked more efficiently. When I replaced the hard drive for a travelstar 7K60 I only lost 12-15 mins of battery life.


    Fast forward 5 and a half years. I finally sold my trusty M6805, mainly because at its size flying couch class isn't too practical. After a ot of time spent in shopping comparison, its replacement was a Toshiba U405D-S2910. The Toshiba sounds years ahead in paper (and it really is years newer) Turionn X2 RM-74 65nm "Griffin" rated at 31 W, 780v chipsetd with integrated graphics, 13.3" screen... one of the very first thing I did when I got it was to check the battery capacity.. the same 4400 mAH.

    If the M6805 with hungry components and a bulky frame managed 2:40- 3 hrs on the same batteyr battery, the U405D shousd easily manage over 4 hrs I thought.... big dissapointment. It never went beyond 2:40 hrs, in fact, 2:30 seems to be the average. Why much more efficient components in a very smaller chasis manage the same battery life is beyond me. :(

    However, for the $649 I spent I would not have gotten such a well rounded machine with an intel CPU at that size. I do game on the laptop while on the road, and the game is not easy on the graphics.. it still runs decently. The machine feels faster than many other machines costing twice... clean install of vista ultimate AMD64 in a WD scorpio black works wonders.... It is very slim, looks very elegant and is very quiet. For the $80 premium of the intel CPU I can buy another battery and double my time away from an outlet... oh, and I have the better graphics. But still, why is not beating the M6805?
    Reply
  • JarredWalton - Wednesday, August 12, 2009 - link

    mAhr is only part of the battery capacity equation; you also have to look at voltage. Multiply mAhr by voltage to get mWhr, which is the true measure of capacity. (Divide by 1000 for Whr.)

    As for performance, there are plenty of options out there. See the follow-up article posting shortly....
    Reply
  • strikeback03 - Friday, August 07, 2009 - link

    And to many of us that really doesn't matter. What's your point? Reply
  • medi01 - Monday, August 10, 2009 - link

    And for many of us it still does matter, so what's your point?

    When someone compares AMD laptop to Intel laptop, people think about CPUs. But it is actually CPU + Graphic card and the performance gap between graphic cards is much wider in AMD's favor than performance gap between CPUs is in Intel's.
    Reply
  • JACKDRUID - Thursday, August 06, 2009 - link

    Intel offers much better battery life... However, i would NEVER ever buy a Intel laptop. Why? because its IGP sucks! its at least 2 to 3 times slower than AMD counter part. you are very likely be running any game at 10 fps with Intel IGP, while with AMD you might get 20-30fps. thats a differece between playable vs non-playable...

    if you insist on getting an Intel part, make sure you get one with dedicated graphics.
    Reply
  • fumigator - Friday, August 07, 2009 - link

    I agree.
    I serviced a Pentium DualC. T4000 series laptop recently, it came with IGP GMA4500, and it could hardly handle winamp visualisations! in some segments I got less than 10 to 15 frames per second! Come on people,

    I understand the intel laptop is perfect for browsing the web and email, but... are we a Grandma or what? Are we going to sacrifice 80 bucks just to get 30 minutes more of battery life? maybe I could buy a whole new battery with that amount. And even windows installation was slow on that thing, but I don't care too much as its a one time procedure, but... a minimal overpower for multimedia should always be welcome... Cheap Intel laptops are a no go for me...

    On the other side, if the laptop is AMD based, I would go ATI 3200 or up (780G). I dislike those X1000 series of ATI videos, which also means they sport the old SB600 southbridge. I would avoid it.
    Reply
  • garydale - Thursday, August 06, 2009 - link

    One point that I think needs to made is that the Intel notebook costs $80 more than the AMD. Given the fact that we're talking a $500 AMD unit, that's a pretty hefty price differential.

    On the performance side, the article mentions that the Intel notebook has the faster CPU but the AMD has the faster graphics. This makes it a toss-up in terms of power so for your extra money.

    IMHO a better comparison would be two similar notebooks at the same price point. This would allow us to see the power versus battery life trade-off which I think is more relevant to most people.
    Reply
  • blackshard - Thursday, August 06, 2009 - link

    Watching to the keyboards, the AMD notebook lacks the "Power Saver" logo. I wonder it means somethings. BTW QL parts are really the entry level and lacks some power management features that RM and ZM series have. A comparison between QL, RM and ZM would be really nice, at least to see if the claims about advanced power management on RM/ZM are true. Reply
  • JarredWalton - Thursday, August 06, 2009 - link

    I've already sent in a request to AMD, so hopefully they will respond. FWIW, CnQ also appears to be causing problems with the AMD single-threaded performance. For example, CINEBENCH 10 shows a greater than 100% improvement in performance when going to multithreaded over single threaded... unless you force affinity to one CPU core. Reply
  • balancedthinking - Thursday, August 06, 2009 - link

    That is right, there is still the old problem with the windows sheduler, jumping threads to the downclocked, unused core.

    This hase been solved with the new Phenom II and vista/7, they do not clock the cores independently anymore and thus do not loose performance due to a jumping single thread.

    Clocking cores independently is a nice power saving feature by AMD but was completely made useless due to the microsoft sheduler.
    Reply
  • IntelUser2000 - Friday, August 07, 2009 - link

    This is why I believe the Mac OS achieves better battery life. Windows isn't very good with idle power. Reply
  • zsdersw - Thursday, August 06, 2009 - link

    That's as close as it's possible to come. Matching prices isn't more important than matching specs. Reply
  • JarredWalton - Thursday, August 06, 2009 - link

    The only way to get a $500 Intel laptop is generally to have other features cut as well. There are options with the Pentium T4200 for around $500, which should be http://www.anandtech.com/bench/default.aspx?p=67&a...">about 5~10% slower than the T6500 used in this comparison, based on Anand's Bench results. (The T4200 is 1MB L2 vs. 2MB L2 on the T6500.) Given that everything else was identical (HDD, LCD, chassis, battery) I felt that was the best we could do. Reply
  • MMartin - Friday, August 07, 2009 - link

    Would you mind sharing your setup?

    1. Power savings
    2. Tools used
    3. Etc.
    Reply
  • computergeek485 - Thursday, August 06, 2009 - link

    Showing all those graphs on top of each other the proper way to set the scales would all be the same. Having different scales makes it take more time to convey the information to the reader. Considering the goal is to accurately and easily display the information to the reader, having the same scale would greatly improve that. Reply
  • JarredWalton - Thursday, August 06, 2009 - link

    Our graphing engine doesn't provide that option. However, you can try this on for size:

    http://images.anandtech.com/graphs/batterylifeamdv...">
    Reply
  • sublifer - Thursday, August 06, 2009 - link

    I wouldn't be surprised if it was because of the gpu and cpu process but I was curious if you tried swapping the batteries just in case the battery in the AMD one was defective or had a bad "memory"? Reply
  • JarredWalton - Thursday, August 06, 2009 - link

    See above post; the results stayed well within the margin of error (less than a 1% difference by swapping batteries). Reply
  • sublifer - Thursday, August 06, 2009 - link

    Don't know how I missed those posts... I thought I read through them all to make sure my concern wasn't already raised.

    Apologies
    Reply
  • mmatis - Thursday, August 06, 2009 - link

    You say:
    "For now, if you're looking for an inexpensive laptop (not a netbook), you need reasonable battery life, and you don't care about graphics performance we suggest saving up the extra $50-$100 for an Intel-based system."

    Why not take the "extra $50-$100" and buy a spare battery or a larger battery? THEN which one has the longer battery life?
    Reply
  • lyeoh - Friday, August 07, 2009 - link

    For some reason spare batteries cost a lot nowadays at least when someone I know was buying a laptop AND wanted a spare battery. The spare battery cost more than USD250.

    A retailer for a different laptop said its spare batteries were about USD140, but when we checked they didn't have any in stock (none in the entire country either), and we would have to order and wait for weeks.
    Reply
  • jamawass - Thursday, August 06, 2009 - link

    These low cost laptops aren't really designed for much portability so battery life is a moot point. I was shopping around for a sub $700 machine and decided on AMD, the reason: virtualization support, I plan on upgrading to Win 7 professional with xp mode. I couldn't find a single Intel cpu in that price range that supported virtualization. Reply
  • drmo - Thursday, August 06, 2009 - link

    I think many people would still want to have enough battery life to watch a DVD at an airport or on a plane for those infrequent trips. The buy a bigger battery for the extra $50-100 is a very valid point though. And when I go on trips I like to be able to do some light gaming as well, and having to pay the extra $100-200 for discrete graphics which kill the battery life at other times (unless it can switch off) is not worth it.

    I'm curious as to why the AMD system sucked so bad to payback DVDs. This seriously sounds like a driver issue. I mean, was the CPU at 100%? Shouldn't it be offloaded to the decoder? Or does the decoder suck more juice??? I really think the full review should look into this.

    This was a great teaser for the full review.
    Reply
  • JarredWalton - Thursday, August 06, 2009 - link

    The DVD playback is likely caused by the AMD CPU not power saving as well as the Intel when under a light load (maybe 5-10% CPU usage). Also, Cool 'n Quiet does not do AMD any favors in single-threaded tasks. I'm not sure if there's more to the DVD playback issue than that, but the results are relatively consistent with the other "heavy surfing" test (which is actually about the same 5~10% CPU load). Reply
  • drmo - Thursday, August 06, 2009 - link

    "The DVD playback is likely caused by the AMD CPU not power saving as well as the Intel when under a light load (maybe 5-10% CPU usage). "

    I think you are right, but if this is a function of architecture, then that AMD should fix it; I mean, why have 100% power consumption if only 10% of CPU cycles are being used?
    I guess the number makes sense compared to x264 test because the DVD-ROM is also using energy in the DVD test. I suppose having a file on the hard drive is better than playing off the DVD-- another reason arguing for legal digital copies.
    Reply
  • monomer - Thursday, August 06, 2009 - link

    If this section is going in the full-blown review, would it be possible to add in some CPU usage charts to go along with the battery life charts? It would be interesting to see another data point to show how much the CPU (and indirectly the GPU) affects each of the battery life tests. Reply
  • medi01 - Thursday, August 06, 2009 - link

    So we have 2 laptops, and the one with MUCH FASTER GPU eats roughly 30% more power. So what? Reply
  • JarredWalton - Thursday, August 06, 2009 - link

    And the other provides a MUCH FASTER CPU along with 25% more battery life. If you don't play modern 3D games at all (and you really shouldn't on any of the IGPs IMO), having a fast GPU is silly. I can't tell you how many people I know that have never run anything more taxing (GPU-wise) than Aero Glass on their computer. Reply
  • medi01 - Monday, August 10, 2009 - link

    GPU is a couple of hundred percent faster.
    CPU is a couple of dozen percent faster.

    Many people I know actually play games on their laptops.
    Reply
  • JarredWalton - Monday, August 10, 2009 - link

    It's about 100% faster on average... and tell me this: what's twice as fast as horribly slow? The fact is, even the ATI 780G HD 3200 IGP struggles mightily in most modern 3D titles. Far Cry 2, absolute minimum detail settings, gets 24FPS at 800x600 (or 16FPS at the native 1366x768). Assassin's Creed, minimum settings are the same (24FPS/15FPS at 800x600/1366x768). Need I mention that both of those games look like garbage at the minimum settings? Mass Effect: 11FPS at 800x600 (unplayable), and Riddick: Dark Athena is about the same (12.6FPS at 800x600).

    If you were to go out and purchase a laptop with a GeForce 9500M or Radeon HD 3650 level GPU, you'd be about three to four times as fast as the HD 3200. Then you can actually play games if you'd like, though you'll also have lower battery life. Considering about 90% of laptop users probably never do anything gaming related outside of Solitaire, I'd say they'd be far more interested in the $580 Intel system that is noticeably faster (30%) in many tasks and ALSO provides 30% more battery life.
    Reply
  • MikosNZ - Thursday, August 06, 2009 - link

    I hate to state the obvious but I see no mention of battery testing or the same battery being used for both laptops in these tests. Lithium ion batteries are notorious for large variances in electrical capacity due to both shelf life and batch. This could easily account for 20-30% PLUS variance. I have seen numbers easily around that when doing battery testing for commercial laptops. These tests are completely worthless unless the same battery was swapped between the two or the batteries used were separately tested to ensure equivalence. I would definitely not rely on gateway to QA their batteries to a level required for accurate platform testing. Reply
  • MikosNZ - Thursday, August 06, 2009 - link

    Actually on second read there does seem to be some implicit references to this. Was this actually done after all? Reply
  • JarredWalton - Thursday, August 06, 2009 - link

    Nope, but if it will make you feel better I'll swap batteries between the two and rerun one of the tests to check results. I'd wager less than a 2% difference but I could be wrong.... Check back tomorrow. Reply
  • JarredWalton - Thursday, August 06, 2009 - link

    Swapping the batteries, the results stay essentially the same in the Idle battery test. With the original batteries, the Intel system has 20.25% more battery life. Switching the result was 19.57% better battery life.

    FWIW, both systems scored slightly lower after the battery swap - 230 vs. 242 on the AMD and 275 vs. 291 on the Intel system. All settings remained the same, but I started the tests right after the batteries hit 100% charge coming back up from 88% (they were sleeping and unplugged much of yesterday). If necessary, I can fully charge them and rerun the test, but a margin of error of around 2-3% is normal in battery tests when you don't short the discharge/charge cycle, and not fully discharging a battery will frequently result in less battery life on the next cycle.
    Reply
  • lyeoh - Friday, August 07, 2009 - link

    Thanks for doing that.

    While the result was not surprising, it was good to eliminate that.

    Different batches of batteries have been known to perform differently (some even explosively ;) ).

    It might be the AMD laptop's graphics system that's the power hog.

    But whatever it is, it doesn't look good for AMD.

    FWIW, some Macbooks seem to have rather good battery life, I wonder if they do any special tricks that the windows laptops don't.
    Reply
  • JarredWalton - Friday, August 07, 2009 - link

    They say MacBooks do some power saving "tricks" that don't use the standard ACPI interface or something. They also have better control over the OS/hardware, and OS X looks like it manages to stay in low-power C-states much better than any Windows OS. Reply
  • Jeffk464 - Thursday, August 06, 2009 - link

    Is amd coming up with anything to compete with Intels new Ultra low voltage cpu. I'm thinking about going this route for my next laptop, coupled with led backlight display these systems are getting unreal batter life. Check out the acer timeline series, the only thing is there kind of slow getting the core 2 duo out. Reply
  • Zagor Tenay - Wednesday, August 05, 2009 - link

    Battery life should not be a concern at all for most laptop users, because most people use their laptops at home. Their laptops are treated like a desktop, power cable attached at all times. (They say they prefer a laptop because it uses less space, there is no cable clutter etc.) So, what is the point in paying a higher price premium for an Intel laptop, because it has a longer battery life? The answer is plain and simple: There is no reason. Reply
  • zsdersw - Thursday, August 06, 2009 - link

    "Battery life should not be a concern at all for most laptop users, because most people use their laptops at home"

    Apparently you've never heard of the business world.
    Reply
  • anactoraaron - Thursday, August 06, 2009 - link

    "So, what is the point in paying a higher price premium for an Intel laptop, because it has a longer battery life?"

    For me, I needed a laptop that I can take with me anywhere (out of state, etc) and be able to run photoshop cs3 and run it well. It is common knowledge that photoshop favors intel's core architecture (both dual and quad) and from my own experience on mobile platforms intel beats amd by 100-300% depending on what you are doing. Would I pay more for 300% improvement? Uh... yes.

    I also would like to say that the current photoshop benchmark is nice, but it should be treated as just any other benchmark - meaning real life usage may differ. One of the things I do alot in photoshop is make contact sheets (used as proof sheets, flattened and saved as pdf's) - which are VERY cpu intensive. Try crunching 300 images on 54 images/sheet making 6 sheets and time the results. This type of benchmark will run for up to 15 minutes on an older core 2 system. My q8400 does 1 sheet per minute- which I still think is fast. Just a suggestion.
    Reply
  • JarredWalton - Wednesday, August 05, 2009 - link

    Most people don't buy a laptop because it's smaller in my experience; they buy them because they're mobile. Anyway, even if you don't want to have better battery life, why pay the price premium for Intel? Well, 30% better performance is something useful I'd think... unless you plan to play lots of games with the IGP. Reply
  • medi01 - Thursday, August 06, 2009 - link

    Nope. At least not me. I have 2 laptops and 1 PC at home. All 3 stay at home all the time. Reply
  • strikeback03 - Thursday, August 06, 2009 - link

    Mobile doesn't necessarily mean it leaves the house, but you can more easily move a laptop from the office to the living room to the kitchen table to the hammock outside if needed. In this class of notebook (15.6") mobility might be less of a concern, but the platform differences should hold up through all segments, including ones where battery life is a larger factor in which unit to purchase. Reply
  • Jamahl - Wednesday, August 05, 2009 - link

    a few benchmarks of fps in gaming would have been nice if only to show that the intel is awful and the amd is capable. Reply
  • pjtomtai - Thursday, August 06, 2009 - link

    Who games on IGP? There ARE laptops with discrete for that purpose. 80% of users don't game, while 80% depend on a solid laptop with longer battery time for work/life on move. Reply
  • xrodney - Thursday, August 06, 2009 - link

    Well i have expensive gaming machine, but still play games on notebook time from time. Not always my gaming machine is available for gaming {rendering etc} so i am using notebook instead or i am not near to it at all. And from my personal experience many people around me is doing same. Reply
  • JarredWalton - Thursday, August 06, 2009 - link

    I've run all the benchmarks; I'm just writing the main article at this point and it should be done for Friday. In terms of gaming, as I mentioned in this piece, the AMD solution is substantially faster - anywhere from 50% to 200% faster, with the average being around 125% faster. (The 50% comes from Empire: Total War, incidentally.) To be honest, I was actually surprised at the number of games the Intel IGP could manage to *load* - last time I looked at that on a GMA IGP there were only a few modern games that would try to load at all. LOL Reply
  • Jeffk464 - Thursday, August 06, 2009 - link

    pshh, games belong on desktops. Reply
  • snakeoil - Wednesday, August 05, 2009 - link

    so this is another 'sort of review'? come on.this happen always when they do an amd review.
    after the 'sort of review' they come with and update where they say that they commmited a 'mistake' and the amd product have a problem so they think the intel hardware wins.
    anyway.
    Reply
  • doncerdo - Wednesday, August 05, 2009 - link

    If you pit a C2D architecture CPU against an Athlon architecture chip, the results are more than obvious and writing this article is a waste of time. It is the same as testing a P4 notebook against an Athlon notebook for power consumption with no testing one already can guess the results. Wasn't the recent Turion lineup with slipt power planes and core redesign a major reason for improved power consumption on the AMD platform? If so, why test previous generation CPUs?

    Another reason not to like this previews at AT emotions get in the way of objective testing...we already had a retraction in the morning with the 785G article. I really prefer full articles instead of these "beta" (best way I can describe it) writeups.
    Reply
  • Exar3342 - Thursday, August 06, 2009 - link

    AMD has ignored the laptop market for a long time and has never really engineered a mobile-only CPU in the same focus as Intel. I don't think anyone would argue that AMD's mobile GPUs are better than Intel's, but when it comes down to standard laptop usage, Intel beats AMD hand's down. Intel has better power management, and has a whole suite of ULV processors that kick the pants off AMD's offerings. AMD needs to get it's head out of their a@@ and realize the mobile market is HUGE, and they are neglecting it. Reply
  • thurston - Wednesday, August 05, 2009 - link

    I agree, I don't care for the preview articles either, but it got me to read Anandtech tonight, so I suppose it works for them. Reply
  • coldpower27 - Wednesday, August 05, 2009 - link

    Actually AMD uses Athlon 64x2 branding for it's mobile line... take a look here:

    http://products.amd.com/en-us/NotebookCPUDetail.as...">http://products.amd.com/en-us/NotebookC...=&f6...

    http://products.amd.com/en-us/NotebookCPUDetail.as...">http://products.amd.com/en-us/NotebookC...=&f6...
    Reply
  • doncerdo - Wednesday, August 05, 2009 - link

    I thought mentioning the different power planes would make you realize we are talking about Turion Ultra. Those CPUs with the Lion core for example that start with RM designation and are part of the Puma platform. So my post stands based on the fact that at least this CPUs were redesigned with mobility in mind and were not simply cherry picked low TDP parts. Reply
  • JarredWalton - Wednesday, August 05, 2009 - link

    They may have split power planes (didn't AMD already have some issues with that and CnQ on the desktop?) but they're still by and large the same K8 dual-core CPU that we've had for years. As for the "Lion" CPUs, you might want to recheck your information. AFAIK, Althon X2 QL-64 is a "Lion" processor, just like the Turion X2 RM and ZM lines. (Not that http://en.wikipedia.org/wiki/List_of_AMD_Turion_mi...">Wikipedia hasn't been wrong in the past....) Reply
  • mczak - Wednesday, August 05, 2009 - link

    Turion Ultra (with Lion core) should have a bit better battery life (despite same TDP) than Athlon X2 mobile (with Lion core) since the former seems to also support (as already mentioned) more power states, deeper sleep, lower min p-state, and adjustable HT frequency. Well, maybe. It should also be slightly faster (because of twice the cache). I'd say all of that would only probably make a marginal difference however.
    As for 45nm parts, Tigris/Caspian should appear later this year - Caspian just looks like mobile regor to me. Chipset is rather unexciting (rs880m) but I'd guess the cpus (caspian) could have a bit higher frequency with the same power draw than lion (or alternatively same frequency with lower power draw), together with the improved IPC of K10 that could be enough to make them competitive with lower end core2 duo offerings. Not sure why it's not released yet actually, it really looks to me like amd would desperately need this new platform and all ingredients already exist as desktop parts.
    Reply
  • JarredWalton - Wednesday, August 05, 2009 - link

    The real problem is that on mobile systems, AMD just doesn't appear to have any answer to Core 2. On desktops, there's Phenom II at least, but that's the size of Core i7 so it's no surprise they aren't cramming that into a laptop. AMD's CPU names are also confusing; Turion X2 is higher spec than Athlon X2 on mobile systems, but there's a ton of overlap and unfortunately this is the closest we could get. Still, the only sub-30W CPUs listed at AMD are the Athlon X2 QL-62 (25W) and the Athlon X2 TK-42 (20W).

    If you look at the http://products.amd.com/en-us/NotebookCPUResult.as..." target="_blank">complete list, all the other dual-core AMD CPUs are 35W TDP except:

    Turion X2 ZM-80 (32W)
    Turion X2 RM-70 (31W)
    Turion X2 TL-56 (31W/33W)
    Turion X2 TL-58 (31W)

    The most attractive based off specs would have to be the QL-62, considering the 25W rating vs. 31-35W, so despite the higher performance designs in most instances I'm not sure the Turion line is universally better. If I can get a ZM-80, I'll try to install it in the NV52 and see what it does for power/battery results. Even if it can improve battery life, the T6500 is still about 30-35% faster, and T6500 isn't even remotely high-end.
    Reply
  • K6III - Thursday, August 06, 2009 - link

    It seems that bringing the Athlon II to the mobile market makes a lot of sense. 45nm and smaller core than existing Athlon64 X2, many of the same performance enhancements as PhenomII, and potentially similar/lower power draw compared to Core2. Reply
  • blackshard - Wednesday, August 05, 2009 - link

    Hi Walton, I sent you a mail with some findings related to 65nm AMD processors parts.

    BTW, reading the datasheets, there are no newer Athlon/Turions with TDP below 31W. QL parts are all 35W parts according to AMD document #43373 ("Power and Thermal data sheets for notebooks") and they are worse than RM and ZM parts dealing with power management (2 power states vs 3 power states, 2 power planes vs 3 power planes on ZMs)

    AMD site often is not updated or reports bad information.
    Reply
  • jonup - Wednesday, August 05, 2009 - link

    Jarred, what are the GPU power settings? I have a older TK-57 with RS690 (x1200) and I have the GPU trottle down to 20MHz on battery w/out load. Are the result different under different GPU power settings?
    Thanks!
    Reply
  • JarredWalton - Wednesday, August 05, 2009 - link

    The GPU setting is "Maximum Battery Life" for battery power, but I did not get any explicit readout of clock speed using the utilities I tried. I'm not sure what the minimum clock speed is, but the max (default) speed appears to be 500MHz. Reply
  • jonup - Thursday, August 06, 2009 - link

    Thanks buddy! I used GPU-z to get the clock from my laptop (RS690M). However, this utillity does not give me a reading for my desktop's RS780. It must be something with the 780G. This chip has been out for a while and they should have had a support for it. It has to be a hardware limitation. Reply
  • vol7ron - Wednesday, August 05, 2009 - link

    expected. curious if that's a result of the gpu. Reply
  • Sazar - Thursday, August 06, 2009 - link

    It's a combination I am sure.

    The lower TDP on the intel procs is probably the biggest benefactor to Intel's numbers but the graphics likely plays a part too.
    Reply
  • atlmann10 - Wednesday, August 05, 2009 - link

    Yes; I was wondering that myself, when I saw the overhead keyboard comparisons, and noticed the ATI emblem. One of the things that's really hit me (I am still running an X2 oc'd), other than the performance difference. While it may not be super realistic in a performance vector until you hit the higher core2 and then of course the I7 line. It is still there and enough to notice. I think one of the things that is really going to kick AMD is if they don't get themselves smaller and do so fast.

    This is shown in a mobile platform much more readily in a mobile platform. Not to mention they may use energy conservation as a big selling point on the corporate side of things (IE servers). They are going to loose that rather quickly if they don't. This is doubly so in a mobile arena. This is because while they use higher energy conservative memory graphically, the base on the gpu is still bigger than Intel's. So the Intel mobile product has a dual advantage. Not only is the CPU considerably smaller and therefor energy conservative there CPU is also two times smaller on a nanoscale point.

    So on a mobile platform I am actually surprised they only loose by a quarter percentage on battery life. They need to really get their operations in gear on research.
    Reply
  • cocoviper - Thursday, August 06, 2009 - link

    Yea I think that ATi GPU should be mentioned because a true apples to apples would involve something like a Geforce 9300 in the Intel box.

    That AMD system is definitely going to be more attractive if you plan on playing an occasional game. On the flipside if ALL you do is plan on staying on the desktop running office and web apps, then Intel is definitely the way to go.
    Reply
  • WeaselITB - Wednesday, August 05, 2009 - link

    I'd think so. I'm not able to find specific TDP numbers anywhere, but it sure seems to me that the Radeon HD 3200 will have higher draw than the GMA 4500MHD.

    Yes, yes, integrated chipset, blah blah blah, whatever; I think the point still stands.
    Reply
  • JarredWalton - Wednesday, August 05, 2009 - link

    It would be reasonable for the HD 3200 to use a bit more power than the GMA 4500... but then the AMD part is in a 55nm chipset vs. a 65nm chipset for the Intel part, which should help. The bigger issue is likely the 65nm AMD CPU vs. the 45nm Intel CPU... plus generally better power management on Intel's side as far as I can tell.

    AMD does a good job of turning down CPU speeds and voltages when the system is idle (5.25X multiplier and 0.950V at idle, during testing), and Intel does likewise (6x multiplier and 0.925V at idle). However, power draw at idle favors Intel and under load the difference between the platforms grows even larger, and I'm not even putting any explicit load on the GPU. It shouldn't require a lot of GPU power to run Windows Vista, even with Aero enabled.
    Reply
  • ltcommanderdata - Wednesday, August 05, 2009 - link

    It's highly unlikely that a TSMC 55nm process has an advantage over Intel's 65nm process other than size.

    http://www.realworldtech.com/page.cfm?ArticleID=RW...">http://www.realworldtech.com/page.cfm?ArticleID=RW...

    In fact, in terms of performance, TSMC's originally planned 45nm process would be required to match Intel's 65nm process. Of course, TSMC has since skipped 45nm and went directly to 40nm. Still, Intel's processes tend to be a generation ahead of the more generic processes at TSMC, with preliminary data on TSMC's 32nm process showing it behind Intel's 45nm process in terms of performance. Which of course means that Intel making Arrandale's IGP or Larrabee on 45nm won't be a performance limitation, rather it's all up to design.
    Reply
  • Alexstarfire - Thursday, August 06, 2009 - link

    This isn't about performance, it's about power usage. I'm pretty sure that a 32nm process is going to generally use less power than a 45nm process regardless of performance or design. I could be wrong though. I don't exactly know all the ins and outs of microprocessors. Reply
  • ltcommanderdata - Thursday, August 06, 2009 - link

    I'm pretty sure in this case performance and power are closely correlated because you can trade off between the 2. I believe they mean performance in a manufacturing process by describing how much current the transistor can transfer when it's on at a given drive voltage along with how much current the transistor leaks when it's off. Of course, higher on current (to do work) and lower off current (leakage) is better. In the case of the graph, it's drive voltage over on current so a higher on current will result in a lower number. Hence, Intel's 45nm process can have a higher on current for a given voltage than TSMC's 32nm process. This is important for high performance processors. For low-power processors, this also means that Intel can use lower drive voltage to save power (like undervolting), while still maintaining a higher current to do work than their competitors' process. Reply
  • imgod2u - Wednesday, August 12, 2009 - link

    This isn't necessarily true. Keep in mind the FET is inherently a non-linear device. There is a limit (especially for digital logic) where you simply can't lower the operating voltage anymore without drastically affecting slew rates to the point of breaking the device.

    This is why multi-Vt processes are available instead of just putting the slower paths on a lower VDD rail. The gate dielectric thickness (tox) is offered in various levels to balance between performance and leakage. It's then up to the designer to choose which Vt device to use in which gate.

    Now it could be that Intel's 45nm (which uses both a high-k gate dielectric as well as gate-replacement) is simply superior to TSMC's 32nm all around. I've not looked up the data. But higher-performance is usually inversely proportional to low-power capabilities of the process itself.
    Reply
  • Pneumothorax - Thursday, August 06, 2009 - link

    Smaller process does not always equal smaller power usage. Don't you remember Presscot? (The first 90nm P4, AKA pressHOT) Used more power than the preceding generation due to power leakage and design. Reply
  • ltcommanderdata - Thursday, August 06, 2009 - link

    Well of course process isn't the only factor, but I mentioned it since Jarred specifically mentioned that TSMC's 55nm process might help provide an advantage over Intel's 65nm process. In this case, it looks like Intel's 65nm process would have been competitive against TSMC's canceled 45nm process, so TSMC's 55nm process would be behind. Reply
  • DigitalFreak - Wednesday, August 05, 2009 - link

    Can't say that I'm at all surprised, for exactly the reasons you mentioned. Reply
  • Voldenuit - Saturday, August 08, 2009 - link

    Agreed. It's been conventional wisdom for a long time now that intel laptops outlast amd laptops, but it's always good to verify (and check up on) the claim.

    Next: how about OS contributions to battery life?

    I'd love to see battery life on the same notebook in WinXP vs Vista vs Win7. If AT could throw in OSX via hackintosh, that would be even more interesting, as you guys have found that OSX is superior to Windows in battery life on Apple hardware, but it would be interesting to ascertain if that is due solely to the OS vs BIOS/fimrware/more low level stuff.

    I just got back from an intercontinental flight, and was able to eke close to 9 hrs out of my Thinkpad X300, using the 6-cell battery and 3-cell Ultrabay battery (albeit with wifi off; mainly reading and some minor document editing).
    Reply
  • JarredWalton - Saturday, August 08, 2009 - link

    The one time I looked into XP vs. Vista battery life, it actually wasn't substantially different on the same laptop (within a few percent, which is margin of error on battery tests). I may try to do a WinXP vs. Vista vs. Win7 comparison in the near term, but OSX is sort of difficult to do properly with a Hackintosh. I've heard Apple plays with the CPU voltages a bit more using some direct access to the hardware, but honestly I'm not sure Windows PCs ever fully enter deep C-states.

    Stay tuned... full review is still pending, and then I'll see about playing with other OSes.
    Reply
  • Voldenuit - Tuesday, August 11, 2009 - link

    Neat. Looking forward to it.

    Course, I'm one of those users who "cheat" w/ undervolting and forcing C-states with RMClck (or NHC depending on my fancy). ^_^.
    Reply

Log in

Don't have an account? Sign up now