Back to Article

  • TouchPadKing - Saturday, March 30, 2013 - link

    I wonder how long the battery life on this and the Surface Pro would be. if they'd used Intel's new 10-13watt tdp chips... Reply
  • Flunk - Saturday, March 30, 2013 - link

    I don't think they'd be much better, Intel's "10-13" watt rating are based on a new rating system that's more marketing than technology. Reply
  • TouchPadKing - Saturday, March 30, 2013 - link

    Their SDP for the 10-13watt TDP processors is 7watts; Reply
  • jeffkibuule - Saturday, March 30, 2013 - link

    That's not really the problem. When I was doing some crude battery life testing on the Surface Pro, my results showed that total platform power when the system is idle and drawing the least amount of power reasonably possible (min brightness, idle CPU, but WiFi on), it only lasted 8hrs and 13 min. Compare that to an iPad 4 with a similar battery at 50% brightness, WiFi on, and typical web surfing/app use which gets 10hrs and you see that Ivy Bridge chews up too much power at idle where ideally a system should be spending most of its time after Turbo Boost kicks in to finish a task. From what I understand Haswell's real power efficiencies come far better regulating power consumption through a combination of on-chip voltage regulators as well as modifications to firmware of non-Intel chips on a motherboard to get Intel's claim of 20-fold reduction in idle power consumption (it's a prominent point on an Intel Ppwerpoint from IDF last year).

    With all of these changes, Intel's Haswell Ultrabook certification bumps tested battery life from 5 hours with SB/IVB to 9 hours with Haswell. Proof will be in the pudding, so to speak.
  • This Guy - Sunday, March 31, 2013 - link

    What's the price of eggs in china?

    Four hours improvement in idle time! I think I'm gonna have to upgrade the old AMD c-50 tablet soon :)
  • Nexing - Tuesday, April 02, 2013 - link

    Being April, I guess the news may be blown. Haswell's family real power efficiency comes from Intel (finally) switching the chipset lithography from 90nm to 22nm... Adding inboard regulating voltage capacity and deeper idle management gets them to figures nearing 70% of better battery life on mainly CPU oriented tests. Reply
  • UpSpin - Saturday, March 30, 2013 - link

    There won't be a significant difference at all. The power consuming device while playing is NOT the CPU, but the GPU. Reply
  • TerdFerguson - Sunday, March 31, 2013 - link

    The battery life is on par with the rest of the mobile gaming experience - awful. Even when selecting parts with the same names (video card model numbers, processor families) or specifications (HD RPM speeds), "gaming" laptops are horribly overpriced and underpowered. Every single vendor seems to feel justified selling subpar parts at premium prices when said parts are destined for mobile platforms. It's no great shock, then, that the gaming experience falls short. Reply
  • flyingpants1 - Sunday, March 31, 2013 - link

    Lenovo Y580 is available for as low as $800 and it rules. Reply
  • designerfx - Monday, April 01, 2013 - link

    $800 for a subpar experience. The point is nothing today delivers a great experience, and Turd is quite correct about that. Reply
  • extide - Monday, April 01, 2013 - link

    That's not true, you can get a very good gaming experience on a Clevo P150EM laptop with a 680M. Reply
  • flyingpants1 - Tuesday, April 02, 2013 - link

    Normally I'd agree but I think the Y580 is an exception. It is based on a quad-core i7, GTX660M, and has 6GB/750GB. It will play BF3 or whatever you like, all day. And it's only $800. That's not much really more than your typical i5/HD graphics/4GB/500GB which kick around for $600. You're basically adding a dGPU that can handle 90% of the games out there for $100-200.

    I wouldn't call that subpar. Maybe par. Not amazing.. but not subpar, and definitely not "awful". For PC gamers who need a laptop anyway, for college or whatever, it's probably the best product on the market.

    I think the Y580 should have received a lot more attention. The Razer Blade whatever $2500 laptop had equivalent specs and cost 3x more, and Anandtech only published 18 articles about it.
  • Wolfpup - Tuesday, April 02, 2013 - link

    Ridiculous. Any midrange notebook components run today's games well. Any high end ones run today's games REALLY REALLY well. Reply
  • Wolfpup - Tuesday, April 02, 2013 - link

    Ugh, this again? Gaming on notebooks is fantastic. It has ALWAYS been that mobile parts cost more and aren't as fast as desktop parts. But so what? They can still be high end, and mid or high end notebook components run games FANTASTICALLY. They aren't "underpowered" Reply
  • rootheday - Sunday, March 31, 2013 - link

    Vivek- at GDC Intel announced new drivers coming in the next week or so should improve performance/watt. Perhaps you could retry the ultrabook when that comes out to see if it either results in better battery life OR better performance at the same battery life? Reply
  • marc1000 - Monday, April 01, 2013 - link

    Hi Vivek. what do you think about turning on VSYNC on these portable devices? As the GPU would have less "wasted" job to do (locking the max FPS to 60), it should have a reasonable impact on power comsumption. Would it be significant or too small? Reply
  • extide - Monday, April 01, 2013 - link

    I was also thinking about this myself. It would certainly help out the ones with faster GPU's and in a way put them on a more level playing field, as they would all be doing the same amount of work; as opposed to saddling the faster GPU's with more work. Reply
  • Death666Angel - Monday, April 01, 2013 - link

    Two of the games tested had below 60 FPS performance. One had above 60 FPS performance. And four had about 60 FPS performance. If the game is behaving well with V-Sync enabled (read: lag is not too high) and it is a high performer, then you can get a few minutes by enabling it. Anything recent though and you won't even hit the 60 FPS necessary for V-Sync to help anything. Also, I have no idea about the Optimus capabilities of the Edge (have posted a comment about this in the review), but if it has Optimus, in those high-performance cases disabling the dGPU and using the iGPU for easy games might yield more battery life. Reply
  • marc1000 - Tuesday, April 02, 2013 - link

    nice point about using the iGPU for light games. but for vsync it will drop the framerate even on games where it is already below 60 fps, because it will make the gpu match the display refresh rate, so if a frame is ready a fraction of time later the time a refresh happens then it will have to wait for the next refresh to display. and the gpu will be doing less work during this time. anyway, I don't know if this would be enough of a difference, but I think it is worth a shot to see. Reply
  • VivekGowri - Wednesday, April 03, 2013 - link

    I think the review should have covered that part (or I thought it did at least), but yes, the Edge has Optimus like pretty much everything else Intel+Nvidia over the last couple of years. I never really tried turning off dGPU in games, though that would definitely help with older games where framerates wouldn't tank. Reply
  • marc1000 - Thursday, April 04, 2013 - link

    thanks for answering our comments, Vivek! regards, Reply
  • jasonelmore - Tuesday, April 02, 2013 - link

    i think you left out a key part in the future of mobile gaming. Battery Technology. We are constantly asking vendors to make more power efficient chips and displays, but we always seem to forget about the battery manufacturers.

    They need to push the envelope and create new more efficient batteries. Imagine if it were like chips, and we see a 30% increase in performance at the same wattage every year. Year after year, those gains would stack up, 3 years time your looking at a 100% increase. I just wanna see battery technology get better, and so many companies seem to focus on component power consumption. Naturally chips will use less power over time, but the gap needs to be closed from both ends.
  • Wolfpup - Tuesday, April 02, 2013 - link

    Good point. What happened to super capacitors? What happened to these new lithiom ion batteries that supposedly charge in minutes and have orders of magnitude more recharge cycles? They get talked about and then...we're just back to lame old regular lithium Reply
  • protomech - Tuesday, April 02, 2013 - link

    Moore's Law isn't broadly applicable to anything we'd like, just because it happens to be associated with technology. Batteries have improved substantially, but the process has been gradual.

    10 years ago I bought a Powerbook G4, the first aluminum model. 46 Wh battery, weighed about a pound as I recall. Real battery life was around 4.5 hours, so average power consumption was around 10 watts.

    Now you can buy a Retina Macbook Pro 15", 85 Wh battery, probably also weighs about a pound. Real battery life is around 7 hours, so average power consumption is around 12 watts.

    So actually even with the huge improvements in power efficiency and power efficiency, Apple (as an example) has an average design power that's even higher than it was 10 years ago.. because the batteries have improved (and more of the chassis interior is dedicated to them) to allow it.

    If Apple integrated the iPad 4's SOC (> 2x as fast as PB G4 btw) and display into a notebook chassis (as Google is wont to do) and repurposed the rMBP battery, then you'd probably see 16-18 hours of battery life. Presumably they're relatively happy with the rMBP battery life, given other design constraints.
  • Tams80 - Thursday, April 04, 2013 - link

    I really wouldn't be surprised if manufacturers of battery technology were trying their hardest to make more efficient batteries.

    If anything, the development of components is an anomaly, not battery development. Moore's Law is likely the exception not the rule and therefore to hold other technological development to it's speed is rather unfair.

    Not it isn't worth trying, as if no one tries, we see slower development!
  • Nexing - Tuesday, April 02, 2013 - link

    An important consideration is that Ion Lithium batteries have a longer life when their recharge cycle is applied before they go below 20-30% depletion. There's no memory effect on this type.
    As opposed to Ni-Me or Ni-Cad ones, that need to be depleted before recharge, in order to attain longer life.

    Few people seems to know this and it is seldom divulged by the specialized press, let alone the consumer channels.
    Since from Haswell we'll really start to talk about "all-day" or "day's work" battery life, it seems appropriate to bring this usage practice to the table (and the equations).
  • Nexing - Tuesday, April 02, 2013 - link

    Also, I understand these tests have been performed down to the depletion point, usually set at critical level 7-8%, hence, this parameter might be changed to include the long term efficiency of this article figures. Reply
  • TrackSmart - Tuesday, April 02, 2013 - link

    Thank you for this clarifying post. I agree with the author's comment about the Edge being "hard to knock for battery life" when comparing against similar hardware. But this product does not exist in a vacuum. There are other platforms for mobile gaming (outside of the PC realm) that deliver an enjoyable experience while providing considerably longer battery life and better portability.

    While the Edge is a technological accomplishment, I really question where this device fits in the grand scheme of mobile gaming. If you want to play your favorite PC games on a trip, your airport time plus flight time had better under 2.5 hours. That's not true portability.

    I'd rather see game makers make higher-quality games for low-power mobile platforms. I don't need bleeding edge graphics to enjoy a game. The graphical fidelity that today's smartphones can provide is "good enough" for enjoyment while on the go. The problem is a lack of quality games and quality controls (i.e. non-touchscreen controls). I don't think it will be long before both of those problems get solved.
  • VivekGowri - Wednesday, April 03, 2013 - link

    Well, it's not like you're going to be playing PC games on a competing Android or iOS platform. I'm not even going to offer the iPad or Nexus tablets as real alternatives here simply because the gaming experience sans any physical control absolutely pales in comparison. The Edge was never about competing with ARM-based tablets though, it was about taking real PC gaming into an ultraportable space. And let's be real, I'd rather play 2.5 hours on the Edge than 3.5 hours on the PS Vita. Reply
  • TrackSmart - Wednesday, April 03, 2013 - link

    Vivek, we are in complete agreement on that last point. I, too, would take 2.5 hours on the Edge over any of the touch-screen enabled games on Android or iOS.

    My comment is not a critique of your article, but a critique of the niche nature of this product. I'm not sure that Razer's gamble here is the optimal path given current (and near future) hardware constraints. Razer has done a good job of meeting their goal of putting PC-class hardware into a portable system, but I think the way forward is to better match high quality games with hardware that fits into a truly portable power envelope..
  • damianrobertjones - Friday, April 05, 2013 - link

    It would be interesting to run the same tests on a HP 8470p with the extended BB09 and CC09 batteries. The above can play media for over 17 hours at which I gave up testing Reply
  • rosey5t - Sunday, July 09, 2017 - link

    Wow! I really loved your website. Thanks for the wonderful inputs.


Log in

Don't have an account? Sign up now