Original Link: http://www.anandtech.com/show/6351/the-new-razer-blade-thoroughly-reviewed
The New Razer Blade: Thoroughly Reviewedby Vivek Gowri on October 3, 2012 5:40 PM EST
Introducing the new Razer Blade
Earlier this year, Razer handed me a Blade evaluation unit and told me to go wild. Considering the company had no previous experience with developing PC laptop hardware, I was skeptical about their ability to transition from a manufacturer of gaming peripherals to a manufacturer of gaming systems. Turns out my concerns were unfounded, as the Blade turned out to be a well-designed, high quality notebook. Granted, the dual-core Core i7 and NVIDIA GT 555M graphics chip were more suited for a high-end 14” system rather than a fully fledged 17” gaming notebook, while the $2799 asking price bordered on obscene, but the Blade was a polished piece of engineering that showed that Razer was capable of producing premium grade hardware.
Fast forward eight months and Razer dropped off the new Blade at my doorstep. (They’re a pretty trusting bunch.) The new Blade looks quite similar to the old Blade, but there are some key differences under the hood. Most obvious is the significant upgrade in computing prowess, with the CPU/GPU combination being kicked up to a quad-core Ivy Bridge chip and NVIDIA’s GTX 660M graphics, but also a redesigned cooling system and a much more stable software backend to the Switchblade LCD trackpad. It’s also gotten a price drop to $2499. Do the improvements make the Blade more competitive with the gaming notebook establishment?
I absolutely enjoyed my time with the original Blade. I used it as my primary portable for a lot longer than I expected to, due to the fact that it weighed the same as the average 15” notebook while looking great and being blazing fast in day to day usage. The combination of the 2.8GHz i7-2640M and the Marvell-based LiteOn 256GB SSD proved to be exceptionally responsive in the real world, resulting in one of the quickest boot times I’ve measured (15.8 seconds). Unfortunately, at $2799, we expected more. A dual-core i7 and a GT 555M simply did not cut it, not compared to less expensive gaming systems that offered quad-core processors and far more powerful graphics cards, notebooks like the ASUS G74SX, Alienware’s M17x, and the Clevo P170. Having specifications that essentially matched the M14x just didn’t cut it at a pricetag approaching $3000.
And it wasn’t just the internal hardware that gave us pause—one of the key selling points of the Blade, the Switchblade UI, was an interesting concept saddled by inherently unstable software. When it worked, Switchblade was fun, a novel idea that could wow your friends and be useful in very specific scenarios. But it needed more utility, and above all else, more robust drivers and software.
So with this updated Blade (referred to internally as the Blade R2), Razer went about fixing the issues that were brought up. It wasn’t just the major stuff though; Razer’s CEO Min-Liang Tan told me that they combed through each and every single review and looked at every concern mentioned. This went to the level of minutia—the click sound of the trackpad buttons as well as the backlighting of the secondary functions in the Fn keys were apparently things that the design team had rethought simply because I pointed them out. After the new Blade was announced in Seattle, I had a lengthy discussion with Min about whether the trackpad buttons should have been matte or glossy plastic. It’s very rare that you see that kind of attention to detail, particularly at the chief executive level, so it’s nice to see how connected Razer is as a company and how serious they are about their PC business.
|Razer Blade (late 2012) Specifications|
Intel Core i7-3632QM
(4x2.20GHz + HTT, Turbo to 3.2GHz, 22nm, 6MB L3, 35W)
Intel HD 4000
(16EUs, up to 1200MHz)
NVIDIA GeForce GTX 660M 2GB GDDR5 (Optimus)
(384 CUDA Cores, 875MHz/950MHz core/boost, 2.5GHz memory, 128-bit memory bus)
17.3" LED Matte 16:9 1080p
AUO B173HW01 V5
500GB 7200RPM HDD (Hitachi HTS72505)
64GB Lite-On LMT-64M3M caching SSD
(Marvel 88SS9174 Flash controller, NVELO DataPlex caching software)
Intel Centrino Advanced-N 6235 802.11a/b/g/n
Realtek ALC275 HD Audio
Single combination mic/headphone jack
|Battery||6-Cell, 60Wh (integrated)|
|Right Side||Kensington Lock|
AC Adaptor Port
3 x USB 3.0
|Operating System||Windows 7 Home Premium 64-bit SP1|
16.81" x 10.90" x 0.88" (WxDxH)
427mm x 277mm x 22.4mm
Ambient light sensor
Ten dynamic LCD keys
4.05" WVGA LCD touchpad (capacitive, multitouch)
On spec level, the new Blade stacks up roughly where we thought it would. Near the end of my Blade review, I suggested that we would see the next iteration jump in performance plane: “The more efficient chips open up a lot of possibilities for Razer due to the thermal design; quad-core CPUs and GTX-caliber graphics wouldn't be out of the realm of imagination.” With the 35 watt quad-core parts that were introduced with Ivy Bridge, it was a given that we would see one. As such, Razer went with the i7-3632QM, a new 35W quad clocked at 2.2GHz with a max turbo of 3.2GHz.
In addition, the graphics were bumped up to the GTX 660M, a Kepler-based 28nm GPU with a GK107 core, 384 CUDA cores clocked at 875MHz, and 2GB of GDDR5 vRAM. If you’ve been paying attention to NVIDIA’s increasingly convoluted mobile graphics lineup (I won’t blame you if you haven’t), that’s the same GPU as the GT 650M except clocked higher—the GDDR5 variant of the GT 650M is clocked at 735MHz, while the DDR3 version comes with an 850MHz core clock but significantly slower memory—the GDDR5 GT 650M ends up being a fair amount faster than the DDR3 one. The GT 640M and one SKU of GT 640M LE also use the GK107 core and have the same 384 CUDA cores and GDDR5/DDR3 variants, but with even lower clocks (in the 625 to 645MHz range for the GT 640M, and 500MHz for the LE). The other GT 640M LE SKU is a 40nm part that’s essentially rebranded from one of the GT 555M’s many variants. Confused yet? Yeah, that’s what I thought. Thanks NVIDIA, we love you.
Basically, this is all to say that architecturally, there’s nothing serious separating the GTX 660M from the GT 650M that can be found in the new MacBook Pros, Samsung Series 7, and the Alienware M14x. Interestingly, the Retina MacBook Pro has a GDDR5 GT 650M clocked at an aggressive 900MHz, which is actually higher than the base clock of the GTX 660M in the new Blade. So here we go again. This isn’t to fault Razer—they were kind of stuck between a rock and a hard place with this one. NVIDIA’s lineup has basically shaken out such that the only Kepler-based GTX parts up until have been the 660M and the 680M; the 680M’s 100W thermal envelope is almost as high as the entire Blade system, which ships with a 120W power adapter, while the 40nm Fermi parts (GTX 670/675M at 75/100W respectively) obviously weren’t realistic either, leaving the 660M as the only really viable option. Will we see GTX 670MX/675MX make it into the Blade sooner rather than later? It's possible, but we'll believe it when we see it.
|Razer Blade (late 2012)||Razer Blade (early 2012)||Alienware M17x R4||Alienware M18x R2||ASUS G75VW||Clevo P170EM|
|CPU||Core i7-3632QM||Core i7-2640M||Core i7-3630QM||Core i7-3630QM||Core i7-3610QM||Core i7-3720QM|
|GPU||GTX 660M||GT 555M||GTX 660M/680M||GTX 680M SLI||GTX 670M||HD 7970|
Right off the bat, let’s address the cost versus performance question that dogged the original Blade. A comparably configured M17x will run you about $1900, though at $2500 you could get the same M17x with a GTX 680M. GK104 is just on a higher performance plane than any other mobile GPU at the moment, so it’s worth thinking about. The ASUS G75VW can be had in GTX 660M and GTX 670M flavors and rings up at less than $1500 no matter what configuration you’re looking for. Obviously, the Blade and the M17x are more premium products than the ASUS RoG systems, but the Razer is still more expensive than the nearest competitors. It ends up as either a tradeoff between the power of the GTX 680M and the portability of the Blade, or just paying a decent premium for the Blade’s design and Switchblade UI. But, at the very least, now it’s at least justifiable. The first Blade would have been a difficult value proposition to make even with a 25% price cut.
The other big hardware news is that the 256GB SSD was dumped in favor of a 500GB 7200RPM hard drive paired with a Lite-On 64GB solid state mSATA cache drive. That drive is based on the same Marvell 88SS9174 flash controller as the old Blade's SSD and runs on NVELO’s DataPlex caching firmware. Quite frankly, I’m disappointed. I get that with the size of modern video games, 256GB can be on the tight side, but I’m not sure that dumping fully solid state storage for a cached solution is the best way to go. And even with a hard drive, it’d have made more sense for Razer to go for a drive in the 750GB-1TB range instead of a 500GB drive to make it really worth the switch. Another option, if Razer really wanted to let users have their cake and eat it too? Configure the mSATA drive as a separate storage drive and use either the LMT-128M3M or 256M3M (the 128GB and 256GB variants of the current 64GB cache drive). That lets you have a decent sized SSD for the OS and applications, along with a mechanical drive for games and data storage. If I had my run of the place, I think I’d have two SKUs—one with the 64GB cache paired with a 750GB hard drive, and another one with a 256GB SSD offered as a no-cost option. I’d like to see users be given the choice, basically.
Other details include the switch across the board to USB 3.0 ports (there’s three of them, all highlighted in Razer green, with nary a USB 2.0 port in sight) and an updated dual-band Intel wireless card that supports Bluetooth 4.0 and WiFi Direct. The display is the same AUO 1080p panel as before, and it’s one of the few meaningful internal components that has been retained from the original Blade. The exterior, however, looks pretty similar, and that’s a good thing. We loved the original design and it has survived mostly intact, with a bit of additional weight due to the mechanical hard drive and some revised cooling details. The form factor is like nothing else, other than maybe the dearly departed 17” MacBook Pro, and in the world of gaming notebooks, it’s just on a different planet. Look at the chart from above, an updated version of the one I had in the original Blade review: half the thickness of the M17x, three and a half pounds lighter than the G75VW. As for performance, we'll look at that in a moment.
Razer Blade (late 2012) - Design Changes
I was a huge fan of the Blade’s design, so I’m perfectly alright with it carrying over mostly untouched. It’s just better looking and better built than a vast majority of other 17” gaming notebooks out there. I’ve used the 17” Ultrabook term before to describe the Blade, and it still applies—it’s got the form factor and design detailing (and the price) that we’ve come to expect from Ultrabook class PCs.
The anodized aluminum unibody is as gorgeous as ever. I spent about a thousand words describing it last time around, and that page of my review perfectly sums up the exterior of the new Blade as well. This is one of the most striking notebook designs to hit the market in recent years. I’ve always loved the detailing on the Blade, from the uniformly green accents (including the USB 3.0 ports) to the two ridges on the back, which interestingly enough, were apparently inspired by the styling of Japanese samurai swords. It’s clean and elegant, but still makes a powerful visual statement. There are few systems out there with the awe factor of the Blade—it’s beautiful and menacing, all at once.
But with that said, there are some changes, predominantly at the bottom of the system. The venting has been changed considerably, with larger vents on the bottom—still the beautiful machined slots with polished aluminum edges and a lighter metal mesh, just with more surface area covered—as well as raised feet in the rear of the system.
This serves two purposes, the obvious one being improved heat dissipation from the bottom of the system due to the presence of significantly more air and airflow underneath. To aid in this, Razer has put in a secondary set of vents on the edge of the underside, next to the raised feet. The other is that adding a space there has allowed Razer to move the CPU heatpipe there, underneath the heatsink. Previously, the copper heatpipe went through the heatsink, impeding airflow, so the change brings about a much freer breathing cooling system.
Every bit of thermal headroom helps with a system as thin as the Blade, and the new thermal design has allowed Razer to add 10-12 watts to the system while still having it run cooler than before. Razer has also changed the fans it used, after complaints from us and others about the noise of the fan when it spooled up, so it’s quieter as well.
The raised feet also give the Blade an ergonomic tilt when set on a flat surface. Of course, it also adds thickness to Razer’s claimed 0.88” height figure, which appears to cover only the body. Including the feet, the Blade is probably closer to 1-1.05” thick.
Razer Blade (late 2012) - Thermal Design
Razer took a system that was already near the brink of its thermal envelope, tossed about 50% more compute power into it, added 12 watts to the power draw, and tried to still keep it within reasonable operating temperatures. So now we get to see if Razer’s engineering team managed to pull it off.
A quick refresher from last time: the Blade was hot. Damn hot. We saw internal temperatures of 95C on the CPU and 80C on the GPU under loaded conditions, numbers that we were simply not that comfortable with. But it wasn’t just at load; this was a system that got relatively toasty even at idle, where we saw temperatures in the 50-60C range. This resulted in a system that ran hot to the touch (though most of this heat was directed away from areas that are commonly touched like the keyboard and palmrests) and constantly had the fan running, even when bouncing around the internet or YouTube. Put simply: not great.
Thankfully, the redesigned cooling system has helped tremendously, particularly at low load. The system now idles in the 37-42C range, significantly lower than the 55ish it used to go for, and it’s very rare to see the fans spool up until you start gaming. I put it through my typical 100% system load, basically using Furmark 1080p and wPrime 1024M looping to peg both CPU and GPU load at 100% for a sustained period of time to see where temperatures settled. wPrime is multithreaded so with 8 threads it's loading all four cores equally. I saw CPU temperature settle in the 85C range, while GPU temperature maxed out at 90C. It’s still pretty hot, but even at a sustained hour-long clip, I never saw throttling—the GPU core was pegged at 950MHz—and the fan itself was much less intrusive than before.
To put this to the test in a real-world gaming situation, I fired up our DiRT 3 benchmark (it’s built into the game) and ran it fifty times in a row. I tested at our Enthusiast setting, which is 1080p, Ultra High quality, and 4xMSAA, and each run, including cut-scenes, totalled about 2 minutes and 20 seconds, give or take ten seconds—it’s not the exact same clip each time, as AI is typically different, which impacts the race results and elapsed times. That’s essentially two hours of gaming, with a fairly new game running at maximum settings. My performance over time graph ended up being as flat as Wyoming—almost no deviation in performance, beyond random test-to-test variation. I ran a similar test on the MacBook Pro (except with Anand’s OS X Half-Life 2 benchmark) and by run 30, the downward trend was pretty clear. I ran that 40 times, but I went even longer here to see if I could establish any kind of pattern. All I got was a really consistent 30.75fps, give or take one. I was impressed, to say the least.
A quick note—I was unequipped to test fan noise, but I can say that even under full load, the new fans are much quieter than the old fans were even under part loading. The new fans seem to be running at a lower RPM as well, which was no doubt helped by the larger venting; it’s a really big improvement from before.
Razer Blade (late 2012) - Switchblade UI
Easily the most significant issue I had with the original Blade’s user experience while I had it was the usability of the Switchblade UI system, and other than the overall system value quotient, it was the biggest reason I could point to for not recommending a Blade purchase. By “usability of the Switchblade UI”, I’m not even saying that it was hard to use, just that it was never working reliably enough for me to use on a day-to-day basis. Synapse, Razer’s cloud-driven device settings manager, would crash with regularity, and the entire system had issues coming out of standby mode.
The good thing here is that the Synaptics drivers were pretty solid, so you could still use the touchpad part of it even when Synapse and Switchblade were long gone. Of course, there were also the occasional times when Switchblade would nuke itself completely (to use the technical term for it)—the LCD buttons would turn white and the touch panel would become unresponsive, necessitating a full system reboot. Through my three months with the original Blade, I probably spent two-thirds of my time with the Switchblade part of things not working. I didn’t care most of the time as long as the multitouch trackpad was working, but it was definitely a significant problem that needed to be addressed going forward.
I wasn’t the only one who noticed this, as Razer took a beating in various online notebook owner forums. Thankfully, the company was very proactive about pushing new updates and new content to Switchblade. Based on my interactions with the new Blade and also the DeathStalker Ultimate keyboard over the last month, it’s definitely a night and day difference in terms of stability and usability.
This is a very, very good thing. So is the new SDK that was released by Razer, something that will become an asset as more Switchblade devices hit the market. There are a couple of new applications, including a game timer and a screenshot app. Razer is also working with a number of game developers to add support for a number of popular titles, including CounterStrike, CS:Source, CS:GO, Team Fortress 2, Battlefield 3, and Star Wars: Knights of the Old Republic at present, with more games currently in validation. And call me shallow, but I find the new ability to set your own wallpaper for the LCD touchpad simply awesome.
But here’s my deal. Even with all of the improvements being made to Switchblade, it’s still a cool concept with only a very specific usage model. Even then, a lot of the experiences are subpar—the browser is still Mobile IE 7 running on an embedded Windows NT 5.1 kernel, and it absolutely pales in comparison to basically any modern smartphone browsing experience. The Gmail, YouTube, Facebook, and Twitter applications are just the mobile IE 7 experiences, which makes them generally painful to use, but it's also worth noting that YouTube videos played back at a very low frame rate, 20 frames per second or less. Any iPhone or iPod touch (going all the way back to the ARM11-based originals from 2007) will top that. So will the Zune HD or any credible Android/Windows Phone/WebOS device made in the last three years.
It’s a blatant issue—it’s far easier and more convenient to just pull out a smartphone or an iPod to check the internet, email, or social networks. The gaming functions are nice enough, as is the screenshot application, and I suspect we’ll see Switchblade get more useful as developers come aboard and more games-specific applications are launched. The game developers can do some pretty innovative things with Switchblade; we were shown a demo of the Firefall application that allowed you to use the touchscreen as either a virtual analog controller or showed the environment map, for example.
Switchblade clearly has the potential to enhance gaming experiences, but it’s just as clear that it needs a lot of work in other areas. I enjoy the idea of having a touchscreen input, along with the configurable shortcut keys at the top. The placement, too, is very convenient and is much more natural to use than a traditionally placed touchpad once you get used to it. But with such outdated software experiences powering the core web-based functions of Switchblade, it’s sadly going to remain a novel but not particularly functional sideshow outside of its uses in the gaming realm.
If you look solely at the keyboard and mouse, things look pretty solid. The touchscreen trackpad, like I said above, is great to use; Synaptic’s multitouch drivers are seamless and offer a wide range of gesture support. Razer has improved the touchpad buttons to give much more positive feedback on clicks—there’s less of a cheap plastic sound when you actuate them. The keyboard is almost untouched, meaning fully anti-ghosted and individually backlit keys that have zero flex, though the key travel itself is a bit shallow. Razer has backlit the secondary functions (brightness, volume, etc), which has made using them in the dark significantly more convenient. I still take issue to the layout (seriously, there’s little to no reason for the Fn key to be on the right side of the spacebar) but you get used to it quickly enough as long as you aren’t switching from notebook to notebook.
Razer Blade (late 2012) - Performance
To gauge how the new Blade stacks up versus its predecessor and other gaming-class notebooks, we picked a decent selection of systems with the entire range of modern GPUs, from the GT 650M GDDR5 in Samsung's Series 7 to the pair of Alienware systems with GTX 680M and 680M SLI graphics configurations. There's a decent range of storage technologies on display here as well, with a number of different SSDs and hybrid storage solutions, along with one conventional 7200RPM hard drive.
|Notebook Configuration Overview|
|Alienware M17x R4||i7-3720QM||GTX 680M||Hybrid (Intel SRT)||90Wh|
|Alienware M18x R2||i7-3820QM||GTX 680M SLI||2x SSD 830 (RAID)||97Wh|
|ASUS G74SX-A2||i7-2630QM||GTX 560M||Intel 320 SSD||90Wh|
|AVADirect Clevo P170EM||i7-3720QM||HD 7970M||Crucial M4 SSD||77Wh|
|Clevo W110ER||i7-3720QM||GT 650M||Hybrid (Momentus XT)||62Wh|
|iBUYPOWER MSI GT70||i7-3610QM||GTX 675M||7200RPM HDD||60Wh|
|Razer Blade||i7-2640M||GT 555M||Lite-On M2S SSD||60Wh|
|Samsung Series 7||i7-3615QM||GT 650M||Hybrid (ExpressCache)||77Wh|
|Razer Blade (late 2012)||i7-3632QM||GTX 660M||Hybrid (DataPlex)||60Wh|
The Blade is the first 35W quad that I’ve dealt with, and it’s the only 35W quad to go through our labs since Dell’s XPS 15 a few months ago. The Blade uses a new OEM-specific i7-3632QM that is essentially the same as the i7-3612QM in the Dell, except with 100MHz faster clocks. Performance-wise, it falls in roughly where you’d expect a 2.2GHz Ivy Bridge quad, a bit faster than the SNB quads and the 2.1GHz i7-3612QM but a bit slower than the various IVB quads that are clocked faster, starting with the 2.3GHz i7-3610 and 3615QM. I'm going to skip straight to the CPU testing this time, as I find PCMark 7 to be far too heavily skewed by other factors (SSD, Quick Sync, etc.)
There’s not too much else to say about Ivy Bridge here; it’s just a really solid CPU that’s done well for Intel. The only complaint that really can be leveled against it is that even though the tick+ update focused heavily on a new on-die graphics architecture, Intel is still playing catch up to AMD’s far more advanced integrated graphics solutions. Intel’s HD 4000 is solid enough for anything not involving gaming though, and in these applications at least, it does its job well.
Traditionally, I try to avoid all Futuremark tests, but I decided to break my rule here with the PCMark 7 storage benchmark. I wanted to see how effective NVELO’s DataPlex caching solution was with respect to Intel’s Smart Response Technology as well as the SSD in the previous Blade. The PCMark Storage score is pretty impressive, beating the SRT-enabled Dell XPS 15 (which had a 32GB cache drive) and interestingly, the old Blade as well. Admittedly, the original Blade’s SSD wasn’t the fastest in the world, even though it used the same Marvell controller as the caching SSD in the new Blade (and the Intel SSD 510, amongst other notable SSDs). We saw better Storage benchmark scores out of faster SSDs in the ASUS N56VM, which was tested using a SandForce-based drive, and Alienware M18x R2 (two Samsung SSD 830s in RAID).
It was a pretty impressive showing, and I actually wasn’t put off by the hard drive too much in day to day usage. The one place where I missed the SSD? Shut down/restart and sleep/wake cycles. I timed a cold boot at 20.8 seconds, a solid 5 seconds off the pace of the old SSDified Blade, and it was definitely something you were reminded of every time the system was rebooted or woken from sleep. As much as the new caching solutions help, nothing can touch the outright responsiveness of a real SSD.
Razer Blade (late 2012) - Gaming
I gave a pretty thorough breakdown of NVIDIA’s midrange mobile GPU lineup on the first page, so I’ll just give you a quick summary here. The new Blade comes with the entry level GTX-class GPU, the GTX 660M. It comes with a GK107 core with 384 CUDA processors, 2GB of GDDR5 video memory, a core clock speed of 835MHz, and memory clocks of 2500MHz. Additionally, the GTX 660M can boost clock speed up to 950MHz when gaming. This is essentially the same GPU as the GT 640M and GT 650M except with higher clock speeds, so there is some question as to whether or not it deserves the GTX label.
With that said, the 600M lineup has shaken out as such and this is really the only viable chip for Razer to have chosen, at least until the quiet release of the GK106 based GTX 670MX/675MX earlier this week. Since those parts likely weren't available for testing and validation in time, the only other option would have been the awe-inspiring GTX 680M. GK104 has a TDP of 100W, which is roughly the same as the power envelope of the entire Blade system. So, not really an option.
As for GK106, the GTX 670MX and 675MX chips just showed up as a pair of Kepler-based replacements for the Fermi-based GTX 670M and 675M, but we don’t have too much in the way of firm information on the new chips yet in terms of performance or power characteristics so speculation isn’t really worth it. Most likely those chips will be roughly half way between the GTX 660M and GTX 680M in terms of TDP, which may or may not fit into the Blade's thermal envelope. Since we don't have them, however, let’s focus on what we have in front of us.
The GTX 660M performs roughly 15-20% faster than the GDDR5 variant of the GT 650M, which is pretty much in line with the clock speed difference. It’s pretty solid, actually, topping the 30fps mark in our enthusiast gaming suite (albeit barely in several titles) in all of the games except Battlefield 3, and being comfortably playable in our entire mainstream suite, never dipping below 40fps. It’s a bit more powerful than the GTX 560M, but obviously gets blown out of the water by the GTX 680M.
Razer Blade (late 2012) - Display
Razer stuck with the AUO B173HW01 V.5 panel for the updated Blade. The 17.3” 1080p display is a good one, and we had no real issues with it last time around. This one was brighter and more accurate than our original Blade’s display panel but also had a slightly poorer contrast ratio. These metrics are all based on relative quality of different manufacturing batches of the same display, so it’s all up to the panel lottery gods. As before, color gamut and viewing angles are stellar (by TN standards, at least).
I asked Min about the possibility of seeing an IPS panel somewhere down the road when I met him at PAX this year. He seemed relatively unenthusiastic about it, citing concerns about display response time (I imagine that cost, too, was a factor). The AUO panel being used is high quality enough that it’s not a strict necessity for Razer to go IPS yet, especially considering that workstation-grade machines are some of the few 17” systems to feature IPS displays at present. But as we go forward and IPS panels become commonplace in larger displays (they’ve already started popping up in midrange 15.6” notebooks like the Sony VAIO SE and Vizio CT15), Razer will want to make the jump.
Razer Blade (late 2012) - Battery Life
As before, battery life is quite solid, due in no small part to the wonders of NVIDIA’s Optimus graphics switching technology. It's amazing that gaming systems like the Blade, M17x, and M18x are all capable of four or more hours of real world usage, when just a couple of years ago this class of system had the battery serve as a glorified UPS. It's good that AMD has caught on with Enduro, and the Clevo that Jarred tested with the latest Enduro drivers was almost as good as what we're used to seeing from NVIDIA based notebooks. [Ed: Not really, actually, but that's more the fault of Clevo and not a problem with Enduro.]
With the Blade, we saw just over four and a half hours in our internet benchmark test, which is usually a good indicator of real-world system usage, and exactly five and a half in our ideal-case scenario. We noted a few percent improvement across the board versus the original Blade, which is pretty good when you consider how much computing power was added in the refresh. Razer is using the same 60Wh Li-poly battery pack as before, so the efficiency numbers look pretty good—one of the best in the class and just ahead of the Samsung Series 7 when we normalize for capacity. The only one that beat it on our charts is the Clevo W110ER, an 11.6" notebook with an i7 quad and a DDR3 GT 650M. The signficantly smaller (and dimmer) display means that it has lower power consumption. Given the sheer size of the Blade, it makes for a surprisingly good portable companion.
Another power-related detail I completely glossed over last time around was the 120W power adapter—it’s significantly smaller than any other 120W adapter out there. It’s slimmer than most of the 90W AC adapters I’ve seen, too. It’s a 19V brick that can draw up to 6.32A, and it weighs roughly three-quarters of a pound. It’s pretty impressive when you compare it to the shoebox-sized adapters that come with some of the Clevo systems these days. I’ve heard estimates from within Razer that it costs about six times as much to manufacture as a typical 120W adapter, and is one of the many custom-designed and custom-made parts in the Blade.
Razer Blade (late 2012) - Final Thoughts
Though they are unwilling to release sales figures, Razer claims that the original Blade was an unqualified sales hit, with demand far outstripping supply in the first number of weeks after it went on sale. Undoubtedly, they weren’t manufacturing units in any huge volumes, but it’s safe to say that Razer themselves weren’t expecting the Blade to sell as fast as it did. But even though the original Blade was a success, it wasn’t really a hard product to follow up. It had a fair list of flaws, but an equal number of relatively obvious solutions. Jarred and Anand both mentioned that it probably would have made more sense for Razer to wait for Ivy Bridge/Kepler to launch the Blade, and from an end product standpoint I think they were right—the new Blade is a significant upgrade in almost every way.
But with that said, I think the original Blade was a very important product for Razer to launch from a process standpoint. The ability to launch a product successfully and support it post-sale was an important aspect for Razer to deliver on, and the original Blade allowed them to gain that experience while developing a superior followup. Now, this of course meant that the early adopters might feel jilted, and to that end, Razer decided to give a $500 discount to original Blade owners that wanted to pull the trigger on a new one.
The new Blade is a far more well rounded system than the original, with the computing power to match its looks and a far more robust hardware (thermal) and software platform to support it. They even gave it a price cut. Razer has really taken in the feedback they got from the media and their own customers to bring some significant improvements to the Blade. The only thing left that I really think is an important fix is the browsing experience in Switchblade. Based on what I know of the Switchblade UI computing backend, I’m not sure how feasible that is without a complete overhaul of the software platform (like a switch away from Windows CE6), but hopefully I’m wrong. I'd also like to see an optional SSD-only configuration, and perhaps an IPS display panel as well. Beyond that, there's hardly anything I would change with the new Blade.
I suspect that the Blade will remain a relatively niche product, as most notebooks in this price range tend to be—it’s still an expensive system, there is no doubt about that. But there are a grand total of three notebooks that I could see myself paying more than $2000 for, and the Blade is easily one of them. The others? The Retina MacBook Pro (I actually already did) and the M17x R4 with the GTX 680M. Obviously the Apple is a different story entirely, but between the Blade and the M17x, it really comes down to what you’re looking for in a gaming system and how much GPU horsepower you’re willing to give up for the sake of style and portability. That the Blade is actually part of this conversation is a testament to how far the new one has come.
If you're in for the highest gaming performance, it's still not the system for you—Alienware will give you a GTX 680M for the same price as the Blade, while Clevo and MSI can give you that GTX 680M for significantly less money. No matter how much compute and gaming performance have improved, the Blade still isn't a system that will win an out-and-out numbers game with the botique performance notebooks. The Blade is about more than that—it’s one of the most unique and interesting designs on the market, particularly in the world of mobile gaming systems. It's the best looking 17" notebook on the market, and it offers a portable gaming experience unlike any of its competitors. I used the term desirable to sum up the Blade last time around, and it's still probably the the best way to describe the new one as well. But this one has more performance to back up the style, and that just makes it all the more compelling.