Final Words

After everything we've seen so far, what is there to say about the XPS 15? As a pure exercise in style, mimicking the look and feel of the XPS 13 was definitely the right way to go. The same Infinity Edge display allows Dell to squeeze a 15-inch notebook into the space that most 14-inch models take up. The aluminum outside feels great in the hand, but the contrasting dark carbon fibre weave on the keyboard deck makes the keys easier to read, and it also does a better job resisting fingerprints. The soft touch coating makes typing on the XPS 15 very comfortable.


XPS 15 Compared to 15.6-inch Lenovo Y700

The keyboard is roughly (if not exactly) the same as the XPS 13 keyboard, which I liked. The 1.3 mm of travel is kind of shallow for a laptop this large, but overall it is pretty good. There are better keyboards around, but it would not take long to get used to typing on the XPS 15. The trackpad is excellent, with plenty of room to work, and nice smooth scrolling. The Microsoft Precision Touchpad drivers lack some of the customizability of other trackpad drivers, but the gestures available are enough for what I need.

Performance is very good, thanks to a quad-core Skylake processor and a NVIDIA GTX 960M graphics card. This isn’t a dedicated gaming system, but the GPU can hold its own and even allow you to play modern games as long as you are OK turning the graphics down a bit. The CPU performance is strong, although as with the Lenovo Y700, the Skylake quad-core didn’t bring a big jump in performance over Broadwell.

The display shipped on the review unit is the 3840x2160 UHD panel with support for the Adobe RGB color space, but the wider gamut can't make up for the disappointing accuracy out of the box. Once calibrated though, this display can hold its own with pretty much anything out there. Text is very crisp, and colors are very vibrant. It’s a shame that there’s not an easier way to use Adobe RGB, but with more devices starting to support this color space perhaps Windows will work on how it deals with different gamuts. We can hope.

The downside of the over 8 million pixels though is the less than amazing battery life. With a large 84 Wh battery, I was hoping for more than 7.5 hours on our light test, but that wasn’t the case. The efficiency is not fantastic, and it is also hindered by LED backlighting that supports a higher gamut, although we did run our testing on sRGB mode. Overall battery life isn’t much different than the XPS 15 9530 that we tested a while back, despite the IGZO display and latest generation processor. It’s hard to get around driving light through that many pixels. The 1920x1080 IPS panel offered in the base model would certainly help here, though not having tested that model it's hard to say just how much it would help.

Dell didn’t load the XPS 15 down with too much extra software either, which is nice. The Dell PremierColor application is great though, and being able to make improvements to the built in Windows Snap assist is a good move. Much of the time I’m not interested in extra software, but if an OEM can improve upon something that is built in, it’s hard to argue with them adding it in.

Overall, the XPS 15 is one of the sleekest 15-inch laptops on the market. If I was looking to purchase something of this size, the XPS 15 would be near the top of my list, thanks to the excellent build quality, great design, and compact size. When you work it, it does get loud, but the combination of good qualities in the XPS 15 are hard to ignore.

Wireless, Thermals, Noise, and Audio
Comments Locked

152 Comments

View All Comments

  • nerd1 - Saturday, March 12, 2016 - link

    It's SWITCHABLE. So it won't drain battery when not in use. And it can play most games at 1080p. It's the same GPU as Alienware 13.

    External GPU is stupid idea in general. You'need a huge separate case and separate monitor, which is as large an often just as expensive as building a separate gaming desktop.
  • Valantar - Saturday, March 12, 2016 - link

    No, no, and no.

    Even optimus-enabled laptops draw more power than dGPU-less equivalents - the dGPU remains "on", but sleeping. Optimus isn't a hard power-off switch. Also, Heat production is a major concern, which isn't really solveable in a thin+light laptop form factor. I'd rather have a ~45W-60W quad core with Iris/- Pro than a 45W quad core with HD 520 + a 65W dGPU.

    Second, you don't need a huge case for an external GPU - that current models are huge is a product of every single one presented aiming for compatibiliby with even the most overpowered GPUs (like the Fury or 980Ti). A case for dual slot mITX sized (17cm length) dektop GPUs with an integrated ~200W power supply wouldn't need to be much larger than, say, 10x20x15cm. That's roughly the size of an external dual HDD case. As the market grows (well, technically as the market comes into existence in this case), more options will appear, and not just huge ones.

    Also, there's the option of mobile GPUs in tiny cases. Most of these come in the form of MXM modules, which are 82x100mm. Add a case around that with a beefy blower cooler, and you'd have a box roughly the size of an external 2.5" HDD, but 2-3 times as thick. Easy to carry, easy to connect. Heck, given that USB 3.1/TB3 can carry 100W of power, you could even power this from the laptop, although I don't think that's a great idea for long term use. I'd rather double the width of the external box and add a beefy battery. Thin and light laptop with great battery life? Check. Gaming power? Check. Better battery life than the same laptop with a built-in dGPU? Check.

    And third, in case you aren't keeping up with recent announcements, you should read up on modern external GPU tech in regard to gaming on the laptop screen. At least with AMD's (/Intel's/Razer's) solution, this is not a problem. We'll see if Nvidia jumps on board or if they keep being jerks in terms of standards.
  • nerd1 - Saturday, March 12, 2016 - link

    Have you EVER used a Optimus enabled laptop? My XPS 15 typically draws around 10W during idle, which means the GPU should draw less than a couple of watts while idle. I can say that is negligible, considering XPS 15 can equip 84 Wh of battery. Just do the math.

    And iris pro is a joke, it's much weaker than 940m, which can barely play old games at 768p. 960m is roughly 3-4 times more powerful and can play most games at 1080p.

    Finally external GPU. External GPU should have its own case, its own PSU and should be connected to external display unless you want to halve the bandwidth. Then why just don't add CPU and SSD to make a standalone gaming desktop? Dell Graphics Amp cost $300, and Razer one will cost much more (naturally) One can build decent desktop (minus GPU) at that price.
  • Valantar - Tuesday, March 15, 2016 - link

    If the GPU - that is supposed to be switched off! - uses "less than a couple of watts" out of 10W total power draw, that is <20% - definitely not negligible. Of course, with the review unit the 4K display adds to it's subpar battery life, but I have no doubt that the dGPU adds to it as well.

    And sure, Broadwell Iris Pro was only suitable for 720p gaming, as seen in ATs Broadwell Desktop review. However, Skylake Iris Pro increases EUs by 50% (from 48 to 72), in addition to reduced power and other architectural improvements. It's not out yet, but it will be soon, and it will eat 940ms for breakfast. This would be _perfect_ for mobile use - able to run most games at an acceptable resolution and detail level if absolutely necessary, but with an eGPU for proper gaming.

    And lastly: "External GPU should have its own case, its own PSU and should be connected to external display unless you want to halve the bandwidth." What on earth are you talking about? Why SHOULD they do any of this? Of course it would need it's own case and some form of power supply - but that case could easily be a very compact one, and the power supply could be through TB3 if you wanted to, although (as stated in my previous posts), I'd argue for dedicated power supplies and/or dedicated batteries for these. And why would using the integrated display halve the bandwidth? It wouldn't use DP alt mode for this, the signal would be transferred through PCIe - exactly the same as dGPUs in laptops do today. You'd never know the difference: except for some completely imperceptible added latency, this should perform exactly like your current laptop, given the same GPU.

    Your argument goes something like this:
    You: eGPU cases are too bulky and expensive, and you need to buy a monitor!
    Me: They don't have to be, and you don't have to!
    You: But that's how it SHOULD be!

    This makes no sense.

    Of course, there is a difference of opinion here. You want a middling dGPU in your laptop, and don't mind the added weight, heat and power draw. I do mind these, and don't want it - I'd much rather add an external GPU with more processing power. I'm not arguing to remove your option, only to add another, more flexible one. Is that really such a problem for you?
  • nerd1 - Wednesday, March 16, 2016 - link

    a) You haven't provided ANY proof that idle GPU with Optimus affects the battery life a lot. And based on your logic, more powerful iGPU will increase the power consumption JUST AS WELL.

    b) Intel has been claiming great iGPU performance FOR YEARS. Yet the actual game framerates with iGPU are much worse than benchmark results. For example, BioShock infinite with high setting and 720p shows ~35fps with Iris pro 5200, and ~56fps on 940m with DDR5 (one in the surface book), 115fps with 960m.

    c) You need external monitor otherwise the bandwidth will be halved (as you need to send the video signal back to the laptop), hampering the GPU performance. There already are external GPU benchmarks around. Please go check.

    d) No sane person will make/buy external GPU system that only accepts very expensive mobile GPUs, which makes the whole point of external GPU moot. External GPU box should at least be able to top double-width GPU (980 ti for example) that requires PSU of at least 300 Watts range.

    And you are arguing to remove the dGPU from the system, which is given almost for free (unlike Apple), and way more powerful than whatever iGPU intel has now, and does not affect the battery life too much.
  • Valantar - Thursday, March 17, 2016 - link

    Wow, you really seem opposed to people actually getting to choose what they want.

    a) The proof for added power draw with idle optimus GPUs is quite simple: as long as it's not entirely powered off, it will be consuming more power than if it wasn't there. That is indisputable. And sure, a larger iGPU would have higher power draw under load, and possibly at idle as well. But it still wouldn't require to power an entire separate card, separate power delivery circuitry, and other dGPU components. So of course, higher performance iGPUs consume more power, but not nearly as much as a dGPU, and probably less than a less powerful iGPU + sleeping dGPU.

    b) Sure, Intel has been claiming great iGPU performance for years. Last year, they kinda-sorta delivered with Broadwell Iris Pro (48 eu + 128mb eDRAM), but not really in terms of performance vs. power. This year, there are Iris 540/550 (48 eu + 64MB eDRAM) units with better eDRAM utilization and lower power (BDW was 45-65W, SKL is 15W and upwards with Iris 540, 28W with Iris 550). And then there will soon be Iris Pro 580, with 72 EUs and 128MB eDRAM at the same frequencies, at 45W and upwards. Given GPU parallelism, it's reasonable to expect the Iris Pro 580 to perform ~40% better than the BDW Iris pro given the 50% increase in EUs (in addition to other improvements). In your BioShock Infinite example that falls short of matching the 940m (a 40% increase from 35fps is still only 49fps).

    But - and this is the big one - that's comparing it with a separate ~30W (probably slightly higher with the Surface Book due to GDDR5s higher power draw than DDR3) GPU. So, in 45W, you could get a dual core i7 w/low end iGPU (Iris not needed for gaming in this scenario), and a GT 940m. Or you could get a Quad Core i7 with higher base and turbo clocks and an Iris Pro iGPU. Comparing those two hypothetical systems (with identical power draw, and as such fitting in the same chassis) my money would be on the Iris Pro all the way. Sure, you could easily beat this with a 945m, 950m or 960m. But those are 40, 50 and 60W GPUs, respectively. And you'd really want a quad core to saturate those graphics cards, in which case you're stuck with Intel's 45W series. Which leaves you with the need for a drastically larger chassis and higher powered fans to cool this off, as you'd need to remove at least 85W of heat. Sure, there is a market and a demand for those PCs. I'm just saying that there is a market for lower powered solutions as well, and if a 45W system can perform within 90+% of a 75W one, I'd go for the low powered one every time. You might not, but arguing that systems like these shouldn't exist just because you don't want them is just plain dumb.

    Edit: ah, bugger, I just noticed that the Iris Pro you're citing (5200) isn't BDW but the HSW version - which had fewer EUs still (40, not 48), and is architecturally far inferior to more current solutions. Which only means that my estimates are really lowball, and could probably be increased by another 20% or so.

    c) You seem VERY sure that it would (specifically) halve bandwidth. What is this based on? DP alt mode over USB 3.1 using two lanes? If so: you realize that TB3 and USB 3.1 are not the same, right? Also, that supposes that the GPU transfers this data as a DP signal - which has 20% overhead, compared to the 1,54% overhead of PCIe 3.0. So, it would seem far more logical for the eGPU to transfer the image back through PCIe, and have it converted to eDP at a later stage (or in the iGPU, as Optimus does: http://www.trustedreviews.com/opinions/nvidia-opti... Also, given that AMD specifically mentioned gaming on laptop monitors with XConnect, don't you think this is an issue that their engineers have both discussed and researched quite thoroughly? Sure, there are external GPU benchmarks around. Based either on unreleased (and thus unfinished) technology, or DIY or proprietary solutions that are bound to be technically inferior to an industry standard. Am I saying performance will be unaffected? No, but I'm saying that you probably wouldn't notice.

    d) Says you. Sorry, but why not? People buy laptops with these "very expensive mobile GPUs" already installed, right? And sure, the case and PSU would add a small cost (60-90W power bricks are VERY cheap for OEMs, but the TB3 controller, cooling solution and case would cost a bit. I'd say the total would be <$100. First gen ones would probably be expensive, but then they would drop. Quickly.). Saying that ALL eGPU boxes should handle EVERY full size GPU at max power is beyond dumb. Not even all computer cases fit those cards, yet the still sell. Should some? Sure, like the Razer Core. But that's the top-end solution. Not everyone needs - or wants - a case that could fit and power a 980ti. Saying that there shouldn't be lower-end solutions is such a baffling idea that I really don't know how to argue with that - except to say: Why the hell not? It would be EASY for an OEM to build an eGPU case with support for cards up to, say, 150W (which encompasses the vast majority of desktop GPU sales) for $200.

    And giving you the GPU for free? Really? Let's look at the $1000 and $1200 versions of the XPS 15. For $200, you get the i5 over the i3, twice the HDD storage, and the 960m. The Recommended Customer Price for the two CPUs is $250 and $225, respectively (http://ark.intel.com/compare/88959,89063). So that's $25 of your $200. At Newegg, the cheapest 500GB 2,5" drive is $39,99, while the cheapest 1TB one is $49,99. And of course, these are retail products at retail prices with retail margins included. OEMs pay far less than this. But I'll be generous, and give you the full $10 difference. So far, we've accounted for $35 of your $200 price hike, and there is one component left. In other words, you're paying $165 for that GPU. The only reason the i5 isn't in the $1000 model is that Dell wants product differentiation.

    I for one would gladly buy a $1300 XPS 15 with the i5 and the 960m moved to an external case - or, of course, even more to add a more powerful GPU. The point is the flexibility and upgradeability. Many people are very willing to pay for that, especially when it comes to laptop GPUs.
  • carrotseed - Wednesday, March 9, 2016 - link

    the 15.6in variant would be perfect those working with numbers if there is an option for a keyboard with a separate number pad
  • Valantar - Saturday, March 12, 2016 - link

    There's no room for a numpad due to the smaller chassis - you could fit one, but it would require shrinking keys and ruining the keyboard. No thanks.
  • carrotseed - Saturday, March 12, 2016 - link

    you can keep the keyboard the same size and shrink the numpad a little. Do you see those big empty spaces on both sides of those keyboards on the XPS 15? I'm sure the small cost of having two options for the keyboard can be passed on to those who will opt for the numpad variant.
  • carrotseed - Saturday, March 12, 2016 - link

    erratum: **...the keyboard on the XPS 15***

Log in

Don't have an account? Sign up now