System Performance

The XPS 15 is offered with three processor choices. The base model is the 35-Watt Intel Core i3-6100H, which is a dual-core Skylake chip with a 2.7 GHz frequency and 3 MB of cache. While I’m sure that it’s fine for most tasks, the base model is also lacking a discrete GPU and means you are only going to be using the integrated graphics, which in this case is the HD 530. I would expect the bulk of Dell’s sales to be the Core i5 and i7 models, which also come with the NVIDIA GeForce GTX 960M graphics card. The Core i5-6300HQ is a quad-core, 45-Watt part, that runs between 2.3 and 3.2 GHz, and it has a 6 MB cache. The top option is the Core i7-6700HQ, which includes HyperThreading (the Core i5 lacks this but the Core i3 has it) and a base frequency of 2.6 GHz with a turbo frequency of 3.5 GHz. The Core i7 is the model that was provided to us for this review.

On the memory side, this device includes two SODIMM slots which is a nice bonus for upgradability. The base offering is 8 GB of DDR4-2133, and you can buy it with 16 GB as well, or if you want to add your own DIMMs you can put 32 GB in this machine. For storage, hybrid hard drives are at the low end of the device range, but Dell ramps up to PCIe based SSDs on the higher priced models. The review unit has a PM951 Samsung drive, which is the TLC version that we’ve already seen in many notebooks this year. Read speeds are generally great, but write speeds will be among the slower PCIe SSDs thanks to the TLC.

I’ve run the XPS 15 through our standard notebook suite, along with a couple of other tests as well. All laptops in the charts below are from our Notebook Bench, and if you’d like to compare the XPS 15 to any other device we’ve tested, please check it out here. The previous generation XPS 15 9530 in the charts has the Core i7-4702HQ processor, GTX 750M GPU, 3200x1800 display, and a 91 Wh battery. The Lenovo Y700 is a device that we just reviewed and has the same CPU and GPU as the XPS 15 9550, and I thought it would be interesting to also see the XPS 13 here as well, although this is the Broadwell version we reviewed last year. This is a Core i5-5200U CPU, 3200x1800 display, and 8 GB of memory.

PCMark

PCMark 8 - Home

PCMark 8 - Creative

PCMark 8 - Work

PCMark 7 (2013)

PCMark attempts to recreate actual workloads that people would use every day, and with version 8 they have several tests to focus on workloads for those tasks. Home includes web browsing, gaming, photo editing, and video chat. Creative has web browsing, photo editing, group video chat, transcoding, and some gaming, and Work has document editing, spreadsheets, and video chat. Pretty much all aspects of the device are tested, and even things like the display resolution can impact the score. The UHD resolution on the review unit impacts these scores quite a bit in fact, with the XPS 15 often quite low compared to the Lenovo Y700 which has the same CPU and GPU but a 1920x1080 display.

Cinebench

Cinebench R15 - Single-Threaded Benchmark

Cinebench R15 - Multi-Threaded Benchmark

Cinebench R11.5 - Single-Threaded Benchmark

Cinebench R11.5 - Multi-Threaded Benchmark

Cinebench is a CPU heavy workload which renders an image. It can use all cores and likes more MHz as well. Just like the Lenovo Y700, I found the XPS 15 wasn’t a big jump in performance with Skylake compared to Broadwell and even some of the later Haswell Core i7 parts. It is however a sizeable jump over the outgoing XPS 15 9530.

x264

x264 HD 5.x

x264 HD 5.x

This test does video transcoding, and much like Cinebench is strongly influenced by CPU performance. More cores and higher frequencies are the name of the game here. Just like Cinebench, there also isn’t an increase in performance with Skylake on these tests compared to Broadwell or later generation Haswell chips like the i7-4720HQ, which can turbo up to 3.6 GHz compared to 3.5 GHz on the i7-6700HQ. But it is still a big jump over the i7-4702HQ found in the XPS 15 9530.

Web Tests

Mozilla Kraken 1.1

Google Octane 2.0

WebXPRT 2015

WebXPRT 2013

I’ve mentioned this a few times already but its worth repeating. Since the launch of Windows 10, we’ve switched from using Google Chrome for web testing to Microsoft Edge. Edge has performance that is quite a bit closer to Chrome now, surpassing it in some tests and behind in others, but both are capable browsers. As such, I’ve labeled the older laptops to let you know which browser was used at the time they were tested.

With a quad-core Skylake processor, which now supports Intel’s Speed Shift technology, the XPS 15 scores very well in our web results. The bursty nature of the web tests really plays into the hands of Speed Shift and lets the processor quickly get up to maximum frequency to perform the task, and make for a more responsive browsing experience.

Design GPU Performance
Comments Locked

152 Comments

View All Comments

  • nerd1 - Saturday, March 12, 2016 - link

    It's SWITCHABLE. So it won't drain battery when not in use. And it can play most games at 1080p. It's the same GPU as Alienware 13.

    External GPU is stupid idea in general. You'need a huge separate case and separate monitor, which is as large an often just as expensive as building a separate gaming desktop.
  • Valantar - Saturday, March 12, 2016 - link

    No, no, and no.

    Even optimus-enabled laptops draw more power than dGPU-less equivalents - the dGPU remains "on", but sleeping. Optimus isn't a hard power-off switch. Also, Heat production is a major concern, which isn't really solveable in a thin+light laptop form factor. I'd rather have a ~45W-60W quad core with Iris/- Pro than a 45W quad core with HD 520 + a 65W dGPU.

    Second, you don't need a huge case for an external GPU - that current models are huge is a product of every single one presented aiming for compatibiliby with even the most overpowered GPUs (like the Fury or 980Ti). A case for dual slot mITX sized (17cm length) dektop GPUs with an integrated ~200W power supply wouldn't need to be much larger than, say, 10x20x15cm. That's roughly the size of an external dual HDD case. As the market grows (well, technically as the market comes into existence in this case), more options will appear, and not just huge ones.

    Also, there's the option of mobile GPUs in tiny cases. Most of these come in the form of MXM modules, which are 82x100mm. Add a case around that with a beefy blower cooler, and you'd have a box roughly the size of an external 2.5" HDD, but 2-3 times as thick. Easy to carry, easy to connect. Heck, given that USB 3.1/TB3 can carry 100W of power, you could even power this from the laptop, although I don't think that's a great idea for long term use. I'd rather double the width of the external box and add a beefy battery. Thin and light laptop with great battery life? Check. Gaming power? Check. Better battery life than the same laptop with a built-in dGPU? Check.

    And third, in case you aren't keeping up with recent announcements, you should read up on modern external GPU tech in regard to gaming on the laptop screen. At least with AMD's (/Intel's/Razer's) solution, this is not a problem. We'll see if Nvidia jumps on board or if they keep being jerks in terms of standards.
  • nerd1 - Saturday, March 12, 2016 - link

    Have you EVER used a Optimus enabled laptop? My XPS 15 typically draws around 10W during idle, which means the GPU should draw less than a couple of watts while idle. I can say that is negligible, considering XPS 15 can equip 84 Wh of battery. Just do the math.

    And iris pro is a joke, it's much weaker than 940m, which can barely play old games at 768p. 960m is roughly 3-4 times more powerful and can play most games at 1080p.

    Finally external GPU. External GPU should have its own case, its own PSU and should be connected to external display unless you want to halve the bandwidth. Then why just don't add CPU and SSD to make a standalone gaming desktop? Dell Graphics Amp cost $300, and Razer one will cost much more (naturally) One can build decent desktop (minus GPU) at that price.
  • Valantar - Tuesday, March 15, 2016 - link

    If the GPU - that is supposed to be switched off! - uses "less than a couple of watts" out of 10W total power draw, that is <20% - definitely not negligible. Of course, with the review unit the 4K display adds to it's subpar battery life, but I have no doubt that the dGPU adds to it as well.

    And sure, Broadwell Iris Pro was only suitable for 720p gaming, as seen in ATs Broadwell Desktop review. However, Skylake Iris Pro increases EUs by 50% (from 48 to 72), in addition to reduced power and other architectural improvements. It's not out yet, but it will be soon, and it will eat 940ms for breakfast. This would be _perfect_ for mobile use - able to run most games at an acceptable resolution and detail level if absolutely necessary, but with an eGPU for proper gaming.

    And lastly: "External GPU should have its own case, its own PSU and should be connected to external display unless you want to halve the bandwidth." What on earth are you talking about? Why SHOULD they do any of this? Of course it would need it's own case and some form of power supply - but that case could easily be a very compact one, and the power supply could be through TB3 if you wanted to, although (as stated in my previous posts), I'd argue for dedicated power supplies and/or dedicated batteries for these. And why would using the integrated display halve the bandwidth? It wouldn't use DP alt mode for this, the signal would be transferred through PCIe - exactly the same as dGPUs in laptops do today. You'd never know the difference: except for some completely imperceptible added latency, this should perform exactly like your current laptop, given the same GPU.

    Your argument goes something like this:
    You: eGPU cases are too bulky and expensive, and you need to buy a monitor!
    Me: They don't have to be, and you don't have to!
    You: But that's how it SHOULD be!

    This makes no sense.

    Of course, there is a difference of opinion here. You want a middling dGPU in your laptop, and don't mind the added weight, heat and power draw. I do mind these, and don't want it - I'd much rather add an external GPU with more processing power. I'm not arguing to remove your option, only to add another, more flexible one. Is that really such a problem for you?
  • nerd1 - Wednesday, March 16, 2016 - link

    a) You haven't provided ANY proof that idle GPU with Optimus affects the battery life a lot. And based on your logic, more powerful iGPU will increase the power consumption JUST AS WELL.

    b) Intel has been claiming great iGPU performance FOR YEARS. Yet the actual game framerates with iGPU are much worse than benchmark results. For example, BioShock infinite with high setting and 720p shows ~35fps with Iris pro 5200, and ~56fps on 940m with DDR5 (one in the surface book), 115fps with 960m.

    c) You need external monitor otherwise the bandwidth will be halved (as you need to send the video signal back to the laptop), hampering the GPU performance. There already are external GPU benchmarks around. Please go check.

    d) No sane person will make/buy external GPU system that only accepts very expensive mobile GPUs, which makes the whole point of external GPU moot. External GPU box should at least be able to top double-width GPU (980 ti for example) that requires PSU of at least 300 Watts range.

    And you are arguing to remove the dGPU from the system, which is given almost for free (unlike Apple), and way more powerful than whatever iGPU intel has now, and does not affect the battery life too much.
  • Valantar - Thursday, March 17, 2016 - link

    Wow, you really seem opposed to people actually getting to choose what they want.

    a) The proof for added power draw with idle optimus GPUs is quite simple: as long as it's not entirely powered off, it will be consuming more power than if it wasn't there. That is indisputable. And sure, a larger iGPU would have higher power draw under load, and possibly at idle as well. But it still wouldn't require to power an entire separate card, separate power delivery circuitry, and other dGPU components. So of course, higher performance iGPUs consume more power, but not nearly as much as a dGPU, and probably less than a less powerful iGPU + sleeping dGPU.

    b) Sure, Intel has been claiming great iGPU performance for years. Last year, they kinda-sorta delivered with Broadwell Iris Pro (48 eu + 128mb eDRAM), but not really in terms of performance vs. power. This year, there are Iris 540/550 (48 eu + 64MB eDRAM) units with better eDRAM utilization and lower power (BDW was 45-65W, SKL is 15W and upwards with Iris 540, 28W with Iris 550). And then there will soon be Iris Pro 580, with 72 EUs and 128MB eDRAM at the same frequencies, at 45W and upwards. Given GPU parallelism, it's reasonable to expect the Iris Pro 580 to perform ~40% better than the BDW Iris pro given the 50% increase in EUs (in addition to other improvements). In your BioShock Infinite example that falls short of matching the 940m (a 40% increase from 35fps is still only 49fps).

    But - and this is the big one - that's comparing it with a separate ~30W (probably slightly higher with the Surface Book due to GDDR5s higher power draw than DDR3) GPU. So, in 45W, you could get a dual core i7 w/low end iGPU (Iris not needed for gaming in this scenario), and a GT 940m. Or you could get a Quad Core i7 with higher base and turbo clocks and an Iris Pro iGPU. Comparing those two hypothetical systems (with identical power draw, and as such fitting in the same chassis) my money would be on the Iris Pro all the way. Sure, you could easily beat this with a 945m, 950m or 960m. But those are 40, 50 and 60W GPUs, respectively. And you'd really want a quad core to saturate those graphics cards, in which case you're stuck with Intel's 45W series. Which leaves you with the need for a drastically larger chassis and higher powered fans to cool this off, as you'd need to remove at least 85W of heat. Sure, there is a market and a demand for those PCs. I'm just saying that there is a market for lower powered solutions as well, and if a 45W system can perform within 90+% of a 75W one, I'd go for the low powered one every time. You might not, but arguing that systems like these shouldn't exist just because you don't want them is just plain dumb.

    Edit: ah, bugger, I just noticed that the Iris Pro you're citing (5200) isn't BDW but the HSW version - which had fewer EUs still (40, not 48), and is architecturally far inferior to more current solutions. Which only means that my estimates are really lowball, and could probably be increased by another 20% or so.

    c) You seem VERY sure that it would (specifically) halve bandwidth. What is this based on? DP alt mode over USB 3.1 using two lanes? If so: you realize that TB3 and USB 3.1 are not the same, right? Also, that supposes that the GPU transfers this data as a DP signal - which has 20% overhead, compared to the 1,54% overhead of PCIe 3.0. So, it would seem far more logical for the eGPU to transfer the image back through PCIe, and have it converted to eDP at a later stage (or in the iGPU, as Optimus does: http://www.trustedreviews.com/opinions/nvidia-opti... Also, given that AMD specifically mentioned gaming on laptop monitors with XConnect, don't you think this is an issue that their engineers have both discussed and researched quite thoroughly? Sure, there are external GPU benchmarks around. Based either on unreleased (and thus unfinished) technology, or DIY or proprietary solutions that are bound to be technically inferior to an industry standard. Am I saying performance will be unaffected? No, but I'm saying that you probably wouldn't notice.

    d) Says you. Sorry, but why not? People buy laptops with these "very expensive mobile GPUs" already installed, right? And sure, the case and PSU would add a small cost (60-90W power bricks are VERY cheap for OEMs, but the TB3 controller, cooling solution and case would cost a bit. I'd say the total would be <$100. First gen ones would probably be expensive, but then they would drop. Quickly.). Saying that ALL eGPU boxes should handle EVERY full size GPU at max power is beyond dumb. Not even all computer cases fit those cards, yet the still sell. Should some? Sure, like the Razer Core. But that's the top-end solution. Not everyone needs - or wants - a case that could fit and power a 980ti. Saying that there shouldn't be lower-end solutions is such a baffling idea that I really don't know how to argue with that - except to say: Why the hell not? It would be EASY for an OEM to build an eGPU case with support for cards up to, say, 150W (which encompasses the vast majority of desktop GPU sales) for $200.

    And giving you the GPU for free? Really? Let's look at the $1000 and $1200 versions of the XPS 15. For $200, you get the i5 over the i3, twice the HDD storage, and the 960m. The Recommended Customer Price for the two CPUs is $250 and $225, respectively (http://ark.intel.com/compare/88959,89063). So that's $25 of your $200. At Newegg, the cheapest 500GB 2,5" drive is $39,99, while the cheapest 1TB one is $49,99. And of course, these are retail products at retail prices with retail margins included. OEMs pay far less than this. But I'll be generous, and give you the full $10 difference. So far, we've accounted for $35 of your $200 price hike, and there is one component left. In other words, you're paying $165 for that GPU. The only reason the i5 isn't in the $1000 model is that Dell wants product differentiation.

    I for one would gladly buy a $1300 XPS 15 with the i5 and the 960m moved to an external case - or, of course, even more to add a more powerful GPU. The point is the flexibility and upgradeability. Many people are very willing to pay for that, especially when it comes to laptop GPUs.
  • carrotseed - Wednesday, March 9, 2016 - link

    the 15.6in variant would be perfect those working with numbers if there is an option for a keyboard with a separate number pad
  • Valantar - Saturday, March 12, 2016 - link

    There's no room for a numpad due to the smaller chassis - you could fit one, but it would require shrinking keys and ruining the keyboard. No thanks.
  • carrotseed - Saturday, March 12, 2016 - link

    you can keep the keyboard the same size and shrink the numpad a little. Do you see those big empty spaces on both sides of those keyboards on the XPS 15? I'm sure the small cost of having two options for the keyboard can be passed on to those who will opt for the numpad variant.
  • carrotseed - Saturday, March 12, 2016 - link

    erratum: **...the keyboard on the XPS 15***

Log in

Don't have an account? Sign up now