The Acer Predator Triton 500 Laptop Review: Going Thin with GeForce RTX 2080
by Brett Howse on April 25, 2019 8:00 AM ESTSystem Performance
Acer offers just a single processor across the Predator Triton 500 lineup. Intel’s Core i7-8750H is a six-core processor with a 2.2 GHz base, and 4.1 GHz boost frequency. This is a Coffee Lake processor, and is the lowest tier of the hex-core i7 models available. But with six cores and twelve threads, it still offers a significant amount of performance in a 45-Watt envelope.
The base tier of this laptop ships with 16 GB of DDR4, and the review unit comes with the full 32 GB allotment. There’s two SODIMM slots if RAM upgrades are something you are into. For storage, Acer offers either 512 GB of NVMe SSD storage, or two 512 GB NVMe drives in RAID 0. I’m personally not a big fan of the RAID 0 thing, since a single larger drive would likely offer better real-world performance at less cost, but it tends to be a thing in gaming laptops unfortunately.
To test system performance, the Acer Predator Triton 500 was run through our laptop workloads. Graph comparisons are against other GTX 1070 and GTX 1080 laptops we’ve seen over the last couple of years, but if you’d like to compare the Triton 500 to any other laptop we’ve reviewed, please check out our online bench database.
PCMark
UL’s PCMark is a comprehensive system test, offering multiple workloads to stress various components. Since we’ve not had a lot of gaming laptops to test since PCMark 10 was released, PCMark 8 is also included in these results. PCMark 8 Creative wasn’t included due to an error on one of the tests. The hex-core CPU doesn’t do a lot for PCMark, which focuses more on office tasks and the like, but the Predator Triton 500 still performs well.
Cinebench
Recently Cinebench R20 was release, and we’ll be transitioning to it once we get some more data, but for this review R15 was used. The Core i7-8750H does well in the single-threaded test, and the extra cores provide a nice boost in the multi-threaded results. It can’t hang with the Core i9-8950HK in the GT75 Titan, but that device does have an 800 MHz frequency advantage.
x264
The x264 test converts a video using CPU, and is likes more cores and higher frequencies. The extra cores give the Triton 500 a speed boost over the quad-core models that used to ship in the 45-Watt range, but once again the Core i9 really stretches its legs here.
Web Tests
Unlike most benchmarks, web benchmarks are influenced heavily by the underlying browser, and since browsers are updated all of the time, performance can change over time as well. Normally it goes up, but we’ve standardized on Microsoft Edge since Windows 10 launched, and Edge performance has taken a step backwards over the last couple of updates.
Performance is still good, but there does seem to be a regression in Edge on some of these tests. When we move to the Chromium based Edge, we’ll likely take that opportunity to move into some new, more modern, web tests.
CPU Conclusion
Acer’s choice to go with the Core i7-8750H is a good one. It lets them compete on price, and the hex-core CPU offers great performance. It can’t quite keep up with the Core i9-8950HK, but it still offers stout performance in the 45-Watt class.
Storage Performance
Acer couples two NVMe PCIe 3.0 x 4 SSDs together in the highest model in their Triton 500 range, which is what we have to review. RAID 0 doesn’t really offer much of a benefit for most people on most tasks, although there’s little doubt it does boost storage benchmark results, which is likely why so many gaming laptops ship this way.
In sequential tests, the RAID 0 pretty much maxes out the PCIe link for read, although for write there’s no benefit of the RAID. It also doesn’t likely help much with the random results either, which is why the extra risk of running RAID 0 doesn’t really outweigh the added costs of having to purchase two drives. One single 1 TB quality drive would almost certainly outperform the 2 x 512 we have here.
46 Comments
View All Comments
shabby - Thursday, April 25, 2019 - link
Is it really a 2080 when the base clock is cut in half?Daeros - Thursday, April 25, 2019 - link
No, but that's nvida's game - they can honestly say it's the same chip, even though performance is a few steps down in the hierarchy. Just like that 8750h's TDP is nowhere near 45w - probably closer to 120w under load.Opencg - Thursday, April 25, 2019 - link
you can see in the cpu benchmarks that draw real power for a significant portion of time that it loses a good deal of performance. all in all its about where it should be for a laptop this thin. i would be surprised if it is really designed to handle more than 45w. personally i would bet it can start to throttle on sustained 45w loadsphilehidiot - Thursday, April 25, 2019 - link
I saw the 2080 and then the screen resolution. In reality, you'd probably want 4K + gsync for a shortish lifespan machine or 1440P for one with a good few years on it. 1080P says the performance is compromised and they HAD to drop it down. You'd never ever run a desktop on a 2080 with 1080P. I bought a gaming laptop once when I had a real need for it, back in the P4 days. The thing had about 6 fans and chucked out 50C hot air. I required it at the time but I'd never buy one now unless I absolutely needed it. That had 1050 lines, so 1080 isn't really a step up, it's a marketing ploy ("FULL HD!")This GPU can not be considered alongside a real 2080 and whilst I appreciate the screen size means resolutions greater that 1440P would be silly (and arguably even that but you must remember you're usually closer to a laptop screen and even a 6" mobile can benefit from the upgrade from 1080 to 1440), to me a gaming laptop generally is 17" anyway. If you go down this path you're rarely looking for real portability but more because you (in my experience) live in two or three different places and want to take a full gaming PC with you with your suitcase and so on.
wintermute000 - Thursday, April 25, 2019 - link
Exactly, a 2060 would have been perfect for 1080p 144hz and then maybe the cooling would have coped.Must be a marketing decision to shove the biggest number into the spec sheet....
PeachNCream - Friday, May 3, 2019 - link
I would not mind the slightest pushing 1080p resolutions with a 2080 GPU, but not with this particular laptop given the network adapter selection. It just isn't worth messing with Killer NICs at all when there are other options out there.wintermute000 - Thursday, April 25, 2019 - link
a.) The TDP as defined by intel (i.e. base clock) IS 45W.b.) Power under boost is much higher for sure, but 120W is a total exaggeration. I can get it to run steady on 3.6Ghz (thermal throttling is a different question LOL) on around 60W with an undervolt.
3.) It would take a mean cooler and power delivery / VRMs on a laptop chassis to let it boost anywhere near its paper specs for long durations. I haven't looked at the built-like-a-tank laptops in depth but none of the mainstream designs have managed it so far.
wintermute000 - Thursday, April 25, 2019 - link
by it I mean the i7-8750H (in an XPS 9570 if that matters)Retycint - Saturday, April 27, 2019 - link
Intel's definition of TDP is very much meaningless because they can change the base clock to fit in the TDP envelope. The i7-8750H maintained the 45W TDP despite having 2 more cores than the 7700HQ, not because the former has had a huge leap in efficiency, but rather because Intel dropped the base clock from 2.8 to 2.2GHz.In other words, Intel can theoretically claim that the 9750H has a 10W TDP, when at the base clock of 0.8 GHz, for instance. Which is why TDP numbers are bull
jordanclock - Thursday, April 25, 2019 - link
Welcome to Max Q! Where the models are made up and the clocks don't matter!