Wireless

Mythlogic offers a couple of different network adapters for the Phobos 8716. You can get the Intel Dual Band Wireless-AC 8260, or the Killer Wireless-AC 1535. The laptop also features two Killer E2400 Gigabit network adapters. If you go with the Killer solution, like the review unit, you’ll get access to the Killer software suite, as well as the DoubleShot Pro which allows some traffic to be done over wireless, while other traffic is done over the Ethernet. Killer, as a company, focuses on maintaining gaming performance regardless of the other networking activities are going on, and their software can help with gaming latency. It also features MU-MIMO, which is not a feature found on the Intel 8260, although it is supported in the newly announced 8265 model.

WiFi Performance - TCP

After some initial issues with the Killer card, some updated drivers really helped out. I was only averaging about 120 Mbps on our test download, but a new set of drivers from Killer brought the result up to 440 Mbps, which is much closer to what I was expecting, having tested other laptops with this same card before.

I also had a few network disconnections on the old driver, so if you do have this card make sure you update the drivers from the Rivet site, and not through Windows Update which doesn't have the latest version.

The same test was performed over Ethernet, and it downloaded at over 900 Mbps.

Audio

The Clevo P870DM2 / Mythlogic Phobos 8716 comes with the SoundBlaster X-FI MB5 onboard audio, and for a look at this solution check out Creative’s page. The Creative software gives a bunch of gaming settings to play with, as well as different profiles to tune the audio to your liking. There’s EAX settings, and even a Scout Mode where you can have the system increase the sound of enemy players so you can hear them from further away. I’m not sure if that’s cheating, but it’s interesting.

The speakers themselves are mounted beside the hinge, and fire upwards which is always a benefit. The sound quality is pretty decent for a notebook, and it is certainly loud enough, with a measured volume of 90 dB when playing back music, with the SPL meter 1-inch over the trackpad.

The Clevo also has a full assortment of audio jacks on the left side of the notebook, much like a desktop. Instead of a single headset jack, there is a headphone jack, a microphone jack, a line-in jack, a line-out jack, and also a S/PDIF output shared with the headphone. If you use the HDMI, you can also get 7.1 output. It would be nice to have these jacks color coded though to make it easier to determine which is which, when trying to plug in headphones in a dim room. There was no issues once plugged in though, and the headphones sounded great.

Thermals and Noise

All of this performance is not going to be very useful if the notebook can’t keep the thermals in check. Luckily the Clevo / Mythlogic laptop is a bit of a beast, tipping the scales at over 12 lbs, so there is plenty of room for fans, and lots of room for heatsinks. To test the thermal capabilities, Rise of the Tomb Raider was played on Very High settings, for about an hour.

There’s a few things to note in the data. First, the GPU temperature rises to the 90°C limit and never exceeds it. Second, the GPU core clock goes up to about 1900 MHz in boost, until it gets thermally limited and falls down to around 1600 MHz. It isn’t throttling at this point, but moving into the steady state. It never falls below the base clock, but is instead likely hitting its TDP limit for the mobile variant of the GPU. Finally, Rise of the Tomb Raider can eat up huge amounts of RAM, so if you’re going to play this game on Very High, be sure your GPU has more the 4 GB.

On the noise side, the fans are on continuously, but when working on the desktop, they never get too loud unless you are pushing the laptop. At idle, the sound level is 39.5 dB(A) with the SPL meter an inch over the trackpad. Some gaming laptops can be screaming jet engines when under load, but this one is not, and even after an hour of gaming the SPL level was only 49.5 dB(A). You would still likely want to wear headphones when gaming, but the laptop doesn’t get hot and it never gets overly loud. This is with the fans at their default settings.

Software

Most gaming laptops come with a bit of software for things like macros, and the Mythlogic Phobos 8716 has a full compliment of utilities, but no extra things like you’d see on mainstream OEM laptops. It’s refreshing to see a clean experience, and it’s worth giving a shout out to the crew at Mythlogic for avoiding that trap. What they do include is all purposefully put there.

The basics are the utilities like the Killer networking, GeForce Experience (which I am not a fan of having to log into now in order to do anything – thanks NVIDIA), and the SoundBlaster software for configuration of the installed hardware. But Mythlogic also adds their own tools to do a couple of things.

Mythlogic has a control center, which allows you to change the device profile quickly among quiet, power saving, performance, and entertainment modes, all which set the display and fan settings for those modes. You can also set the volume, and turn off the display. Fan speeds can be configured with some built-in profiles, or you can do a custom one, or just set them to maximum, which is going to give you some more headroom for the internal components, but at 65 dB(A), it’s not very enjoyable for day to day use. The software works fine, although it’s not the prettiest utility around.

They also have their FlexKey software, which is used for a couple of purposes. First, you can use it to customize the keyboard and back panel lighting. Second, it can be used to track statistics of which keys you use, and you can set it to record and stop as needed. Finally, you can use it to record and use macros. There are no dedicated macro keys, so they will have to be bound to other keys on the keyboard, or mouse.

The last bit of software is for those who want to overclock. Since the review unit is an i7-6700K, the CPU is unlocked, and the laptop also supports GPU overclocking. To set up the CPU, they have included the Intel Extreme Tuning Utility, which allows you to do some very fine grained controls of the CPU and memory. On the GPU side, Mythlogic has a tool to change the core and memory clocks on that, as well as adjust the fan speed as needed. The overall fan speeds can also be set to the Overclock mode in their control center.

Since I am not into overclocking, I shied away from digging into this and possibly breaking something. Plus, it would be difficult for me to see needing more performance than this system already gives at stock levels. However there’s a whole community who is into extracting the maximum performance, and Mythlogic has provided the tools for you to do this if you are into that.

Battery Life and Charge Time Final Words
Comments Locked

61 Comments

View All Comments

  • BrokenCrayons - Thursday, October 27, 2016 - link

    Minor details...the MYTH Control Center shows an image of a different laptop. It struck me right away because of the pre-Pentium MMX Compaq Presario-esque style hinge design.

    As for Pascal, the performance is nice, but I continue to be disappointed by the cooling and power requirements. The number of heat pipes affixed to the GPU, the fact that it's still reaching thermal limits with such cooling, and the absurd PSU requirements for SLI make it pretty obvious the whole desktop-class GPU in a laptop isn't a consumer-friendly move on NV's part. Sure it cuts back on engineering, manufacturing, and part inventory costs and results in a leaner organization, but it's hardly nice to people who want a little more than iGPU performance, but aren't interested in running up to the other extreme end of the spectrum. It's interesting to see NV approach the cost-cutting measure of eliminating mobile GPU variants and turning it into a selling point. Kudos to them for keeping the wool up on that aspect at least.

    The Killer NIC is something I think is a poor decision. An Intel adapter would probably have been a better choice for the end user since the benefits of having one have yet to be proven AND the downsides of poor software support and no driver flexibility outweigh the dubious claims from Killer's manufacturer.
  • ImSpartacus - Thursday, October 27, 2016 - link

    Nvidia just named their mobile GPUs differently.

    Fundamentally, very little has changed.

    A couple generations ago, we had a 780mx that was based on an underclocked gk104. Nvidia could've branded it as the "laptop" 770 because it was effectively an underclocked 770, just like the laptop 1080 is an underclocked 1080.

    But the laptop variants are surely binned separately and they are generally implemented on the mxm form factor. So there isn't any logistical improvements just by naming their laptop GPUs differently.
  • The_Assimilator - Thursday, October 27, 2016 - link

    "The number of heat pipes affixed to the GPU, the fact that it's still reaching thermal limits with such cooling, and the absurd PSU requirements for SLI make it pretty obvious the whole desktop-class GPU in a laptop isn't a consumer-friendly move on NV's part."

    nVIDIA is doing crazy things with perf/watt and all you can do is complain that it's not good enough? The fact that they can shoehorn not just one, but TWO of the highest-end consumer desktop GPUs you can buy into a bloody LAPTOP, is massively impressive and literally unthinkable until now. (I'd love to see AMD try to pull that off.) Volta is only going to be better.

    And it's not like you can't go for a lower-end discrete GPU if you want to consume less power, the article mentioned GTX 1070 and I'm sure the GTX 1060 and 1050 will eventually find their way into laptops. But this isn't just an ordinary laptop, it's 5.5kg of desktop replacement, and if you're in the market for one of these I very much doubt that you're looking at anything except the highest of the high-end.
  • BrokenCrayons - Thursday, October 27, 2016 - link

    Please calm down. I realize I'm not in the target market for this particular computer or the GPU it uses. I'm also not displaying disappointment in order to cleverly hide some sort of fangirl obsession for AMD's graphics processors either. What I'm pointing out are two things:

    1.) The GPU is forced to back off from its highest speeds due to thermal limitations despite the ample (almost excessive) cooling solution.

    2.) While performance per watt is great, NV elected to put all the gains realized from moving to a newer, more efficent process into higher performance (in some ways increasing TDP between Maxwell/Kepler/etc. and Pascal in the same price brackets such as the 750 Ti @ 60W vs the 1050 Ti @ 75W) and my personal preference is that they would have backed off a bit from such an aggressive performance approach to slightly reduce power consumption in the same price/performance categories even if it cost in framerates.

    It's a different perspective than a lot of computer enthusiasts might take, but I much perfer gaining less performance while reaping the benefits of reduced heat and power requirements. I realize that my thoughts on the matter aren't shared so I have no delusion of pressing them on others since I'm fully aware I don't represent the majority of people.

    I guess in a lot of ways, the polarization of computer graphics into basically two distinct categories that consist of "iGPU - can't" and "dGPU - can" along with the associated power and heat issues that's brought to light has really spoiled the fun I used to find in it as a hobby. The middle ground has eroded away somewhat in recent years (or so it seems from my observations of industry trends) and when combined with excessive data mining across the board, I more or less want to just crawl in a hole and play board games after dumping my gaming computer off at the local thrift store's donation box. Too bad I'm screen addicted and can't escape just yet, but I'm working on it. :3
  • bji - Thursday, October 27, 2016 - link

    "Please calm down" is an insulting way to begin your response. Just saying.
  • BrokenCrayons - Thursday, October 27, 2016 - link

    I acknowledge your reply as an expression of your opinion. ;)
  • The_Assimilator - Friday, October 28, 2016 - link

    Yeah, but my response wasn't exactly calm and measured either, so it's all fair.
  • BrokenCrayons - Friday, October 28, 2016 - link

    "...so it's all fair."

    It's also important to point out that I was a bit inflammatory in my opening post. It wasn't directed at anyone in particular, but was/is more an expression of frustration with what I think is the industry's unintentional marginalization of the lower- and mid-tiers of home computer performance. Still, being generally grumpy about something in a comments box is unavoidably going to draw a little ire from other people so, in essence, I started it and it's my fault to begin with.
  • bvoigt - Thursday, October 27, 2016 - link

    "my personal preference is that they would have backed off a bit from such an aggressive performance approach to slightly reduce power consumption in the same price/performance categories even if it cost in framerates."

    They did one better, they now give you same performance with reduced power consumption, and at a lower price (980 Ti -> 1070). Or if you prefer the combination of improved performance and slightly reduced power consumption, you can find that also, again at a reduced price (980 Ti -> 1080 or 980 -> 1070).

    Your only complaint seems to be that the price and category labelling (xx80) followed the power consumption. Which is true, but getting hung up on that is stupid because all the power&performance migration paths you wanted do exist, just with a different model number than you'd prefer.
  • BrokenCrayons - Thursday, October 27, 2016 - link

    You know, I never thought about it like that. Good point! Here's to hoping there's a nice, performance boost realized from a hypothetical GT 1030 GPU lurking in the product stack someplace. Though I can't see them giving us a 128-bit GDDR5 memory bus and sticking to the ~25W TDP of the GT 730. We'll probably end up stuck with a 64-bit memory interface with this generation.

Log in

Don't have an account? Sign up now