#5 The Lenovo Y700 (Carrizo, FX-8800P + R9 385MX)

The Y700 pre-release unit we had access to didn't have a battery, or a wireless module. But it did have a ‘neat’ trick compared to the other APUs in this test, in that it is the 35W model of the AMD FX-8800P, which adds a bit more frequency in exchange for some additional power draw. Moving to 35W affords some benefits we’ll go into in a bit, although for some odd reason Lenovo didn’t take them here.

Lenovo Y700 (Carrizo) Specifications
Size and Resolution 15.6-inch, 1920x1080 IPS
Processor AMD FX-8800P (35W)
Dual module, 4 threads
2.1 GHz Base Frequency
3.4 GHz Turbo Frequency
Graphics Integrated R7
512 Shader Cores
800 MHz maximum frequency
GCN 1.2

AMD R9 385MX Discrete GPU with 2GB GDDR5
512 Shader Cores
900-1000 MHz Core, 1200 MHz Memory
GCN 1.2

Dual Graphics Not Available in Drivers
TDP Chassis: 15W
CPU: 35W
Memory 16 GB in Single Channel Operation
2 x 8GB at DDR3L-1600 C11
Single Channel ONLY
Storage 256GB Sandisk
Battery Size None in our model
80Wh with 4 cell Li-ion design otherwise
WiFi None in our model
802.11ac M.2 otherwise
Optical Drive Optional
Dimensions 15.24 x 10.91 x 1.02-inch
38.7 x 27.7 x 2.60 cm
Weight 5.72 lbs
2.6 kg
Webcam 1280x720 with array microphones
Ports Memory Card Reader
HDMI
2 x USB 3.0 + 1 x USB 2.0
Ethernet
Operating System Windows 10 Home
Website Link link

The Y700 here is paired with a discrete graphics card, AMD's Radeon R9 385MX, which offers 512 streaming processors. The FX-8800P processor also has R7 graphics and 512 SPs at 800 MHz, and in theory one might think that these two automatically work with each other in dual graphics mode – but this design is not set up that way. So for this design, the user is paying for almost the same graphics design twice (though the discrete card has access to much faster memory), but one is essentially disabled or only comes on when the discrete card is shut off. Arguably one might postulate that the active idle power of the integrated graphics is lower than that of the discrete, but it seems expensive just for the sake of a few hundred mW. There could be another reason in display support, but it still seems odd. The user can however manually choose to invoke whichever graphics solution they wish from the Catalyst menu.

Another element of the design worth questioning is the memory. Carrizo as a platform does support dual channel memory, but it shares a design structure with Carrizo-L (Puma+) which is single channel only. As a result, a number of OEMs have designed one motherboard for both platforms, which means all Carrizo under that design are limited to single channel operation, reducing performance for the sake of some PCB design. This is an aspect we’ll get on to later, but it means that the Y700 has access to 16GB of DDR3L-1600 CAS 11 but in single channel mode. The fact that it is DDR3L-1600, even though Carrizo supports DDR3-2133, is another angle to tackle on how such a design can have performance issues.

For the other specifications, the Y700 gets a 1920x1080 IPS screen, a 256 GB Sandisk SSD and some Wi-Fi in an M.2 form factor. I say ‘some’ Wi-Fi, purely because our pre-production unit didn’t have any.

This low quality image of the insides shows the dual fan design for the 35W APU and discrete graphics, and we can confirm we didn’t see any throttling during our testing. The two memory modules, despite being part of a single channel design, sit on the right below the slim hard drive which we replaced with the 256 GB Sandisk SSD. There is also an M.2 slot next to this, though I believe this is SATA only, supporting form factors up to 2280.

Next to the M.2 slot is the bass speaker. The Y700 has an extra vent at the bottom for better sound, rather than muffled in a chassis:

The keyboard we had in our model was a mix English/Japanese variant, though the red backlight was easy to see through.

Brett actually has the Skylake variant in for testing, so I'll let him mull over the design a bit more, but on the sides:

The left gets a charging point, a USB 3.0 port, a multi card reader and a headphone jack. On the right are two more USB ports, a HDMI port, an expanding Ethernet port and a Kensington Lock hole.

Y700 Specific Testing

In the case of the display, out of those we tested it actually comes best in terms of color accuracy. While I don’t have a spectrophotometer to show you exactly in numbers, the colorimeter graph does the business:

Here red and blue are pretty much dead on accurate, but green is straying too low. The panel gives a good 1032 contrast ratio, with 0.216 nits at low brightness and 223 nits at peak. The peak isn’t very high, which might be a bit concerning in bright lights.

One of the downsides to these configurable TDP processors is that the ‘max TDP’ string doesn’t change. It is up to the OEM to do the firmware adjustments, but chances are they won’t open it up to regular users in case someone wants to put 35W through a chassis only designed to handle 15W. The way to tell is in the peak frequencies, and this one goes to eleven 3.4 GHz.

For the discrete GPU, we get 2 GB of dedicated memory and, thanks to the use of GDDR5, much greater bandwidth than just relying on DDR3 alone. The ‘CrossFire available’ message means that GPU-Z recognizes that the CPU and GPU can be both put to work together, but for whatever reason the drivers did not allow it when we tested.

The Devices: #4 The HP Pavilion 17z-g100 (Carrizo, A10-8700P) AMD’s Industry Problem
Comments Locked

175 Comments

View All Comments

  • basicmath - Tuesday, February 9, 2016 - link

    No it's really not, this laptop came from the factory with dual channel capability but that capability was not utilised because that would have shown the platform in a much better light, he even states that he checked the chips in the G2 to confirm that it was single channel. Upgrading the RAM on a laptop is a simple process that any end user can perform. The only discernible difference between the APU in the G2 & G3 is the number of GPU cores so why did he even bother testing the G3 without using dual channel configuration?
  • Intel999 - Sunday, February 7, 2016 - link

    @Ian

    I look forward to that R-series test as it will provide a sneak peek at how much DDR4 relieves the bottleneck on integrated graphics when Bristol Ridge comes out.

    That $70 Athlon X4 845 is intriguing as well.
  • AS118 - Saturday, February 6, 2016 - link

    Good article (although to be fair, I mostly skimmed it), and I agree with the conclusions. AMD should try harder to make sure their high-end products are paired with good components. Single-channel ram, bad screens, and slow hard drives with an A10 or mobile FX defeats the purpose of having those higher end APU's.

    Plus, people will get a bad impression of AMD if a lot of them have poor trackpads, etc. I wish they'd make their own "signature" brand of laptops, and find someone to help make them a thing.
  • TheinsanegamerN - Thursday, February 11, 2016 - link

    Both clevo and MSI have treated AMD well before, Im sure either would love to have exclusive rights to the high end AMD notebook.

    That being said, I doubt AMD has the intelligence to pull it off. They seem to be run by monkeys 90% of the time.
  • Cryio - Saturday, February 6, 2016 - link

    I'm really sorry for AMD. Kaveri and Carizzo on mobile, when configured to the proper ram, cooling and when using the highest performing part ... would've provided awesome performance, compared to Haswell and Broadwell. But no one bothered.

    Bristol Ridge will basically be Carizzo with DDR4 support and since it will be even better binned 28 nm CPUs, maybe we'll get even higher frequency out of Excavator. As for GCN 2.0 GPUs ... it will be interesting to see.

    I love my Surface Pro 4, even given the disaster that is Skylake drivers and Windows 10 horrible efficiency compared to W8.1. But MAN. I would've loved a proper Carrizo based Surface Pro/Book.
  • Gadgety - Saturday, February 6, 2016 - link

    A confirmation, with in depth detail. Nice write up.
  • Khenglish - Sunday, February 7, 2016 - link

    I would have really liked to see some dual channel results, or at least pulling a memory stick from the Kaveri and Intel systems to get a fair comparison. AMD says Zen brings a 40% IPC improvement. It'd be great to have a baseline to see if that 40% improvement is enough. In the dual channel intel to single channel AMD comparisons it does not appear to be enough, but we don't know how big of a factor memory was.
  • Jon Irenicus - Sunday, February 7, 2016 - link

    I want to buy an amd part for my next notebook but as was mentioned in the article, oems only choose bargain basement platforms to put the amd inside. The elitebook is the one exception, along with the lenovo if you don't mind the bulk.

    But the elitebooks are super overpriced for what you get. They need to release an hp spectre version of a notebook with a zen apu, a dell xps notebook variant with an apu. Ideally, the models that include a discreet gpu should allow the apu to work in tandem.

    In 2017 dx12 will be in full effect with games, and having two gpus working together by default could give a lot of amd equipped systems a larger edge, especially if the oversized ipc deficits between excavator and intel parts is minimized with zen.

    The future really does rest on Zen, amd needs to laser focus on performance per watt and ipc, and equip the 2017 apus with polaris gpu parts or vega or whatever the first iterations will be called. That has to be the minimum. Put those in nice chassis with solid battery life and that is all they need.
  • Intel999 - Sunday, February 7, 2016 - link

    In DX12 dual graphics will be automatic. Even an Intel Igpu combined with a discrete Nvidia or AMD GPU will "merge" the graphic capabilities in the laptop.

    Theoretically, an AMD APU combined with an AMD GPU might have an advantage as all graphics would be from the same underlying graphic architecture. Time will tell if this bares out.
  • Ryan Smith - Monday, February 8, 2016 - link

    Note that it's only going to be as automatic as the game developer makes it, as devs will be responsible for implementing it. For the moment game devs are going to be the wildcard.

Log in

Don't have an account? Sign up now