#5 The Lenovo Y700 (Carrizo, FX-8800P + R9 385MX)

The Y700 pre-release unit we had access to didn't have a battery, or a wireless module. But it did have a ‘neat’ trick compared to the other APUs in this test, in that it is the 35W model of the AMD FX-8800P, which adds a bit more frequency in exchange for some additional power draw. Moving to 35W affords some benefits we’ll go into in a bit, although for some odd reason Lenovo didn’t take them here.

Lenovo Y700 (Carrizo) Specifications
Size and Resolution 15.6-inch, 1920x1080 IPS
Processor AMD FX-8800P (35W)
Dual module, 4 threads
2.1 GHz Base Frequency
3.4 GHz Turbo Frequency
Graphics Integrated R7
512 Shader Cores
800 MHz maximum frequency
GCN 1.2

AMD R9 385MX Discrete GPU with 2GB GDDR5
512 Shader Cores
900-1000 MHz Core, 1200 MHz Memory
GCN 1.2

Dual Graphics Not Available in Drivers
TDP Chassis: 15W
CPU: 35W
Memory 16 GB in Single Channel Operation
2 x 8GB at DDR3L-1600 C11
Single Channel ONLY
Storage 256GB Sandisk
Battery Size None in our model
80Wh with 4 cell Li-ion design otherwise
WiFi None in our model
802.11ac M.2 otherwise
Optical Drive Optional
Dimensions 15.24 x 10.91 x 1.02-inch
38.7 x 27.7 x 2.60 cm
Weight 5.72 lbs
2.6 kg
Webcam 1280x720 with array microphones
Ports Memory Card Reader
HDMI
2 x USB 3.0 + 1 x USB 2.0
Ethernet
Operating System Windows 10 Home
Website Link link

The Y700 here is paired with a discrete graphics card, AMD's Radeon R9 385MX, which offers 512 streaming processors. The FX-8800P processor also has R7 graphics and 512 SPs at 800 MHz, and in theory one might think that these two automatically work with each other in dual graphics mode – but this design is not set up that way. So for this design, the user is paying for almost the same graphics design twice (though the discrete card has access to much faster memory), but one is essentially disabled or only comes on when the discrete card is shut off. Arguably one might postulate that the active idle power of the integrated graphics is lower than that of the discrete, but it seems expensive just for the sake of a few hundred mW. There could be another reason in display support, but it still seems odd. The user can however manually choose to invoke whichever graphics solution they wish from the Catalyst menu.

Another element of the design worth questioning is the memory. Carrizo as a platform does support dual channel memory, but it shares a design structure with Carrizo-L (Puma+) which is single channel only. As a result, a number of OEMs have designed one motherboard for both platforms, which means all Carrizo under that design are limited to single channel operation, reducing performance for the sake of some PCB design. This is an aspect we’ll get on to later, but it means that the Y700 has access to 16GB of DDR3L-1600 CAS 11 but in single channel mode. The fact that it is DDR3L-1600, even though Carrizo supports DDR3-2133, is another angle to tackle on how such a design can have performance issues.

For the other specifications, the Y700 gets a 1920x1080 IPS screen, a 256 GB Sandisk SSD and some Wi-Fi in an M.2 form factor. I say ‘some’ Wi-Fi, purely because our pre-production unit didn’t have any.

This low quality image of the insides shows the dual fan design for the 35W APU and discrete graphics, and we can confirm we didn’t see any throttling during our testing. The two memory modules, despite being part of a single channel design, sit on the right below the slim hard drive which we replaced with the 256 GB Sandisk SSD. There is also an M.2 slot next to this, though I believe this is SATA only, supporting form factors up to 2280.

Next to the M.2 slot is the bass speaker. The Y700 has an extra vent at the bottom for better sound, rather than muffled in a chassis:

The keyboard we had in our model was a mix English/Japanese variant, though the red backlight was easy to see through.

Brett actually has the Skylake variant in for testing, so I'll let him mull over the design a bit more, but on the sides:

The left gets a charging point, a USB 3.0 port, a multi card reader and a headphone jack. On the right are two more USB ports, a HDMI port, an expanding Ethernet port and a Kensington Lock hole.

Y700 Specific Testing

In the case of the display, out of those we tested it actually comes best in terms of color accuracy. While I don’t have a spectrophotometer to show you exactly in numbers, the colorimeter graph does the business:

Here red and blue are pretty much dead on accurate, but green is straying too low. The panel gives a good 1032 contrast ratio, with 0.216 nits at low brightness and 223 nits at peak. The peak isn’t very high, which might be a bit concerning in bright lights.

One of the downsides to these configurable TDP processors is that the ‘max TDP’ string doesn’t change. It is up to the OEM to do the firmware adjustments, but chances are they won’t open it up to regular users in case someone wants to put 35W through a chassis only designed to handle 15W. The way to tell is in the peak frequencies, and this one goes to eleven 3.4 GHz.

For the discrete GPU, we get 2 GB of dedicated memory and, thanks to the use of GDDR5, much greater bandwidth than just relying on DDR3 alone. The ‘CrossFire available’ message means that GPU-Z recognizes that the CPU and GPU can be both put to work together, but for whatever reason the drivers did not allow it when we tested.

The Devices: #4 The HP Pavilion 17z-g100 (Carrizo, A10-8700P) AMD’s Industry Problem
Comments Locked

175 Comments

View All Comments

  • chris471 - Friday, February 5, 2016 - link

    What do you do with all those split hares? Are they any good barbecued?
    ("Octane splits hares between the Kaveri ...")
  • Ian Cutress - Friday, February 5, 2016 - link

    Lightly roasted for me :) Edited, thanks!
  • maglito - Friday, February 5, 2016 - link

    Were you able to test 18Gbps HDMI? The ability to drive an external display with 2160p 4:2:2 @ 60Hz? I guess the lack of a 10bit accelerated video decoder almost makes the point moot for future 2160p content though....

    Otherwise, fantastic article!
  • MonkeyPaw - Friday, February 5, 2016 - link

    Last time I shopped for a laptop (which was recently), I was considering an A10-based HP with a 1080p screen. The problem I saw was in reviews the battery life was really poor. It looked like HP put a small battery in it, making the thing only worthy as a DTR. I ended up going with a Lenovo with an i3. I guess part of the problem is that there are so many variants of laptops that finding a review of a specific model is impossible, and all you have to go on are things like Amazon or Best Buy ussr reviews, which can be extremely painful to read.
  • Lolimaster - Friday, February 5, 2016 - link

    HP sometimes release near "nice" AMD laptops but always cripples it with laughable battery capacities, same models intel inside dont get the nerfs.
  • euskalzabe - Friday, February 5, 2016 - link

    That was a wonderful article that I thoroughly enjoyed reading to start my Friday. Long story short, you perfectly define my laptop buying rationale with "SSD, dual channel memory, 8 hours+ light battery, under 2kg, Full HD IPS panel".

    That's why I bought an i5 UX305. I wanted an AMD machine because I plain like the company and would like them to succeed to bring more competition to Intel, but I found NOTHING even close to the specs you mentioned. The UX305 fit the description perfectly and cost me $750. It was an immediate purchase for me. If AMD managed the OEM relationship to create such a machine, it would be an insta-buy for me. Also, Zenbooks with Zen APUs oculd be a great marketing strategy :)
  • Shadowmaster625 - Friday, February 5, 2016 - link

    AMD is a company that shoots itself in the foot at every opportunity. I am truly perplexed. Their cat cores shouldnt even exist. But not only do those crippled parts exist, they crippled their premium parts by combining the two platforms! WHY? How could they not see that every notebook would be single channel? They wasted the entirety of their ATI purchase, as you can see with the Rocket League results vs Intel. This is a disgrace.

    AMD needs to realize that it IS AMD who controls the User experience. Look at the Mackbook Air. Look at the Surface Pro. Look at the Playstation 4 and Xbox One. All of these platforms have a set minimum level of performance. Sure they might be more expensive than a $300 atom clunker, but at least the user will not throw the thing out the window after pulling their hair out.

    AMD needs to put a floor under their products. 4 cores. 8GB of unified HBM. 512 cores GPU. This is the SoC that they need. Sure they can fuse off a core or whatever to harvest bad dies, but this is the minimum die they should be making. Ideally within 2 years they will move to a 4 core, 16GB HBM, and they will replace one stack of HBM with 128GB of HBF. They need to control the memory bandwidth of their SoC. Take away the ability of the OEMs to cripple performance. Use HBF to take away the ability for OEMs to cripple storage performance also. Do this, and every AMD system will be fast. And it will get design wins.
  • t.s - Friday, February 5, 2016 - link

    If you read the article: AMD not crippled their premium parts. It was OEM. If only OEM create mobos that have dual channel mems.
  • xthetenth - Friday, February 5, 2016 - link

    OEMs shaving pennies is as universal a phenomenon as gravity, and designs should be made as such.
  • t.s - Thursday, February 11, 2016 - link

    hence the title, "who controls user experience" :)

Log in

Don't have an account? Sign up now