Conclusion: Is Intel Smothering AMD in Sardine Oil?

Whenever a new processor family is reviewed, it is easy to get caught up in the metrics. More performance! Better power consumption! Increased efficiency! Better clock-for-clock gains! Amazing price! Any review through a singular lens can fall into the trap of only focusing on that specific metric. So which metrics matter more than others? That depends on who you are and what the product is for. 

Tiger Lake is a mobile processor, featuring Intel's fastest cores and new integrated graphics built with an updated manufacturing process. This processor will be cast into the ultra-premium notebook market, as it carries the weight of the best Intel has to offer across a number of its engineering groups. Intel is actively working with its partners to build products to offer the best in performance for this segment right before a discrete GPU is absolutely needed.

As a road warrior, pairing the right performance with power efficiency is a must. In our benchmarks, due to the new process node technology as well as the updated voltage/frequency scaling, we can see that Tiger Lake offers both better performance at the same power compared to Ice Lake, but it also extends the range of performance over Ice Lake, assisted by that much higher turbo boost frequency of 4.8 GHz. When Tiger Lake gets into retail systems, particularly at the 15 W level, it is going to be fun to see what sort of battery life improvements during real-world workflows are observed.

As an engineer, genuine clock-for-clock performance gains get me excited. Unfortunately Tiger Lake doesn't deliver much on this front, and in some cases, we see regressions due to the rearranged cache depending on the workload used. This metric ignores power - but power is the metric on which Tiger Lake wins. Intel hasn't really been wanting to talk about the raw clock-for-clock performance, and perhaps understandably so (from a pure end-user product point of view at any rate).

Tiger Lake has updates for security as well as Control-Flow Enforcement Technology, which is a good thing, however these are held behind the vPro versions, creating additional segmentation in the product stack on the basis of security features. I’m not sure I approve of this, potentially leaving the non-vPro unsecure and trying to upsell business customers for the benefit.

The new Tiger Lake stills falls down against the competition when we start discussing raw throughput tests. Intel was keen to promote professional workflows with Tiger Lake, or gaming workflows such as streaming, particularly at 28 W rather than at 15 W. Despite this we can easily see that the 15 W Renoir options with eight cores can blow past Tiger Lake in a like-for-like scenario in our rendering tests and our scalable workloads. The only times Intel scores a win is due to accelerator support (AVX-512, DP4a, DL Boost). On top of that, Renoir laptops in the market are likely to be in a cheaper price bracket than what Intel seems to be targeting.

If Intel can convince software developers to jump on board with using its accelerators, then both the customers will benefit as will Intel’s metrics. The holy grail may be when it comes to OneAPI, enabling programmers to target different aspects of Intel’s eco-system under the same toolset. However OneAPI is only just entering v1.0, and any software project base building like that requires a few years to get off the ground.

For end-user performance, Tiger Lake is going to offer a good performance improvement over Ice Lake, or the same performance at less power. It’s hard to ignore. If Intel’s partners can fit 28 W versions of the silicon into the 15 W chassis they were using for Ice Lake, then it should provide for a good product.

We didn’t have too much time to go into the performance of the new Xe-LP graphics, although it was clear to see that the 28 W mode does get a good performance lift over the 15 W mode, perhaps indicating that DG1 (the discrete graphics coming later) is worth looking out for. Against AMD’s best 15 W mobile processor and integrated graphics, our results perhaps at the lower resolutions were skewed towards AMD, but the higher resolutions were mostly wins for Intel - it seemed to vary a lot depending on the game engine.

As a concept, Tiger Lake’s marketing frustrates me. Not offering apples-to-apples data points and claiming that TDP isn’t worth defining as a singular point is demonstrating the lengths that Intel believes it has to go to in order to redefine its market and obfuscate direct comparisons. There was a time and a place where Intel felt the need to share everything, as much as possible, with us. It let us sculpt the story of where we envisaged the market was going, and OEMs/customers were on hand to add their comments about the viewpoints of the customer base from their perspective. It let us as the press filter back with comments, critiques, and suggestions. The new twist from Intel’s client division, one that’s actually been progressing along this quagmire path, will only serve to confuse its passionate customer base, its enthusiasts, and perhaps even the financial analysts.

However, if we’re just talking about the product, I’m in two minds for Tiger Lake. It doesn’t give those raw clock-for-clock performance gains that I’d like, mostly because it’s almost the same design as Ice Lake for the CPU cores, but the expansion of the range of performance coupled with the energy efficiency improvements will make it a better product overall. I didn’t believe the efficiency numbers at first, but successive tests showed good gains from both the manufacturing side of Intel as well as the silicon design and the power flow management. Not only that, the new Xe-LP graphics seem exciting, and warrant a closer inspection.

Tiger Lake isn’t sardine oil basting AMD just yet, but it stands to compete well in a number of key markets.

Xe-LP GPU Performance: F1 2019
Comments Locked

253 Comments

View All Comments

  • blppt - Friday, September 18, 2020 - link

    Yeah, we can extrapolate such things if power consumption and heat dissipation are of no relevance to AMD. You're leaving out other factors that go into building a top line GPU.
  • AnarchoPrimitiv - Saturday, September 26, 2020 - link

    Power? It will certainly be better than Ampere which is awful at efficiency... Are you forgetting that RDNA2 will be on an improved 7nm node, meaning a better 7nm node that RDNA2?
  • Spunjji - Friday, September 18, 2020 - link

    Big Navi probably won't clock that high for TDP reasons, but the people who are buying that it's only going to have 2080Ti performance are in for a rude surprise. It should compete solidly with the 3080, and I'm betting at a lower TDP. We'll see.
  • blppt - Saturday, September 19, 2020 - link

    Its been AMD's modus operandi for a long time now. Introduce new card, and either because of inferior tech (occasionally) or drivers (mostly), it usually ends up matching Nvidia's last gen flagship. Although also at a lower price.

    Considering the leaked benches we've already seen, Big Navi appears to be more of the same. Around 2080Ti performance, probably at a much lower price, though.
  • Spunjji - Saturday, September 19, 2020 - link

    @blppt - not sure if you're shilling or credulous, but there's no indication that those leaked benchmarks are "Big Navi". Based on the probable specs vs. the known performance of the 3080, it's extremely unlikely that it will significantly underperform the 3080. It's entirely possible that it will perform similarly at lower power levels. They're also specifically holding back the launch to work on software.

    In other words: assuming AMD will keep doing the same thing over and over when they already stopped doing that (see: RDNA, Zen 2, Renoir) is not a solid bet.

    But none of this is relevant here. It's amazing how far shills will go to poison the well in off-topic posts.
  • blppt - Sunday, September 20, 2020 - link

    Considering that the 2080ti itself doesn't "significantly underperform the 3080", Big Navi being in line with the 2080ti doesn't qualify it as getting pummeled by the 3080.
  • blppt - Sunday, September 20, 2020 - link

    Oh, and BTW, I am not a shill for Nvidia. I've owned many AMD cards and cpus over the years, and they have been this way for a while. I keep wishing they'll release a true high end card, but they always end up matching Nvidia's previous gen flagship.

    Witness the disappointing 5700XT in my machine at the moment. Due to AMD's lesser driver team, it often is less consistent in games then my now ancient 1080ti. Even in its ideal situation with well optimized drivers in a game that favors AMD cards, it just barely outperforms that old 1080ti. Most of the time its around 1080 performance.

    Actually, YOU are the shill for AMD if you keep denying this is the way they have been for a while.

    "In other words: assuming AMD will keep doing the same thing over and over when they already stopped doing that (see: RDNA, Zen 2, Renoir) is not a solid bet."

    Except---they STILL don't hit the top of the charts in games on their CPUs. Zen/Zen 2 is a massive improvement, and dominates Intel in anything highly multi-core optimized, but that almost always never applies to games.

    So, going to a Zen comparison for what you think Big Navi will do is not a particularly good analogy.
  • Spunjji - Sunday, September 20, 2020 - link

    @blppt - "I'm not the shill, you're the shill, I totally own this product, let me whine about how disappointing it is though, even though performance characteristics were clear from the leaks and it still outperformed them. I bought it to replace a far more expensive card that it doesn't outperform". Okay buddy, sure. Whatever you say. 🙄

    I didn't say it would take the performance lead. Going for a Zen comparison is exactly what I meant and I stand by it. We will see, until benchmarks come out it's all just talk anyway - just some of it's more obvious nonsense than the rest...
  • blppt - Sunday, September 20, 2020 - link

    @Spunji

    That was the dumbest counter argument I've ever heard.

    First off, I didn't buy it to 'replace' anything. The 1080ti is in one of my other boxes. Where did you get 'replace' from? The 5700XT was to complete an all-AMD rig consisting of a 3900X and and AMD video card.

    Secondly, the 1080ti is now almost 4 freaking years old. You bet your rear end I'd expect it to outperform a top end card from almost 4 years ago, when it is currently STILL the best gpu AMD offers.

    And finally, I have over 20 years experience with both AMD cpus and gpus in various builds of mine, so don't give me that "bought one AMD product and decided they stink" B.S.

    I've been on both sides of the aisle. Don't try and tell me i'm a shill for Nvidia. I've spent way too much time and money around AMD systems for that to be true.
  • AnarchoPrimitiv - Saturday, September 26, 2020 - link

    You're a liar, I'm so sick of Nvidia fans lying about owning AMD cards

Log in

Don't have an account? Sign up now