Everyone knows that Intel doesn't make the fastest graphics solutions in the world right now. In fact, at best their GPUs are typically equivalent to the slowest GPUs that AMD and NVIDIA make; at worst, they're about half that level of performance. But they have one quality that makes them essentially ubiquitous: they're "free". Outside of Intel's enthusiast LGA2011 platform, all current Intel platforms use processors that include some form of graphics. From the lowly Celerons and Pentiums up to high-end Core i7 processors, if you buy any of Intel's consumer CPUs you're going to get some form of Intel graphics.
In the past few months, Intel launched their latest 5th Generation Core series of processors. Okay, never mind the counting of generations -- if we start at 65nm Core Duo (Yonah) as the 1st Generation, we would currently be on the 8th (or even 9th) Generation with Broadwell. But with the 45nm Nehalem family as the 1st Generation (it was the first time we saw "Core i-whatever" after all), then Sandy Bridge was 2nd Generation, Ivy Bridge was 3rd Generation, Haswell, was 4th Generation, and now Broadwell is the 5th Generation. Interestingly, the biggest gains in performance have mostly come in the areas of mobile products (the jump in performance from 1st Gen Clarksfield to 2nd Gen Sandy Bridge was quite massive for notebooks!) as well as graphics performance. Each generation has trended towards higher performance than the previous release, sometimes improving graphics performance by roughly 100%.
You'd think with all the advances that Broadwell would be a reasonably fast gaming chip by now, and it might very well be just that... but not in the current Broadwell-Y and Broadwell-U products. The reason is really quite simple: TDP, or Thermal Design Power. For all the improvements in CPU efficiency over the years, Intel's GPUs haven't really kept pace. Throw more resources at the graphics solution and performance goes up, but so does power use. So in chips like the 4.5W Core M products and the 15W ULV (Ultra Low Voltage) U-series processors, regardless of core counts and clock speeds the current Intel GPU solutions are simply not fast enough for anything but very light workloads.
In theory, the architectural changes that have occurred with Broadwell should allow the current GT2 (24 Execution Units / EUs) and GT3 (48 EUs) to provide more than adequate performance for gaming. In reality, 15W TDP means that when gaming, GT2 Broadwell-U is no faster than GT2 Haswell-U, and in fact there are 15W GT3 (Iris Graphics 6000) Broadwell-U parts (chiefly used in Apple's latest MacBook Air laptops) that are hardly any faster than their GT2 siblings. There are some 28W GT3 parts (Iris Graphics 6100) as well, which would be very interesting to test as they might double the performance thanks to the increased TDP, but so far outside of the new Apple MacBook Pro Retina 13 we haven't been able to find anyone shipping the Core i3-5157U, Core i5-5257U, or Core i5-5287U -- but give it some time and that should change. The Core i7-5557U on the other hand does show up in one product, an Intel NUC5i7RYH (NUC = Next Unit of Computing).
But I digress; just how bad is graphics performance on 15W Broadwell-U with HD 5500? We ran a Core i5-5300U (in a Lenovo ThinkPad T450s) through our full suite of gaming benchmarks to find out. We are using the latest available Intel drivers at the time of writing, version 10.18.14.4156, which requires uninstalling the Lenovo-provided drivers first but otherwise there were no issues. Before we get to the results, there are two items to mention. First, at least one game we tested (Assassin's Creed: Unity) refused to run at all -- it crashed to desktop every time we tried to load a save, sometimes even before trying to load a save. Second, rather than beating a dead horse by running at higher quality settings, we confined testing to 1366x768 and Low settings. This should be a very low bar to clear, but as you'll see HD Graphics 5500 with a 15W TDP struggles at best with most recent releases, though older/lighter games are generally okay.
Not surprisingly, even at 1366x768, the Lenovo ThinkPad T450s fails to reach playable frame rates, coming in at less than 30FPS in roughly 75% of the games we tested. What's more, if the goal is to keep minimum frame rates above 30FPS, we're down to five games -- and many of these are known for being quite forgiving at low quality settings. Both GRID 2 and GRID Autosport pass muster, as does the aging Elder Scrolls: Skyrim. The Talos Principle and Tomb Raider are the final two games, though Tomb Raider at least starts to look quite ugly at Low settings. And this is running most of the games at their minimum quality settings -- not a good sign.
You might be wondering, "Okay, HD 5500 seems pretty weak, but how do you know the problem is the 15W TDP rather than just a poor showing in general for Intel?" It's a valid question, and the answer is that logging clock speeds during several of the tests, it's clear that the CPU and GPU are throttling to stay within the 15W TDP. And yes, we are technically "throttling" as the i5-5300U we're looking at for example is supposed to run the CPU at 2.3-2.9GHz under load; we routinely drop well below 2.0GHz while gaming, though you could argue that the CPU just doesn't need to run faster to feed the (slow) GPU. The GPU meanwhile is running at 750-850 MHz most of the time, rather than being pegged at its maximum 900MHz, which means that doubling the number of EUs (e.g. in the CPUs that Apple is using for MacBook Air) will likely have very little positive effect on performance.
An interesting counterpoint to the HD 5500 showing is to look at HD 4600 performance from a desktop Core i7-4770K. Granted, the TDP is over five times higher and clock speeds even on just the GPU side are substantially higher, but the graphics solution consists of 20 EUs instead of 24 and along with architectural improvements in theory the HD 5500 should come close to equaling its performance. We've benchmarked most of the same games on that part; the result? On average, the desktop HD 4600 is over 50% faster, and in some cases (F1 2014) it's more than twice as fast. Looking at only 18 games (as there are several we haven't yet tested on the HD 4600), over half break 30FPS average with the desktop chip. And it's almost certainly due to the higher TDP more than anything, as the desktop was also tested with older drivers.
What's surprising to me is just how many games don't come near playable levels of performance with the HD 5500/Core i5-5300U. More than half of the games can't even break 20FPS, and even at 800x600 Low settings they would remain unplayable. The good news is that only one game completely failed to run, so Intel has made some real strides with their drivers over the past few years -- I remember testing Sandy Bridge when it was first launched and at least a third of games I tried to play either wouldn't work or had rendering problems. There remains another problem that does warrant mention: scaling of 1366x768 content to fill the screen almost never worked. Instead, the game would only fill the middle portion of the screen, with large black bars around the edges. I've seen this before, so it's something Intel needs to go back and fix (again).
While we're using the Lenovo ThinkPad T450s as our test laptop for this article, I do want to make it clear that the gaming results here are by no means a slam on the laptop or Lenovo. In fact, for non-gaming purposes this is an excellent laptop -- very likely the best true business laptop you could currently buy. You can get eight or more hours of battery life with the default battery configuration, and basically double that if you purchase the extended battery. And for non-gaming use, the i5-5300U is quite peppy.
Wrapping things up, hopefully not too many people are surprised by the performance results from Intel's ULV processor when it comes to gaming. This has been a problem with Haswell-U as well as Ivy Bridge-U before it; 15W simply isn't sufficient headroom for running a modern Intel GPU and CPU at acceptable performance levels. The CPU side is fast enough, but the GPU half of the chip could easily use 15-20W on its own. It's worth noting that NVIDIA's entry level mobile GPUs (e.g. GeForce 820M or GeForce 840M) seem to use around 25-30W while providing typically 3X the performance of Intel graphics solutions, all on a less efficient process node, so Intel has room for improvement. And before anyone points at the tablet and smartphone markets, remember that the SoCs in tablets and smartphones offer less than half the performance of the i5-5300U, and they also tend to run at much lower quality settings for Android games. The games are in many cases similar to what you would see in PC games from 2004; if you want to play games with graphics from 2004, Intel's HD 5500 performance would be quite decent in most cases. Sadly, looking at the calendar on the wall, we're now in 2015.
Here's to hoping that when the desktop and high performance Broadwell processors start shipping (which should be sooner rather than later), the increased TDP will allow Intel's 5th Generation of HD Graphics to finally stretch it's legs. With 48 EUs finally showing up in the mainstream desktop processors (compared to the 20 EUs in chips like the i7-4790K), we could actually see very respectable performance. There's also Skylake waiting in the wings, which may end up being more of a boost to the GPU aspect than the CPU side of things. Skylake will move to a new GPU architecture, and it will also be available in a GT4 (72 EU) configuration, as well as a GT4e variant with embedded DRAM. The problem of course is that anyone serious about graphics and gaming will likely already have a GPU that's 5X faster than a GT4e Skylake configuration.