The holiday rush for getting games on store shelves always creates some problems. For one, games that are otherwise good can sometimes get lost in the shuffle, but perhaps even worse is that publishers often ship games in a state where they're not really complete. Sometimes the games are just buggy and unstable, other times they're missing features, or maybe performance is worse than it should be -- and in the worst scenarios they can completely fail to run. We encountered a bit of this with Lords of the Fallen, going so far as to retest all of the graphics cards with the updated (patched) version of the game. We could go back and retest most of the games released late last year at least one more time as patches and driver updates continue to come out. However, there's plenty of other items keeping us busy so instead let's just look at one game where we have initial (near launch date) performance as well as current (as of the end of January) performance: Assassin's Creed Unity.
The Assassin's Creed series has a long standing tradition of being a bit of a pig to run at maximum quality settings, though the last couple weren't quite so bad. With the next generation consoles now taking over and sporting more RAM than ever before, however, we are starting to encounter problems with GPUs that "only" have 3GB of VRAM. In fact, at maximum quality and high resolutions, Assassin's Creed Unity will pretty much fail to run well on any of the current crop of cards that have 4GB VRAM or less -- including NVIDIA's GTX 970 and GTX 980! The reason is that the textures are now using over 3GB RAM with the Ultra quality setting, and at QHD (2560x1440) for instance you have to store multiple buffers that are each going to be roughly 29MB (though some will be half that, 14.5MB). All told, you're realistically looking at around 300-400 MB of VRAM to run a game at QHD with 4xMSAA, compared to around half that amount for 1920x1080 (160-220MB, give or take). Oh, and if you want to do that all at 4K instead of QHD, it's a 125% increase in VRAM requirements -- so 675-900MB of VRAM just for buffers. Note that this is all without even storing textures and shadows!
What all this means is that even if a developer says, "Well, most PC GPUs don't have more than 3GB VRAM, and very few have more than 4GB, so let's try not to go over 2.5GB in textures", depending on your screen resolution you can still end up running out of VRAM. There are plenty of other factors at play of course, but basically you can kill performance on any GPU if you reach the point where you're trying to store more things in VRAM than you have VRAM to hold. You desktop might have 25.6GB/s of system memory bandwidth, but a high-end GPU might have over ten times the bandwidth for VRAM -- GTX 980 has 224GB/s while the R9 290/290X have 320GB/s! Getting back to the game at hand, Assassin's Creed Unity is definitely pushing a lot of textures around, so as we'll see in a moment there are cases where performance falls off a cliff at higher quality settings.
Assassin's Creed Unity - Performance and Analysis
Current (Feb 2015) vs. Initial (Nov. 2014)
Let's get into the specifics a bit for what Assassin's Creed Unity is doing that makes it such a demanding game to run. First, it's a large open world game set in a very detailed facsimile of Paris around the time of the French Revolution (1789-1799). Powered by the AnvilNext engine, the game has large crowds of people in quite a few areas, and all of the AI (Artificial Intelligence) routines to move those people around, plus animations, will chew up a decent amount of CPU power. Now on top of all that, add in all the textures, anti-aliasing, shadows, and a few GameWorks technologies from NVIDIA and you end up with a very complex simulation. Interestingly, I should also note that while tessellation was promised for the game, we have not yet received the promised patch to add it, so if anything Assassin's Creed Unity will only become more difficult to run once full tessellation is added.
Due to the memory requirements of the game, we've also elected to run our 4K results at High quality -- Ultra is simply not playable on current consumer GPUs (though perhaps the 6GB and 8GB cards like the GeForce GTX Titan and the Radeon R9 290X 8GB would do okay. We've got two charts this time, the top one shows current performance using the latest updates at the time of writing (late January 2015) with AMD's Omega Catalyst drivers and NVIDIA's latest 347.25 WHQL drivers; the bottom chart shows performance after the 1.2 patch with the 344.65 NVIDIA Game Ready driver for Assassin's Creed Unity and AMD's 14.11.2 Beta driver.
Most of the changes from the initial release are pretty minor so far; AMD performance improves a bit in some cases and a lot in others, particularly at 4K and 1080p High), while strangely NVIDIA GPUs tend to be slower at Ultra settings now -- though 4K and 1080p High show impressive gains on the 970 and 980 in particular (but not as much on 770/780). As a whole, it looks as though patches and driver updates have benefited AMD the most with Assassin's Creed Unity, with one exception: NVIDIA's SLI sees a major jump in performance across the board. But improvements in performance aren't the same as actually being the fastest GPU for the game, so let's focus on the current state of performance for the remainder of the discussion.
The overall performance crown clearly belongs with NVIDIA, which is expected as Assassin's Creed Unity is part of NVIDIA's TWIMTBP program (not to mention the GameWorks collaboration). The SLI GTX 970 configuration places at the top of every single chart, often with a sizeable lead over a single GTX 980. There have been a few issues with SLI rendering that remain, but when I've played the game lately I haven't noticed any glaring issues on SLI. CrossFire however is a different story; three months after launch and as far as I'm aware there is currently no official support for CrossFire in Assassin's Creed Unity; not only that, but simply having CrossFire enabled results in a rather large drop in performance. So for dual-GPU configurations, NVIDIA easily wins this round.
Looking at the single GPU results, as usual the GTX 980 continues to be the fastest card on the block. At 4K with High settings, the R9 290X is nipping at its heels and they're basically tied, but both GPUs struggle at that setting so it's not a very meaningful comparison. When we drop to 2560x1440 Ultra, the top NVIDIA GPUs maintain their performance while minimum frame rates on the AMD GPUs take a hit. Interestingly, the Ultra setting is the only case where the CrossFire 290X configuration actually leads over a single 290X, but it does so by delivering higher average frame rates with lower minimum frame rates, so in general it's a less desirable experience. Once we reach 1080p Ultra and lower settings, the 4GB NVIDIA GPUs establish themselves at the top followed by the 290X, with thee other GPUs filling out the ranks.
That lengthy discussion on VRAM at the start is also there for a reason -- look at the results of the GTX 780 3GB and the GTX 770 2GB cards! It's not like you'd actually want to buy either card at this point, but both GPUs are basically only able to reach acceptable performance at 1920x1080 High, and minimum frame rates are still a concern. In fact, even at 1600x900 Medium the GTX 770 continues to struggle and all of the 3GB cards easily beat it. Also worth pointing out is that AMD GPUs all seem to hit a major CPU bottleneck of around 64 FPS, while for NVIDIA GPUs the bottleneck is around 95 FPS. Clearly AMD could still improve performance with further driver optimizations.
Finally, let's go over image quality, and here's where much of the above conversation starts to fall apart. While it's true that you can choose settings that will kill performance with Assassin's Creed Unity, the reality is that the difference between the various settings isn't actually all that great. In fact, you could make a very good argument that the post processing used to achieve anti-aliasing pretty much makes the game look like garbage -- all of the textures can look quite blurry, with the exception of those at the Low setting. There's also an issue of the game sometimes apparently loading lower quality textures even at higher quality settings -- or using the wrong mipmap at least.
Whatever the cause, other than the heavy aliasing the Low setting seems to look better than the Medium and High settings in many ways. Given the already heavy performance demands of Assassin's Creed Unity, you'll be better off tuning a few settings by hand rather than using the presets; we'd suggest starting at the High or Very High Texture Quality setting and then play with the options for Bloom, Shadow Quality, and Anti-Aliasing. Or just use GeForce Experience and let that select optimal settings for you, as it generally does a decent job of balancing quality with performance.
Wrapping up, we want to touch on the potential to really hurt a franchise by rushing out a holiday release. We've actually enjoyed playing Assassin's Creed Unity for the past several months, in between benchmarking and other tasks, but we also happen to have a very beefy collection of hardware available. If you look at the reviews, however, most of those posted back in November and December, when performance was quite a bit worse on certain configurations. The result is that with an average score of 71%, we believe this is the lowest rated game of the Assassin's Creed series to date. You can make a strong case that releasing a buggy game in what is arguably a beta state negatively reflected the reviews, which in turn can cost a lot of sales.
Even today the "true" Assassin's Creed Unity experience has yet to be delivered, as bugs and technical glitches remain, along with missing features like tessellation. While the holiday shopping season can account for 75% of annual sales (or in some cases even more), chasing that paycheck at all costs can seriously hurt in the long term. Let's hope Ubisoft learns their lesson, but given Far Cry 4 was also a buggy release and Ubisoft has been in this business for a while, apparently they've decided it's better to ship in November and patch over the coming months than to miss the holiday season. That's unfortunate, but as long as people continue to buy buggy products the situation is unlikely to change. Perhaps the only good news is that the negative publicity has resulted in the price dropping to $30 instead of $60, and with another patch and some driver updates, Assassin's Creed Unity will definitely be worth picking up.