Sniper-Elite-3-Featured

Sniper Elite 3 Performance – Now with Mantle

This past week AMD had some interesting news with the release of an update to Sniper Elite 3 ($49.99). While it's not the first game to get support for AMD's Mantle API, it's the most recent game to receive such treatment and number four in the list of Mantle enabled games. For the record, the first three Mantle enabled games are Battlefield 4, Thief, and Plants vs. Zombies: Garden Warfare. That last game uses the same Frostbite engine as Battlefield 4, which is why it has Mantle support; going forward, there are quite a few upcoming games that also use the Frostbite engine, so we'll start seeing even more games that use AMD's Mantle API.
But I'm probably getting ahead of myself for some of you. If you're wondering, "What exactly is Mantle?", well, the short summary is that it's a low-level API (Application Programming Interface) that allows software developers more direct control over the graphics hardware. Direct3D and OpenGL are the two most popular APIs for creating games and other applications that utilize graphics hardware, though there are other APIs as well (OpenCL, CUDA, and DirectCompute come to mind). OpenGL and Direct3D are similar in that they offer quite a bit of abstraction from the hardware, which makes them easier for programmers to use but also results in less than optimal performance.
A really talented programmer might extract several times more performance from computing hardware by coding directly for the hardware (this is how consoles like the Xbox 360 have managed to stay relevant for nearly a decade since their launch). The drawback is that programmers end up targeting a very specific set of hardware, and if you're not using that exact hardware then you may not even be able to run the program. It can also require far more time and energy to write and debug the code than using a low level API, which is one of the reasons for the existence of OpenGL and Direct3D. Mantle is sort of an in between API so it's not necessarily programming directly for the hardware, but it does provide more access to hardware features than OpenGL or Direct3D. And incidentally, Direct3D 12 will actually provide much in the way of low level access to graphics hardware when it launches some time in the next year.
Okay, that's the long introduction to Mantle, but the real question is: does it actually help in real games? Battlefield 4 and Thief were the first two Mantle enabled games, and both have been a bit hit and miss -- driver updates and game patches have sometimes wreaked havoc with Mantle performance. That's the drawback with low level APIs, and it's something of an ongoing concern. Mantle also tends to be most helpful in situations where the CPU is more of a bottleneck, as it can help reduce CPU overhead. What that means is that if you're running at maximum detail and the graphics subsystem is the bottleneck, Mantle may not do much for frame rates -- and in fact it could actually cause frame rates to drop. There's another big issue, and that's multiple GPUs. With the added control over the GPUs, Mantle could potentially create problems for CrossFire users, and in fact at this point in time it looks like this is the case with Sniper Elite 3 -- Mantle with CrossFire basically causes the second GPU to become mostly worthless.
Regarding CrossFire, Sniper Elite 3 developer Rebellion notes that this is something of double-edged sword. It can create more work for the developers on the one hand, but it also creates the potential for much better use of the hardware. Systems with non-identical GPUs in particular could benefit from Mantle, as the developer could use one GPU for geometry and triangle setup with the second faster GPU doing the main rendering. Offloading more work from the CPU to the GPU via Mantle also creates the potential for better artificial intelligence (AI), as the extra CPU performance can be used for more complex AI routines. However you want to look at it, Mantle is certainly interesting and with Direct3D 12 coming next year it could very well be a taste of things to come. Let's move on to the performance run-down.

Sniper Elite 3 with Mantle - Performance and Analysis

Sniper Elite 3 is actually a bit of an odd choice for the Mantle treatment, as truth be told it's not particularly demanding of most modern GPUs. Unless you turn on SSAA (Super-Sample Anti-Aliasing), which we'll use in our Ultra settings for the benchmarks, the game can easily hit 60+ FPS on a single GPU, never mind CrossFire configurations. It's not that the game looks bad -- at least not at higher quality settings -- but it just doesn't push the hardware like Crysis 3 or Metro: Last Light. Our charts this time have a few extra entries for the Mantle-enabled R9 280, R9 280X and R9 290X single GPU configurations; again, CrossFire with Mantle results in worse performance than a single GPU, so we've skipped those results for now (though if this gets fixed with a future update to Sniper Elite 3, we'll revisit the subject).


At our highest quality 2560x1440 Ultra settings (Ultra + 4xSSAA), Sniper Elite 3 can really pound the GPU; this is a bit extreme as 2.25xSSAA is also an option and the game already features shader-based edge anti-aliasing, but that doesn't generally fix all jaggies where SSAA does. Anyway, if you want maximum quality settings, SLI (which we can't test yet as we don't have any SLI setups) and CrossFire are basically required. The scaling from CrossFire is also very good -- R9 290X for example sees a 94% improvement in frame rates while R9 280 gets an 88% boost in performance from the second GPU. The gains at our 1920x1080 Ultra SSAA settings are even more impressive: 105% faster on R9 280 and 117% faster on Ri 290X! Greater than 100% scaling? Apparently there are some bottlenecks in Sniper Elite 3 where CrossFire really helps. Of course, once we hit our 1920x1080 High settings, the frame rates start to get silly and scaling becomes far less important: a single R9 280 can hit 93 FPS, while adding a second GPU reaches the absurd level of 178 FPS, and R9 290X is well into triple digits with a single GPU. Unless you're wanting to play Sniper Elite 3 on multiple monitors or a 4K display (only $598 now!), there's not much need for multiple GPUs unless you absolutely insist on enabling SSAA.
In terms of NVIDIA and AMD performance, Sniper Elite 3 is an AMD Gaming Evolved title (which helps explain the support for Mantle), so you'd expect AMD to have a bit of a lead. And indeed, the R9 290X does top the single GPU performance charts and it's the only GPU we've tested that can break 30 FPS at our 2560x1440 Ultra settings. (Update: we've added GTX 980 results, which is able to claim the crown though at a higher price than the R9 290X.) The new GTX 970 is nipping at the heels of R9 290X, and it costs $50 to $100 less. Oh, did I mention before that R9 290X is seeing some price cuts in response to GTX 970 and GTX 980? Yeah, it's gone from $499 down to $399, so if you're more of an AMD fan it's still a beastly card, though power requirements are certainly higher than GTX 980, nevermind GTX 970. Sniper Elite 3 is also interesting in that the new GTX 970 happens to outperform the old GTX 780 by a sometimes sizeable margin; at our Ultra 4xSSAA settings it's not much of a difference, but at 1080p High the 970 is a solid 20% faster.
As for Mantle, one of the main reasons we're even looking at Sniper Elite 3, it does offer some tangible benefits, but it's mostly at lower quality settings. 2560x1440 Ultra (4xSSAA) shows very little improvement and in some cases is even slower than running without Mantle. Disabling SSAA however does allow Mantle to improve performance on nearly all of our GPUs (though this isn't shown in our charts). The problem is that measuring frame rates with Mantle means we have to depend on the FPS reported by the game, and there are some issues with doing that (see below) which caused us to switch to measuring FPS using FRAPS; unfortunately FRAPS doesn't work with Mantle, so the result is that our FRAPS average and minimum FPS are more accurate and they're also apparently higher than what the game measures internally. Ugh....
Anyway, while the above charts don't show these figures for the non-Mantle scores, at 1920x1080 Ultra we see small increases in performance on the R9 290X and R9 280X, but the R9 280 still loses a bit of performance. Turn off SSAA and at 1920x1080 High we're now looking at solid performance gains across all three AMD R9 GPUs -- R9 280 is 9% faster, R9 280X is 14% faster, and R9 290X is 8% faster. It's enough of a boost to put the R9 290X ahead of the GTX 970 in most cases, but on a price/performance basis the NVIDIA GPU is still the better choice right now.
What about lower end hardware, though? Well, the main system we have for testing entry level graphics hardware is our Core i7-4770K (which we overclock to 4.1GHz so it's more like the i7-4790K we just linked) with its HD 4600 graphics. If that's all the graphics hardware you have, you're going to be hurting; average frame rates at 1366x768 Low are at least above 30 FPS, but there are some scenes where the game will drop into single digits, making it a very choppy experience. A moderate GPU like the R7 250X however can deliver a decent experience, allowing for 1080p High settings with decent performance.
I do want to mention that minimum frame rates reported in the Sniper Elite 3 benchmark tend to be a bit skewed. I considered leaving them out, then I did remove them, and finally I eventually decided to retest using FRAPS to log the "real" frame rates. No the figures (which we're in the process of updating) use the average of the lowest 3% of frames to provide a better estimate of true minimum FPS.
Update: Doing 4K at maximum quality (including 4x SSAA) simply isn't going to happen, so we dropped the SSAA setting to 2.25X and that seemed to result in at least somewhat playable performance. A single GTX 980 can't even hit 30FPS average, and neither can any of the other single GPUs, but R9 290X CrossFire does get up to 52FPS average. Minimum frame rates are still well below 30FPS, however, so jitter is a potential concern.

Sniper Elite 3 Image Quality

Sniper-Elite-3-IQ1-Ultra Sniper-Elite-3-IQ1-High Sniper-Elite-3-IQ1-Medium Sniper-Elite-3-IQ1-Low


Sniper-Elite-3-IQ2-Ultra Sniper-Elite-3-IQ2-High Sniper-Elite-3-IQ2-Medium Sniper-Elite-3-IQ2-Low


Sniper-Elite-3-SSAA-Comparison-(1) Sniper-Elite-3-SSAA-Comparison-(2) Sniper-Elite-3-SSAA-Comparison-(3)
We have a couple things I want to discuss with image quality. First is the difference between the preset Low/Medium/High/Ultra levels, and second is how SSAA affects things. The Ultra preset is obviously the most accurate/best looking, and in Sniper Elite 3 it employs a variety of lighting effects and shadowing techniques along with high quality textures. Dropping down to High quality mostly results in a slight degradation of the shadow maps and ambient occlusion, but the performance impact is also pretty small. Medium drops the quality of the textures enough that the change starts to become noticeable, and shadows are now even lower resolution. Finally, the low setting gets rid of dynamic shadows and uses pretty coarse textures, with the result being a game that looks rather flat -- I like to call this the "Xbox 360 mode" (or Intel HD 4600 mode). Along with dropping the texture quality, you can also see that the background gets blurred more at lower quality settings.
While jaggies aren't a huge problem for me in Sniper Elite 3, they are present when you're not using SSAA. What's interesting however is that enabling SSAA actually affects more than just aliasing. It looks like shadows and depth of field effects are applied using shaders that work on a set dimention, and with SSAA they effectively target fewer visible pixels. The result is somewhat unique in that 4xSSAA produces a noticeably sharper image than 0xSSAA, and 2.25xSSAA falls somewhere in between. The impact on performance is of course quite severe, so unless you have multiple GPUs you're probably going to need to run at 0xSSAA or 2.25xSSAA, but if you have the GPU power to spare 4xSSAA is a nice option.

How to Benchmark Sniper Elite 3

Wrapping things up, Sniper Elite 3 has a built-in benchmark under the "Extras" menu option, so this part is going to be pretty easy. We have the usual five settings. 2560x1440 and 1920x1080 with Ultra Quality are also run with 4xSSAA enabled in the "Advanced" menu; everything else has SSAA disabled. 1920x1080 High is exactly that, along with 1600x900 Medium and 1366x768 Low. Select the appropriate settings, apply the resolution, then go to the main menu and "Extras" and start the benchmark. You can run it multiple times and there's slight variation between runs, but frame rates are mostly consistent. Here's a look at the settings, and you can see the change in image quality once more:
Sniper-Elite-3-Settings-Low Sniper-Elite-3-Settings-Medium Sniper-Elite-3-Settings-High Sniper-Elite-3-Settings-Ultra 0xSSAA Sniper-Elite-3-Settings-Ultra 4xSSAA
There's one final item to note. The benchmark will only report the average and maximum frame rates, but the results file also lists minimum frame rate. However, as I mentioned before, the minimum frame rate tends to be lower than normal (i.e. if just one frame right at the start is really low, it messes up the result). I've elected to run the benchmark and log frametimes with FRAPS (which takes a lot more time and energy), and then use those numbers to calculate the correct average and minimum FPS values. My approach is similar to what I'm doing with some other games (Sleeping Dogs, Shadow of Mordor, etc.) where I use the average of the bottom 3% of frame rates, calculated using Excel. For Mantle scores, I've used scaling similar to the non-Mantle results.

Leave a Reply