Xbox One May Still Get 1080p Titanfall In Post-Release Patch

Expectations are high for the imminent launch of Titanfall. The game is a major test for Microsoft's Xbox One -- its one of the biggest exclusive projects, its beta was well-reviewed, and gamers have been looking for a game that would clarify just what the Xbox One is capable of when compared against the PlayStation 4. The game's beta ran an odd resolution -- 1408x792 -- but according to an interview with Respawn lead engineer Richard Baker, an upgrade could still be in the cards.

Digital Foundry spoke to Baker, who said: " One of the big tricks is how much ESRAM we're going to use, so we're thinking of not using hardware MSAA and instead using FXAA to make it so we don't have to have this larger render target. We're going to experiment. The target is either 1080p non-anti-aliased or 900p with FXAA. We're trying to optimise... we don't want to give up anything for higher res."

This is passing odd, for several reasons. It's not surprising that the Titanfall team would opt to dump multi-sampled antialiasing in exchange for higher resolutions; MSAA's impact on GPU performance is well known. What's a little more unusual is that the developer would identify FXAA as the difference between running at 900p and 1080p in the final product.



Here's the thing to understand about FXAA / MLAA -- it's not actually antialiasing at all. In conventional antialiasing, the GPU takes multiple sub-pixel samples of a point in space, then combines that data to create a smoother (albeit somewhat blurrier) line. This requires sub-pixel precision and it's computationally expensive. FXAA, in contrast, uses pixel-level precision to identify discontinuities (jaggies) in frame buffer data and then re-draw the image for better visual quality.



The impact of FXAA / MLAA on performance is supposed to be modest. That's the point. In the graph above (drawn from Intel's original presentation on the topic), the purple and red lines are the two to pay attention to. As scene complexity increases, the cost of FXAA stays consistent -- just above the cost of rendering a scene with no AA at all.

The jump from 1600x900 to 1920x1080 isn't small -- that's a 44% leap in total number of pixels, and it's downright odd that the performance gap between FXAA and no antialiasing whatsoever might require the Titanfall team to drop the resolution so drastically. But there's a potential answer to this question that dovetails with what we've heard from other sources -- the Xbox One's EDRAM cache might be a little too small.

A smaller-than-ideal cache wouldn't completely cripple a game, but it could create exactly the kinds of problems we've seen with launch titles. When the cache isn't big enough for everything to fit in comfortably, developers have to get creative about ensuring that data is loaded and evicted at precisely the right moment. Suddenly, it's imperative that algorithms be optimized for size, textures need excellent compression, and the game needs to be very, very stingy when it comes to loading only necessary information into cache. Information that gets evicted needs to stay evicted; information that's critical to the game's operation needs to take up as little room as possible.

If this is true, it suggests that 1080p isn't impossible on the Xbox One, but we may never see many titles take advantage of it. If it requires significant optimization to hit that level while 720p is readily available, the majority of game studios will likely settle for the 720p experience.