Ubisoft Claims CPU Specs, Not GPU Performance, A Limiting Factor in Assassin's Creed: Unity

A new interview with Assassin's Creed Unity senior producer Vincent Pontbriand has some gamers seeing red and others crying "Told you so!" after the developer revealed that the game's 900p framerate and 30 fps target on consoles is a result of weak CPU performance rather than GPU compute.

"Technically we're CPU-bound, he [Pontbriand] said. "The GPUs are really powerful, obviously the graphics look pretty good, but it's the CPU [that] has to process the AI, the number of NPCs we have on screen, all these systems running in parallel.

"We were quickly bottlenecked by that and it was a bit frustrating, because we thought that this was going to be a tenfold improvement over everything AI-wise, and we realised it was going to be pretty hard. It's not the number of polygons that affect the framerate. We could be running at 100fps if it was just graphics, but because of AI, we're still limited to 30 frames per second."



This has been read by many as a rather damning referendum on the capabilities of AMD's APU. To some extent, that's justified -- the Jaguar CPU inside both the Sony PS4 and Xbox One is a modest chip with a relatively low clock speed. Both consoles may offer eight threads on paper, but games can't access all that headroom -- one thread is reserved for the OS and several more cores will be used for processing the 3D pipeline. Between the two, Ubisoft may have only had 4-5 cores for AI and other calculations -- scarcely more than last gen, and the Xbox 360 and PS3 CPUs were clocked much faster than the 1.6 / 1.73GHz frequencies of their replacements.

What kind of AI is Ubisoft trying to build?

I think it's worth noting that the concept of good AI is entirely game-dependent. We can separate it into two approximate camps -- games that attempt to model near-lifelike behavior for a very small number of characters, and games that attempt to create a large number of characters that aren't blitheringly, humiliatingly, obviously artificial and stupid.

What does this mean? It's genuinely not clear -- and sometimes, even small things can make an enormous difference to players. Take SimCity -- when Maxis launched the latest troubled incarnation of that franchise, it quickly became obvious that its Sims were terrifyingly moronic in ways that created problems for players. Instead of actually returning home, they simply walked into the first empty house they found. Rather than "owning" a car, they got in the first empty vehicle. This created massive traffic and housing snarls and caused job shortages -- the game treated characters as interchangeable parts that walked in and out of a machine every day, rather than as individuals with particular assignments and tasks.

At the same time, however, trying to make NPCs smarter can run into its own challenges. Skyrim's NPC merchants, guards and bandits are often mocked for ludicrously stupid behavior -- but in many cases, these oddities -- such as being able to put buckets on shopkeeper heads and then steal at will -- are the result of sophisticated attempts to improve behavior by adding more flexibility. Shopkeepers aren't just invisibly aware of you if you steal something -- they can only know you stole it if they see you do it.


Let's face it -- some of these AI decisions really were just terrible


Similarly, we laugh at bandits who return to normal status after a few seconds -- even walking around a dead ally while muttering "Must've been my imagination" -- yet few people want to play a version of the game where being detected always led to complete camp mobilization and being chased out of the area or killed.

Somewhere in between the two lines, there's a balance -- but where that balance sits varies from game to game.

What seems increasingly obvious is that this will not be the console generation of 1080p60 as a reliable feature. Unlike some, I don't blame AMD's Jaguar for that:  AMD has variants of the core clocked at 2GHz and above, albeit at higher power consumptions. Microsoft or Sony could've specced out a variant of the core clocked at 2-2.4GHz and boosted total CPU throughput by as much as 50%. They didn't. The programmable nature of the GCN architecture inside the Xbone and PS4 is meant to compensate for the relatively lightweight core, but AI calculations may simply be something it can't do much with -- GPU calculations tend to be high latency, and AI typically requires fast response times.

So far, Sony and Microsoft's decision to minimize console costs and go with off-the-shelf designs that could reach profitability more quickly has paid off, big time -- but whether or not that continues to be the case long term is an open question.