God Of War PC Gameplay And Performance Review: Nailed It


God of War: Game Performance And Graphics Benchmarks

The chart below is straight from Sony, and it lays out the company's recommended hardware for solid performance at various settings. Glance over the chart if you want, but I'll tell you now: it's being rather conservative. For example, speaking informally with others playing the game on more modest hardware than our test rig, a GeForce GTX 1060 6GB card is more than capable of 60 FPS in 1920×1080, even using my recommended settings on the previous page. Similarly, we never saw whole-CPU usage above 40% on our Ryzen 7 5800X, implying that the game is fairly reliant on just a couple of threads, like most DirectX 11 titles, which implies that you won't get much extra out of a many-core CPU.

HotHardware's God Of War Testing Environment

godofwar image11 sonychart

I tested God of War with a handful of handful of graphics cards. All three of the graphics cards I used are quite powerful, and the Ryzen 7 5800X in the test rig is no slouch either. We'll go over the performance data in detail here in just a moment, but let me spoil it for you: none of these graphics cards struggle with this game until you hit a 4K native resolution, and there's really no reason to play the game that way thanks to support for intelligent upscalers like DLSS. Here's the full system configuration for the test rig:

Hardware Used:
AMD Ryzen 7 5800X
(3.8 GHz - 4.9 GHz, 8-Core)

ASRock X570 Taichi (AMD X570 Chipset)
32GB G.SKILL DDR4-3800

AORUS NVMe Gen4 SSD
Integrated Audio
Integrated Network

EVGA GeForce RTX 2080 SUPER XC ULTRA
NVIDIA GeForce RTX 3070 Ti Founders Edition
AMD Radeon RX 6800 XT

Relevant Software:
Windows 10 Pro 21H1
AMD Radeon Software v22.1.1
NVIDIA GeForce Drivers v511.23

God of War doesn't have a canned benchmark, so to gather realistic performance data, I re-ran the same gameplay section at the start of Wildwood's Edge many times, mercilessly murdering the first few draugr in that area while capturing with CapFrameX before restarting at checkpoint and doing it all over again.

That gives us a good idea of gameplay performance, but given the not-quite identical nature of our testing, there's some variability in the data. To make sure we were getting an accurate picture of the state of things, we averaged our results across five test runs at each setting.

godofwar image12 wildwood
The area where we performed our benchmark captures.

We've charted eight performance tests: three at 3840×2160 resolution, three at 2560×1440 resolution, and two at 1920×1080 resolution. For all three, we tested the game without upscaling using the "Original" preset that configures the renderer as it was on the PlayStation 4, on the game's "Ultra" preset with maxed-out visuals, and then for the higher resolutions, again in "Ultra," but with AMD's FSR enabled in "Quality" mode. We initially tested using FSR on all three cards, but there's really no reason to use FSR on the GeForces. We re-tested with DLSS and we're pitting the two upscaling techniques head-to-head in both performance and image quality.

God of War Benchmarks - 4K
Nordic Myth Murder, Caught in 4K

Starting straight from the top with 3840×2160, better known as 4K Ultra HD, we immediately see the expected pattern emerge: the RTX 2080 Super is the slowest, the RTX 3070 Ti offers stably superior performance, and the Radeon RX 6800 XT is the fastest overall.

god of war 8mp chart dlss

Only the Radeon card is capable of maintaining a framerate over 60 FPS in combat at native 4K with ultra settings, although if you use my recommended settings I'd expect the GeForce RTX 3070 Ti to manage just fine as well. The older Turing card just doesn't have the memory bandwidth for such a high resolution with maxed-out settings, although with a solid G-SYNC display it'd really be just fine. Reducing the game's settings helps all three cards pick up the average over 60 FPS, but the trade-off is pretty dire.

Instead, we'd recommend enabling your choice of resolution scaling, whether you use NVIDIA's DLSS, AMD's FSR, or the built-in temporal resolution scaler. We initially tested all three cards using AMD's FSR solution to ensure performance parity, but upon realizing DLSS has nearly identical performance we switched our GeForce results to DLSS data, as that's more likely to be how most players will enjoy the game. Using Ultra settings with either upscaler turned to "Quality" gives us a pre-upscaling resolution of 2560×1440, and despite losing over half the pixels, it still looks quite good.

The main limiter on performance at 4K UHD resolution is the fixed-function graphics hardware on the GPU; it's a ton of work to shade and fill 8 megapixels. Enabling upscaling (and thus dropping the resolution) drastically lightens that workload, and so we see the cards spread out a little, mostly stratified by fill-rate and memory bandwidth. The Radeon card is the standout here; playing in 4K with max settings and FSR enabled it manages an average framerate of nearly 100 FPS. Neither of the other cards turn in a shabby performance, though; all three are more than enjoyable at these settings.

God of War Benchmarks - QHD
Monster-Slaying Mayhem at 2560×1440

god of war 4mp chart

Moving down to 2560×1440 output resolution, we see an interesting change. Quad HD resolution isn't really high enough to present a shading challenge to the cards in this comparison. As a result, lowering the render resolution by enabling upscaling gives a very modest performance uplift. This is the main reason we didn't bother to test DLSS or FSR at 1920×1080 resolution. Lowering the game's settings to reduce the workload on the GPUs' other parts has a much greater impact on performance, but it also turns a gorgeous game much uglier.

Notably, we see higher performance at 2560×1440 native compared to the results we just looked at using FSR (or DLSS) with a 4K output resolution. As a reminder, those two settings (2560× native and 4K with FSR "Quality") have the same internal render resolution. Some of that is down to the extra work required by the smart upscaling, but we suspect that the game might also be scaling content LODs based on the output resolution.

That's not unusual in modern games, and in subjective testing we couldn't tell the difference, but it's our best explanation for why we see such improved performance playing in the native QHD resolution. All three of these cards are more than capable of maintaining 60 FPS in native QHD with ultra settings, so from this point on the data is only really relevant to PC gamers with high-refresh-rate monitors.

God of War Benchmarks - FHD
Being a Single Dad in 1080p

god of war 2mp chart

Even in 2022, most gamers are still playing in 1920×1080 "Full HD" resolution. That's true even if we're talking about gamers with powerful graphics cards like the ones in this comparison; if you look at the Steam hardware survey, there simply aren't enough pre-Pascal and Polaris cards to make up for the massive proportion of full HD displays employed.

There's no need to employ resolution scaling at this setting, and even with the arguably-excessive Ultra preset, all three graphics cards put up performances that practically mandate a 120-Hz display. Using my recommended settings, I expect that all three cards will maintain a minimum frame rate over 100 FPS.

It would be easy to look at the data for the Original preset tests here and draw some concerning conclusions regarding the Radeon RX 6800 XT. Examining the saved sensor data in CapFrameX, we do see consistently higher CPU usage from the Radeon card, which is what we've come to expect from Radeons in DirectX 11 titles. It's possible that our Radeon card is running into a CPU limitation at these unrealistically low settings, but we think it's worth noting that the 1% low framerate is still over 120 FPS.

God of War Benchmarks - Smoothness
Frame Time Axe-nalysis

godofwar frametimes
You can click through the three images to observe hand-selected frame-time data for two runs per card. We intentionally picked the "worst" runs for each card at the chosen settings. Knowing that, look over the data and notice how smooth the presentation actually is. While the lines look fuzzy, if you pay attention to the Y-axis scale on the left, even the GeForce RTX 2080 SUPER is seeing a typical frame-to-frame variance of around 2 milliseconds.

Indeed, God of War on PC is an exceptionally smooth-performing game. Even when performance drops, such as when I was testing the RTX 2080 SUPER in 4K UHD native, it remains fairly consistent. There are the occasional very minor hitches here and there, but these are generally when loading new content or during rapid camera movement and are to be expected. Overall, the title is impressively fluid most of the time.

DLSS Or FSR For God Of War? Both Are Good Options But One Is Superior

godofwar dock cutouts

A lot of arguments have been made both in favor of and against the idea of comparing NVIDIA's Deep Learning Super Sampling (DLSS) and AMD's FidelityFX Super Resolution (FSR). It's true that they are fundamentally different technologies. DLSS uses the tensor processors on RTX GPUs to do temporal image upscaling through AI inferencing. This requires fairly deep integration into the game engine, so it can't be easily implemented in a driver-level toggle.

By contrast, FSR is simpler, but that's also arguably its strength. When upscaling, FSR just performs a Lanczos upsample before applying FidelityFX Contrast Adaptive Sharpening. FSR doesn't know anything about the scene geometry, and it only operates on a single final-pass rendered frame at a time. However, thanks to that approach, it's also drastically easier to implement, and it can work on essentially any graphics card.

godofwar bridge cutouts


God of War is one of the few games to ship with support for both technologies, and thus it gives us a lovely chance to compare the two directly. There's not much of a comparison to be had, though, honestly. As I mentioned earlier, the performance of the two techniques is nearly identical, with perhaps a slight edge in FSR's favor. The visual output of the two methods is quite different, though. You can see for yourself in these screenshot cutouts.

DLSS' primary fault in past implementations has been noticeable temporal artifacting and I was indeed able to pick it out in a couple of scenes with dark geometry scrolling past light backgrounds, but you really have to look for it; it's not something you'll ever notice during gameplay. DLSS in this game looks fantastic and I recommend it without reservation. Some folks have commented that DLSS is not available to them despite possessing a capable GeForce RTX graphics card. At the time of this writing there is currently an experimental beta branch on Steam that you can download which resolves this issue.

Don't mistake me though; FSR doesn't look bad at all, and both methods are preferable to the smeary TAA upscale of the regular resolution scaler. DLSS is visibly crisper and sharper, though. To some degree that may be because the specific version of DLSS implemented by Jetpack Interactive in this port performs an aggressive sharpening pass after upscaling. I've seen some players express that they feel it's overly sharpened, giving an almost comic-book like vibe to the image. I don't feel that way, but I understand how they could.

godofwar bandage cutouts

I also compared FSR on the GeForce RTX 3070 Ti with the output of FSR on the Radeon RX 6800 XT to make sure that it was truly an apples-to-apples comparison. You can see a couple of example images from that testing above, along with a comparison to DLSS and native 4K rendering.

Unfortunately, I wasn't able to match the angle as exactly in some shots due to the forced shakycam in cutscenes, but you can still see that regardless of vendor, FSR looks a lot more like simple upscaling than it does AI-enhanced DLSS or native. Again, FSR is not ugly, it's just softer and not as sharp. Some people may even prefer the look. Another curious consequence of the sharpening performed by both DLSS and FSR, is that it un-does some of the distant blur applied by the Depth of Field effect. This is clearer in the full-resolution shots, which you can download here from our Google Drive repository.

If FSR is your only choice in God of War, we'd certainly recommend using it, especially if it lets you stay at recommended settings, or if you can't quite get a stable frame rate at your monitor's native resolution (say, when playing on a 4K UHD monitor with a Radeon RX 5700 XT or similar.) If you do have the option of using DLSS, though, do so. The infinitesimal visual difference between native and DLSS is well worth the gains in performance.

God Of War On The PC Is Fantastic, And Other Key Take-Aways

God of War is between 20 and 30 hours long for a typical play-through, or around fifty hours if you intend to do and see everything. It has a lot of qualities to recommend it. It's a rare gem; a fully-offline game with no DLC to buy. $50 and you get the full package, some thirty-plus hours of single-father feels and Nordic myth mayhem. That in and of itself is almost enough to warrant a recommendation.

Graphically, while even this brand-new PC port is held back somewhat by the limitations of the old PlayStation 4, it also remains one of the very best-looking games from the previous console generation. With the extra coat of spit and polish that the PC can provide, this four-year-old game can still be visually stunning at times. At high resolution on an HDR display, this game is a visual treat and it's not that taxing on PC hardware either.

godofwar conclusion1 throw
Kratos flings the Leviathan Axe at a menacing troll.

Folks who enjoy getting engrossed in a game's setting, who want to become captivated by its narrative and deeply-invested in its characters, will absolutely love this game. The voice acting is a top-class affair, with main characters Kratos and Atreus especially being expertly performed. The pacing of the game is pretty good too, at least until the latter half, where it's somewhat under the player's control.

Technologically speaking, God of War is nearly flawless. Some players have encountered bugs, but they seem few and far between. Major performance and rendering errors are nowhere to be found, to say nothing of non-existent crash bugs. The game is scalable to machines both above and below its target hardware, and it adapts seamlessly to myriad input devices. All in all it is a shining example for other PC games to follow, ironically.

godofwar conclusion2 bridge
Atreus runs ahead. This is kind of a theme. (click to enlarge)

The weakest points of God of War are in its mechanics. The combat itself is engaging, but it's very much up to you to keep it that way because it's all too easy to button-mash your way to victory, even on the harder difficulties. That's fun for for a bit, but it gets old fast. Kratos and Atreus gain a lot of abilities over the course of the game, and if you bother to make use of them it's both easier and more entertaining. Meanwhile, as I said before, the RPG elements are unnecessary, but at least they don't get in the way too much.

As a result, I wouldn't flat-out recommend God of War to hardcore RPG aficionados and action-game adrenaline junkies—at least not at the $50 asking price. The sparse mechanics and story-heavy presentation, with its entirely-unskippable cutscenes, are likely to annoy serious gamers looking for a challenging hack-and-slash adventure, while the rudimentary RPG systems aren't going to be fulfilling to role-players looking to min/max a build.

godofwar conclusion3 atreus
A touching father-son moment.

However, those two groups compose a small minority. Technology nerds with beefy rigs that want to marvel in the game's graphical glory, people who want to revel in an epic a story line, fans of similar AAA games like The Last of Us, and casual gamers—likely the majority of game-playing adults—should all snap this up immediately if they haven't played it.
hothardware editors choice
God Of War For PC

Related content