Dragon's Dogma 2 PC Performance Is a Mess, But It Didn't Have to Be

  • Thread starter Thread starter Jacqueline Thomas
  • Start date Start date
Status
Not open for further replies.

Jacqueline Thomas

Guest
Its no secret that while Dragon's Dogma 2 is great, performance has been an issue from the jump – particularly on PC. There's a reason for it, Capcom has explained NPCs are individually simulated by your CPU, which makes this game incredibly dependent on your processor. Play the game for an hour or so and you'll realize how much the game benefits from this approach, but unless you have a top-end CPU paired with the best RAM you can muster, you're going to have a rough time with the frame rate, especially in cities.


And then even if you do have the best gaming PC money could buy, you're still going to see frame rate drops. It's a shame, especially because the issues easily could have been avoided, and it's not entirely Capcom's fault.

What's The Actual Problem?​


Capcom's RE Engine is largely responsible for the bugbears you'll run into when playing the game. Reiterating our performance review, the RE Engine was primarily designed for linear games, like Resident Evil, which it's named for. Making it worse is advanced pathing for every NPC, which is why even on my Core i9-14900K, my framerate will drop to 50 fps in towns – where dozens of citizens devour my CPU bandwidth.

Dragon's Dogma 2 being so reliant on your CPU and memory was kind of unexpected. By and large, games – particularly action role playing games – have depended more on the best graphics cards as time has gone on, as GPUs like the RTX 4080 Super have been improving faster than CPUs have. Then, out of the blue we get a game that needs a hefty CPU to run at its best, and we get stuck with poor performance.

Making things worse is Capcom's choice of upscaling technology: DLSS 2.0 and FidelityFX Super Resolution 3.0. Having both these upscalers included is better than not having one at all, or only having one, but neither of them really address the elephant in the room: high CPU usage. It's little wonder that modders have found a way to add DLSS 3.0, which was apparently already in the game files anyway. Unfortunately, as things stand right now, DLSS 3.0 is the only upscaling tech that can mitigate the heavy CPU usage, because of the way it handles, or rather, replaces the rendering pipeline.

Upscaling Is So, So Important: An Essay​


Every time I talk to AMD or Nvidia, there's one thing they keep telling me when I grill them about graphics card prices: Moore's Law is dead. For the unintiated, Moore's Law was a theory that the transistors on processors would double every two years at the same price. Whether the GPU makers are using this as an excuse to charge more for GPUs or if it's actually true, it doesn't matter. The demand for shinier graphics is growing faster than the hardware that, well, makes the graphics happen.

That's why upscaling is so important. DLSS launched forever ago at this point, alongside the RTX 2080, and while it didn't seem that important back then, it's grown to be the most important thing in PC gaming. It's little wonder that after seeing how successful this tech was, AMD launched its FSR tech, followed by Intel with XeSS. Unfortunately FSR and XeSS both lag behind DLSS, and it all comes down to having dedicated AI acceleration boosting the Nvidia upscaling technology.

In many ways, everyone else is finally catching on to how important AI acceleration is, which is why AI was the only thing people would talk about at CES this year. AMD and Intel both added Neural Processing Units to their new lines of laptop CPUs, and Team Red even secretly added them to RDNA 3 graphics cards, like the Radeon RX 7800 XT. Unfortunately, AMD hasn't used these AI accelerators for gaming, even though it vaguely suggested that AI upscaling would be coming sometime in 2024 – but that hasn't happened yet. Instead, AMD's AI accelerators are solely used for enterprise AI applications, like Stable Diffusion.


It's a huge missed opportunity, and the sole reason FSR lags so far behind DLSS, and the gap widened even more with the release of DLSS 3.0 with its AI frame generation tech. And it's really a travesty that Dragon's Dogma 2 didn't release with the technology available, given that it was in the game files all along.

Functionally, DLSS Frame Generation completely replaces the traditional render queue that has powered PC games for years. Rather than having the CPU generate frames for the GPU to render, Nvidia uses Reflex to synchronize the two components, so the processor can essentially hand the GPU the raw data, which it then uses to execute the render queue from start to finish. Basically, in a CPU-limited game like Dragon's Dogma 2, DLSS 3.0 essentially allows the CPU to focus on simulating the world, while everything else is handled locally on your graphics card.

What sucks is that AMD even has frame generation tech which is supported in Dragon's Dogma 2, but because it doesn't run it off of its AI Accelerators – as Nvidia's does with its Tensor Cores – it comes into the pipeline at a later stage, which introduces latency. You can mitigate this by enabling AMD Anti Lag in your Adrenalin app, but the process will still introduce more latency than DLSS.

Unfortunately, if Dragon's Dogma 2 proves anything, it's that AI acceleration is a must in AAA games going forward, and AMD needs to get its shit together to catch up to Nvidia. It's not really a matter of one graphics card being better than another at this point – PC gaming has kind of grown past that. Everything is riding on upscaling technology if we want to keep having games that look this good, while also having complex physics engines and NPC pathfinding.

It's either that or we stop demanding that games look photorealistic at all times. I don't know, pick one.


Jackie Thomas is the Hardware and Buying Guides Editor at IGN and the PC components queen. You can follow her @Jackiecobra
 
Status
Not open for further replies.
Top