1. Home >
  2. Gaming

Is Nvidia's PhysX causing AMD frame rate problems in Gears of War?

There's a rumor going around that blames AMD's low Gears of War performance on Nvidia's PhysX API. The truth is somewhat more complicated, and there's no clear answer as to what's harming AMD's performance at the moment.
By Joel Hruska
GoW1

Ever since Gears of War Ultimate Edition came out last week, there's been a rumor floating around that one reason the game runs so poorly, with so much stuttering on AMD hardware, is because Nvidia's PhysX is actually running on the CPU. We were flagged about this possibility last Wednesday, so I installed the base game and consulted with Jason Evangelho over at Forbes, who had written the initial article on Gears of War's low performance, to check performance settings and the like.

Update (3/11/2016): I'm inserting a point of clarification here about PhysX and how it functions. Nvidia historically licensed PhysX in two distinct ways -- as a general software middleware solution for handling physics that was always intended to execute on the CPU (software PhysX), and as a GeForce-specific physics solution that added in-game visual effects and was intended to execute on Nvidia GPUs (hardware PhysX).

The problem with this distinction is that hardware PhysX can be executed on the CPU as well. This is a distinct third operating case, best referred to as "Hardware PhysX executing in software." Some websites have claimed that Gears of War uses this mode by default, therefore harming performance on AMD GPUs. Our results refute this claim.

Original story below:

I used the built-in Windows performance monitoring tool, Perfmon, to grab a screen shot of what CPU utilization looked like within Gears of War when benchmarking at 4K on an AMD Radeon Fury X GPU. I also checked the Windows\Apps folder to check the configuration files for PhysX. What I found -- and I wish I had screenshots of this -- was that every single game-related INI file contained the following: "bDisablePhysXHardwareSupport=True" Since I was testing on an AMD Radeon R9 Fury X, that's exactly what I wanted to see. I turned the system off and went back to working on other articles. (All tests below were run on a Haswell-E eight-core CPU).

Data from March 2. PhysX disabled according to INI. Data from March 2. PhysX disabled according to INI.

Fast forward to today, when reports are still surfacing of the "bDisablePhysXHardwareSupport" variable being set to False, rather than True. I fired the testbed up again, allowed the game to update, checked the same INI files, and found that the value had changed. On Wednesday, five files had defaulted that value to "True," meaning PhysX should've been disabled.On Sunday, the value had changed to "False," which implies it's now enabled.

Data from March 6. PhysX enabled according to .INI. Data from March 6. PhysX enabled according to .INI.

If you compare the CPU graphs of False versus True, however, you'll note they're more or less the same. Allowing for some variation in when the benchmark run started, and you've got a pattern of high spikes and dips. The average value for the disabled/True run was 13.63% and for the enabled/false run, 14.62%.

What about Nvidia? I dropped in a GTX 980 Ti, installed Nvidia's latest drivers, and ran the same, simple test. I allowed the benchmark to run twice, then grabbed the final CPU utilization result.

Click to enlarge. Data from March 6. PhysX enabled in .INI. Click to enlarge. Data from March 6. PhysX enabled according to .INI. The average CPU utilization on this graph isn't much lower, at 11.77%, but the shape of the graph is distinctly different. The GTX 980 Ti's frame rate is roughly double that of the R9 Fury X (we benchmarked with ambient occlusion disabled, since that mode causes continual rendering errors on the AMD platform), but the CPU utilization doesn't keep spiking the way it does with the AMD cards.

Smoking gun or poorly optimized game?

It's true that the .ini default setting for Gears of War appears to have changed between the original game and the latest update that's been pushed via the Windows Store. But there's no evidence that this actually changed anything about how the game performs on AMD cards. Nvidia's own website acknowledges that Gears of War uses HBAO+, but says nothing about hardware PhysX. Given the age of this version of the Unreal 3 engine, it's possible that this is a variable left over from when Ageia owned the PhysX API; Unreal 3 was the first game engine to feature Ageia support for hardware physics.

Right now, the situation is reminiscent of Arkham Knight. It's true, Nvidia cards generally outperformed AMD cards in that title when it shipped, but the game itself was so horrendously optimized, the vendor pulled it altogether. As of this writing, there's no evidence that hardware PhysX is active or related to this problem.

All we have is evidence that the CPU usage pattern for the AMD GPU is different than the NV GPU. Since we already know that the game isn't handling AMD GPUs properly, even with ambient occlusion disabled, we can't draw much information from that. Our ability to gather more detailed performance data is currently curtailed by limitations on the Windows Store. (None of the game's configuration files can be altered and saved -- at least not using any permission techniques I'm familiar with.)

If you're an AMD gamer, my advice is to stay clear of Gears of War Ultimate Edition for the time being. There's no  evidence that hardware PhysX is causing this problem, but the game runs unacceptably on Radeon hardware.

Update (3/11/2016):

After we ran with this piece, we realized that while we can't edit the INI files of a Windows Store application, we can change how PhysX runs via the Nvidia Control Panel. Previously, the application was set to "Default," which means that if hardware PhysX was enabled, the game would execute that code on the GPU.

We retested the game in this mode and saw essentially identical results to our previous tests. The CPU utilization curve for GeForce cards remains somewhat different than it does for AMD GPUs, but it's consistent whether PhysX is forced to run on the GPU or the CPU.

If Gears of War actually used hardware PhysX, it would increase CPU utilization when we offloaded that task back on to Intel's Haswell-E. The fact that we see no difference should put to rest any claim that Gears of War is using PhysX to damage AMD's performance.

 

Tagged In

Gears Of War Windows AMD Windows Store Nvidia

More from Gaming

Subscribe Today to get the latest ExtremeTech news delivered right to your inbox.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of use(Opens in a new window) and Privacy Policy. You may unsubscribe from the newsletter at any time.
Thanks for Signing Up