Could you try using some kind of performance tweak tool for your GPU and just like–set it to be slow? I’m curious if your GPU load has anything meaningful to do with your framerate since for me that generally doesn’t seem to be the case.
Setting my max frequency to sub-25%, turning down the power limit and the voltage, I still get a good amount over 60 FPS in the Plaza with max settings, barring anti-aliasing where I’m using FXAA instead of TAA, with workshop on in the most populated server atm (61 players on West 1) Also 1440p monitor with the game in windowed borderless.
I know looking at the sky could improve the framerate–but I wanted to have other people in view. Moving around the center fountain area it still stays above 60, hovering around 80 with some jumps into the low 100s.
What my "Tuning" section of the AMD Software is set to
In my case the limiting factor seems to mostly be CPU. Changing the power plan in Windows, which affects the CPU speed and whatnot, has a much bigger impact (60-100 FPS being brought down to at most 50) than lowering GPU speed (that impact being a loss of about 20-30 FPS).
FPS in a Plaza using tuned-down and default GPU settings and balanced/power saving power plans (also GPU usage)
This screenshot is with the “Default” tuning for my GPU and “Balanced” power plan,
and then with “Default” tuning on GPU and the “Power saver” power plan,
And now this is with the tuning settings in the details summary above and the “Balanced” power plan,
and then this one is with the same tuning settings but using the “Power saver” power plan,
And then this is what the numbers are looking like in the AMD software while the games running. And for reference–the numbers seen in the other details summary was also while the game was running and focused.
Also here’s this super neat and tidy visual example that was neatly crafted for your viewing pleasure depicting the GPU usage being done in the various modes
So like- if the GPU were being maxed out–that’s where I’d expect DLSS/FSR/XeSS to help, but unless I’m running the CPU at ~40% speed (“Power saver” makes mine run at 1.7GHz while balanced makes it run at 4.2 GHz), the GPU isn’t really getting anywhere near fully utilized. Now that might be different with a 3060Ti, which is why I asked at the start of this reply, I’m curious. The only RTX card I had laying around (RTX 3050) I gave to my sister so I don’t have one to test with anymore.
Computer specs for reference:
CPU: AMD Ryzen 7 5800X3D
GPU1: AMD Radeon RX 6950 XT
GPU2: Intel Arc A770 Limited Edition (for AV1 encoding)
GPU3: Nvidia Quadro P4000 (for CUDA compute)
RAM: Corsair Vengance LPX 128GB (32x4) @ 3200MHz
Motherboard: GIGABYTE X570S AORUS PRO AX
Boot Drive: Western Digital Blue 500GB SSD (NVMe)
Game Drive: Crucial P5 Plus 2TB SSD (NVMe)
Operating System: Windows 10 Pro Version 22H2 Build 19045.3803
GPUs 2 and 3 shouldn’t really have an impact on this, though two of my monitors are plugged into the Intel GPU which does cause some fun stuff with the Desktop Window Manager. That being said, the primary monitor is plugged into the RX 6950 so that’s what the game is rendering on.