Stalker 2 has made me look back and realize that maybe it was a mistake to make Epic Games' Unreal Engine 5 become an industry standard for the next decade.
I’m going to sound a little pissy here but I think most of what’s happening is that console hardware was so limited for such a long time that PC gamers got used to being able to max out their settings and still get 300 FPS.
Now that consoles have caught up and cranking the settings actually lowers your FPS like it used to people are shitting themselves.
If you don’t believe me then look at these benchmarks from 2013:
Look at how spikey the frame time graph was for Battlefield 3. Look at how, even with triple SLI Titans, you couldn’t hit a consistent 60 FPS in maxed Hitman Absolution.
And yeah, I know high end graphics cards are even more expensive now than the Titan was in 2013 (due to the ongoing parade of BS that’s been keeping GPU prices high), but the systems in those reviews are close to the highest end hardware you could get back then. Even if you were a billionaire you weren’t going to be running Hitman much faster (you could put one more Titan in SLI, which had massively diminishing returns, and you could overclock everything maybe).
If you want to prioritize high and consistent framerate over visual fidelity / the latest rendering tech / giant map sizes then that’s fine, but don’t act like everything was great until a bunch of idiots got together and built UE5.
EDIT: the shader compilation stuff is an exception. Games should not be compiling shaders during gameplay. But that problem isn’t limited to UE5.
The issue is not that the games performance requirements at reasonable graphics settings is absolutely destroying modern HW. The issue is that once you set the game to low settings it still performs like shit while looking worse than a 10y old games
No matter what you’ve got to compile the shaders, either on launch or when needed. The game should be caching the results of that step though, so the next time it’s needed it can be skipped entirely.
I’m going to sound a little pissy here but I think most of what’s happening is that console hardware was so limited for such a long time that PC gamers got used to being able to max out their settings and still get 300 FPS.
Now that consoles have caught up and cranking the settings actually lowers your FPS like it used to people are shitting themselves.
If you don’t believe me then look at these benchmarks from 2013:
https://pcper.com/2013/02/nvidia-geforce-gtx-titan-performance-review-and-frame-rating-update/3/
https://www.pugetsystems.com/labs/articles/review-nvidia-geforce-gtx-titan-6gb-185/
Look at how spikey the frame time graph was for Battlefield 3. Look at how, even with triple SLI Titans, you couldn’t hit a consistent 60 FPS in maxed Hitman Absolution.
And yeah, I know high end graphics cards are even more expensive now than the Titan was in 2013 (due to the ongoing parade of BS that’s been keeping GPU prices high), but the systems in those reviews are close to the highest end hardware you could get back then. Even if you were a billionaire you weren’t going to be running Hitman much faster (you could put one more Titan in SLI, which had massively diminishing returns, and you could overclock everything maybe).
If you want to prioritize high and consistent framerate over visual fidelity / the latest rendering tech / giant map sizes then that’s fine, but don’t act like everything was great until a bunch of idiots got together and built UE5.
EDIT: the shader compilation stuff is an exception. Games should not be compiling shaders during gameplay. But that problem isn’t limited to UE5.
The issue is not that the games performance requirements at reasonable graphics settings is absolutely destroying modern HW. The issue is that once you set the game to low settings it still performs like shit while looking worse than a 10y old games
You can preload them if you want but that leads to loadscreens. It’s a developer issue not an Unreal one
No matter what you’ve got to compile the shaders, either on launch or when needed. The game should be caching the results of that step though, so the next time it’s needed it can be skipped entirely.
Gpus do cache them
That’s why on launch/loading screens work