Trade offer:
I receive: Your next paycheck
You receive: 8.3% faster graphics
At the expense of 18% more watts! Comparing the x090s in the article.
So as a total non-expert - isn’t that kind of what you’d expect from overclocking the old one? A bit more wattage used for a bit more power?
exactly
I receive: Your next paycheck
And all the power your house can provide without buring down*.
Isn’t it like… Ridiculously small?
I’m not up to date on hardware evolution, so don’t hesitate to educate me if I’m wrong :)
I’ve gotten the impression that the major gains in the 5xxx line are from DLSS and frame generation going from 1:1 extrapolated frames to 3:1, and it looks like this review was just comparing straight rendering.
Which is the actual comparison. GPU vs GPU, not software vs software.
I think it’s a little underwhelming considering the prices they’re asking, I am thinking bigger increases will hopefully come next generation with the 3nm manufacturing process
Perfect time for AMD to come in with some good prices and compete before Intel steals their lunch on the lower end cards
Is that with or without multiframe generation? I am assuming without?
For competitive or fast action games you should have MFG off, but for games that aren’t so fast I could easily see MFG giving a very clearly smoother presentation, and that is only available on the 50 series cards.
The fact that it’s only available on 50 series feels like an artificial restriction that is anti-consumer.
Plus I’d rather have a game that is actually optimized and looks good than one that takes 10x the compute and is full of AI artifacts.
Blaming GPU manufacturers for poorly optimized games is a bit like blaming forks for people being obese.
Yes, games should be optimized. Thats a given. But its not NVidias fault for making better, more performant graphics cards with new features each generation. Its the game developer’s fault for being lazy, not knowing how to use their game engine, and not optimizing their game. GPU makers could drop the most advanced card known to man and that would make no difference for developers. Its still on them to optimize their game.
The artifacting is way down compared to before. AI isn’t going anywhere, and I only see it improving with less artifacts in the future. In some videos I have seen some issues but its really unfair since to show them on YouTube they have to record at only 120 fps and slow the game to 50% speed, and even then they also get slapped with YouTube compression. I don’t know if the artifacts will even really be very visible or noticeable outside of some edge cases.
I am more curious to see if MFG can be used for games that have a forced framerate cap, or emulators.
I don’t blame them for poorly optimized games but i do blame GPU manufacturers for all the little things they do to inflate their prices.
I mean, I agree but what are you going to do about it? Even if they didn’t add features, itd be “supply chain issues” or some other made up excuse. Whatever it takes to get Jensen a shinier jacket. But I personally would rather have new features I may or may not use compared to no features for the same price.
Yarp im not really arguing witcha, ops argument was different than mine anyhow. What will i do? Everything in my power (nothing). I’ll continue ta complain mightily about their unfair market control and price fixin’ simply cuz it feels good to vent
There’s a good reason why you see Nvidia or AMD splash logos on game startup. Companies 100% get kickbacks to make demanding games that use the newest hardware
For games that aren’t fast moving, you don’t need 240fps in the first place.
I played an MMO at 40 FPS for years. With a freesync screen that matches the frame rate instead of stuttering or tearing, it still feels fine.
From experience you’d definitely notice and would have a hard time going back. My windows settings had restricted my fps to 30 on my mmo and i didn’t know but when i fixed it and had 60 the difference was huge. Took a week to get used to.
Got my wife a 144hz screen and i can see the difference between my 60 and hers but it’s not as huge.
Any time some weird thing results in the refresh rate being set incorrectly, it is pretty noticable to me. Though it might be more about the pixel response time being tuned for the max refresh rate and thus the screen looking a bit darker at lower refresh rates (because the pixels get closer to returning to black before the next frame comes in).
Just speculating on the reason but I definitely notice it when it’s wrong.
No, you dont need it, but it wouldn’t hurt to have a smoother picture. A smoother picture of course would give a perceived performance increase. So while the card might only be an 8% increase, it could feel like more particularly for games that don’t require MFG to be turned off.
I mean, obviously the 5080 isn’t really worth the price if you already have a 4080, but still. No 50 series is worth the price if you already have a 40 series. Nobody in their right mind buys a new GPU every generation.
I wonder if MFG can be utilized for games with a game engine hard-cap on framerate, or with emulators to give a smoother experience without having to modify the game files.