Not generating the exact same bits in GPU product and architecture comparisons

I mean it asks you during the first install "do you want to enable game optimization" with "yes" being the pre-selected option. I would think most people probably select yes.

If that’s true then that is certainly enabling it by default. Don’t remember seeing that during install though it’s been a long time since I did a first time install of the app.
 
I mean it asks you during the first install "do you want to enable game optimization" with "yes" being the pre-selected option. I would think most people probably select yes.
It doesn't "ask you during the first install", it gives you several options during installation where you can chose to enable the optimizations (which is a default position for the option in that dialog) or not.
We have no idea what "most people" are choosing in that step.
The app itself does not enable anything "by default", it is a user choice during the installation process.
 
It doesn't "ask you during the first install", it gives you several options during installation where you can chose to enable the optimizations (which is a default position for the option in that dialog) or not.
Maybe this is a language thing but your sentence describing this is what I would call "asking you during the first install". It asks if you want to enable optimization, default is yes.

Beyond that most games auto enable DLSS and Reflex as part of quality presets so kinda a moot point regardless.
 
Maybe this is a language thing but your sentence describing this is what I would call "asking you during the first install". It asks if you want to enable optimization, default is yes.
A choice presented at installation is always defaulting to something, that doesn't mean that the app "enables something by default".
The latter would be the case if there would be no choice during installation and the feature would be silently enabled with an option of disabling it later. This is not the case here.
 
A choice presented at installation is always defaulting to something, that doesn't mean that the app "enables something by default".
The latter would be the case if there would be no choice during installation and the feature would be silently enabled with an option of disabling it later. This is not the case here.
Without user specially altering the default choices in installer, it's enabled. That's literally what by default means, no matter how much you try to claim moon is actually sun.
 
Last edited:
Without user specially altering the default choices in installer, it's enabled. That's literally what by default means, no matter how much you try to claim moon is actually sun.
No matter how much you want to claim that Nvapp is "turning/enabling things automatically" the simple truth is it doesn't. Please pay attention to what YOU claim.
 
A choice presented at installation is always defaulting to something, that doesn't mean that the app "enables something by default".
The latter would be the case if there would be no choice during installation and the feature would be silently enabled with an option of disabling it later. This is not the case here.
Realistically speaking, how many people do you think choose not to enable optimization? This feels like a silly thing to debate, almost everyone is choosing 'yes' because it's what Nvidia suggests. What normal user is going to disable something called 'optimization'???
 
Has anyone proven where the 80% number comes from? Is it the nvidia app or driver telemetry? Unfortunately, I doubt Nvidia will be forthcoming with how that data was collected and analyzed.
It's in the slide, it's from NVIDIA App

No matter how much you want to claim that Nvapp is "turning/enabling things automatically" the simple truth is it doesn't. Please pay attention to what YOU claim.
Yes, it missed "by default" in the text, because no-one on earth would think it means it forces it on no matter what user does. Stop grasping straws that aren't there.
 
Realistically speaking, how many people do you think choose not to enable optimization? This feels like a silly thing to debate, almost everyone is choosing 'yes' because it's what Nvidia suggests. What normal user is going to disable something called 'optimization'???
I wouldn't call myself normal in any particular way but I feel software has trained me over time not to blindly affirm anything it asks of me.

Blame it on innocuous sounding extra offers in your typical setup.exe as well as the ever growing list of checkboxes I now categorically unmark during a Windows upgrade or feature release. So no, when Nvidia asked me if it should do something automagically, I told it nope I got it.
 
Realistically speaking, how many people do you think choose not to enable optimization? This feels like a silly thing to debate, almost everyone is choosing 'yes' because it's what Nvidia suggests. What normal user is going to disable something called 'optimization'???
We have no idea. The fact is the choice is presented during installation. If you just click through everything without reading then it's not the app which is "enabling things automatically", it is you. And you're not "disabling" anything since this choice is during installation meaning that nothing gets enabled prior to you making it.
 
I too am one of the folks who have been burned in the past by "auto-magic optimizations" and, in general, refuse to permit any of them when given the option. I will say, after using the NVIDIA app for a month or three and the GeForce experience before it, I haven't really ever used their first automatic guess at what I'd like. Rather, I usually end up ratcheting the quality slider all the way to the right and letting it try again, and I'm typically happy with the result.

I think part of the reason automatic optimizations aren't a great idea is because each customer has a different perspective on what they find important. I can be using the same hardware as my brother, and he'll want >120FPS and I'll prefer something between 60 and 100 with a lot more visual fidelity. Which one is most correct?
 
I too am one of the folks who have been burned in the past by "auto-magic optimizations" and, in general, refuse to permit any of them when given the option. I will say, after using the NVIDIA app for a month or three and the GeForce experience before it, I haven't really ever used their first automatic guess at what I'd like. Rather, I usually end up ratcheting the quality slider all the way to the right and letting it try again, and I'm typically happy with the result.

I think part of the reason automatic optimizations aren't a great idea is because each customer has a different perspective on what they find important. I can be using the same hardware as my brother, and he'll want >120FPS and I'll prefer something between 60 and 100 with a lot more visual fidelity. Which one is most correct?
Do most people really use the automatic stuff? I've never used it in my life. Not that I'd expect the average user to obsess over these things like I do.
 
Do most people really use the automatic stuff? I've never used it in my life. Not that I'd expect the average user to obsess over these things like I do.
Yeah, I find it chooses pretty sensible defaults. For older stuff it usually misses the mark but for anything somewhat modern I usually just choose the GeForce optimized settings.
 
Yeah, I find it chooses pretty sensible defaults. For older stuff it usually misses the mark but for anything somewhat modern I usually just choose the GeForce optimized settings.
I'm looking at its recommended settings for FF7 Rebirth and it's exactly what I chose manually. I'll never use it myself because I'm insane when it comes to this stuff, but I'll definitely recommend more reasonable gamers to try it.
 
I've pulled out this side discussion from the RX 9070 reviews thread in the A+P forum since I think it's an important one to have alone from the discussion of any particular GPU product or architecture. There's never been any hard and fast guarantees that every GPU would, given the same inputs, generate the same bit-exact outputs. Traditionally, analysis methods have tried to ensure that as much as possible, but these days it's almost impossible and maybe even an already lost battle, especially at the higher level of a product analysis.

The posts I've pulled out discuss what control panel style systems might do automatically or under user control, but there's a lot more to it than that, especially if vendor-specific games tech or optimisations are in play.

What should the industry as a whole do about that, if anything? At the lower-level of how GPUs work it's easier to avoid vendor-specific tech and ensure bit-accurate output was generated, but as soon as you start rendering anything then it gets much harder.
 
Yeah I believe we had similar discussions on this forum during the anisotropic filtering "era." After that it's almost impossible to get bit exact output from games anymore.
Since it's very difficult to find a "reference frame" any comparison quality-wise will always be subjective in some way.
 
Ideally any analysis will start off with exploring artifacts in native rendering with no TAA or upscaling. I still play quite a few older games where you can disable TAA and even 4xDSR can’t get rid of some annoying artifacts. The upscaling debate is colored by a perception that native rendering is artifact free and that’s not the case at all. Then from there you can explore the benefits and flaws of TAA and then contrast that with modern upscalers.
 
The upscaling debate is colored by a perception that native rendering is artifact free and that’s not the case at all.
This is pretty much signature worthy and is an incredibly concise way to describe the situation. At this point in the graphics timeline, I'm not even sure we could precisely define how to generate a "pure" frame from a modern graphics engine. What would the definition even entail, what does a completely non-artifacted scene look like? At some point, the combination of display device and viewing distance becomes the source of visible artifacting.

If I could give you more than one thumbs up, I would. Stupid forum software! ;)
 
Back
Top