Jawed
Legend
If you're gaming with Titan, then which games is this relevant to?Low levels of eyecandy, low framerate and low resolutions are not tech![]()
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
If you're gaming with Titan, then which games is this relevant to?Low levels of eyecandy, low framerate and low resolutions are not tech![]()
I checked the review benchmarks of a Titan: none of the games tested reached min frame rates of 120Hz at 1080p with Ultra setting. (In fact, most don't even reach 60Hz.)If you're gaming with Titan, then which games is this relevant to?
I don't see a universal hardware solution here, do you?
Or are console games played on TVs irrelevant? And the games played on anything but a few models of NVidia GPU are irrelevant, too?
You aren't paying attention. There is no reason to go for broke on the first difficult frame when the effects step up a notch.
TVs regularly need new models whenever HDMI adds a new resolution + framerate mode.I don't see a universal hardware solution here, do you?
Or are console games played on TVs irrelevant? And the games played on anything but a few models of NVidia GPU are irrelevant, too?
There's no prediction required. e.g. you render your explosion at 1/4 (or 1/16th, etc.) resolution when it starts and if the frame time impact is low, reduce the fraction. Even if the explosion takes 12 frames before it reaches "max IQ" on a high-end system, the low-end system hasn't stuttered at any point in the explosion.
The developer can benchmark the GPU (in game) to identify the starting fraction for this effect (e.g. 1/4 for Titan, 1/32 for some APU).
You'll always be leaving some performance on the table.
This is no different from how a 60Hz locked framerate always leaves performance (IQ) on the table, since there are large portions of any game that can render at more than 60 fps (because it's PC).
The point is, console developers are already using excess framerate to improve IQ, and they are reducing IQ to maintain framerate. They don't need hardware to do this. And, more fundamentally, this hardware makes absolutely no difference to console gamers because they won't have this hardware this gen. At the same time, these techniques will end up in PC games because of their console lineage (well, some PC games have been adapting IQ to maintain performance for more than 10 years).
You can argue the techniques are still crude, but so are low res textures, square heads, blocky/noisy shadows, fade-in vegetation etc.
That's stutter:
60, 60, 60, 25, 26, 27, 25, 25.5, 60, 60, 60
I would have thought you would understand the pretty elementary sampling rate argument from silent_guy.Damn this thread is full of stupidity.
Jawed, you win, I concede: game developers are lazy.
Or is this just for lazy PC developers (or those encumbered by shit in the API)
and gamers with cheap cards that can't maintain 30 or 60 fps?
That's stutter:
I don't see a universal hardware solution here, do you?
Or are console games played on TVs irrelevant? And the games played on anything but a few models of NVidia GPU are irrelevant, too?
If you're gaming with Titan, then which games is this relevant to?
The vast majority that were released in the last 5 years if you want to game at 4K. Or just max settings, 1080p, good AA and 3D. Or any other combination of image quality/resolution that prevents you maintaining 60fps 100% of the time.
Well, I'm bemused, people apparently want to spend money on getting one kind of stuttery, sub-60 fps framerate that's different from another kind of stuttery sub-60fps framerate they already have. Go ahead...
I dont know what people want, but I`d love a solution that makes horrible CRT legacy band-aids unnecessary and solves the problem where it can be handled best.Well, I'm bemused, people apparently want to spend money on getting one kind of stuttery, sub-60 fps framerate that's different from another kind of stuttery sub-60fps framerate they already have. Go ahead...