Upscaling Technology Has Become A Crutch

the point of DLSS and so on is that they help to create a high-end experience with low-range or mid-range hardware. I don't think they are using it as a crutch on most games. It's like upscaling on consoles, native 4K is not attainable on many games, so they use upscaling to compensate for that, they have no choice.

I would say it always applies to the high end.
I just went from a 27" 170 Hz IPS to a 42" 144 Hz OLED.
My 4090 is not sufficient to run "CyberPunk 2077 - Phantom Liberty" with PathTracing without DLSS
Or "Black Myth - Wukong"
(And I was running DLAA, not DLSS before, but 4K is a performance killer, so now I have to run DLSS (Quality))

I hope the 5090 will enable me to run DLAA again, because I can see the difference cleary between DLAA and DLSS
 
It also provides anti aliasing in a method that is superior to most temporal algorithms.

So I would say that it’s a necessity at this time. TAA with 4K will very often look worse than DLSS upscaled from a lower resolution.

I’ve seen enough videos and screenshot comparisons where I think many times people attributed the blurriness to being a difference in anisotropic filtering or texture quality between Xbox consoles and PS5, when now thinking back, it’s a likely a result of TAA being slightly better on PS4Pro/5 due to devs incorporating id buffer.

And you can see how large a jump is with NN based anti aliasing. The difference is night and day when it works.

Upscaling is a crutch. But NN based antialiasing is the future, which is often bound to upscaling. I guess another way to look at it is, if you're going to eat the performance cost for superior AA, you may as well claw some of that performance back by reducing the resolution and taking advantage of the upscaling.
Agreed...DLAA is far superior to TAA.
But 4K is a performance killer, so this generation of NVIDIA cards are unable lift the burden in high IQ games.
 
Yep. Back in the crt days you just changed your resolution. Then lcd came out and had a native resolution, but it wasn’t so bad until 4K showed up. 4K is just kind of ridiculous as a native target, especially when it’s getting more and more difficult to produce the hardware to render on.
I will need to try out lower the resolution on my OLED...when I am done enjoyt true black :love:
 
I will need to try out lower the resolution on my OLED...when I am done enjoyt true black :love:
I totally hated the image quality of the first LCDs. The image was smeared and looked pixelated but everyone was going batshit insane about how great LCD were.
Yes they were great for their size and thickness. But a Sony WEGA was literally pissing on the LCDs
 
I totally hated the image quality of the first LCDs. The image was smeared and looked pixelated but everyone was going batshit insane about how great LCD were.
Yes they were great for their size and thickness. But a Sony WEGA was literally pissing on the LCDs
I had Sony Trinitron CRT's back in the day, so I know what you mean.

This OLED is the first time I am not displeased with the image quality on a monitor since then.
 
What drove me nuts on my first lcd was black wasnt black compared to a crt it was dark grey after a few days I got used to it
 
NN as in near native?

I'm assuming Neural Network given the context.

I also just realized we can't use Machine Learning Anti Aliasing MLAA as that's already been taken for anti aliasing. Just wondering aloud here but I wonder if Nvidia would've named DLSS/DLAA as MLSS/MLAA had that not been the case.
 
What drove me nuts on my first lcd was black wasnt black compared to a crt it was dark grey after a few days I got used to it
I have the problem that when I go to work now, where I use 2 x Lenovo P34W-20 (3440x1440) looked rather displeasing now...no true black.
 
I would say it always applies to the high end.
I just went from a 27" 170 Hz IPS to a 42" 144 Hz OLED.
My 4090 is not sufficient to run "CyberPunk 2077 - Phantom Liberty" with PathTracing without DLSS
Or "Black Myth - Wukong"
(And I was running DLAA, not DLSS before, but 4K is a performance killer, so now I have to run DLSS (Quality))

I hope the 5090 will enable me to run DLAA again, because I can see the difference cleary between DLAA and DLSS
the culprit of forcing 4K down the throats of the users are the companies building TVs. When you see 4K or 8K on console boxes you know they are trying to sell those TV models no matter what.

When the Xbox One was launched I gotta admit I purchased a brand new 1080p TV -which still works today- which was a good buy. However, on modern devices -consoles and PCs-, 4K is still a bit of a stretch. Many console games runs at laughable resolutions.

Still, I can see PS6 featuring a 8K logo on the main box, without even mentioning 4K when 8K becomes the new fad.
 
I will need to try out lower the resolution on my OLED...when I am done enjoyt true black :love:
OLEDs and LCDs are essentially identical in how they handle resolution. The only big difference is in how OLEDs are self emissive and therefore each pixel can reach a perfect black (in near perfect lighting conditions ofc). CRTs scaled better with resolution since they didn’t have fixed pixel arrays, they would just change how many rows and columns were shot out by the electron gun.
 
CRTs scaled better with resolution since they didn’t have fixed pixel arrays, they would just change how many rows and columns were shot out by the electron gun.

This is not entirely true, as CRT still has lines. The size of these lines are kind of fixed, through the calibrated brightness of the phosphors and the mask. A video card does line doubling in low resolution mode (e.g. 320x240) on a high resolution display, that's why they appears pixelated.
If you use low resolution on a high resolution CRT without line doubling (such as doing 640x480 on a 1024x768 monitor) there will be gaps between lines. Human eyes adapt to ignore these gaps but they reduces brightness and they are not very nice. Some arcade emulators do this effect to simulate the "CRT look and feel".
 
the culprit of forcing 4K down the throats of the users are the companies building TVs. When you see 4K or 8K on console boxes you know they are trying to sell those TV models no matter what.

When the Xbox One was launched I gotta admit I purchased a brand new 1080p TV -which still works today- which was a good buy. However, on modern devices -consoles and PCs-, 4K is still a bit of a stretch. Many console games runs at laughable resolutions.

Still, I can see PS6 featuring a 8K logo on the main box, without even mentioning 4K when 8K becomes the new fad.
No one forced me to go 4K.
I made that choice.

Not only do I have a lot more "screen realestate" by going 4K, I also made the switch from IPS to OLED and oh my golly.
The picture quality (pixel cleaness), no back light (true black) and 12 bit 4:4:4 (yeah for colors) is in each their own a reason to be happy about the upgrade
I also went from 27" @ 1440p to 42" @2160 at 150 cm distance so immersion got turned WAY up.

Even old games like "Rome - Total War Remasted" are a bliss on my new monitor.
But a RTX 4090 is not sufficient to run 4K in some newer games.
"Black Myth - Wukong", "Cyber Punk 2077 - Phantom liberty" & "Alan Wake II" all requires me to use DLSS/FG or reduce settings in a way that make image quality suffer a lot more than DLSS/FG.

4K is far from "a strech", DLSS/FG helps me in new games...and older games 4K really ups the image quality.

(And I havn't even mentioned HDR (Eg. Dolby Vision) 120 FPS videos on a OLED in 4K yet)
 
Just to ad, this is my work screen estate, and I could more more at times:
Test.png
(2 x 3440*1440 + 1 x 1920x1080 for a total of 11.980.800 pixels vs 3840*2160 for a total of 8.294.400 pixels at home)
So 4K is not something being pushed on me as I see it.
4K is the option that makes the most sense for me at home atm, while I need more pixels, albeit in a diffeernt form factor at work.
 
4K for gaming is very different to 4K for work.

I think very much on modern supersized screens, 4K has value, but the debate is exactly the same as 1080p on smaller screens. What can normal people actually differentiate at what pixel density, and what about in motion? 4K on a treacle display would be less resolvable than 1080p on a very crisp low-latency display. But going forwards, we don't want resolution to be arbitrarily capped lower than is ideal. 4K displays make sense as you don't have to render at that resolution! Giving the option of higher framerates. If 1080p is good enough for you, render the game at 1080p and just display it on the 4K screen, either fancily upscaled or just doubling pixels for that authentic 1080p look.
 
4K for gaming is very different to 4K for work.

I think very much on modern supersized screens, 4K has value, but the debate is exactly the same as 1080p on smaller screens. What can normal people actually differentiate at what pixel density, and what about in motion? 4K on a treacle display would be less resolvable than 1080p on a very crisp low-latency display. But going forwards, we don't want resolution to be arbitrarily capped lower than is ideal. 4K displays make sense as you don't have to render at that resolution! Giving the option of higher framerates. If 1080p is good enough for you, render the game at 1080p and just display it on the 4K screen, either fancily upscaled or just doubling pixels for that authentic 1080p look.
I will that taht going from 27" 1440P to 42" 2160p for gaming has been great for most games for me.
One downside though is that I have to use DLSS instead of DLAA in "CyberPunk 2077" and I can see the slight image quality decrease.
It is how ever better than having to reduce settings.
I hope that a 5090 will enable me to use DLAA again.

I might also be "colored" by the superior clarity on an OLED compared to IPS.
One strange thing though is that if I choose 2160p @ 144Hz, I can only use 10bcp, but if I reduce to to 2160p @ 120Hz I can select 12bcp.
But this is fine for me, as I do not need anything more than 120 FPS...and never games cannot reach that level in pathtraced games on current hardware anyways :LOL:
 
4K for gaming is very different to 4K for work.

I think very much on modern supersized screens, 4K has value, but the debate is exactly the same as 1080p on smaller screens. What can normal people actually differentiate at what pixel density, and what about in motion? 4K on a treacle display would be less resolvable than 1080p on a very crisp low-latency display. But going forwards, we don't want resolution to be arbitrarily capped lower than is ideal. 4K displays make sense as you don't have to render at that resolution! Giving the option of higher framerates. If 1080p is good enough for you, render the game at 1080p and just display it on the 4K screen, either fancily upscaled or just doubling pixels for that authentic 1080p look.
agreed. I've had a bit of everything, a decent 4K TV and a lacklustre 4K 28" monitor, plus a 1440p monitor, a 1080P TV and a 1080p monitor. I've also had quite a few 1366x768P displays back in the day aswell.

When it comes to productivity, 4K wins hands down. You just get a larger viewport on a 4K screen. In addition, my 4K 50" can even be "natively" used as a 32:9 display -3840x1080p-.

If we are talking games, as long as you play at the native resolution -DLSS or not-, the resolution differences aren't that noticeable.

My old and mediocre 4K 28" monitor looked very crisp when playing at 4K, as if games were over-sharpened. But it was a bit meh. You wouldn't go like "oh my god, this looks incredible, 4K!!!!!".

At the same quality of the display, a good 27" 1440p monitor can look as crips as a 4K 42" screen imho.
 
Back
Top