Lossless Scaling. Frame Generation x20 šŸ˜ (2025 update) using AI and upscaling on ANY GPU!

this 5 months old video explains it.

Optimized Photorealism That Puts Modern Graphics to Shame: NFS 2015


Oh, it's Wesley Crusher again.
photocompare.jpg
What are these? Game looks nothing like these. For one thing it takes place at night AFAIK, so there's no daytime scene.
 
Oh, it's Wesley Crusher again.

What are these? Game looks nothing like these. For one thing it takes place at night AFAIK, so there's no daytime scene.
got the image from here.


Speedhunters published some reference pictures that the development team of the upcoming Need for Speed tried to recreate in the game. The result looks pretty amazing!

I don't have the game (just wishlisted it though) so I can't say.
 
This post by @neckthrough on a different thread mentioning why FG is necessary to help your brain should be taught at universities.

https://forum.beyond3d.com/threads/nvidia-blackwell-architecture-speculation.63097/post-2365191

-- the GPU should be generating as many frames as needed to match the refresh rate of any sample-and-hold monitor (unless the monitor supports BFI, see below). The problem is that sample-and-hold is fundamentally broken. Your sample-and-hold monitor is already generating frames if you're sending a 60 fps signal to a 240Hz display -- it's just repeating the old frame 4 times. It's the most brain-damaged form of frame-generation and you use it all the time.

BFI changes the equation because it kills the hold prematurely, allowing your brain to interpolate between frames (which somewhat emulates CRT behavior). You do lose brightness. I think in practice some mix between GPU-FG and BFI is going to be the ideal recipe.
 
not specifically LS related, but I shared this in another post, and for those who want to play at the max framerate of their monitor I find this video is VERY interesting.

It's a brief video on how to use G-Sync/Freesync.

For an optimal gaming experience, using a framerate limiter (be it from AMD, Intel, or nVidia drivers) to cap the FPS at 3 frames below the maximum refresh rate of your display is ideal. This, combined with Vsync on (preferrably on your GPU's options, disable ingame) and Gsync on, results in the least input lag.


A screenshot from the video highlights the conclusions. In my case, the middle option works best. Additionally, the Intel Graphics Software app now includes a "Low Latency" setting.

VSDsC7o.png


Also, you can find additional tips like setting your mouse to 1000Hz polling rate to avoid mouse micro stutter and set the power plan to High performance in Windows, etc, in this incredible article of Blur Busters creators on G-Sync/Freesync tech.


 
Last edited:
playing one of my favourite games -I completed a few speedruns on it-, Resident Evil 2 Remake at a PERFECT locked 54/162fps (which I never did, even decreasing the resolution to 720p plus FSR enabled didn't get past 123fps) with great image quality -no sacrifices-, has left me awestruck.

I was looking at the perfect flow..., the smoothness of everything moving on the screen and I couldn't muster a word, I was just looking like when I had my first graphics accelerator, the Monster 3D, and transitioned from 2D to 3D, that kind of amazingness....

Can't wait to play and complete, again, this game at a locked 300fps or 360fps when my 360Hz monitor arrives.
 
Last edited:
btw, you can enable nVidia Reflex in Rivatuner, which further helps to create a flawless framerate pacing.

AZmmKTm.png


That's how I managed to play Resident Evil 2 Remake so fine.

Also this..., as per Blur Busters G-Sync/Freesync recommendations' article (the bible of smoothness):

-> Global framerate limiter on the GPU's panel set to 3fps less than your monitor's max refresh rate (i.e. 162fps for a 165fps monitor)
-> VRR on
-> Vsync on (in the GPU's native control panel, but set Vsync to off in-game)
-> Framerate limiter like Rivatuner with nVidia Reflex enabled, set to a multiple of your monitor's limited max refresh rate for a very precise, low input lag framepacing (i.e. 54fps framerate limiter on Rivatuner for a 162fps max framerate on a 165Hz display)

Having the max refresh rate limiter set to -3fps of your actual max monitor's display just keeps the monitor always working withing its VRR range, so you get the benefits of VRR all the time.
 
Last edited:
the video below isn't about LS. It's interesting to see that he says that he prefers to play with FG even when he gets about 30 FPS base, though it's preferable to use FG when you don't get 50-60 FPS at least, but it is also true that even if you don't get 50fps, not using FG makes playing the game a PITA.

He only doesn't recommend it when playing competitive games or when you can reach very high framerates without it.

He also mentions that the increase in artifacts at certain base frames -30 or so- or those extra 10ms of input lag compensate a lot for what you gain in visual clarity in motion.


The point of sharing this video is that when he runs the game at 30fps base, even with the best ever GPU and using FG, the targeting reticule can show a slight flaw here and there, which just goes to say how good LS is for a 7ā‚¬ app, 'cos it shared some of those issues, but the author managed to almost fix them.
 
some great tips in the video below. A HUGE one actually. That's the missing piece of the puzzle. How do you get CRAZY framerates, like 600fps without having the screen go bananas and look like a smearing mess?

It's all related to latency.

Among other tricks and trips, let's say that you want to go REALLY high regarding the framerate, say you just have a 540Hz monitor; if you use a quite high multiplier like FGx10 or FGx12, usually at the default value of Max Frame Latency (3) you are going to see artifacts and mushrooms-like visions. šŸ˜

In that case, increase the Max Frame Latency to say 10 or so it depends and the game is going to run at crazy framerates without the artefacts. (this obviously increases latency but your experience is going to be much much better)

Video starts at the tips & tricks section, where you can actually see the huge difference that it makes:


Also another good thing of this video is that it features a game called Witchfire, which looks really good and fun, didn't expect that nor I knew the game.
 
Last edited:
game called Witchfire
It had a year of early access exclusivity on EGS and only made it to steam late last year, It had some buzz when first announced then when the egs exclusivity happened the buzz/interest evaporated like a drop of water in the sahara. I'm waiting for it to leave early access before picking it up.
 
Quoting myself from the DF thread because it's not DF related, but more topical here.
I would love to see some A/B blind tests where people play a game on the same hardware at different settings, both graphics settings and frame generation/reflex/ai upscaling, to see what they prefer, and if they could feel or see the negative effects of frame generation. People fixate on the negatives of frame generation, which there are some, for sure, but I really want to know if those same people could identify those negatives in practice. Also, and i think this gets a bit lost in the conversation partly because of how nVidia markets DLSSFG, but the question really shouldn't be "real" 120fps vs 120fps using frame generation. It should be whatever you can achieve without frame generation, vs what you can achieve with it. If you can hit 60fps native and 120/240 with 2x/4x frame gen, the question for the feature should be what is better, having the feature on, or off. Frame generation needs it's Windows Mojave moment, for science, and also to satisfy my own curiosity.
Youtuber Vex blind tested his friend.
It's interesting what his friend gets right and wrong. I also love how they keep talking about the price.
 
Quoting myself from the DF thread because it's not DF related, but more topical here.

Youtuber Vex blind tested his friend.
It's interesting what his friend gets right and wrong. I also love how they keep talking about the price.
watched the video. Once you know what to look for you might notice a difference. There is a difference, but it's best to test it yourself.

When you get used to 60fps, they feel really smooth. But then you play at 240fps for a while and you are going to find 60fps choppy. They are choppy in fact. Stable 60fps don't look that great even at the best 8K display with SSAA on.

On my 4K TV I lock all the games to 60fps and I don't find them fluid, it's like when you free up your GPU and go from 4K to 1440p on a native display or 1080p on a native display, something seems to feel smoother, or maybe it's just that your GPU is freed up. Maybe that's a placebo effect of mine.

But freamerate for instance is pretty noticeable. Also with DLSS4 FG artefacts don't seem to show frequently.
 
had my first experience with Resident Evil 2 Remake on the Alienware AW2523HF 360Hz monitor. I locked the game at 59fps + LS FGx6 so at 354 I keep within the VRR range (48Hz to 360Hz) of the monitor, and what can I say. I didn't have to touch anything in my profile of RE2 Remake within LS.

The game worked so smoothly from the get go. An ideal game to showcase LS. Not a single artefact, flawless 360fps, playing with everything maxed out except RT (which I set to off) and no upscaling, native 1080p. Kinda awe inspiring.

vRPjQwV.jpeg


p.s. photo taken from the phone at night, so not the best quality. Still the contrast of the display is mediocre compared to what I'm accustomed to use, but I wouldn't change this small 24,5" display for anything I have.
 
Last edited:
now with 150% supersampling and 40fps base + LS FGx9 . šŸ™‚šŸ™‚

KpJbPlk.jpeg


m8R6rrc.jpeg


I've been truly looking for artifacts making super super fast mouse movements. And maybe there are.

But disabling LS with Control + Alt + S and making the same super fast movements there were artifacts too, her nose during those movements seemed to duplicate. The artifacts were worse and the nose seemed to float in the air, and there were two noses, at 59fps no FG.

Maybe a TAA thing?

At 59fps base + LS FGx6 there were no "double nose" artifacts. šŸ¤·ā€ā™€ļø
 
I have no idea how you guys put up with these artifacts. I tried it even on map games (where latency basically doesn't matter) and it looked horrible.
 
I have no idea how you guys put up with these artifacts. I tried it even on map games (where latency basically doesn't matter) and it looked horrible.
which games did you play and at what framerate? That's important. Games are much more stable from 40fps on. 30fps is playable but artifacts might show. Some games are more prone to artifacts than others.

Resident Evil 2 Remake shows zero artifacts, Ninja Gaiden 2 Black too -this one with FGx9, 40fps base-.

Sometimes you can look for artifacts if you use a very fast mouse and start turning the camera left and right, but in that case you might find artifacts even at native framerate....
 
which games did you play and at what framerate? That's important. Games are much more stable from 40fps on. 30fps is playable but artifacts might show. Some games are more prone to artifacts than others.

Resident Evil 2 Remake shows zero artifacts, Ninja Gaiden 2 Black too -this one with FGx9, 40fps base-.

Sometimes you can look for artifacts if you use a very fast mouse and start turning the camera left and right, but in that case you might find artifacts even at native framerate....
Like I said these were map games, aka strategy games. Paradox titles and the Civ series. These arenā€™t hard to run so my frame rate was fairly high, and it being a map game latency doesnā€™t really matter, it just looks not very good.
 
Just adding this:
And I know what some of you might say. ā€œMeh, I donā€™t care, I have Lossless Scaling which can do the same thingā€œ. Well, you know what? Iā€™ve tried Lossless Scaling and itā€™s NOWHERE CLOSE to the visual stability, performance, control responsiveness, and frame delivery of DLSS 4. If youā€™ve been impressed by Lossless Scaling, youā€™ll be blown away by DLSS 4. Plain and simple.
 
Back
Top