Lossless Scaling. Frame Generation x20 šŸ˜ (2025 update) using AI and upscaling on ANY GPU!

Interesting testing, keep it up!
it seems to be the case that LS' resolution scale ONLY affects the generated frames but not the original frame. It makes sense though, since the original frame has to be rendered by the game's engine --I use no upscaling method in LS at all (if anything I'd use Integer Scaling, which I do in the Intel panel already).

I noticed this because decreasing the resolution scale via the original game's settings results in an obvious blurrier image in games that feature that option. But using the same resolution scale % in LS didn't make the image blurrier..

If you lock the framerate to 41fps and it's stable, you see the FPS indicator saying 41/130, 41/140, whatever, which means that the base game is running like a charm...

But to achieve that ideal 41/164 is a whole different story, sometimes the GPU can't keep up with the quality of the generated frames, it seems.

So I had to resort sometimes to decreasing the resolution scale of LS, which made me unhappy thinking it affected the game, but it doesn't. I didn't notice increased blurriness in games like FH5 and Bright Memory; Infinite. šŸ¤·ā€ā™€ļø

This is a curious way to save resources without having the image looking like crap.
 
wonder if the Resolution Scale option in Lossless Scaling only affects the FG generated frames, because it certainly increases the performance when you decrease the resolution scale but the game still looks clean even if you go as low as 25% of Resolution Scale, as if the actual frame generated by the game isn't upscaled at all, but native. šŸ¤”
I think it's the resolution that the frame generation uses to generate the frame, not the generated frame.
 
I think it's the resolution that the frame generation uses to generate the frame, not the generated frame.
what do you mean? Excuse me if I don't understand you. Aren't you saying the same as me? The actual "native" frame from the game is decoupled from the generated games in LS. I don't use any upscaling method in LS. I expected that at 25% resolution scale (1/8th the res.) the game would be a mess, but it looks crisp.
 
The patch notes refer to the resolution scale as applying to the input resolution. They don't really specify, but to me, that implies that instead of calculating the generated frames from native, they are calculating them from the lower resolution input, but they are outputting whatever the output resolution is. In your case, since you are not using Lossless Scaling to scale resolution, that means the traditional rendered frames are native, and your output is native, and the generated frames are output at that resolution as well - which would explain the sharpness you described earlier - but they are generated using less samples from the traditionally rendered frames.
 
The patch notes refer to the resolution scale as applying to the input resolution. They don't really specify, but to me, that implies that instead of calculating the generated frames from native, they are calculating them from the lower resolution input, but they are outputting whatever the output resolution is. In your case, since you are not using Lossless Scaling to scale resolution, that means the traditional rendered frames are native, and your output is native, and the generated frames are output at that resolution as well - which would explain the sharpness you described earlier - but they are generated using less samples from the traditionally rendered frames.
hmmm that's an interesting take. I haven't noticed any artifacts from decreasing the Resolution Scale, which puzzles me. Wish the developers clarified that in some FAQ or something similar.

On a different note, regarding the setting Max Frame Latency the recommendation for NVIDIA GPUs is 1 frame and for AMD GPUs is 3 frames. But it seems to be 3 for everything now, including nVidia GPUs. (I was just using 1 all the time)

It is on the devs Discord.

 
when you enter their Discord for the first time you are referred to a FAQ that includes this video to configure the app. The video is from a nerd who has a 480Hz monitor and gives good usage tips. (why capping the framerate is a good idea, for both the app and the GPU, and so on)

I like when he talks about the phantom array effect. šŸ˜


The video is from the previous version of LS but there are some good tips there anyways, and he mentions the added motion clarity.., which is important. Image quality is much higher at say 240fps even at the same settings compared to 60fps, which he explains with images.

Btw, @GhostofWar he plays F-Zero using Retroarch in the video.
 
Last edited:
new update!

Fixing the LSFG 3 bottleneck on Intel iGPUs (HD, UHD, Xe) boosts performance by a massive 100-200%, with smaller but noticeable improvements for other vendor iGPUs like Vega and RDNA, as well as some dGPUs.
Added Danish language
 
new uppdate tested. The performance has increased exponentially on dGPUs like Intel ARC.

On Bright Memory Infinite I got a solid 41/164fps before -FGx4-, but at the same settings when using 55fps base, stable and solid 55/165fps was impossible to achieve. Drops to 55/161, 55/162 were common.

Now the fps indicator shows a super solid 55/165fps all the time.

What surprised me the most is that I tested Contra Anniversary Collection too, and in the previous version when I unlocked the framerate, the game capped at 213fps. With the new update it caps at 400fps! šŸ˜®

In those 2D games I set LS to FGx10 'cos there were artifacts from FGx11 on. Now I can get 2D games running at FGx17 without artifacts due to the performance increase. šŸ˜Š
 
I think it's the resolution that the frame generation uses to generate the frame, not the generated frame.
it seems that the mystery regarding what Resolution Scale means has been solved. The author or Lossless Scaling has replied to the guy of the video below.

LosslessScaling UPDATE is so good that it makes no sense. From 48 to 144fps! (MFG without RTX 5090)


And he says the following in the highlighted comment:

Thank you for checking it out! I can see how the term "Resolution Scale" can be confusing for many, so Iā€™m planning to rename it. Itā€™s not related to image scaling at all, it only affects the motion flow resolution.

He seems to be referring to the proven scientific effect named Optical flow, the perception of it.

One of the comments of the video says: "Lone developer is running circles around a 3 trillion dollar company". šŸ™‚
 
Last edited:
Btw, @GhostofWar he plays F-Zero using Retroarch in the video.
I actually tried f-zero when trying to get it going so it's definitely a me problem by the looks. Instead of gutting my current retroarch config I'm going to create a clean install and work my way forwards to see if I can work out what the problem is.
 
you can use 2 GPUs (on laptops, desktop etc) with Lossless Scaling. o_O One to run the game and the other one just for the FG frames.

You just have to select the output display for the best monitor you have, having your best GPU run the game. (i.e. on a laptop you can have your iGPU and your dGPU work together to offload the work)

And your 2nd gpu, less powerful GPU, as input to generate the extra frames. With that the delay is almost non-existent for competitive games and you get all the benefits of FG.

I found this out in the video below. The guy runs the actual game, Cyberpunk 2077 on the RTX 3080 and a 480Hz OLED display, and uses the RTX 3050 to generate the frames with LS.

 
Last edited:
How it works using 2 monitors. If you have a second GPU in your computer, you can configure Lossless Scaling to handle FG while your let's say RTX 4070 focuses entirely on rendering the game at full utilization. Here's how to set it up:

  1. Connect your main monitor to the secondary GPU and your secondary monitor to the RTX 4070.
  2. In Windows, set your main monitor (connected to the secondary GPU) as the secondary display and your secondary monitor (connected to the 4070) as the primary display. This ensures games default to rendering on the RTX 4070 and the correct monitor.
  3. In Lossless Scaling, set your preferred GPU to the secondary GPU and the output monitor to your main monitor (connected to the secondary GPU).
With this setup, youā€™ll see the game displayed twice, but the performance boost is worth it. Lastly, make sure the RTX 4070 is set as the default graphics device in Windows.

Yes, itā€™s a bit of a hassle LOLā€”dual-GPU setups always areā€”but the results make it worthwhile!
 
Last edited:
55/165fps stable on RTX Quake II, with path tracing effects maxed out, and 40% resolution. I preferred to play it safe, but I think I could get to 50% or 60% fixed resolution. On 1080p I can run it at 100% at 360Hz maybe.

xjfvbkP.png


GuUnccF.png
 
Last edited:
How it works using 2 monitors. If you have a second GPU in your computer, you can configure Lossless Scaling to handle FG while your let's say RTX 4070 focuses entirely on rendering the game at full utilization. Here's how to set it up:

  1. Connect your main monitor to the secondary GPU and your secondary monitor to the RTX 4070.
  2. In Windows, set your main monitor (connected to the secondary GPU) as the secondary display and your secondary monitor (connected to the 4070) as the primary display. This ensures games default to rendering on the RTX 4070 and the correct monitor.
  3. In Lossless Scaling, set your preferred GPU to the secondary GPU and the output monitor to your main monitor (connected to the secondary GPU).
With this setup, youā€™ll see the game displayed twice, but the performance boost is worth it. Lastly, make sure the RTX 4070 is set as the default graphics device in Windows.

Yes, itā€™s a bit of a hassle LOLā€”dual-GPU setups always areā€”but the results make it worthwhile!
On average, what percentage of VGA power does Lossless Scaling 3 use?
 
couple of tips-:

- in the options (bottom left wheel) set Lossless Scaling to run with admin rights (it will perform better)

- in Rivatuner's setup enable NVIDIA Reflex (by default it says async) for the option "Enable framerate limitier".

- if you have a monitor with mediocre HDR like I have, disable it. HDR adds a bit of input lag and VRAM footprint and it isn't worth it unless you have a display with great HDR.

The good thing about DLSS4 and LS, is that those monitors which were considered only esports monitors and sometimes never reached their full potential even in esports 'cos you needed a super machine to achieve those framerates, have become universal, thoe high framerates are going to benefit both esports and single player games equally, thanks to the motion clarity. :)
 
Last edited:

If we don't talk about how GAMING has just changed FOREVER, I'll explode​


is the title of the video below by an articulate young spaniard guy, Billy Cherokee. It's one of the best videos about LS.

He doesn't talk that much about LS, but he mentions all the new technology on gaming, and, of course, nVidia.

He talks about AA, what nVidia is doing, Reflex 2, etc etc.

He explains how AA -different types of it too- works now, why ultra realistic games like Need for Speed 2015 are more rare to see.

He explains a lot of things that we take for granted, but he does it in a way that ANYONE can understand the concepts of all kinds of AA, FG and stuff..

He does that in a very casual but also DF like way. He managed to surprise me, I learnt a lot from that video. For anyone interested, use subtitles.


p.s. he bought his mama a computer that cost him 300ā‚¬ using second-hand components and a 6700XT, and he noticed that the videos on her mother's computer and the games ran as smooth as in a 1000ā‚¬+ second hand computer, and that was mind-blowing for him.

He is a hardcore racing sims guy who has a 480Hz monitor, and has several records in racing games he uses good wheels and so on.
 
Last edited:
Back
Top