Lossless Scaling. Machine learning Frame Generation and upscaling on ANY GPU!


This is something I mentioned in the FSR3 thread, but according to the Lossless Scaling author, they use Machine Learning and not FSR3 for this tech, so I decided to add a new thread.

With Lossless Scaling you can have Frame Generation in any game of your collection, just like when this app added FSR1 for any game years ago, and it worked. I purchased it for like 1€ or 2€ 3 years ago or so to add FSR1 to ALL my games and it did a good job upscaling 1080p internal games to my 1440p monitor at the time, which was huge for me despite the flaws of FSR1 'cos I had a dying GTX 1080 and a puny GTX 1060 3GB which could barely run most games anymore (VRAM).

Nowadays it uses other upscaling solutions like their own LS1 -via machine learning-, FSR1, nVidia's NIS..., Anime4K and it works with video and desktop resolutions too.

Lossless Scaling Guide | Frame Generation & Upscaling In ANY Game (6 min video).

So..., I tried it launching The Witcher 3, a game that is limited to 33fps-34fps with RT Ultra on my Ryzen 3700X CPU, and it was my first experience ever with Frame Generation.

I was skeptical at first 'cos I used Lossless Scaling -and similar solutions like Magpie- to have FSR1 on all my games, and on very specific games VRR kinda broke at times, but yes, it was indeed an universal FSR solution.

Gotta admit that when I launched The Witcher 3 and set it to 30fps max, and then enabled Frame Generation and saw the game running at a flawless 60fps I was awestruck. Totally worth it, and I didn't see any artifacts. Another game I want to try is Elden Ring, but at 120fps :mrgreen:

It works, and I didn't use the upscaling options of Lossless Scaling at all, since The Witcher 3 has internal XeSS support. Can't wait to try it in quite a few games....., like Shadow of the Tomb Raider (native XeSS support, and also native support for half of the refresh rate of my monitor, which is ideal for Lossless Scaling FG, and I never got to play Shadow of the Tomb Raider at 164fps, max is like 130fps on my CPU).
tested with Shadow of the Tomb Raider and using Half Refresh Vsync. That doesn't work like with The Witcher 3, mostly because at 1440p I don't get a solid 82fps.

At 1080p I get between 105-130fps, so not enough for a solid 165fps of my monitor, but enough for a 82fps half rate of my monitor.

At 1440p I get 77fps average, so not enough either for 82 fps, half refresh rate of my monitor.

Without a fps limiter like Rivatuner, Lossless Scaling decreases the framerate dramatically, so at 1440p the fps go down from 77fps to about 60fps.

At 1080p, those 100 something fps also go down near 60fps. I thought it happened 'cos of the extra processing Lossless Scaling might need, but no, whatever you do, without a fps limiter, Lossless Scaling tries to run the game at 60fps, it seems, so 120Hz would be the best option for my monitor, given the numbers, or 82fps with a framerate limiter at 1080p.

Didn't have many issues with frame pacing though, because I enabled VRR support in the app settings.
are there already TVs with an AI frame generator ?
For what purpose? For gaming? Frame generation shouldn't be done on the final frame. For video? To go from 24fps to 60fps for example? That would probably produce a weird result to watch.
TV's have had frame generation for years as 'motion smoothing', that featutre that makes movies like like TV programmes! Dunno if any have moved on to ML solutions but I thought the existing methods were fairly robust and there isn't going to be any noticeable difference for TV content?
TV's have had frame generation for years as 'motion smoothing', that featutre that makes movies like like TV programmes! Dunno if any have moved on to ML solutions but I thought the existing methods were fairly robust and there isn't going to be any noticeable difference for TV content?
for TV content there is going to be a difference, for gaming content there is but it's not that effective as video. You still notice the 30fps, specially when rotating the camera.

Just like TVs added Freesync and now that FSR3 is open source...they could add FG on TVs in the future.
so.... a colleague of mine has tried Lossless Scaling on an old humble laptop with a i5-6200u and a Geforce 920 (2GB VRAM) using Throttle Stop 9.6 so the CPU doesn't throttle and he managed to run Captain Tsubasa: Rise of New Champions at 33fps. :)

He added admin rights to LS to start it as admin and the exe of the game.

It might not look like much but before using LS, the game ran at 10fps or less (he enabled one of the built-in upscalers too). So yeah, it works.
i tried frame creation with my tv at the time with the matrix demo, yeah it looked like 60fps, but there was off course conrol latency and sometimes it would break up.
Still, it was better than what i had expected.
i tried frame creation with my tv at the time with the matrix demo, yeah it looked like 60fps, but there was off course conrol latency and sometimes it would break up.
Still, it was better than what i had expected.
BFI or Black Frame Insertion? Kinda works for me too, but when I rotate the camera there is no difference for me to 30fps vs 30fps + BFI.

On a different note, new update of the app. Haven't tested if yet but now the stuttering in some game seems to be gone and framerate is not limited now.

Found this video in portuguese explaining the new version of the app. To use it just go to Properties-Betas and select beta-beta.

Last edited:
after quite a few hours of testing, specially in Resident Evil 4 Remake, while not being a panacea nor perfect at all, and while also native 60fps/120fps/whatever are more fluid, I'd certainly recommend this utility.

Use it in admin mode, enable HDR if you play games with HDR, and along with HDR, leave the Draw FPS setting on to know when it's working and it is going to do the job. I use it quite a few times to let the GPU rest and run the games internally at 30fps, thus the GPU fans are dead silent and no heat coming from them.

On RE 4 Remake it works more like the Black Frame Insertion (BFI) of the TV, okay but it doesn't always feel like 60fps, but in games like The Witcher 3 it gives you a similar experience to 60fps for the most part, and you can go crazy with the Raytracing settings if you are CPU limited (Ryzen 3700X) like me.

It's free FPS and it doesn't affect the image quality so I think it's an okay solution.
Lossless Scaling + Black Frame Insertion (BFI) from the TV + Intel's Smooth Sync (disabling Vsync in the game's menu), is working very well for me.

I set the games at 30fps, consuming little energy and processing power, and I am championing.
new version available! Just use the beta-beta update in the Steam's properties for this app.

It's faster and uses DXGI to capture frames, improving performance. Also now the app can be tied to the game's framerate so you don't need to lock the framerate, though recommended for the best experience, is still recommended.

2.6.0 released
- New DXGI frame capture API, that should fix most performance issues, also allows the LS to be tied to the game's framerate and simply double the number of frames when frame generation is used. For the best LSFG experience, it is still recommended to lock the game's framerate to half the LS refresh rate (which can be limited by the vsync option). However, you can now also lock the game at any framerate or not lock at all, while LS will do its best to ensure the correct framepacing.

The LS FPS counter should be twice the game's frame rate, this will indicate that it is working correctly.
If the game is locked at a framerate higher than half the monitor's refresh rate, LS will still render twice as many frames, and the extra frames will be discarded before being displayed on the monitor (not recommended, but possible). Also, in this case, do not use the vsync option, as it will limit the framerate of the LS to the refresh rate of the monitor.
If the GPU fails to generate frames in time, LS may report the correct FPS, but delayed frames will be discarded before being displayed (this can be verified in presentmon).

To use DXGI API the minimum version of Windows 10 version 2004 (20H1) is required.

If you have the black screen issue when using DXGI capture on a system with multiple GPUs, make sure LS is running by default on the GPU connected to the monitor (that's the iGPU on laptops). Go to System - Display - Graphics and set LS to power saving mode. Then in LS, select the desired GPU.

- Full support for devices with non-native display orientation when using DXGI capture API.

Known issues:
When using the DXGI capture method, the LS window itself cannot be captured by any program. Radeon software is the only known way to record this for now.
The cursor renders at the game's frame rate, so if you scale a static window, you'll have 0 FPS and no cursor movement.

- The capture API that was the primary in LS 2.5 and earlier is called WGC. The previously mentioned limitations when using frame generation still apply. This API captures game frames at DWM framerate regardless of game framerate. Therefore, for correct framepacing, it is necessary to lock the game to half the framerate of the LS (which can be limited by the vsync option).

- The GDI capture API was formerly known in LS as the "Legacy Capture API" option. Can be used for very old games that newer APIs cannot capture.

- Minor UI redesign: "Windowed Mode" and "VRR Support" options moved to Legacy section and are no longer supported, but still work. "VRR Support" known to completely break the frame generation and is forced to off when LSFG and DXGI are used.

If the capture method is not supported of failed for some reason, automatic switching to the other one will not occur. Capture API should be explicitly chosen to avoid further confusion in reports.
my new tactic, when I use my 165Hz monitor, not the TV. Lock games at 82fps with Rivatuner at a global level, and use this lock at said framerate to play the games that my PC cannot run at that refresh rate + Smart Sync from Intel.

Most of those games easily run at 60fps, and 82fps is not that far from 60fps. Then I use FG to play at 165fps, and the truth is it works, it's like Black Frame Insertion but a little better, without being like native 165fps, it's decent.

For everything else, master graphics card, :mrgreen: and I play at 165fps everything that my computer can run with ease, which is not everything but it is fine.
Microsoft are introducing some form of upscaling into Direct X

DirectSR is an API to sit in front of all the vendor-specific upscaling solutions such as DLSS, FSR, and XeSS.
Lossless Scaling adds 3X frame interpolation in a new update. It effectively triples the framerate by generating two intermediate frames. The creator recommends a minimum 30fps as base framerate for 1080p and a minimum base 40fps for 1440p.

tested the new version with Callisto Protocol -pc gamepass-, using 30fps as the base framerate of the game. I didn't expect, at all, it would work that well. No stuttering and the framerate is actual tripled. Colour me surprised.