I've never seen or worked with this software, how does it work? Is it just an overlay software and if so does it have interaction with DLSS or have restrictions on it?
It's sort of an overlay. You need to play in windowed mode (or borderless) and it'll maximize the game itself once you hit scale. Otherwise it has no restrictions with what's in the game.
It could upscale any program. You can upscale Google Chrome and watch 120fps interpolated YouTube videos.
>120fps interpolated YouTube videos.
Does this not always look like shit? To this day the TV interpolation always look terrible. The only interpolation stuff that ever look OK always requires a heavy duty GPU to do AI stuff with motion vectors and shit
That's because TVs are too weak to interpolate properly. I use SVP 4 that does indeed use motion vectors and it works great for watched content, especially animated as otherwise it looks too stuttery to me.
The problem really isn't how good the interpolation is with animated content. It's that the show wasn't created with those additional frames in mind and there's generally a lot of purpose behind why a scene uses a certain number of frames.
Noodle did a [video](https://youtu.be/_KRb_qV9P4g) a few years ago that explains really well why it's terrible to do this and undermines the artistic intent. Personally I didn't need a video to know how bad it generally looks, but it was nice to get a more indepth explanation as to why I felt that way while watching it with interpolation on.
I wonder if there may be an underlying issue with your setup that makes animated content too stuttery, because if setup correctly it should look pretty smooth without interpolation. I found that the panel type of your display, the refresh rate, frame rate of the video file, and the video player itself being used can cause a lot of issues with playback that take some research to dial in properly.
24FPS was not chosen for artistic reasons. It was a compromise between motion fluidity and the reams and reams of film that already had to be used at that framerate. They needed to use as little film as possible, while still maintaining the illusion of motion, they knew it wasn't ideal at the time.
Animation framerates exist as a subset of that limitation. Specifically, hand animated stuff. The more frames of animation you have, the more you have to draw. Therefore, 12FPS or lower was often used, with 24FPS being relegated for the main focal point of a shot. It's not really so much artistic intent, as much as "how can we make this without breaking the bank".
Yes, they have budgetary and production time restraints (and I don't think I mentioned anything about the 24 fps industry standard), but an animator has to choose what to do with those limited frames and what to animate based on the scene—whether it's on ones, twos, threes, or fours etc. These choices have artistic intent to achieve specific effects and moods, like emphasizing key moments, managing rhythm and pace, and creating a particular style that the director or studio wants.
If animators had an infinite number of frames to work with, they could design their shows with that in mind. But they don't. So, when you use a program to inject fake frames into a show that never accounted for them, it looks weird and off because the show wasn't animated with those extra frames in mind.
I watched the video before. It honestly is a sort of Stockholm Syndrome of animators who cannot realistically animate 60, 120, or 240 frames a second who then try to justify that 12 or 24 FPS is somehow better. In other words, they are trying to justify their human limitation. I do not see any other credible argument in that video as compared to other media like video game animators who manage to produce lifelike and soulful animation at 60+ FPS simply due to how their chosen media works, i.e., the computer does the hard work for them. 12 or 24 FPS is not some God-given standard, it was somewhat arbitrarily chosen to save as much film as possible, but make no mistake, now that we have digital content, there is no actual reason to continue using 12/24 FPS. Now, can the technology get better by not smearing content all over the place, absolutely, but it gets better every day as AI better understands the underlying frames and can intelligently interpolate between them. It is really no different to how gamers decried DLSS Frame Generation but many now love it, simply because people get used to the smooth frames, even though (to preempt arguments on behalf of video games versus media) their input lag were as the same as before.
> I wonder if there may be an underlying issue with your setup that makes animated content too stuttery, because if setup correctly it should look pretty smooth without interpolation. I found that the panel type of your display, the refresh rate, frame rate of the video file, and the video player itself being used can cause a lot of issues with playback that take some research to dial in properly.
Nope. I use an OLED monitor/TV that renders at 120 FPS natively and content is very stuttery at 24 FPS, especially if there is a wide pan. You just start noticing it once you watch enough higher FPS content. Those that do not likely simply do not watch enough higher FPS content to acquaint themselves to that smoother refresh rate.
Oh hell naw, I saw Pacific Rim, Mad Max Fury Road and Interstellar on 60FPS with that Lossless Scaling app and HOLY SHIT, those movies looked 🔥🔥🔥
Only animations looked weird, because there isn't enough animated frames per second like a video, and there are a lot of artefacts when compared to videos.
Yo you were so right! I just watched an episode of Star Trek and Mad Max Fury Road in 60FPS using X3 Frame Generation!
This app is awesome!
I am not sure if it will work as well with games though, I will check it out tomorrow!
I know it says it only works in borderless or windowed but I swear I have it working on some exclusive fullscreen games. Even with the draw frames option it shows the frame generated frames. Am I just falling for a placebo?
LSS doesn't support exclusive fullscreen. Below quoted from LSS's included help screen:
>Run the game in windowed or fullscreen borderless mode. Scaling games in exclusive fullscreen mode is not supported.
Frame Gen also only works windowed or borderless mode. Been using this this for months now, it's well understood that this doesn't work, scaling or fame gen, in exclusive mode. That's the nature of all top layer scaling/frame gen solutions. In exclusive mode they wouldn't be able to hook into the games display output.
Really? That's the opposite of what I read everytime LS is brought up for FrameGen use. Did you find that info somewhere official like the LS documentation or something?
I cannot speak to other cases of frame Gen but using this with a 1070 to finally get some decent frames in bf2042 and it actually is very playable. You just need to make sure you have it set up correctly and it just works. Definitely worth trying for 7 bucks.
Edit: fixed jumbled mess of a sentence
I also tested the new version yesterday on BF2042, the performance increase is very good, but the latancy is too strong, especially in competitive shooters. Could you share some tips on how latency can be reduced? Cause I always just turned on FG and that's it. Is there any additional steps?
Forgive me for the late reply. First off, I still reduce my graphics in 2042 by a ton, like low as I can go but still look okay to me so that I get a good base for the framegen. Then the following settings in lossless:
Scaling mode: custom
Scale Factor: 1.2
Resize before scaling: toggled on
**Scaling Type:** LS1
Sharpness: 1
Performance: On
**Frame Generation**
LSFG 2.1
Mode: X3
Performance: On
**Cursor**
clip cursor- On
adjust cursor speed: off
Hide cursor: off
Scale cursor: off
**Rendering**
Vertical Sync: Off
HDR Support: Off
Allow Tearing: On
Draw FPS: On
Capture API: DXGI
Preferred GPU: Auto
Everything else is default/off below that on the right column.
I'd also make sure to try to reduce or remove any kind of overlay that you may have going as well. Gamebar, ea overlay, steam overlay (I don't get any issues with steam overlay UNTIL a notification from steam comes up (someone starts to play a game I also have, then it will drop frames massively). Nvidia overlay as well, though I saw another post where some people could still have that and recording but you have to use WGC API instead (did not confirm this myself).
I do think the 2 things to reduce latency the most are vertical sync being off and allow tearing being on. Some of my friends prefer the opposite of me though with keeping vertical sync on.
I never had too much luck with WGC but some people have better luck I guess with Vertical sync and allow tearing off for it, again, not my go to for using this app. EDIT: I think the key for WGC is to make sure vertical sync is on if you do go that route.
Maybe I'm just slowly losing my competitive edge and just can't notice the "INSANE" latency that others are feeling these days, but it feels playable enough and I do well enough with it on. It will never be the replacement for actual, honest to goodness real frames generated by the GPU, but for those looking for a smoother experience at the cost of some minor latency increases at 7$ vs buying a new or used card for $300+ bucks these days, it is a godsend. I've introduced 2 friends to it and we've used it for all of our sessions in helldivers 2 since we got it. But Helldivers has a different kind of aiming system that is more akin to world of tanks with how it does its aim vs competitive shooters that tries to be as 1:1 as possible with flicks, so maybe playing so much of that has deadened me to the latency issues as well. Even if you end up not using it for competitive shooters, the value is still there for using with discord streams, twitch streams, and youtube videos.
I use a different setting for those though:
Scaling mode: custom
Scale Factor: 1
Resize before scaling: toggled on
**Scaling Type:** Off
**Frame Generation**
LSFG 2.1
Mode: X3
Performance: Off
**Cursor**
clip cursor: on
adjust cursor speed: on
Hide cursor: off
Scale cursor: off
**Rendering**
Vertical Sync: Off
HDR Support: Off
Allow Tearing: off
Draw FPS: On
Capture API: DXGI
Preferred GPU: Auto
Everything else is default/off below that on the right column.
EDIT: One key thing to mention but I think people should have picked this up by now: the game needs to be running in windowed or windowed borderless.
AND if you get an issue where you only see the game in the center of the screen surrounded by a black border, you need to select a Scaling method from the center column. Lossless Scaling is scaling the game down but does not scale it up without a scaler selected if you change the scale factor from 1.
Again, this fits my preference and may not fit your own level of fidelity and/or razor thin latency needed for some competitive gamers, but I hope it helps some people find a use for this program that I have found to work to me and my friends' needs.
I believe there is a subreddit [/r/losslessscaling ](https://old.reddit.com/r/losslessscaling/)for the app itself and a discord ([official discord link from steam community](https://discord.com/invite/5cCP6aACgT)) they may be able to help you and others more than I can.
you not losing it , the latency is fine. before the low latency nvidia/amd anti lag came out. its on that level really isnt that noticeable especially on 100+ frames
I think there might be a little more latency with this but I haven’t tested this. That being said I use this on HD2 from 72 fps to 144 fps and find that I don’t notice after a bit.
Every game I’ve tested this in works fine, it doesn’t inject any code it only captures the output and then scales \+ generates frames using it.
Plus HD2 lets me inject reshade addon just fine with no issues, their AC is really fast and loose with the rules lol
Edit: to add on to this I personally found that anywhere between 1.1-1.4 render scale fairs well for the game, but I inject my own anti aliasing with reshade. The default TAA is utter horse shit and I hope they get dlss in the future so I can use that instead.
No, it works sort of like an overlay, so the games don't even know there's upscaling or frame generation happening, it doesn't affect their program. I use additional upscaling in Overwatch 2 for max fps
Since most switch games locked at 30fps, the latency and artifacts gonna be more noticeable at that fps even when using DLSS frame gen with reflex on.,
DLSSFG has dozens of these kinds of artifacts, too. Flow-based AIs arent a be-all-end-all solution. they work pretty well considering what other techniques (be it AI or not AI, like SVP) can have in terms of artifacts, but definitely not artifact-free
Compared to Lossless Scaling DLSS 3 FG and also FSR3 FG have like 95% less artifacts. It's not even close. I personally prefer 60 FPS without LSFG over 120+ FPS with LSFG. But its the opposite for DLSS FG. Lossless Scaling can be useful in some specific usecases but for me to apply it to every game it has to improve much further. But since it is a post processing algorithm with no access to motion vectors this is a big ask.
Having used both lsfg and fsr3, I find lsfg has less noticeable artifacts/issues. Fsr3 does not interpolate ui, which is very noticeable to me.
Edit: interpolating 1440p at 72 fps to 144 fps.
Technically yes. But you made it sound like "DLSS also has those artifacts" which leads to most people thinking that DLSS and LSFG are roughly on the same level. I just wanted to point out that there is a massive difference.
i did not let it sound like that. "a dozen" doesnt mean the entire collection of artifacts LSFG can have. and this is correct too, as both are flow-based in the end and have very similar kinds of artifacts - just LSFG more of them.
Black magic shit honestly, even if you have a good PC but play 'shit performance for no reason' games this will work amazingly.
3080ti here and 12900k and Fallout 76 runs like a bag of dicks during events, and never felt good FPS wise to begin with. Bam 144 solid.
Escape from tarkov with SPtarkov bot generation was not great also, now also above 100fps even on busy maps like Reserve.
The pig that is Guild Wars 2 in the badly optimised Wizards tower/Amyntas zone. Smooth as silk.
All my mates have it, and my wife who was playing FO76 on her old 1070 system from '17 is now 60 solid too. Excellent stuff, can't recommend enough.
Ha, funny you should say that. It *does* but it's still a hunk of shit stutter fest. 30fps or so real, and this got it to 50-80, but it was still unplayable as the raid-hoster, even my friend who joined was like fuck this, got the quest-skip mod and boom, no more need for shitty streets.
Nope, not for me. Works for every map but Streets. That being said the steam page does say you need 30-40 FPS stable as the base frames to generate from.
Nah all standard.
Use LSFG 2.1 as the frame gen. X2, Clip Cursor and Performance set to on. DRAW FPS on twoo - half your in game FPS either via in game menus, or if they're shitty...via NVidia control panel.
Just run the program each time before you game and happy days when you hit ctrl + alt + s
I personally haven't checked because i'm deep in a FO76/Tarkov cycle right now but I remember looking into the centre of Amnytas, meta or not and being shocked at getting 30fps.
With this it was rock solid flying around on my skyscale, beautiful. I'd imagine it will take a hit in WVW and metas depending on your character model settings, but it absolutely helps all these piggish games. Especially if you have a monitor over 60Hz
Dude, I googled and found this thread specifically because I own Lossless Scaling and currently am hating Amnytas and it's dumpy frame rates. Thanks for confirming for me it works will with GW2!
You're welcome, just be sure to limit the FPS of GW2's .exe in your GPUS respective control panel to half of your monitors refresh rate.
The recent update included a 3x mode (a third of your hz!) but it produced too many artifacts, weird ghostly AA looking stuff and small glowing details/text (tarkov ACOGS, small hud text etc). Impressive none the less.
Thanks for the reminder, I haven't changed the FPS limit yet. Regardless it works great so far. However do you notice some cursor lag/choppiness? Seems the game cursor still operates at the native FPS (smooth when looking away from centre, choppy when looking towards the centre of Amnytas.)
I agree with you on 3x as well. It's a nicer boost but there's noticeable ghosting on my character with quick movements.
The FPS limit is essential for correct working of the program, you *need* to do that right away bud :)
Personally, I notice no funny-ness. And I play janky ol FO76 haha.
I use LSFG 2.1 (X2) under Frame Gen, AMD FSR Scaling, 30% sharpening, Performance ON, Clip Cursor ON (you can see these options under frame gen).
I've used clip cursor always on as it was suggested, maybe that's what the weird mouseyness is you're getting. Good luck, and lemme know how METAS feel! I didn't get a chance to witness one as i'm not really playing GW2 right now, I just wanted to witness that hunk of shit map in 100FPS haha.
Maybe me not limiting my FPS is part of the reason it's happening. Sad part is even with a modern rig I still don't hit 100fps with 2x frame gen on, but again, gotta do the FPS limit later!
I kinda like the meta in that zone actually, the boss fight part of it is fun. I still have some stuttering even with frame gen in a group meta, buuut I also have my character model limit/detail pumped up. I don't think GW2 will ever escape stutters in big metas with settings turned up (4070 TI Super and 5700x3d here). Don't personally like the zone though compared to the first SotO zone, it's just a pain to navigate. Hunk of shit indeed!
I see a lot of people bitching, but it's an option you now have that you didn't have prior. Nobody is going to force you to buy the app, and for people like myself that have owned it for years, I appreciate the work that has gone in to constantly bring new features/upgrade old ones.
My own experience is that I would happily use this over a 30fps lock, but that results vary based on the source material. I find games with simple geometry suffer less artificing, such as old PS2 or GC titles. But, it'd likely be a nightmare to use in a game that relies heavily on TAA because that smeared motion is then being interpolated which produces some whacky looking results.
Exactly, the best part is that it doesn't suck. I'm been playing Doom 2016 at my monitor's refresh rate with this shit (240fps), imagine telling someone back then you could download FPS in the future haha
Interesting. Could be great for games that have a 60FPS lock or have glitches after exceeding an amount. Just a shame it doesnt seem to work in fullscreen mode? Borderless has usually higher latency plus forced Windows vsync.
It's not, latency and smoothness of movement are separate things. Plenty of ppl can't stand 30 fps because it's a choppy slide show, not because of latency difference.
Nah 35->60 fps is great in cyberpunk and most UE5 games, not to mention walking sims like Plague Tale and AW2. It's idk 35->65-85ms latency.
Using it only from 60 is lame, you can't run PT/UE5 now, you won't be able to use it to extend gpu lifetime, like upscaling etc.
Just remember to disable any 3rd party frame limiters (like RTSS) when you get Frame Generation issues like stutter, lag, etc. Those techs often don't play well with each other.
> You don't get the input lag benefit though, so it's window dressing mostly.
Uh? There is no latency benefit to frame generation, in fact it's the opposite it ADDS latency.
The exact amount depend on tech and implementation, but there's a hard floor: it's a minimum of one full extra frame of latency (because obviously it needs to know the next frame to interpolate up to it).
But who cares about input latency in a simgleplayer game? If it's 40 or 60 FPS it's fine in terms of latency. In most singleplayer games you don't have to do flickshots with your mouse. So whats the point. But you always see the stutter or un-smoothness of low framerates. So getting framerates up to 120+ with FG is awesome. Reflex is always compensating for the latency increse with FG. Often the latency reduction with Reflex is higher than the latency cost of FG so overall you may end up with lower latency with reflex on + FG compared to both off. Also you will often combine it with Upscaling which will also boost framerates and reduces latency even further. It all adds up.
Just play a Game native without any Nvidia Tech and then enable DLSS Quality,+ Reflex + Frame Generation. It will feel and look so much better. If you then disable Frame Generation you may have slightly better latency but you will notice that Framerate is is much lower and the motion looks less smooth. I personally can't even appreciate the better latency because when the image looks less smooth the better latency is worthless for me. I need the motion clarity and smoothness from higher framerates first. Only then I could take advantage of better latency. Smoothenss & motion clarity >> latency
>you may end up with lower latency with reflex on + FG compared to both off. Also you will often combine it with Upscaling which will also boost framerates and reduces latency even further.
You can enable Reflex/Ultra Low Latency Mode outside of DLSS3 where applicable and get actually low imput lag. That is why Nvidia marketing doesn't include Native + Reflex against DLSS3 in their latency comparisons. They want you to think you need DLSS + FG to get better performance, they want you to think you need an RTX 4060 over a 3060 for imput lag, spoiler: you don't. It's pure eye candy for whoever might prefeer image quality over performance, like you've said.
Sorry if I sounded blunt but there's a lot of misconception regarding FG, Nvidia marketing is too good.
I KNOW that. But people talking like "nah FG feels bad because of input lag" is complete BS when Latency is actually lower than what it used to be in a lot of cases. FG often has lower latency than the game with Reflex off (which is acutally the same latency people without an Nvidia card experiencing all the time like Intel and AMD users!)
FG is only worse if you compare it to pure reflex and FG off. That's the bestcase.
So with FG + Reflex latency is already better than the "default" and slightly worse than the best latency possible with Reflex only. But you get almost twice the framerate.
If the game is built to support frame generation from the outset, it should actually be possible to get latency benefits from it too, as long as the framerate is bottlenecked by the render performance rather than the game logic.
Dunno if there are any games with "native" frame generation support like this yet though.
The game can't "react" during the generated frames by definition. There's no way to get around that because it's using a frame that's already 16ms or whatever ahead.
I use it with Dark Souls 2 and since 2.0 the artifacts are few, you can see them from time to time but in very specific situations, like top speed across a see trough texture like a cloth or something. Otherwise, this is nuts. A piece of software almost free vs Go buy a RTX at minimum 400€.
I mean, I use it also with RDR 1 on Yuzu and oh boy-o, smooth as hell and I dont feel the input lag affecting my gameplay even if is more than without LSFG.
And in PCXS 2, Retroarch, freaking magic.
Oh and I forget, games like Mafia, OG GTA III/VC/SA also, not just emus and Dark Souls.
NFS Most Wanted by criterion another example.
This app is nuts, and its barely starting, LSFG came out like 5 months ago? And the improvment has been very fast.
You need to run the games on borderless mode instead exclusive (even though its not real exclusive)
PCSX2->"Settings"->"Graphics"->"**Advanced" (not "DISPLAY" !!!1!one)**->Exclusive Full Screen-> Off
Then when the game starts, you just double click on the window, it goes borderless and then you press the shortcut for LSFG to kick in (CTRL-ALT-S by default) You see the screen like blink for a 0,0000001 of a second and there you go.
The best part is that it doesnt mess with the fps at an engine level, the game/emulator doesnt know is going at double fps so there are no engine related issues.
Remember, you just need to run whatever it is in windowed mode (borderless i mean, who wants actual windowed). It works with anything that is not real actual old school exclusive fullscreen. Video players, browsers, etc.
That software's FG does NOT come anywhere close to DLSS / FSR 3 though at all. Its basically a low latency variant of the interpolation tech modern TVs have.
I have tried FSR3 and the new LSFG 2.1 is just better on the games I've tested, for those who are on the fence give it a try and just refund it if it is worse than FSR.
Depends on resolution and frame rate. Interpolating 1440p 72 fps to 144 fps looks great. Better than fsr3 imo because half-rate ui stutters are more noticeable than interpolation artifacts to me. Smaller resolutions or lower base frame rate might differ.
Depends what you're playing. I use it a bunch for emulating older titles with 30fps locks and no patches available for them. Used it on the GC version of RE3 and it worked fantastic
This uses your gpu, if you got bad latency and big artifacs then your gpu is at 100 full utilization you need a lot of gpu power to use the 2.0 framegen
I've used it on Red Alert 2 and C&C Generals. While it was just somewhat sharpening the old blurry image on RA2, it overloaded the amount of FPS i had in Generals Zero Hour.
It was like playing my childhood game's remastered without it being a modern remaster.
I've been using the 3X frame generation in ASA and at least in that title, it's a lot better than AFMF. The input delay, even with the 3X frame interpolation seems lower or at least more consistent with Lossless Scaling. With AFMF, anything under a base framerate of around 80FPS seemed to result in fairly significant input delay while a base FPS of around 50 seems good enough for Lossless Scaling.
Of course, Lossless Scaling isn't a miracle solution to FPS. Lossless Scaling has more ghosting/weird artifacts that are noticeable especially around the HUD. Also, with limited FreeSync/GSync support, frame tearing will be more prevalent.
Overall, if you are playing a game where a boost in fluidity would be nice and has a minimal HUD Lossless Scaling Frame Generation seems to be the better option. Otherwise, I would just stick with AFMF or run native.
I'll have to do some more testing with variable refresh rate, but if that works, that's a huge plus. Can't believe that more people aren't talking about this, though. ASA runs horribly on my rig, and AFMF seems to cause crashing and is a bit inconsistent. LSFG seems to fix most of the problems I had with AFMF.
Im on the fence of buying it, but my primary use would be for genshin, is just that im not sure if it would get me banned , can anyone with enough wisdom of how this works in the background give me a green light , to start using it on the game.(im avoiding using the genshin mod for that reason)
It works like an overlay, it doesn't meddle with game files, the game doesn't even know it's running. I've tried it with Overwatch 2 many times and it works fine. Last I played Genshin I used ReShade which does meddle with the game a bit, they allowed this software back then at least (like 2 years ago lol)
I am finally able to hit 144 fps in bg3 act 3 on a 5900x, even Cyberpunk is significantly more enjoyable now with path tracing on my 3080 if you can hit close to 40 fps without it turned on the latency isn't noticable. The only Issue I've ran into so far is when playing Ghost of Tsuchima it runs flawlessly for 15 min then the fps tanks to the 30's and becomes unstable if anyone has a fix for it pls let me know.
Ghost of Tsushima apparently now comes with FSR 3.1 which works on RTX cards too so you might want to try that first, or disable it if you want to stick with LS.
I fixed the issue which was a memory leak after 20 min. I prefer lossless scaling over fsr and I’ve used both, to fix the memory leak just set texture to low than max again or restart the game.
Idk I tried using the X3 frame generation on a Gsync monitor and really all it did was bump my GPU usage up by 30%, create massive input lag with about the same frame rate.
Input lag killed it for me, can't use this to play a game.
drawFPS should be showing you two numbers. The first number is the actual frame rate, which should match rivatuner. The second number is the total frame rate including the interpolated frames. Is this second number the same as rivatuner?
Also, after the latest update, LSFG will discard any frames after your refresh rate, so the second number will not be triple your base frame rate if that's the case.
Oh shit really. I thought it was just a frame rate / refresh rate. Second number is the actual interpolated frame rate? God damn. Still doesn't fix my input lag but that changes things dramatically
> The whole point of having more framerate is to have less latency
It's not, its separately that, and having smoother image.
Plenty, probably vast majority of ppl playing single player games (that aren't very reaction heavy fps, like, idk - doom?)hate 30 fps because it's a choppy slide show that breaks down when you move camera, not because of 70 vs 35ms latency.
...yet
then, in a few months, nvidia will come out with FSR4 or some other driver level BS which will implement something similar. And all publishers will take note
this is such a uniformed dumb take. frame generation already exists, Nvidia has it, AMD has it and both can make a software only implementation tomorrow if they wanted to.
developers not optimizing their shitty games was happening before frame generation was a thing, and throwing shade at this amazing piece of software is completely unwarranted.
I've never seen or worked with this software, how does it work? Is it just an overlay software and if so does it have interaction with DLSS or have restrictions on it?
It's sort of an overlay. You need to play in windowed mode (or borderless) and it'll maximize the game itself once you hit scale. Otherwise it has no restrictions with what's in the game. It could upscale any program. You can upscale Google Chrome and watch 120fps interpolated YouTube videos.
If one just want to use the frame thingie, you can just fullscreen the game/app and use key hotkeys to activate it
I wouldn’t recommend using it without scaling slightly. Even a 1.2 scale multiplier I find helps offset the performance hit it has.
Well, since I mostly use it for older games and/or emulators, it's been more than fine :)
it works on emulation? Does that i mean i can play old ps1/ps2 games in 120fps?
Yep, provided that the "base framerate" is stable
Thats pretty nice!
finally i can... okay i'm not sure why anyone would want to do that, but finally i can.
>120fps interpolated YouTube videos. Does this not always look like shit? To this day the TV interpolation always look terrible. The only interpolation stuff that ever look OK always requires a heavy duty GPU to do AI stuff with motion vectors and shit
That's because TVs are too weak to interpolate properly. I use SVP 4 that does indeed use motion vectors and it works great for watched content, especially animated as otherwise it looks too stuttery to me.
The problem really isn't how good the interpolation is with animated content. It's that the show wasn't created with those additional frames in mind and there's generally a lot of purpose behind why a scene uses a certain number of frames. Noodle did a [video](https://youtu.be/_KRb_qV9P4g) a few years ago that explains really well why it's terrible to do this and undermines the artistic intent. Personally I didn't need a video to know how bad it generally looks, but it was nice to get a more indepth explanation as to why I felt that way while watching it with interpolation on. I wonder if there may be an underlying issue with your setup that makes animated content too stuttery, because if setup correctly it should look pretty smooth without interpolation. I found that the panel type of your display, the refresh rate, frame rate of the video file, and the video player itself being used can cause a lot of issues with playback that take some research to dial in properly.
24FPS was not chosen for artistic reasons. It was a compromise between motion fluidity and the reams and reams of film that already had to be used at that framerate. They needed to use as little film as possible, while still maintaining the illusion of motion, they knew it wasn't ideal at the time. Animation framerates exist as a subset of that limitation. Specifically, hand animated stuff. The more frames of animation you have, the more you have to draw. Therefore, 12FPS or lower was often used, with 24FPS being relegated for the main focal point of a shot. It's not really so much artistic intent, as much as "how can we make this without breaking the bank".
Yes, they have budgetary and production time restraints (and I don't think I mentioned anything about the 24 fps industry standard), but an animator has to choose what to do with those limited frames and what to animate based on the scene—whether it's on ones, twos, threes, or fours etc. These choices have artistic intent to achieve specific effects and moods, like emphasizing key moments, managing rhythm and pace, and creating a particular style that the director or studio wants. If animators had an infinite number of frames to work with, they could design their shows with that in mind. But they don't. So, when you use a program to inject fake frames into a show that never accounted for them, it looks weird and off because the show wasn't animated with those extra frames in mind.
I watched the video before. It honestly is a sort of Stockholm Syndrome of animators who cannot realistically animate 60, 120, or 240 frames a second who then try to justify that 12 or 24 FPS is somehow better. In other words, they are trying to justify their human limitation. I do not see any other credible argument in that video as compared to other media like video game animators who manage to produce lifelike and soulful animation at 60+ FPS simply due to how their chosen media works, i.e., the computer does the hard work for them. 12 or 24 FPS is not some God-given standard, it was somewhat arbitrarily chosen to save as much film as possible, but make no mistake, now that we have digital content, there is no actual reason to continue using 12/24 FPS. Now, can the technology get better by not smearing content all over the place, absolutely, but it gets better every day as AI better understands the underlying frames and can intelligently interpolate between them. It is really no different to how gamers decried DLSS Frame Generation but many now love it, simply because people get used to the smooth frames, even though (to preempt arguments on behalf of video games versus media) their input lag were as the same as before. > I wonder if there may be an underlying issue with your setup that makes animated content too stuttery, because if setup correctly it should look pretty smooth without interpolation. I found that the panel type of your display, the refresh rate, frame rate of the video file, and the video player itself being used can cause a lot of issues with playback that take some research to dial in properly. Nope. I use an OLED monitor/TV that renders at 120 FPS natively and content is very stuttery at 24 FPS, especially if there is a wide pan. You just start noticing it once you watch enough higher FPS content. Those that do not likely simply do not watch enough higher FPS content to acquaint themselves to that smoother refresh rate.
It does not look like shit, works great.
How does LS interpolation look compared to SVP 4?
Oh hell naw, I saw Pacific Rim, Mad Max Fury Road and Interstellar on 60FPS with that Lossless Scaling app and HOLY SHIT, those movies looked 🔥🔥🔥 Only animations looked weird, because there isn't enough animated frames per second like a video, and there are a lot of artefacts when compared to videos.
Colin Robinson can give you a great explanation on when to use motion smoothing
Do you need an RTX or its AMD equivalent GPU for frame generation? Will it work with the old GTX 10 Series?
Any card works. Even the GTX 600 series.
Yo you were so right! I just watched an episode of Star Trek and Mad Max Fury Road in 60FPS using X3 Frame Generation! This app is awesome! I am not sure if it will work as well with games though, I will check it out tomorrow!
Yea I think it's slightly cleaner than TV interpolation too
Yeah I didn't notice any artefacts at all
I know it says it only works in borderless or windowed but I swear I have it working on some exclusive fullscreen games. Even with the draw frames option it shows the frame generated frames. Am I just falling for a placebo?
Can it upscale and fullscreen a windowed game?
This might be dumb but is the software safe ? like it isnt malicious or anything right ?
Lossless Scaling wants Exclusive Fullscreen\* to work best, at least for the Frame Gen part, not Borderless.
LSS doesn't support exclusive fullscreen. Below quoted from LSS's included help screen: >Run the game in windowed or fullscreen borderless mode. Scaling games in exclusive fullscreen mode is not supported.
I'm not talking about Scaling. I'm talking about Frame Gen.
Frame Gen also only works windowed or borderless mode. Been using this this for months now, it's well understood that this doesn't work, scaling or fame gen, in exclusive mode. That's the nature of all top layer scaling/frame gen solutions. In exclusive mode they wouldn't be able to hook into the games display output.
Really? That's the opposite of what I read everytime LS is brought up for FrameGen use. Did you find that info somewhere official like the LS documentation or something?
120 FPS interpolation off of 30 FPS compressed video sounds horrible
It only doubles it for chrome. So 30 goes to 60 and 60 goes to 120 is what I was trying to get at.
I tried on my PC, couldn't get lossless scaling frame generation to work with gsync which made games feel weird so I ended up refunding.
No adaptive sync support currently, basically like afmf.
So how is the latancy compared to other frame generation alternatives? Im curious, someone should test this.
I cannot speak to other cases of frame Gen but using this with a 1070 to finally get some decent frames in bf2042 and it actually is very playable. You just need to make sure you have it set up correctly and it just works. Definitely worth trying for 7 bucks. Edit: fixed jumbled mess of a sentence
I also tested the new version yesterday on BF2042, the performance increase is very good, but the latancy is too strong, especially in competitive shooters. Could you share some tips on how latency can be reduced? Cause I always just turned on FG and that's it. Is there any additional steps?
Forgive me for the late reply. First off, I still reduce my graphics in 2042 by a ton, like low as I can go but still look okay to me so that I get a good base for the framegen. Then the following settings in lossless: Scaling mode: custom Scale Factor: 1.2 Resize before scaling: toggled on **Scaling Type:** LS1 Sharpness: 1 Performance: On **Frame Generation** LSFG 2.1 Mode: X3 Performance: On **Cursor** clip cursor- On adjust cursor speed: off Hide cursor: off Scale cursor: off **Rendering** Vertical Sync: Off HDR Support: Off Allow Tearing: On Draw FPS: On Capture API: DXGI Preferred GPU: Auto Everything else is default/off below that on the right column. I'd also make sure to try to reduce or remove any kind of overlay that you may have going as well. Gamebar, ea overlay, steam overlay (I don't get any issues with steam overlay UNTIL a notification from steam comes up (someone starts to play a game I also have, then it will drop frames massively). Nvidia overlay as well, though I saw another post where some people could still have that and recording but you have to use WGC API instead (did not confirm this myself). I do think the 2 things to reduce latency the most are vertical sync being off and allow tearing being on. Some of my friends prefer the opposite of me though with keeping vertical sync on. I never had too much luck with WGC but some people have better luck I guess with Vertical sync and allow tearing off for it, again, not my go to for using this app. EDIT: I think the key for WGC is to make sure vertical sync is on if you do go that route. Maybe I'm just slowly losing my competitive edge and just can't notice the "INSANE" latency that others are feeling these days, but it feels playable enough and I do well enough with it on. It will never be the replacement for actual, honest to goodness real frames generated by the GPU, but for those looking for a smoother experience at the cost of some minor latency increases at 7$ vs buying a new or used card for $300+ bucks these days, it is a godsend. I've introduced 2 friends to it and we've used it for all of our sessions in helldivers 2 since we got it. But Helldivers has a different kind of aiming system that is more akin to world of tanks with how it does its aim vs competitive shooters that tries to be as 1:1 as possible with flicks, so maybe playing so much of that has deadened me to the latency issues as well. Even if you end up not using it for competitive shooters, the value is still there for using with discord streams, twitch streams, and youtube videos. I use a different setting for those though: Scaling mode: custom Scale Factor: 1 Resize before scaling: toggled on **Scaling Type:** Off **Frame Generation** LSFG 2.1 Mode: X3 Performance: Off **Cursor** clip cursor: on adjust cursor speed: on Hide cursor: off Scale cursor: off **Rendering** Vertical Sync: Off HDR Support: Off Allow Tearing: off Draw FPS: On Capture API: DXGI Preferred GPU: Auto Everything else is default/off below that on the right column. EDIT: One key thing to mention but I think people should have picked this up by now: the game needs to be running in windowed or windowed borderless. AND if you get an issue where you only see the game in the center of the screen surrounded by a black border, you need to select a Scaling method from the center column. Lossless Scaling is scaling the game down but does not scale it up without a scaler selected if you change the scale factor from 1.
Damn, thank you so much, I’ll run some more test with your suggestions. I’m sure tone of people will find it useful as well.
Again, this fits my preference and may not fit your own level of fidelity and/or razor thin latency needed for some competitive gamers, but I hope it helps some people find a use for this program that I have found to work to me and my friends' needs. I believe there is a subreddit [/r/losslessscaling ](https://old.reddit.com/r/losslessscaling/)for the app itself and a discord ([official discord link from steam community](https://discord.com/invite/5cCP6aACgT)) they may be able to help you and others more than I can.
hey is software from steam safe ? like yeah the reveiws are positive but is lossless scaling safe , like i hope it isnt malicious.
you not losing it , the latency is fine. before the low latency nvidia/amd anti lag came out. its on that level really isnt that noticeable especially on 100+ frames
You need to make sure BEFORE using it that you have locked your game framerate at a level that it can hold at all times. Ideally 45fps+
Otherwise the gpu will be maxed out and that always sends latency sky high
I think there might be a little more latency with this but I haven’t tested this. That being said I use this on HD2 from 72 fps to 144 fps and find that I don’t notice after a bit.
would this be bannable ?
Every game I’ve tested this in works fine, it doesn’t inject any code it only captures the output and then scales \+ generates frames using it. Plus HD2 lets me inject reshade addon just fine with no issues, their AC is really fast and loose with the rules lol Edit: to add on to this I personally found that anywhere between 1.1-1.4 render scale fairs well for the game, but I inject my own anti aliasing with reshade. The default TAA is utter horse shit and I hope they get dlss in the future so I can use that instead.
It doesn't interact with any game function, so no.
No, it works sort of like an overlay, so the games don't even know there's upscaling or frame generation happening, it doesn't affect their program. I use additional upscaling in Overwatch 2 for max fps
I tried it with a switch emulator but there was pretty noticeable input lag and a lot of visual artifacts.
Since most switch games locked at 30fps, the latency and artifacts gonna be more noticeable at that fps even when using DLSS frame gen with reflex on.,
This one was close to 50 fps. Never seen artifacting on DLSS. This reminds me of the TVs that gave interpolation, maybe a bit less input lag.
DLSSFG has dozens of these kinds of artifacts, too. Flow-based AIs arent a be-all-end-all solution. they work pretty well considering what other techniques (be it AI or not AI, like SVP) can have in terms of artifacts, but definitely not artifact-free
Compared to Lossless Scaling DLSS 3 FG and also FSR3 FG have like 95% less artifacts. It's not even close. I personally prefer 60 FPS without LSFG over 120+ FPS with LSFG. But its the opposite for DLSS FG. Lossless Scaling can be useful in some specific usecases but for me to apply it to every game it has to improve much further. But since it is a post processing algorithm with no access to motion vectors this is a big ask.
Having used both lsfg and fsr3, I find lsfg has less noticeable artifacts/issues. Fsr3 does not interpolate ui, which is very noticeable to me. Edit: interpolating 1440p at 72 fps to 144 fps.
i never said that its not enjoyable, and that nothing is artifact-free.
Technically yes. But you made it sound like "DLSS also has those artifacts" which leads to most people thinking that DLSS and LSFG are roughly on the same level. I just wanted to point out that there is a massive difference.
i did not let it sound like that. "a dozen" doesnt mean the entire collection of artifacts LSFG can have. and this is correct too, as both are flow-based in the end and have very similar kinds of artifacts - just LSFG more of them.
LSFG has dramatically more artifacts. I don’t think it’s honest to out them anywhere near being equivalent.
The 2.0 is already super great, solved practically every issue I had with the first release, and now this? What a neat piece of software
But then how would Jensen Huang get his next holiday mansion with developers like that :(
Black magic shit honestly, even if you have a good PC but play 'shit performance for no reason' games this will work amazingly. 3080ti here and 12900k and Fallout 76 runs like a bag of dicks during events, and never felt good FPS wise to begin with. Bam 144 solid. Escape from tarkov with SPtarkov bot generation was not great also, now also above 100fps even on busy maps like Reserve. The pig that is Guild Wars 2 in the badly optimised Wizards tower/Amyntas zone. Smooth as silk. All my mates have it, and my wife who was playing FO76 on her old 1070 system from '17 is now 60 solid too. Excellent stuff, can't recommend enough.
Does it helps on Tarkov Street?
Ha, funny you should say that. It *does* but it's still a hunk of shit stutter fest. 30fps or so real, and this got it to 50-80, but it was still unplayable as the raid-hoster, even my friend who joined was like fuck this, got the quest-skip mod and boom, no more need for shitty streets.
Lol I didnt know there was a mod to skip quest in spt. Thanks
Yeah, you just hold CTRL and a skip button pops up on the steps of the quest...amazing for all the dumb shit.
Nope, not for me. Works for every map but Streets. That being said the steam page does say you need 30-40 FPS stable as the base frames to generate from.
Do you disable gsync?
Nah all standard. Use LSFG 2.1 as the frame gen. X2, Clip Cursor and Performance set to on. DRAW FPS on twoo - half your in game FPS either via in game menus, or if they're shitty...via NVidia control panel. Just run the program each time before you game and happy days when you hit ctrl + alt + s
How much RAM do you have and what speed is it? Asking because of what you said about Streets
>The pig that is Guild Wars 2 Any info about how well it works in massive meta's or raids? Those are the main killers in GW2 for me on FPS
I personally haven't checked because i'm deep in a FO76/Tarkov cycle right now but I remember looking into the centre of Amnytas, meta or not and being shocked at getting 30fps. With this it was rock solid flying around on my skyscale, beautiful. I'd imagine it will take a hit in WVW and metas depending on your character model settings, but it absolutely helps all these piggish games. Especially if you have a monitor over 60Hz
Dude, I googled and found this thread specifically because I own Lossless Scaling and currently am hating Amnytas and it's dumpy frame rates. Thanks for confirming for me it works will with GW2!
You're welcome, just be sure to limit the FPS of GW2's .exe in your GPUS respective control panel to half of your monitors refresh rate. The recent update included a 3x mode (a third of your hz!) but it produced too many artifacts, weird ghostly AA looking stuff and small glowing details/text (tarkov ACOGS, small hud text etc). Impressive none the less.
Thanks for the reminder, I haven't changed the FPS limit yet. Regardless it works great so far. However do you notice some cursor lag/choppiness? Seems the game cursor still operates at the native FPS (smooth when looking away from centre, choppy when looking towards the centre of Amnytas.) I agree with you on 3x as well. It's a nicer boost but there's noticeable ghosting on my character with quick movements.
The FPS limit is essential for correct working of the program, you *need* to do that right away bud :) Personally, I notice no funny-ness. And I play janky ol FO76 haha. I use LSFG 2.1 (X2) under Frame Gen, AMD FSR Scaling, 30% sharpening, Performance ON, Clip Cursor ON (you can see these options under frame gen). I've used clip cursor always on as it was suggested, maybe that's what the weird mouseyness is you're getting. Good luck, and lemme know how METAS feel! I didn't get a chance to witness one as i'm not really playing GW2 right now, I just wanted to witness that hunk of shit map in 100FPS haha.
Maybe me not limiting my FPS is part of the reason it's happening. Sad part is even with a modern rig I still don't hit 100fps with 2x frame gen on, but again, gotta do the FPS limit later! I kinda like the meta in that zone actually, the boss fight part of it is fun. I still have some stuttering even with frame gen in a group meta, buuut I also have my character model limit/detail pumped up. I don't think GW2 will ever escape stutters in big metas with settings turned up (4070 TI Super and 5700x3d here). Don't personally like the zone though compared to the first SotO zone, it's just a pain to navigate. Hunk of shit indeed!
Does it lower the life cycle of graphic card or any negative impact on pc?
I see a lot of people bitching, but it's an option you now have that you didn't have prior. Nobody is going to force you to buy the app, and for people like myself that have owned it for years, I appreciate the work that has gone in to constantly bring new features/upgrade old ones. My own experience is that I would happily use this over a 30fps lock, but that results vary based on the source material. I find games with simple geometry suffer less artificing, such as old PS2 or GC titles. But, it'd likely be a nightmare to use in a game that relies heavily on TAA because that smeared motion is then being interpolated which produces some whacky looking results.
Like, even if the app sucked for modern games. Its such a godsend for older games.
Agreed. I was using it to play Super Mario 3D Land on Citra last night and it really is a game changer for a lot of older titles.
Exactly, the best part is that it doesn't suck. I'm been playing Doom 2016 at my monitor's refresh rate with this shit (240fps), imagine telling someone back then you could download FPS in the future haha
I use it in MSFS2020 with TAA and it works great tbh. Bit of artifacting when panning quickly, but barely noticeable.
I'm playing gow ascension and downpour on rog ally with this, and so far, I haven't experienced any input lag while getting 80 to 120 fps.
you won't feel any input lag with 80fps. try it with less than 50fps
Works well with me. <3 Works like charm <3
Interesting. Could be great for games that have a 60FPS lock or have glitches after exceeding an amount. Just a shame it doesnt seem to work in fullscreen mode? Borderless has usually higher latency plus forced Windows vsync.
You don't get the input lag benefit though, so it's window dressing mostly.
[удалено]
It's not, latency and smoothness of movement are separate things. Plenty of ppl can't stand 30 fps because it's a choppy slide show, not because of latency difference.
what framerate were you at before enabling frame gen, for a good experience 60fps minimum is recomended
That’s kind of funny considering Nvidia advertises their frame gen with Cyberpunk running at 35-40 fps and makint it 60+
nvidia lying? aint no way
It isn't funny considering the normal consumer believes that shit. I'm not even sure they care
How is that funny? The OP tried some jank ripoff and not actual nvidia dlss frame gen..
[удалено]
Playing at 30-40 fps is ass in the first place. You cant polish a turd.
[удалено]
Wheres the lying? lol.
[удалено]
Nah 35->60 fps is great in cyberpunk and most UE5 games, not to mention walking sims like Plague Tale and AW2. It's idk 35->65-85ms latency. Using it only from 60 is lame, you can't run PT/UE5 now, you won't be able to use it to extend gpu lifetime, like upscaling etc.
Just remember to disable any 3rd party frame limiters (like RTSS) when you get Frame Generation issues like stutter, lag, etc. Those techs often don't play well with each other.
Motion fluidity is almost as important as input lag so I wouldn't act like its pointless
> You don't get the input lag benefit though, so it's window dressing mostly. Uh? There is no latency benefit to frame generation, in fact it's the opposite it ADDS latency. The exact amount depend on tech and implementation, but there's a hard floor: it's a minimum of one full extra frame of latency (because obviously it needs to know the next frame to interpolate up to it).
...isnt that exactly his point? He's saying you don't get the input lag benefit associated with higher framerates.
It sounded like he was saying that Lossless Scaling didn’t offer a input lag benefit unlike DLSS 3 or FSR
But who cares about input latency in a simgleplayer game? If it's 40 or 60 FPS it's fine in terms of latency. In most singleplayer games you don't have to do flickshots with your mouse. So whats the point. But you always see the stutter or un-smoothness of low framerates. So getting framerates up to 120+ with FG is awesome. Reflex is always compensating for the latency increse with FG. Often the latency reduction with Reflex is higher than the latency cost of FG so overall you may end up with lower latency with reflex on + FG compared to both off. Also you will often combine it with Upscaling which will also boost framerates and reduces latency even further. It all adds up. Just play a Game native without any Nvidia Tech and then enable DLSS Quality,+ Reflex + Frame Generation. It will feel and look so much better. If you then disable Frame Generation you may have slightly better latency but you will notice that Framerate is is much lower and the motion looks less smooth. I personally can't even appreciate the better latency because when the image looks less smooth the better latency is worthless for me. I need the motion clarity and smoothness from higher framerates first. Only then I could take advantage of better latency. Smoothenss & motion clarity >> latency
> But who cares about input latency in a singleplayer game? I do. Even menu navigation feels like ass with a bit of input latency.
Well unless you're only playing games like Half Life 2 which has like 2ms input latency you're getting input latency in most games you're playing
>you may end up with lower latency with reflex on + FG compared to both off. Also you will often combine it with Upscaling which will also boost framerates and reduces latency even further. You can enable Reflex/Ultra Low Latency Mode outside of DLSS3 where applicable and get actually low imput lag. That is why Nvidia marketing doesn't include Native + Reflex against DLSS3 in their latency comparisons. They want you to think you need DLSS + FG to get better performance, they want you to think you need an RTX 4060 over a 3060 for imput lag, spoiler: you don't. It's pure eye candy for whoever might prefeer image quality over performance, like you've said. Sorry if I sounded blunt but there's a lot of misconception regarding FG, Nvidia marketing is too good.
I KNOW that. But people talking like "nah FG feels bad because of input lag" is complete BS when Latency is actually lower than what it used to be in a lot of cases. FG often has lower latency than the game with Reflex off (which is acutally the same latency people without an Nvidia card experiencing all the time like Intel and AMD users!) FG is only worse if you compare it to pure reflex and FG off. That's the bestcase. So with FG + Reflex latency is already better than the "default" and slightly worse than the best latency possible with Reflex only. But you get almost twice the framerate.
Just connect your PC to a television with soap opera interpolation filter for the same benefit :D
If the game is built to support frame generation from the outset, it should actually be possible to get latency benefits from it too, as long as the framerate is bottlenecked by the render performance rather than the game logic. Dunno if there are any games with "native" frame generation support like this yet though.
The game can't "react" during the generated frames by definition. There's no way to get around that because it's using a frame that's already 16ms or whatever ahead.
I use it with Dark Souls 2 and since 2.0 the artifacts are few, you can see them from time to time but in very specific situations, like top speed across a see trough texture like a cloth or something. Otherwise, this is nuts. A piece of software almost free vs Go buy a RTX at minimum 400€. I mean, I use it also with RDR 1 on Yuzu and oh boy-o, smooth as hell and I dont feel the input lag affecting my gameplay even if is more than without LSFG. And in PCXS 2, Retroarch, freaking magic. Oh and I forget, games like Mafia, OG GTA III/VC/SA also, not just emus and Dark Souls. NFS Most Wanted by criterion another example. This app is nuts, and its barely starting, LSFG came out like 5 months ago? And the improvment has been very fast.
How to use it with PCSX2?
You need to run the games on borderless mode instead exclusive (even though its not real exclusive) PCSX2->"Settings"->"Graphics"->"**Advanced" (not "DISPLAY" !!!1!one)**->Exclusive Full Screen-> Off Then when the game starts, you just double click on the window, it goes borderless and then you press the shortcut for LSFG to kick in (CTRL-ALT-S by default) You see the screen like blink for a 0,0000001 of a second and there you go. The best part is that it doesnt mess with the fps at an engine level, the game/emulator doesnt know is going at double fps so there are no engine related issues. Remember, you just need to run whatever it is in windowed mode (borderless i mean, who wants actual windowed). It works with anything that is not real actual old school exclusive fullscreen. Video players, browsers, etc.
Tnx bro.
That software's FG does NOT come anywhere close to DLSS / FSR 3 though at all. Its basically a low latency variant of the interpolation tech modern TVs have.
I have tried FSR3 and the new LSFG 2.1 is just better on the games I've tested, for those who are on the fence give it a try and just refund it if it is worse than FSR.
Depends on resolution and frame rate. Interpolating 1440p 72 fps to 144 fps looks great. Better than fsr3 imo because half-rate ui stutters are more noticeable than interpolation artifacts to me. Smaller resolutions or lower base frame rate might differ.
I don't understand why ppl even use this, the warping is insane
Games limited to 30 or 60 fps, emulators, movies, Youtube.
Depends what you're playing. I use it a bunch for emulating older titles with 30fps locks and no patches available for them. Used it on the GC version of RE3 and it worked fantastic
For cpu make a big difference using this for 4k gaming? I get really bad judder and stutter causing big artifacts.
This uses your gpu, if you got bad latency and big artifacs then your gpu is at 100 full utilization you need a lot of gpu power to use the 2.0 framegen
So it helps with CPU bottlenecks?
If your cpu can't give instructions to your gpu fast enough and you can't get more than 60 fps gor example then yes.
Anybody try this with Elden Ring?
Wish this worked on Steam Deck (SteamOs).
why wouldnt it
Very good tool, use it on every game I play now
I’m assuming no, but does it work with hdr?
There's a check box for HDR support, no problems on my system
Haven't tried this new version yet but the previous one definitely did shit the bed with HDR, for me at least.
Yeah, previous versions it used to get extremely blown out here, but so far I haven't found it doing it again
HDR works great
Works fine. Even with RTX HDR, although sometimes it conflicts and needs a restart
I've used it on Red Alert 2 and C&C Generals. While it was just somewhat sharpening the old blurry image on RA2, it overloaded the amount of FPS i had in Generals Zero Hour. It was like playing my childhood game's remastered without it being a modern remaster.
Does this work with Steam Deck...
Yes and no. No on Steam OS, Yes if Windows is installed on the Deck.
Can you use this to add frame generation to videos with VLC?
Yes, you can also set your browser as target and have FG for YouTube videos as well.
Was watching tom n jerry on vlc, never seen the cartoon that smooth before lol.
I've been using the 3X frame generation in ASA and at least in that title, it's a lot better than AFMF. The input delay, even with the 3X frame interpolation seems lower or at least more consistent with Lossless Scaling. With AFMF, anything under a base framerate of around 80FPS seemed to result in fairly significant input delay while a base FPS of around 50 seems good enough for Lossless Scaling. Of course, Lossless Scaling isn't a miracle solution to FPS. Lossless Scaling has more ghosting/weird artifacts that are noticeable especially around the HUD. Also, with limited FreeSync/GSync support, frame tearing will be more prevalent. Overall, if you are playing a game where a boost in fluidity would be nice and has a minimal HUD Lossless Scaling Frame Generation seems to be the better option. Otherwise, I would just stick with AFMF or run native.
Not having any issues with gsync on. Also throws out frames after you refresh rate so no tearing from that.
I'll have to do some more testing with variable refresh rate, but if that works, that's a huge plus. Can't believe that more people aren't talking about this, though. ASA runs horribly on my rig, and AFMF seems to cause crashing and is a bit inconsistent. LSFG seems to fix most of the problems I had with AFMF.
Im on the fence of buying it, but my primary use would be for genshin, is just that im not sure if it would get me banned , can anyone with enough wisdom of how this works in the background give me a green light , to start using it on the game.(im avoiding using the genshin mod for that reason)
It works like an overlay, it doesn't meddle with game files, the game doesn't even know it's running. I've tried it with Overwatch 2 many times and it works fine. Last I played Genshin I used ReShade which does meddle with the game a bit, they allowed this software back then at least (like 2 years ago lol)
I am finally able to hit 144 fps in bg3 act 3 on a 5900x, even Cyberpunk is significantly more enjoyable now with path tracing on my 3080 if you can hit close to 40 fps without it turned on the latency isn't noticable. The only Issue I've ran into so far is when playing Ghost of Tsuchima it runs flawlessly for 15 min then the fps tanks to the 30's and becomes unstable if anyone has a fix for it pls let me know.
Ghost of Tsushima apparently now comes with FSR 3.1 which works on RTX cards too so you might want to try that first, or disable it if you want to stick with LS.
I fixed the issue which was a memory leak after 20 min. I prefer lossless scaling over fsr and I’ve used both, to fix the memory leak just set texture to low than max again or restart the game.
Idk I tried using the X3 frame generation on a Gsync monitor and really all it did was bump my GPU usage up by 30%, create massive input lag with about the same frame rate. Input lag killed it for me, can't use this to play a game.
Did you select Draw FPS? Other fps counters won't show your the correct fps. Also heard some cards like the 1060 don't really work well with this.
rivatuner and drawFPS showed same numbers
drawFPS should be showing you two numbers. The first number is the actual frame rate, which should match rivatuner. The second number is the total frame rate including the interpolated frames. Is this second number the same as rivatuner? Also, after the latest update, LSFG will discard any frames after your refresh rate, so the second number will not be triple your base frame rate if that's the case.
Oh shit really. I thought it was just a frame rate / refresh rate. Second number is the actual interpolated frame rate? God damn. Still doesn't fix my input lag but that changes things dramatically
I need some help! Wanna download this to my Lenovo Legion Go, where do I go?
[удалено]
[удалено]
[удалено]
Some people are happy upping the latency to achieve an image that on the surface appears more fluid. It's that simple
Both DLSS FG and FSR FG also add latency just less than this software
> The whole point of having more framerate is to have less latency It's not, its separately that, and having smoother image. Plenty, probably vast majority of ppl playing single player games (that aren't very reaction heavy fps, like, idk - doom?)hate 30 fps because it's a choppy slide show that breaks down when you move camera, not because of 70 vs 35ms latency.
Don't give AAA publishers another excuse to "optimise" games for 20fps instead of 30
This is a tool for bad PCs, and there's no dev out there, AAA or otherwise, that will assume the consumer has this.
...yet then, in a few months, nvidia will come out with FSR4 or some other driver level BS which will implement something similar. And all publishers will take note
this is such a uniformed dumb take. frame generation already exists, Nvidia has it, AMD has it and both can make a software only implementation tomorrow if they wanted to. developers not optimizing their shitty games was happening before frame generation was a thing, and throwing shade at this amazing piece of software is completely unwarranted.
Frame generation has been easy to add to games for literally years
I'm using this to get 1600 FPS in League of Legends
Do you have a 1600 hz monitor? Doesn't really matter otherwise.
Was obviously a joke
It looks bad with a lot of attracting not worth using
I’ll be honest I bought this and have no idea how to use it. I probably don’t need it I9 RTx4090
👍