Personal information? Where?
How many graphics leads are there in a studio with 3500 employees? More than a hundred perhaps?"Lead graphics engineer."
https://www.neogaf.com/threads/next...leaks-thread.1480978/page-1000#post-256994247
"No its running at 90fps locked. native 4k."
Why? It makes very little sense to me to have a game running locked at 90FPS on console. Why not use the overhead to improve assets? I know in racing games higher FPS makes sense to help with controller latency and I guess this is a twitch style shooter but you don't see Call of Duty or other games trying to lock their games at 90FPS and sacrificing IQ in the process. If the game is locked at 90FPS that means it's consistently running well over 100FPS right? Again why?
New TVs can now show 4K120, and locking the framerate to any point below that range will produce a nice flat frametime graph and consistent input lag. IMO that's the most important point for every game, no matter if its locked at 30, 60, 90, 120 or any place between 60-90. Consistency leads to good game feel.Why? It makes very little sense to me to have a game running locked at 90FPS on console.
FPS games can greatly benefit from high framerate.but this sounds like a first person shooter
I have hard doubts here. I have family in Ubisoft. It’s locked down hard there.How many graphics leads are there in a studio with 3500 employees? More than a hundred perhaps?
Assuming this is Ubisoft Montreal, of course.
Does "locked" mean won't go above 90FPS or always at 90FPS?
In the PC, a game locked at 30FPS is a game whose engine won't go above 30FPS, so it's an upper limit.
Locking at 90 FPS as a maximum threshold seems reasonable enough to me. They prefer that the system waits for the next frame to start within the 11.1ms window, and it's how they achieve predictable frametimes.
How many graphics leads are there in a studio with 3500 employees? More than a hundred perhaps?
Assuming this is Ubisoft Montreal, of course.
o'dium said:I only work with/worked with other people in the industry. I have no access to dev kits myself, sadly. I do have info we chat about, nothing I will talk about here other than to say it lines up with what’s being said.
Agree, most AAA games have gone towards reconstruction of some form.Native seems a big red flag, almost all AA have gone to reconstruction and similar which I assume would still be used unless that would make scaling back to current gen just too difficult.
Ignoring the possibility of VRR, 90fps on a 120hz display would be akin to 45fps on a 60hz display (except sped up 2x), with constant frametime variance between 16.7ms and 8.3ms.New TVs can now show 4K120, and locking the framerate to any point below that range will produce a nice flat frametime graph and consistent input lag.
If you have HDMI 2.1 to display those resolutions, you're pretty guaranteed VRR. I would be flummoxed if someone implemented 2.1 without VRR.Ignoring the possibility of VRR, 90fps on a 120hz display would be akin to 45fps on a 60hz display (except sped up 2x), with constant frametime variance between 16.7ms and 8.3ms.
You can only get smooth frametime graphs with integer factors of the base frequency - for 120hz that would be 120, 60, 40, 30, and a bunch of lower-but-useless framerates because no-one wants to go back to N64 performance levels.
o'dium said:Now for next gen? Well, the same thing keeps getting repeated over and over, that the SX isn’t as mature in dev kit but faster, the PS5 is more mature in dev kit but a tiny bit slower, and nothing matters because they are equal. That’s fine. That’s the best result. You WANT that. But some of you really are hurting at the fact SX may be that stupidly tiny little bit faster...? Hell I can’t say what I know, from my talks with my friends who have actual hardware in hand, all of different updates.
o'dium said:Anybody with an ounce of common sense is talking about how A) they don’t want Lockhart because it’s a bag of bollocks, and B) the best thing is that both consoles are on par. Yes, one console will be a teeny tiny bit better, and no, that won’t have any perceivable difference.
Ignoring the possibility of VRR, 90fps on a 120hz display would be akin to 45fps on a 60hz display (except sped up 2x), with constant frametime variance between 16.7ms and 8.3ms.