Unreal Engine 5, [UE5 Developer Availability 2022-04-05]

I think there's value in this discussion to understand diminishing returns and the state of the industry, but it'd need to be handled by someone willing to invest in proper research. Back and forths with people 'remembering' is just going to be subjective opinion. An important part of that evauation would also be "if high-end GPUs had better framerates, why weren't they tapped for better visuals?"
Was going to say the same thing. My cynical self thinks that people have formed opinions that are relatively immune to new information, and nostalgia is always going to color the comparisons. Regardless we should at least take the meta-discussion back to the other thread: https://forum.beyond3d.com/threads/game-engine-convergence-and-the-problem-with-ue5.63291/

If there's desire to split off a thread about industry rendering engineer retention and hiring we could do that as well, but I'm not really sure how far that conversation can go without real information either.
 
I still game on retro hardware and frame rates were not as high as people remember them being in AAA games, with the fastest GPU's.

Back in the DX10 era (which is what I'm currently focused on) the games at the time needed SLI or Crossfire to get at least 60fps at the high end resolution in that period which was 1440p (Or 1600p at 16:10)

The DX11 era was no different with dual GPU cards like the HD5970, HD6990 and GTX590 being a 100% requirement for 60fps+ at the highest end resolution which was 1440p.

Arguably performance from today's single GPU's at the high end resolution (4k) are in a much better place in terms of performance than the GTX285, GTX480, GTX580, HD4890, HD5870 and HD6970 were in the DX10 and DX11 era.
 
I still game on retro hardware and frame rates were not as high as people remember them being in AAA games, with the fastest GPU's.

Back in the DX10 era (which is what I'm currently focused on) the games at the time needed SLI or Crossfire to get at least 60fps at the high end resolution in that period which was 1440p (Or 1600p at 16:10)

The DX11 era was no different with dual GPU cards like the HD5970, HD6990 and GTX590 being a 100% requirement for 60fps+ at the highest end resolution which was 1440p.

Arguably performance from today's single GPU's (RTX 4890,RTX 4090 and 7900XTX) at the high end resolution (4k) is in a better place than the GTX285, GTX480, GTX580, HD4890, HD5879 and HD6970 were in the DX10 and DX11 era.

They didn't need multi-GPU back then. Multi-GPU (basically AFR after both NV and AMD/ATi dropped experimenting with split frame and checkerboard rendering) just made the games feel worse while arguably looking better (higher framerate combined with higher input->display latency).

I briefly had a Crossfire setup and sold the second card after using it for a couple of months because the experience with AFR was just horrible. So, no, I and many other certainly didn't need multi-GPU AFR solutions in order to experience high resolution gameplay at the time. As usual for me, resolution and framerate remain the same for me and it's just fiddling around with the other IQ settings to enable those.

Regards,
SB
 
They didn't need multi-GPU back then. Multi-GPU (basically AFR after both NV and AMD/ATi dropped experimenting with split frame and checkerboard rendering) just made the games feel worse while arguably looking better (higher framerate combined with higher input->display latency).

I briefly had a Crossfire setup and sold the second card after using it for a couple of months because the experience with AFR was just horrible. So, no, I and many other certainly didn't need multi-GPU AFR solutions in order to experience high resolution gameplay at the time. As usual for me, resolution and framerate remain the same for me and it's just fiddling around with the other IQ settings to enable those.

Regards,
SB

The reason you think that is because you had Crossfire which was exposed as having shit micro stutter compared to SLI (Which was basically perfect)

SLI always had the reputation of being much smoother than Crossfire but Crossfire had the reputation of having better scaling.

It was exposed around the HD6000 series which is why AMD ditched the Crossfire connector and moved GPU transfers over the PCIEX bus which massively helped.

The HD200 series were the first AMD GPU's to ditch the Crossfire connector.

The cause was a lack of bandwidth the Crossfire connector had compare to Nvidia's SLI connector which had tones of bandwidth.

And if you wanted 60fps at the high end resolution with max settings you absolutely needed SLI/Crossfire back then.
 
Last edited:
The reason you think that is because you had Crossfire which was exposed as having shit frame time stutter compared to SLI (Which was basically perfect)

SLI always had the reputation of being much smoother than Crossfire but Crossfire had the reputation of having better scaling.

It was exposed around the HD6000 series which is why AMD ditched the Crossfire connector and moved GPU transfers over the PCIEX bus which massively helped.

The HD200 series were the first AMD GPU's to ditch the Crossfire connector.

The cause was a lack of bandwidth the Crossfire connector had compare to Nvidia's SLI connector which had tones of bandwidth.

And if you wanted 60fps at the high end resolution with max settings you absolutely needed SLI/Crossfire back then.

Most of my friends had NV setups, so, it's not like I didn't also game extensively on NV SLI setups ... and the experience was just as bad.

Bandwidth had nothing to do with it. It was that you had 2 frames of rendering for every "real" frame of rendering while input was still based on the "real" frame and not the "alternate" frame. This meant that in fast action games where high framerates were needed, there was a distinct and very noticeable disconnect between the input->display feedback loop compared to what you were visually seeing on the screen. People that were less sensitive to this could either tolerate it or perhaps not even notice it, but anyone who had been gaming at high framerates because of the noticeable increase in the input->display feedback loops immediately noticed the jankiness of both AMD/ATi and NV's AFR solutions. Hence, you almost never saw SLI setups used in e-sports where that input->feedback loop was critically important. SLI (AFR) just fucks with that in really really bad ways.

The last and really only good multi-GPU setup was the Voodoo and Voodoo 2 SLI because they actually did "Scan-Line Interleave" hence the SLI moniker. It's absolutely insulting that NV chose to name their AFR solution as SLI as if to imply that it was in any way even remotely comparable (it wasn't, it was massively worse).

Regards,
SB
 
Last edited:
Most of my friends had NV setups, so, it's not like I didn't also game extensively on NV SLI setups ... and the experience was just as bad.

The last and really only good multi-GPU setup was the Voodoo and Voodoo 2 SLI because they actually did "Scan-Line Interleave" hence the SLI moniker. It's absolutely insulting that NV chose to name their AFR solution as SLI as if to imply that it was in any way even remotely comparable (it wasn't it was massively worse).

Regards,
SB

There were loads of articles around 2012 (ish) on the subject and Nvidia were no where near as bad when it came to the micro stuttering.

So I'm going to have disagree with you on the comment that SLI was just as bad.

As articles like the one this picture is from show how much better SLI was than Crossfire in terms of smoothness.

4td5DwKy3SpxmvZWT39TB4-970-80.png
 
There were loads of articles around 2012 (ish) on the subject and Nvidia were no where near as bad when it came to the micro stuttering.

So I'm going to have disagree with you on the comment that SLI was just as bad.

As articles like the one this picture is from show how much better SLI was than Crossfire in terms of smoothness.

View attachment 9441

Again, it has nothing to do with microstuttering. It's the latency of the input->visual feedback loop was based on the actual real frame while what was being displayed was the real frame + a generated frame. So, you would need 120 FPS with SLI for the controls to be as responsive as a 60 FPS non-SLI setup. Poor fools using SLI to get 60 FPS in online play would get slaughtered by anyone good not using SLI as they were unknowingly hobbling themselves with a 30 FPS input->feedback loops where someone with a single card at 60 FPS was operating with a 60 FPS input->feedback loop.

But the problem goes deeper than that because if you are doing frame perfect responses and inputs (as is the case with good FPS, RTS, esports players) SLI screws everything up because only 1 out of every 2 frames will actually display your most immediate control input. That makes it impossible to do frame accurate aiming and shooting ... immensely important in competitive FPS play.

And I always felt bad for the poor suckers that got quad-SLI. :D

That graph you posted completely misses that. Yes, AFR was great for benchmarks, but for most people it sucked in actual games. Hence why AFR never took off and most people that tried it never went back to it. It was so bad with such a bad reputation that both NV and AMD abandoned it.

Regards,
SB
 
A significant increase in the amount of things that are physicalized in the game world is one of many paths to offering players new experiences. We can also go far in improving how the player interacts with the world.
I agree if you mean something like the latest Zelda game, which seems rich in emerging options and spurring player creativity. Though, it also looks like just merging Garys Mod into some generic ARPG. It seems hard to come up with such options for more serious / realistic games, but that's kind of what i want.

If you mean interactions such i can bet money on a horse race in GTA, or i can fuck a bear in BG, or i can change my hairstyle ingame... that does not count. That's not something the player can come up with and do on his own. It's an offer of abstract minigames or multiple choice content, but tangent to the core gameplay loop or even conflicting / interrupting it. It costs a lot of money to implement, feels bolt on, and because scaling up and up and up is expected, the progress is not even noticed.

The keyword to me is emerging options. If the world simulation is rich and robust, the player can come up with multiple options to tackle a problem. If it works he feels smart and rewarded.
It's also good if the player can exhaust and exploit the game without breaking it, which is hard. The more options to the player, the less control for the developer.

But i agree new mechanics is not the only thing we can do to show new things.
Having worked on robotic ragdolls years ago, this surely enables some new mechanics due to agility no longer depending in static animation data. We might do our Arnie and Stallone action heroes much better than before.
But idk yet. What i do know is that such characters can generate an impression of being truly alive, and that's quite interesting on its own.
Maybe this video can show a bit of this impression:
The upper body does nothing, just the legs. But there you can see some subtle details convincing me that this is really happening, it's not just playback of some content.
So now i wonder how it feels to shoot NPCs dead which feel alive to me. This might be much more dramatic than current games.

To me, it's really physics simulation which is the sleeping princess of gaming. It's current use is too passive, often just for decorative, visual effect. We need more motors! :D
 
Poor fools using SLI to get 60 FPS in online play would get slaughtered by anyone good not using SLI as they were unknowingly hobbling themselves with a 30 FPS input->feedback loops where someone with a single card at 60 FPS was operating with a 60 FPS input->feedback loop.
I never got slaughtered and I always ran SLI (And for a time, Crossfire)
And I always felt bad for the poor suckers that got quad-SLI. :D
I didn't, it meant they had more money than me 😂
It was so bad with such a bad reputation that both NV and AMD abandoned it.
SLI and Crossfire rigs were insanely popular so I'm not sure where you got that view from?

DX12 killed off Multi-GPU more than anything else.

I would welcome it back as with high refresh monitors and ray tracing it would be really useful to have.
 
Last edited:
Don't Nod's climbing game, Jusant is coming out Oct 31st (Day one on Gamepass). I played the demo on my Steamdeck and it looked nice even with potato settings. The location & art are lovely. The climbing mechanics can be a little fiddly, but pretty interesting.

 
Last edited:
Don't Nod's climbing game, Jusant is coming out Oct 31st (Day one on Gamepass). I played the demo on my Steamdeck and it looked nice even with potato settings. The setting & art are lovely. The climbing mechanics can be a little fiddly, but pretty interesting.

According to report on DSOG this game runs smoothly even it uses Nanite and Lumen. While Remnant2 and Immortals does not, so the devs deserve to die, he says.

Now we could speculate that the climbing game has only few dynamic objects and little action going on. But i guess that's only one factor of many. Maybe people overuse Scripting and Node Graphs over C++ these days. :D
 
According to report on DSOG this game runs smoothly even it uses Nanite and Lumen. While Remnant2 and Immortals does not, so the devs deserve to die, he says.

Now we could speculate that the climbing game has only few dynamic objects and little action going on. But i guess that's only one factor of many. Maybe people overuse Scripting and Node Graphs over C++ these days. :D
Indie studios, less than a year of dev. I would be surprised if they didn’t leverage some blueprint systems
 
Don't Nod's climbing game, Jusant is coming out Oct 31st (Day one on Gamepass). I played the demo on my Steamdeck and it looked nice even with potato settings. The setting & art are lovely. The climbing mechanics can be a little fiddly, but pretty interesting.

🧠Gamers can't cherry-pick screenshots of low res textures if you use flat color for everything

(game looks great!)
 
Yeah I really enjoyed the demo of Jusant - will definitely pick up the final game. It's a unique mix of textureless "flat" art style but with sharp shadows and areas of high geometric detail/density of scatter objects that I haven't seen anything quite like before. Very curious what the other environments look like.
 
SLI and Crossfire rigs were insanely popular so I'm not sure where you got that view from?

DX12 killed off Multi-GPU more than anything else.

I would welcome it back as with high refresh monitors and ray tracing it would be really useful to have.
Multi-GPU configurations went obsolete even on D3D11 to make room for TAA. Ever notice how your favourite upscaler (DLSS2) in available games also doesn't offer support for multi-GPU ?
 
Multi-GPU configurations went obsolete even on D3D11 to make room for TAA.
No they didn't.

They were insanely popular and cards like the HD5970, HD6900 and HTX590 were very sought after GPU's
Ever notice how your favourite upscaler (DLSS2) in available games also doesn't offer support for multi-GPU ?
You're going to have to clarify this irrelevant comment as multi gpu died well before DLSS arrived.
 
Back
Top