Current Generation Hardware Speculation with a Technical Spin [post GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
30 fps is the 480p of frame rate. It's the absolute bare minimum. Once you go below that you start having serious problems. 30fps games require a ton of post-processing motion blur to hide judder when moving the camera quickly. So you have a nice 4k presentation and then blur the whole thing every time the camera moves. At that point you don't need to worry about the MPRT of your display since you've intentionally ruined the entire image anyway. Motion blur can look quite nice at high frame rates because it can be very subtle to actually hide judder at 120fps. In-game motion blur or not, 30fps games just do not look good in motion. They look great when the camera is still or moving very slowly. The faster the camera moves the more the image breaks down. It's just how your eyes respond to sample and hold displays. You can't even use blur reduction effectively at 30fps. You need to get above 100 fps to start having effective blur reduction on displays with BFI or strobing, though those techniques have their own issues.

A lot of gamers today have never gamed on a CRT, or have never owned an actual fast display that has g2g times to handle 120Hz or higher. Their view of 60fps is from a VA panel or an older tv with slow g2g that might blur everything with ghosting anyway. People are also used to gaming on displays that might have 100ms of display lag, so the difference between 1 frame of input delay int the game engine would be 130ms for 30fps vs 116ms for 60ms. Might be hard to tell the difference. But then someone with a monitor that can have <10ms of display lag at 120Hz+ switches their monitor over to a 30 fps game on a 60Hz signal and they've gone from maybe 18ms of lag total to 50ms and the change feels really really bad.

I have my console hooked up to a gaming monitor with really fast transition times and judder is very apparent even at 60fps. 60fps looks pretty much bad without motion blur. 30fps is just terrible. I guess it's a case of people not knowing what they're missing. Ignorance is bliss. Enjoy 30fps.

Edit: Just going to leave this here https://www.testufo.com/
Try playing with the speed. Even at 120 pixels per second, 60fps is noticeably sharper than 30fps. At 240 pixels per second, 60fps is much better. And that's a fairly slow panning speed.
 
Last edited:
Sadly none.

Only research thus far linked to games has been the star wars one and tests done for Trials back in the days. (Which they got to run 60fps by other means.)
Might have had some issues that didn’t allow it to progress further.
 
Just give people the choice. For fighting and driving games, sure, 60 fps default. But for other stuff that's traditionally been 30 fps just allow a choice.

For crossgen that would just set up faulty expectations for later. Something like Nioh has 60fps and 30fps options baked into it because Team Ninja by default had 60fps in mind when designing the game, and the 30fps cap was just an arbitrary setting to get resolution higher, but it was clearly a far lower priority than 60fps for the team(as is customary for Team Ninja)

This type of game design by default means that your designing the game for a higher FPS and hamstringing what devs would do with a 'true' 30fps design target for their games. Its not realistic unless your accounting for it during development and many devs wont because your still having to account for that handicap of 60fps.

There will be lots of 60fps games this gen, and for the first time on console(outside of VR), higher.

But its not gonna change how console devs inherently have to work around a closed system to maximize design goals.

I also dont agree with the concept that "30fps is the FPS version of 480p". Its a silly comparison because standards always increase with resolution and GPU power, CPU constraints dont because that jump is always comparatively lower, and the CPU is more inherent in how games are actually designed than the GPU, thus leaving tons of headroom with TV technology improvements.
 
@Inuhanyou The comparison is this: 960x480 is about as low as you can go for a 3D game. 30 fps is as low as you can go for a 3D game.

I think the comparison is pretty much spot on. It’s spatial resolution vs temporal resolution. If you’re 4k30 you basically have amazing spatial resolution that’s destroyed by low temporal resolution. Conversely you could do 480p240 and have terrible spatial resolution that’s temporaly stable. 240Hz displays are rare and expensive, but 120 is becoming common in televisions. I’m not sure why spatial resolution should be prioritized so highly over temporal, especially since almost all games now allow you to control the camera.
 
While I'm not a 60 fps absolutist, I do think that too much emphasis is being placed on 4K over 60 fps. I would rather see 2K-60 than 4K-30 for most titles. That's my preference. Devs should have the flexibility to do what they want though.
 
For crossgen that would just set up faulty expectations for later. Something like Nioh has 60fps and 30fps options baked into it because Team Ninja by default had 60fps in mind when designing the game, and the 30fps cap was just an arbitrary setting to get resolution higher, but it was clearly a far lower priority than 60fps for the team(as is customary for Team Ninja)

This type of game design by default means that your designing the game for a higher FPS and hamstringing what devs would do with a 'true' 30fps design target for their games. Its not realistic unless your accounting for it during development and many devs wont because your still having to account for that handicap of 60fps.

As game engines move towards data driven and job based implementations, with GPUs also able to generate their own work, I think the pattern of 30 fps games we've seen this generation will have less excuse to exist.

It's a misconception that 60 fps must take twice the CPU power of 30 fps. That only appears to be true at the moment because most games are currently limited by a single thread, which makes double the performance necessary no matter how many cores are available. The actual percentage of total CPU time dedicated to pulling all kinds of disparate data together for making draw call should be dropping, not increasing.

If managing the rendering is only actually taking up 20 % of your total CPU time at 30 fps in future, and you can spread that load effectively across many threads, then you can double up frame rate with a small and far more easily managed additional allocation of CPU time.

It's going to be grotesquely interesting to watch something like a 3700X push games above 60 fps on PC, while console versions have no option but to run at 30 fps despite their more efficient APIs and filesystems, and their hardware SSD compression blocks.
 
As game engines move towards data driven and job based implementations, with GPUs also able to generate their own work, I think the pattern of 30 fps games we've seen this generation will have less excuse to exist.

It's a misconception that 60 fps must take twice the CPU power of 30 fps. That only appears to be true at the moment because most games are currently limited by a single thread, which makes double the performance necessary no matter how many cores are available. The actual percentage of total CPU time dedicated to pulling all kinds of disparate data together for making draw call should be dropping, not increasing.

If managing the rendering is only actually taking up 20 % of your total CPU time at 30 fps in future, and you can spread that load effectively across many threads, then you can double up frame rate with a small and far more easily managed additional allocation of CPU time.

It's going to be grotesquely interesting to watch something like a 3700X push games above 60 fps on PC, while console versions have no option but to run at 30 fps despite their more efficient APIs and filesystems, and their hardware SSD compression blocks.

*Chef's kiss towards this post*

I'm just imagining Doom Eternal running on a console with HDR at 120Hz on an OLED screen. Beautiful.
 
Jaguars had the advantage of beeing balanced with the GPU in sharing bandwidth... only on PRO and more on ONE-X jaguars are really capping the consoles.... I believe that mostly zen2 will remain underutilized this (1) because same games need to run on ps4/one (2) because they steal bandwidth to the GPU (this worst on ps5 of course).
 
What do you mean? When the little jaguar uses bandwidth on the PS4 the memory bandwidth tanks disproportionately for the GPU. Doesn't seem balanced to me.
 
What do you mean? When the little jaguar uses bandwidth on the PS4 the memory bandwidth tanks disproportionately for the GPU. Doesn't seem balanced to me.
This is most probably inherent to every APU / SoC in existence that has a GPU and a CPU competing to access the same memory pool, not just the PS4.

Only exceptions I can remember is the Vita where the GPU had exclusive access to the 128MB Wide I/O while the CPU cores would work with the dual-channel LPDDR2. There's also the ipad 3's A5X where the GPU had access to all 4 of the LPDDR2 channels whereas the CPU could only access 2 of them.
Funnily enough, both chips came out at about the same time, and both used 4 Cortex A9 cores and a PowerVR SGX543 MP4. I wonder if this was a recommendation from PowerVR themselves, at the time.


EDIT: I just remembered the Broadcom SoCs that went into Nokia N8 and 808. I think the first had 32MB exclusive for the GPU and the second had 128MB.
 
Last edited by a moderator:
This is most probably inherent to every APU / SoC in existence that has a GPU and a CPU competing to access the same memory pool, not just the PS4.

Right, I only reference that because of the well known and published charts of PS4 Bandwidth that we've seen multiple times in discussions here.

I just don't see where the concept of Jaguar Balanced comes from.
 
UE5 Lumen is offering a very solid global illumination solution and I am satisfied.
Proper raytraced GI is impossible in this generation.

I really hope that we will see a good GI with native 4K games without temporal artifacts ruining the image quality.

After this first demo I am quite positive that the Next-Gen will be good enough and not just a generation caught in between the Ray Tracing Revolution.
 
Status
Not open for further replies.
Back
Top