Next Generation Hardware Speculation with a Technical Spin [2018]

Status
Not open for further replies.
75 and 144 hz are Pc centric refreshes. Well 75hz monitors don't even exist anymore do they? Tv's are either 60 or 120hz with various optional interpolation methods for faking anything higher.

You'll only get multiples of 30fps on console and for good reason. If ultra high refresh rates are your thing just stick with PC.
75hz monitors are still a thing. They are pretty common in the 1440p and ultrawide models, as well as many lower end Freesync displays. Xbox One S and X already support higher refresh rates and VRR, so I don't have to stick with PC.
 
What proportion of consoles are attached to monitors? As LB says, it's a silly discussion. 1) Consoles are designed for TVs. 2) They can't even hit frickin' 60 fps! Anything higher than that is gravy, but the framerate has zero baring on this discussion. With a suitable HDMI port and support for VRR, consoles will support whatever framerates the devs and TVs allow.

Point of fact - the consoles will not be designed for 4K, or higher framerates. They'll just be more power, for the devs to choose what to do with, like every console generation ever. So let's stop talking about framerates and resolutions in this thread! (Unless you have something poignant like required EDRAM capacities for a certain resolution and G-buffer size if advocating an esoteric design)
 
75hz monitors are still a thing. They are pretty common in the 1440p and ultrawide models, as well as many lower end Freesync displays. Xbox One S and X already support higher refresh rates and VRR, so I don't have to stick with PC.
VRR is great but you're still limited to 60fps max on console, so yeah it's PC or nothing. You do game on PC right?
 
Well, if VRR becomes a thing than things like 24, 40, 48, 50, 72, 75Hz, all become possible on monitors with VRR support, no?

If there is sufficient bandwidth, yes. What resolution and colourspace are you talking about? If the hope is to push a 10/12bit HDR 4K at at super high frames, possibly with some uncompressed audio, you'll want to see if the HDMI connection can even carry that much data.
 
I don't see framerates being limited to 60fps, not when next-gen consoles will continue to support VRR (variable refresh rates) and offer VR (virtual reality) experiences. You want to target 90 fps for VR.

Pretty much what I've been thinking. When PSVR games are somewhere between 60fps reprojected and 120fps native, I don't see why higher framerates won't proliferate.

If there is sufficient bandwidth, yes. What resolution and colourspace are you talking about? If the hope is to push a 10/12bit HDR 4K at at super high frames, possibly with some uncompressed audio, you'll want to see if the HDMI connection can even carry that much data.

I can't quite remember, but I think HDMI 2.1 has more than enough bandwidth. Enough for 8K60 IIRC.
 
8K 60Hz and 4K 120Hz max. That requires full HDMI 2.1 hardware (including the certified cable) end-to-end. Some folks were talking about 144Hz which you won't get at 4K.
Correct, but only if you assume full 4:4:4 chroma with 12 bit color. 4:4:4 isn’t required for consoles, and we’re only at 10 bit color with current HDR. It’s limited to 48Gbps total.

hdmitable.png
 
Pretty sure when Xbox decided to support 120Hz, they meant it for lower resolutions. My TV does 4K60 but when I ask it to do 120Hz, it can only do it at 1080.

Any 4K60 titles on 1X should be able to do 1080/120! Assuming cpu was not a limiter

This bodes well for further support for next gen for those who desire it. Ontop of freesync. And automatic DSR. It’s entirely up to developers to decide how they want to support their titles.
 
Last edited:
Since we're on the topic of framerates, I do wonder if motion interpolation might be worth including in the next consoles?

Many PSVR games render at 60fps, which is then reprojected to 120fps. Many TV's contain motion interpolation hardware, which adds lag. The same is true of upscalers. It's also the case that a hardware upscaler will introduce less lag when at the end outputting the image.

So, rather than including motion interpolation hardware in an external box, solely for VR use, why not stick it in the console itself for anyone to use however they see fit?
 
Is there any reason for this? Or is it just that no-one currently uses that in the console space?
4:4:4 is only required for sharp text rendering in PC environments. You can get away with less in other formats. UHD is 4:2:0 for example I believe.

Here’s a good discussion on the topic: https://www.avsforum.com/forum/166-...ally-important-pc-gaming.html#/topics/2322113

My takeaway is that I want 60 FPS first, 10 bit or greater second (not just HDR - also helps banding), 4K native, then muss with chroma if you can hit all those targets. That being said, I’m fine with checkerboarding or other temporal injection techniques. Generally I’ve been pretty happy with devs deciding what looks best and is balanced performance and visuals wise. I just hope they don’t lean on VRR in lieu of 60 FPS.
 
Last edited:
I was wondering if you wanted 120fps so bad on console because you didn't have a PC or something.

And you can do a lot worse than sega saturn so!
No, I have a PC and a 144hz monitor. It's one of those things that I assumed wasn't that big of a deal until I got one, and it was worlds better, even with games that I play with a controller. I'm not PC only, though. I have a One X and PS4pro, 360, PS3, and most of the older consoles. Saturn is my favorite, though, even if it's objectively a pretty poor piece of hardware with a mostly lackluster library.
 
Ok. From resetera:

https://www.resetera.com/threads/ps...ion-post-e3-2018.49214/page-184#post-13759967

Basically a new "insider" is claiming that the PS5 (edit: thanks milk! - argh!) was delayed because it did not originally include backwards compatibility. There is not, to my knowledge, any confirmation that he actually is an insider. Assuming this is true, 2 more questions:

If backwards compatibility was going to be broken despite the APU being another AMD x86 and an AMD GPU, what, if anything, can be read into the fact that the basic design breaks backwards compatibility? Does this tell us anything about Navi?

I believe AMD just announced a short time ago that their first 7nm product "taped out". Is this not almost certainly a cell phone component and not a console APU? What I believe are called "beta devkits" cannot go out until the APU is actually being produced in some fashion, however low volume/ low yield. Correct?
 
Last edited:
Status
Not open for further replies.
Back
Top