Next Generation Hardware Speculation with a Technical Spin [post E3 2019, pre GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
Who took which path, sorry?

Sony took the path of insufficient bandwidth, or Microsoft took the path of customisations?
Path of Sony choosing not to insert more memory; to make up the insufficient bandwidth through checkerboarding, reconstruction and rapid packed math.
 
Btw, finished counting, but there are 200 out of 454 titles that are below 1800p. That's nearly 50%.

If I may chime in:

jgxkaam96z701.jpg


https://www.neogaf.com/threads/what-games-run-on-native-4k-on-xbox-one-x.1461879/
 
Who took which path, sorry?

Sony took the path of insufficient bandwidth, or Microsoft took the path of customisations?
Sony took the path of 399 in 2016. There is zero indication anything better was possible at the time for that price. Some customizations were required, and they helped. To decide whether it was the right move, we have to look at sales, not counting pixels.

What is this whole tangent about?
 
Heh they edited the interview, it now says "dedicated hardware" instead of dedicated cores.

Sounds more like what we expect, doesn't it?
 
What surprised me about the X1X profiling is actually that I would have expected AMD to do extensive profiling to see why and where their designs underperform. It's like AMD just designs top down without extensive evaluation about real code needs.

Different markets for different products. The XBO-X has arguably one task and one task only. Play games well.

AMD GPUs have to be robust enough to cope with many MANY tasks outside of games. It is entirely possible that customizations that MS requested for the XBO-X GPU might have negative performance implications for things outside of gaming. Similar to how an HDD or SSD with gaming specific performance tuning in firmware can and will perform non-optimally in say a server environment. And likewise a drive that performs optimally in a data server environment will often times not even hit middle of the pack drive performance in everyday PC operation.

Regards,
SB
 
So, what should be considered fully bespoke or custom wares?
I don't think as a usable term it needs clarifying even if it can be. Like 'more powerful', clarification can be provided for the particular topic with better specified terms. If arguing likelihood of one company using proprietary hardware for RT versus another, one should talk about the types of modifications they've integrated in the past without needing to label them as partial or full customisations.

AFAICS, both are equally positioned to integrate proprietary or advanced (AMD R&D, not due for PC space release until a year or two after console release) RT tech over-and-above the off-the-shelf base GPU base architecture going by all their consoles. Heck, XB1 has ESRAM which no off-the-shelf AMD GPU uses, so I'm not sure what Globby was getting at.
 
Because we they're running the same framerate.
You're interested in defending 4Pro, I'm not interested in downing it.
Nonsense. I was interested in getting proof of your claims.
I now got it, so I admit I was wrong.



Looking back at this thread
, it seems to be because devs prefer to use the ID buffer and FP16 calculations for reconstruction techniques at significantly lower resolutions, instead of cranking up the resolution like they do in the XBoneX. Most probably, the ICE team already included in the SDK some high-level tools to apply CB and other methods.
While DF usually does say the game looks better on the 1X, in almost all face-offs they claim the difference isn't clear to the untrained eye, so the Pro's method seems valid considering the fairly large difference in available RAM for the devs (framebuffer, shadowmaps, textures, etc.).


My guess is for the typical 50-65" 4K TV there might be little discernable difference between the Pro's reconstructed 4K and the 1X's full resolution, but for people using >65" TVs at close-ish range then the difference might be large.
 
Last edited by a moderator:
Nonsense. I was interested in getting proof of your claims.
I now got it, so I admit I was wrong.
Fair enough, I was projecting.

While DF usually does say the game looks better on the 1X, in almost all face-offs they claim the difference isn't clear to the untrained eye, so the Pro's method seems valid considering the fairly large difference in available RAM for the devs (framebuffer, shadowmaps, textures, etc.).
My guess is for the typical 50-65" 4K TV there might be little discernable difference between the Pro's reconstructed 4K and the 1X's full resolution, but for people using >65" TVs at close-ish range then the difference might be large.
At a static camera position, reconstruction is going to get you pretty much native. In motion it's going to be noticeable, but with
a) motion resolution being a thing on tvs (and most people not realizing how bad their TV sets actually are)
b) lower framerates/refresh rates
c) added motion blur
d) viewing distance from TV
e) a lack of calibration on HDR/sub-optimal lighting to see small details/glare etc
f) just someone having poor vision in general

edit: but most importantly, visual acuity is really about resolving fine details, from far away it means seeing things you couldn't normally see (impacted by draw distance); which up close really comes down to being able to see the fine textures of something. And unless content creators are designing textures with this level of fidelity in mind, the benefit is going to be less. Thus I totally understand the argument of not seeing the difference.

It's really hard to see the difference at UHD resolution unless you have a TV unit that is of high enough grade for you to notice the artefacts paired with an optimal viewing setup. The only other way is to do a side by side comparison, which no one ever does.
 
Last edited:
At a static camera position, reconstruction is going to get you pretty much native. In motion it's going to be noticeable, but with
a) motion resolution being a thing on tvs (and most people not realizing how bad their TV sets actually are)
b) lower framerates/refresh rates
c) added motion blur
d) viewing distance from TV
e) a lack of calibration on HDR/sub-optimal lighting to see small details/glare etc
f) just someone having poor vision in general

It's really hard to see the difference at UHD resolution unless you have a TV unit that is of high enough grade for you to notice the artefacts with a viewing setup that lets you enjoy that quality.


Which is why reconstruction will most probably be a huge part of next-gen.
 
Nonsense. I was interested in getting proof of your claims.
I now got it, so I admit I was wrong.



Looking back at this thread
, it seems to be because devs prefer to use the ID buffer and FP16 calculations for reconstruction techniques at significantly lower resolutions, instead of cranking up the resolution like they do in the XBoneX. Most probably, the ICE team already included in the SDK some high-level tools to apply CB and other methods.
While DF usually does say the game looks better on the 1X, in almost all face-offs they claim the difference isn't clear to the untrained eye, so the Pro's method seems valid considering the fairly large difference in available RAM for the devs (framebuffer, shadowmaps, textures, etc.).


My guess is for the typical 50-65" 4K TV there might be little discernable difference between the Pro's reconstructed 4K and the 1X's full resolution, but for people using >65" TVs at close-ish range then the difference might be large.

Hell most developers aren't even using the ID buffer and FP16 on the PS4-P. It's mostly just Sony and occasionally some other developers will do something with it.

Regards,
SB
 
Hell most developers aren't even using the ID buffer and FP16 on the PS4-P. It's mostly just Sony and occasionally some other developers will do something with it.

Regards,
SB
Most developers probably aren't, but it seems most developers of 3rd party AAA games are.
DICE's own presentation states how the id buffer is being used for their in-house reconstruction techniques, and how using FP16 in the Pro makes the reconstruction run 30% faster.
 
What went so wrong with the Pro anyway?
Why so few games hit the 1800p "target"?

Who said something went wrong with the Pro?
The "target" was never a specific rendering resolution, it was an image output that is adequate for 4K TVs.
And for that, the Pro was quite successful.


If you think native 2160p will be the target for the next generation consoles along with ray tracing and higher framerates, you'll be really disappointed.
 
Which is why reconstruction will most probably be a huge part of next-gen.

Indeed
Digital Foundry: With image reconstruction techniques becoming more prevalent, what is your opinion of native resolution rendering on systems with set hardware like in consoles?

The Coalition: We love running a game at native resolution and the simplicity and purity of that - but we feel like this will become less and less common, especially if we begin to see 8K games next generation. Temporal reconstruction techniques offer cheap performance returns allowing for higher quality visual systems with very hard to perceive visual quality loss - especially at higher resolutions. Put another way: pixel counting was a very useful thing a few years ago with games like Gears of War 4, as it would give you a clear indication of image clarity. With games that use reconstruction now I think pixel counts are an interesting thing to make note of but isn't necessarily that indicative of your final output clarity unless you are seeing some extremely lower-than-native resolutions.
https://www.eurogamer.net/articles/digitalfoundry-2019-gears-5-tech-interview
 
Since this is supposed to be the hardware prediction thread, I will make a very bold hardware prediction.

PS5 will have ID buffer, and fp16 RPM.

Anyone disagree?
 
Status
Not open for further replies.
Back
Top