Next Generation Hardware Speculation with a Technical Spin [post E3 2019, pre GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
with Ray Tracing, sorting triangles is removed, and in essence I suspect culling is more straightforward as well (at least more straightforward)
Currently the more visually capitivating the game, the less onscreen actors there are. Culling and sorting become a larger issue. WIth RT I wonder if it can push the upper limits here and have some very complex scenes where there are just layers and layers of NPCs etc in a dense scene and RT by nature will always return the closest triangle.
There is no triangle culling. You use the same space traversal no matter the scene, no matter where objects are. You'll always return the closest triangle and have to cast secondary rays to trace beyond a triangle, increasing recursion which is where RT can became insanely complex. Tracing through a cloud of particles will kill raytracing, but then that should force alternatives like computing a volumetric effect instead of splatting zillions of particles.

But 7TF sounds low, no matter how we talk about it.

If they push RT so heavily, could the low TF bottleneck shading?
I don't think so. We have 1.8 TF for PS4 that's having to draw a load of triangles. How much is 'left over' for shading pixels? With shading decoupled from defining geometry, you'll looking at maybe 4x as much shader power, which is adequate 4x the complexity at 1080p and PS4 shader quality at 4K. Coupled with beautiful RT lighting and adaptive shading LOD, that'd look very good I imagine.

While I agree with you on that, I have to mention that most users here failed to see the NEON signs that hardware RT was coming to consoles and everywhere else. I consider this a massive oversight that shouldn't have happened here, this is afterall the best tech forum in the world, it's filled to the prim with veterans, but sadly most got lost on the train of thought of "open uncertainties".
Huh? What exactly was the discussion supposed to be?

"Could compute be enough for raytracing without needing dedicated RT hardware? What if it was a choice between 7TFs with RT and 15 TFs without?"
"Hush your mouth! Of course the consoles are going to have dedicated RT acceleration and there's no point discussing any other possibility!"

:unsure:
 
I don't think so. We have 1.8 TF for PS4 that's having to draw a load of triangles. How much is 'left over' for shading pixels? With shading decoupled from defining geometry, you'll looking at maybe 4x as much shader power, which is adequate 4x the complexity at 1080p and PS4 shader quality at 4K. Coupled with beautiful RT lighting and adaptive shading LOD, that'd look very good I imagine.
Arguable, because now you have to shade ray hitpoints, additionally to what we see on screen directly. So, assuming tracing 4 rays per pixel would increase shading 4x, not decreasing it. Plus, no more cache efficient coherent shading, because hitpoints are scattered accross materials and textures.
For GI we can use unified lambert material for the hitpoins (Exodus even used one solid color per object), for AO / shadows there is no shading at all, but for reflections a unified material would be less accurate than BFV or what we see with UE4.
Considering this, i would still assume increased shading costs of at least 2-3x, if console RT would be powerful enough to do full raytraced lighting.
However, maybe they decided GI > reflections.

Also, with deferred rendering shading was decoupled from triangles, and shader cores are underutilized with rasterization, especially for shadow maps. So additional RT power can be more helpful to reduce load on ROPs if at all, but not much to reduce shading / compute cores. But ROPs have increased for Navi, so... 7TF still looks low to me, in comparosion to 6TF XBox X.
 
Now that both are officially hardware RT, do we still have an angle to speculate one will have the RT circuitry closer to the gpu and the other closer to the cpu?
 
The cost of new nodes is always cost prohibitive, even NVIDIA abstained from doing 7nm until it's cheaper and the yields are better. A console is much more susceptible to these factors.

GPUs and Consoles also have highly different price concerns. For a GPU design, especially lower volume ones, the setup costs are far more relevant than for a console because they are just noise over millions of guaranteed sales. There the cost of the SOC itself might matter long term though not so much short term.
 
Arguable, because now you have to shade ray hitpoints, additionally to what we see on screen directly. So, assuming tracing 4 rays per pixel would increase shading 4x, not decreasing it
You rarely have to trace all rays to the most accurate level, only when tracing perfect reflections and refractions. That's where shader LOD and approximations can save a large amount.

Plus, no more cache efficient coherent shading, because hitpoints are scattered accross materials and textures.
That's true, but for secondary rays just tracing light and diffuse colour, you can use broad approximations. You don't need to resolve fine displacement maps and SSS from an object who's either just reflecting some secondary light or a fuzzy reflection/refraction. So a classic wet-down street view won't need to fully shade all the reflections in the glass and on the street, and just use simpler shaders. Shader LOD will no doubt have tiers for ray iteration as well as distance, say.
 
Personally, I find the possibility quite exciting. If it is only 7TFs and a considerable amount of RT hardware, it means Sony for one are going whole hog and that might mean a true generational advance in games, not just in rendering which would be prettier, but also development if devs don't need to keep refining hacks. Though cross-platform development will be a challenge.

Same here, i'm not one of those that are TF hungry. There's no reason to, other advancements in tech are (much) more important imo. And expecting high-end GPU's in a 500 dollar console is expecting way to much, yes i want a 14TF Turing (or even Volta) in there but it's not going to happen 2020.


DF thinks 'it might not match up to what we have seen in Control on pc' about Ray Tracing.

But 7TF sounds low, no matter how we talk about it.

If they are 7TF RDNA2 flops it is not that low. PS4 was kinda meh in 2013, doubt it will be worse this time around.
 
they now have the option to wait on latest tech?

That's kinda hard i think, Zen 3 will be out and possibly AMD next gpu will be more advanced too. Not easy to keep up with latest tech in a worrld where its evolving all the time. I guess the PS5's specs are already finalized.

Then we have to consider the possibility that AMD is dedicating more hardware to RT than NVIDIA, a notion that I find hard to believe.
A more convincing possibility would be Sony customizing the GPU to handle more RT.

Compared to Turing maybe, we don't know what nvidia's new GPU will do with RT. Hopefully even better then Turing's RT.

I certainly think that having no ray tracing was an option.

That can be said for every gen of hardware then, as it will always improve over the years to come.

Different in what way exactly? What you said leads pretty much to the same conclusion: many veterans here DISMISSED the idea completely based on whatever arbitrary factor they deemed reasonable: silicon cost, budget, performance, supposed superiority of compute RT, supposed superiority of Rasterization, NVIDIA scamming people with professional tech .. etc. Yet here we are, 12 months later and all of those factors gone in an instant.

The very fact that you thought cost or budget alone would prevent RT inclusion is a testament to how wrong, old and utterly flawed that line of reasoning was.

I consider what happened a test that many failed to pass, there should lessons to be learned from it, like any other historical paradigm shift in the industry, some people need to be truly open to the possibility of sudden shifts in the industry, and not to be stuck in the old ways of thinking or reasoning.

Dunno about this forum but on neogaf people where against RT just cause it was nvidia/pc, now that the PS5 is going to have it it's all cool though. Not visiting those places anymore now :)
 
thought RT hardware in $400-$500 consoles was a bit too good to be true.

Actually it would have shocked me if they wouldn't have RT in some form. Ofc they can fit in RT by sacrificing some TF or something else. Prefer a sub 10TF console with TF then a 10+ TF without it. If i want the latest high end tech il get a PC.


https://www.eurogamer.net/articles/...DdtODlRvp6gVFN4K2noy_isrTgdCVAXa0syr8ym4TCG6k

How much space are games going to take in the future? Do these consoles come with 1TB or larger SSD's?
 
Last edited:
DF thinks 'it might not match up to what we have seen in Control on pc' about Ray Tracing.
Surprisingly a lot of modern cards can't run Crysis either ;)

I think a big part of the hardware is also seeing the evolution of the software. Like Crysis, a lot of techniques weren't invented back then, and they made it as an attempt, but it was brute forced. And they expected that hardware would go a certain route, but instead it changed, and can it run Crysis became a meme.

If any of this sounds familiar then ;) ... I'd say we give it time.
 
If they are 7TF RDNA2 flops it is not that low. PS4 was kinda meh in 2013, doubt it will be worse this time around.
PS4 compute perf was very impressive compared to NV GPUs from the same time, so PS4 was much faster than most gaming PCs for what i'm doing - it almost can only get worse :)
But we will see. Personally i expected next gen on par with FuryX all the time, and 14 TF fp16 could beat this easily. I agree it does not look bad.
Just 7TF at 600GB/s is really an interesting ratio...
 
So RT capable GPU is confirmed with it seems "just" 7 TF ? Ok is what I have tought before... they dont want to create a gap so huge with ps4/pro and so putting them out... at the same time they give developers ways to give the "next gen experience" and not a mere doubling of FPS (that will mostly the main advantage specially in first titles).... Where does it come out the "7 TF" news ?

Ok, now I can talk.

One of the two console will be composed by:
- single soc: on basic 7nm process at TSMC, not 7nm+ or 6nm
- cpu: 8 Zen2 core @2.2GH with SMT enabled, they will call them "custom", but it will just normal plumbing work to adapt the cache to the soc
- gpu: based on navi, hybrid between rdna and rdna2, @ about 7TF, frequency not finalized, hardware RT (no info if proprietary or amd's technology)
- memory: 24GB GDDR6 on a 384bit bus, bw not finalized, but expected at around 800GB/s
- ssd: 2TB soldered
- usb: 3.2 Gen. 2, type C, 1.5A
- size: comparable to actual xbox one x
- new very low power mode with less than 2 seconds from power button push to dashboard
- controller: current gen's evolutionary iteration, rechargeable with off the shelf cables
Well I've been one of the few saying ps5 will be on 7nm... that gives the opportunity to release early & cheaper.. maybe even in first months of 2020 (despite what's in the article).... So someone thinks that ps5-pro will be released on 7nm+

Thanks fehu....
 
Last edited by a moderator:
Beacause -since long- I had the same feeling: ps5 releasing early 2020 on 7nm... RDNA1 (or hybrid) better compatible with GCN so for (1) BC (2) lower costs (3) earlier release (4) no need to create a monster that puts out ps4/pro .... they choose this way... 7 TF is probably a bit low.... lets say 8.4 TF ;) will be the real
 
Why are people buying into what fehu posted?
I'm wondering that myself.
The 2.2GHz CPU clocks are specially glaring. Zen 2 at 3.2GHz is massively efficient. Why would Sony want to loose almost a third of potential CPU performance just to save what, 8 Watts?


ATSM2E1.png
 
Because it is NOT needed...
This ps5 needs to not leave ps4 behind... so a 3.2 ghz CPU is way too much... better saving 8 watts (only 8 watts ????)

Also because of bandwidth... Too strong CPU cuts down GPU bandwidth.... system needs balance. Please answer: why on ONE X we still have Jaguars ?
 
Last edited by a moderator:
maybe even in first months of 2020 (despite what's in the article)....
Your obviously entitled to your opinion, but it's an official statement. So I doubt they will say fall 2020 to then just say psyche 1qtr really.

Because it is NOT needed...
This ps5 needs to not leave ps4 behind...
The gap in CPU processing is big enough that it already leaves current gen behind, even at lower frequency. Lowering it for those reasons would just artificially limit it in the future.
Reasons could be to improve CPU yields, or they are power and heat constrained, so they will design their cooling and power circuitry around those limits.
Easy to forget that it's not just the CPU on the soc. Even desktop zen CPU's aren't running at way above 3.2ghz.
So this seems like not only a reasonable reason, but likely for sub 3ghz if that's what we get
 
PS4 compute perf was very impressive compared to NV GPUs from the same time, so PS4 was much faster than most gaming PCs for what i'm doing - it almost can only get worse :)
But we will see. Personally i expected next gen on par with FuryX all the time, and 14 TF fp16 could beat this easily. I agree it does not look bad.
Just 7TF at 600GB/s is really an interesting ratio...

Compared to NV gpu's in some area maybe, but compared to what was available the PS4 was kinda low end. 7970ghz was out for a year by ps4 launch. NV gpu's where that bad either, mostly outperforming amd in most games This time around were getting a real CPU, and a more advanced GPU for the time (i think). But comparing 'to what's available in pc market' is pointless anyways as much faster hw will be there by the time or shortly after.
 
Status
Not open for further replies.
Back
Top