Next Generation Hardware Speculation with a Technical Spin [post E3 2019, pre GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
That's exactly what's happening :) You think PS5 could be RNDA1, or some hybrid still?

Actually, some are claiming Xbox is not really RDNA 2, but RDNA 1.9. Basicaly a custom RDNA 1 with RDNA 2 features, including Ray Tracing. But since A
How different could they be, though? Seriously. AMD are, by the most obvious parsing of the quote since it is part of a larger statement on AMD's support for raytracing, providing the solution for both. Why would they come up with two different solutions to the same problem?

Just a theory to answer your question. A what if!

Here's AMD roadmap!

amd_ray_tracing.jpg


As you can see RDNA 2 was always refered as not having total Ray Tracing capabilities, or at leat the power for it (selected lighting effects). That would only happen mixing RDNA 2 with the cloud... Something Microsoft can do easily!

Now what if Sony, not having the same infrastructure, used an ASIC, a bit Nvidia style, to complement RDNA 2, skipping the cloud?

A secret sauce.

As I said, just a Theory...
 
Now what if Sony, not having the same infrastructure, used an ASIC, a bit Nvidia style, to complement RDNA 2,

Wouldnt that require obscene amounts of bandwidth to ship the RayTracing data between GPU and ASIC, especially at 120 Hz?
 
Wouldnt that require obscene amounts of bandwidth to ship the RayTracing data between GPU and ASIC, especially at 120 Hz?
not the real answer but:
I think if I wanted to solve this problem I would only transmit the deltas.
So you have a copy of the BVH locally, and you have a copy of the BVH in the cloud. You transmit the vector of change, so both BVH trees are aligned.
Now, the update part is complete, time for intersection:

You have that work on non-coherent rays on the cloud; more latency forgiving.
Locally you work only on coherent rays, so the important stuff.
I guess you could send back all the intersected triangles, but this number seems large.
 
To counterpoint that, Microsoft has also been doing RT R&D since that time and has patents too, so why would they opt for AMD's solution instead? To make it easier for developers and be able to support it on the PC, is that maybe why?
MS's RT was crap and failed and they needed AMD's to work, whereas Sony's is super awesome and better than everyone else's....obvs.
 
Wouldnt that require obscene amounts of bandwidth to ship the RayTracing data between GPU and ASIC, especially at 120 Hz?
Does it though? An expansive outdoor scene is largely indifferent to the effects of your locality, especially with regards to lighting. The success or failure will be in the ability/quality of blending your local sphere(and others) with the greater scene.
 
Wouldnt that require obscene amounts of bandwidth to ship the RayTracing data between GPU and ASIC, especially at 120 Hz?

Nvidia,with RT, takes a performance hit of 40% staying at max at 1440p 60 hz... Why are you assuming these consoles RT can go to 4K 120 Hz?

And will 120 hz be used besides VR?
How big is the market for 4K 120 Hz tvs? In sigle digits, 0% should not be far off!
 
MS's RT was crap and failed and they needed AMD's to work, whereas Sony's is super awesome and better than everyone else's....obvs.
That might be taking it too far.
The main difference between Sony and MS in this case is only that MS needs to cater to >1 platform and Sony needs to cater to only 1 platform. What advantages that comes with is beyond me.
 
Nvidia,with RT, takes a performance hit of 40% staying at max at 1440p 60 hz... Why are you assuming these consoles RT can go to 4K 120 Hz?

And will 120 hz be used besides VR?
How big is the market for 4K 120 Hz tvs? In sigle digits, 0% should not be far off!
unfortunately nvidia is only a single data point in an entirely new rendering method. We've never had a RT performance shootout; so I'm not sure if those numbers even apply.
 
Until we get full in depth analysis, I'm open to both PS and Xbox having custom amd RT hardware.
That wouldn't stop amd saying its Radeon/RDNA etc.
Level of customization as always is what matters, if any at all.
It's not all or nothing.
 
the biggest takeaway for me, is that RT and VRS are not the only defining points for RDNA 2.0.

As writers have cited earlier, RDNA 1.0 was a hybrid RDNA with GCN technology. That is what the architecture is.
RDNA 2.0 is a full departure from GCN.
This is the reason why I was riding the whole 1.9 thing so badly.
It's either a complete departure or it's not.
Blocks customized or not, the critical aspect is the step away from GCN.

If there is any reason to believe that PS5 could hit 2Ghz, it's only because it's RDNA 2.0 and we don't know or understand how high that architecture can clock.
We know very little of what RDNA 2.0 is capable of.
 
Last edited:
RDNA 2.0 is a full departure from GCN.
This is the reason why I was riding the whole 1.9 thing so badly.
I'm not sure if it is a full departure or not, people will probably debate it. But it's definitely has to be a big improvement, if only on the power/heat/frequency side.

Like you I'm eager to hear about RDNA 2 as a whole.
 
How is that a counterpoint? MS came up with DXR without AMD's solution. Everything points to AMD being late to the party here, it wouldn't be surprising for no one to be following AMD's lead as they are not leading anything. Why are they using AMD's solution? Cost would be my mostly likely reason. That may be the determinant reason for Sony too, We'll see, maybe as soon as today.

I actually don't agree with the idea that AMD is late. RTX was largely derided and became a meme. Real-time RT is in it's most primitive early days. No one is late. It's just getting started. I'd also fully expect that AMD was aware of DXR from the very beginning, especially now that we know AMD will be supplying the RT hardware for Series X, which uses the DXR api.
 
Status
Not open for further replies.
Back
Top