Next Generation Hardware Speculation with a Technical Spin [post E3 2019, pre GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
VII

gpuroadmap.png
 
Sony and Microsoft are using Navi architecture (7nm), along with some specific customizations to the CUs on handling RT acceleration for their respective products architectures. These systems will be middle-ground PC counterpart products with great SSD performance. Anyone expecting RTX 2080/Ti high-end performance within the console space is fooling themselves. All of suddenly, wattage/TDP aren't a thing...o_O

I cant wait until silly season is over.
 
Sony and Microsoft are using Navi architecture (7nm), along with some specific customizations to the CUs on handling RT acceleration for their respective products architectures. These systems will be middle-ground PC counterpart products with great SSD performance. Anyone expecting RTX 2080/Ti high-end performance within the console space is fooling themselves. All of suddenly, wattage/TDP aren't a thing...o_O

I cant wait until silly season is over.

It’s interesting that people are asserting there’s a thermal/power limit without articulating why. There isn’t some magical threshold. The X1X cools ~160W quite well. Why can’t things just scale linearly?
 
Last edited:
It’s interesting that people are asserting there’s a thermal/power limit without articulating why.

Primary reason for TDP management: costs.
Everything costs more as power needs go upwards. From cooling to the board components that need to feed the APU.

For consumer boxes that are held in small places a high TDP places the unit in a higher failure rate situation especially If the unit is stored in poor circulation

Last but not least; the living room goals are to have a quiet device so that you can hear the game and not the device humming away.

TDP mitigation can be done going wide but slow. But then the whole pipeline needs to be beefed up across the board. Bottlenecks unfortunately aren’t necessarily hit the same way. And how different engines will utilize The hardware makes it a challenge to fix bottlenecks for all engines. Much easier to increase the clock speed.
 
Last edited:
Sony and Microsoft are using Navi architecture (7nm), along with some specific customizations to the CUs on handling RT acceleration for their respective products architectures. These systems will be middle-ground PC counterpart products with great SSD performance. Anyone expecting RTX 2080/Ti high-end performance within the console space is fooling themselves. All of suddenly, wattage/TDP aren't a thing...o_O

I cant wait until silly season is over.

AMD stated months ago in their slideshow that "RDNA2 is used in next gen consoles" among other things, so unless they lie, that is what we get.

But anyway, expecting 2080 performance can mean at least 2 things:

A) to have literally same teraflops/rt performance in a console with low wattage(unlikely)

B) have similar performing/looking games with less performance because of unified systems(at least on ps5 with one sku)

Aka optimism for better optimizations

I would bet my money to the B), wait 3-5 years into the gen and ps5 will output similar graphics vs 2080 now/then. If we are lucky
 
AMD stated months ago in their slideshow that "RDNA2 is used in next gen consoles" among other things, so unless they lie, that is what we get.

They also stated 7nm (not 7nm+) for next gen consoles, didn't they? Most likely console gpu's are hybrid RNDA/RDNA2 based, seeing the timeframe etc.

Aka optimism for better optimizations

I would bet my money to the B), wait 3-5 years into the gen and ps5 will output similar graphics vs 2080 now/then. If we are lucky

The highest end chip AMD has now is the 5700XT, that most likely won't be in consoles in raw power level. Even if it does, it's quite far away from a 2080. I think it's not really realistic, in special today where optimisations have gotten better across the board.
 
They also stated 7nm (not 7nm+) for next gen consoles, didn't they? Most likely console gpu's are hybrid RNDA/RDNA2 based, seeing the timeframe etc.



The highest end chip AMD has now is the 5700XT, that most likely won't be in consoles in raw power level. Even if it does, it's quite far away from a 2080. I think it's not really realistic, in special today where optimisations have gotten better across the board.
Quite far away? Are you confusing 2080 with 2080ti?
 
Consoles won't come close to 2080 in raw performance no.
Because you don't want to?
Look at battlefield and forza horizon 4...
i9 9900k
R5 3600x @4.2
I acctually think that it will clearly perform equally or better and there are games on the PC that alreadyshow that potential.
With optimizations the difference about 10 fps on most games shown will be matched that's a given.
 
Because you don't want to?
Look at battlefield and forza horizon 4...
i9 9900k
R5 3600x @4.2
I acctually think that it will clearly perform equally or better and there are games on the PC that alreadyshow that potential.
With optimizations the difference about 10 fps on most games shown will be matched that's a given.

I think you don't understand what people mean by "RAW" performance.
 
I think you don't understand what people mean by "RAW" performance.
Please, explain, because this RAW performance seems to be quite a variable meaning lately
Radeon VII RAW performance should be far higher than 2080 10tf in teraflop values 13.44tf but performs like it.
How do you calculate RAW performance, please give me a rational explanation.
 
There isn’t some magical threshold.

Never stated such. And honestly, I didn't think I had to explain why these pipedreams of 11TF-13TF GPUs aren’t happening within the console space anytime soon, let alone the ridiculous CPU clocks being reported. But go ahead, have your dreams, it’s the same old nonsense every new generation.

At best, ~10.5TF GPUs and 2.8GHz CPUs
 
Changing arguements...Great

Not really, if you see that AMD's current highest end dGPU is the 5700XT, you can't expect that level of performance in a console, the PS4 didn't either get AMD's highest end GPU by the time it launched, it got a year old mid-range part.
It's the 3TF all over again.

But go ahead, have your dreams, it’s the same old nonsense every new generation.

At best, ~10.5TF GPUs and 2.8GHz CPUs

10.5TF if were in luck, the 5700XT is how many TF?

Also, what about that 'trusted journalist' that called for 'unimpressive specs'? are we only going to see the 14TF leaks? :p
 
Status
Not open for further replies.
Back
Top