Baseless Next Generation Rumors with no Technical Merits [post E3 2019, pre GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
I still can't see how they'll hit acceptable TDP targets. These RDNA2 CUs must be even bigger than Navi's, with RT included. And we know a RX 5700X sucks ~250W @2 Ghz. They must be either doing something miraculously with RDNA2 (extremely low core Voltage), or go with a large die (expensive).
4 Huge variables:

1. Process:
7nm OR 7nm+ : about 15% power efficiency

2. Frequency:
We know that 7nm 5700xt power increases 40W just from 1.8GHz to 1.9 GHz( from era's graph)
Proper frequency would help reduce power significantly.

3. Architecture:
We know that Nvdia is better than AMD for several years. In fact Nvidia just uses 120W(TBP 175W)
for RTX 2070 that means Nvidia may have sub 80W in 7nm.

I expect 20% power reduction if RDNA2 is used rather than RDNA1(but still fall behind nvidia).

4. Cooling:
I expect next-gen consoles have some talent cooling systems so they target 150W~160W GPU
and a 250W console.
 
Finally! A console generation with top-tier hardware, and devs that will not have to reinvent their toolchains to take advantage from new architectures.

Start of nextgen will be full of great games, I can feel it.
 
Finally! A console generation with top-tier hardware, and devs that will not have to reinvent their toolchains to take advantage from new architectures.

Just as long as Nintendo doesn't pull a "Hold my beer" when it's their turn, things should be good.
 
Personally I know it to be a fact. Either way it's about video games so it's not very important. I would take any kind of wager though if people want to!
 
Kidding aside, your terms are, if I understand you correctly, that you are willing to bet that the final PS5 will have more than 36CUs and no more tham 40, and also near 2.0Ghz.

Ok, I can bet on that, just for the shits and gigs. I've been meaning to donate to B3D next year anyway. We just gotta set a definite range for that clock frequency there. Does 1.75Ghz - 2.25GHz seem fair to you?
 
Kidding aside, your terms are, if I understand you correctly, that you are willing to bet that the final PS5 will have more than 36CUs and no more tham 40, and also near 2.0Ghz.

Ok, I can bet on that, just for the shits and gigs. I've been meaning to donate to B3D next year anyway. We just gotta set a definite range for that clock frequency there. Does 1.75Ghz - 2.25GHz seem fair to you?

Retail PS5 36 OR 40 CUs. Range:1.8 to 2.2ghz.
 
I wonder if PS5 will use their own version of The Hovis Method* to reduce binning. If not, it's one area where Scarlett has a potential cost advantage. If they bin off chips to Lockhart then it gives MS further ways to reduce chip cost across the whole family. Might be how they can afford to have more CU's than PS5, if that turns out to be true.

* As a Brit, this can't help but sound like an ineffective form of contraception involving a popular sliced bread brand.
 
If it ain't a 2080 beater i will be dissapointed.
20TF atleast.
AMD's top current 5700XT will be nothing compared to PS5 gpu, that will beat 2080 easy. 15TF comparable power from GCN, atleast.
Expect 3080 or much higher. Sony is going all in this time.
Yeah, no one did see that coming. Ultra high end parts in a small box, at a max 499 price. Going to get two.
Star-Trek-Picard.jpg


MS is targeting 12. Who knows what Sony is targeting. They have 40CUs in their devkit at ~2.0ghz.

Clearly they are targeting around 10.24TF.
 
Is the rumor of a 12.9 TF PS5 early in the year confirmed to be horse shit? Imagine if that turns out to be real, the kind of hype it would generate could light up a whole continent.
 
But do we have any proof that PS5 will not be literally based on an overclocked RTX 2080?
Maybe the RTX 2085.

It's a possibility, and then the xbox going with a poor navi variant. Makes really sense though, if dev kits are based on nvidia hardware whilest final specs will be AMD.
 
Status
Not open for further replies.
Back
Top