General Next Generation Rumors and Discussions [Post GDC 2020]

MS may have chosen to respin the chips to get better yields. They may have spent more money in fixing any issues that cropped up and hoped to recoup it with better yields and higher performance.

The variable route doesn't to me sound like an ideal situation. You can never have more than the max clock speed but can now have less than the max clock speed. Just sounds like a recipe for an uneven experience for some users

except the point was to make all PS5s act the same unless I’m mistaken?
 
What do you mean by 'respin'? There's a layout of transistors defined by the RDNA and Zen architectures. These can't really be arranged any differently, I don't think. Whatever MS has for its CUs and CCXs, Sony have also, same as on PC. The in-between and custom bits will be different, but these won't likely be the parts creating issues in power drawer.

PS5 ensures exactly the opposite. Everyone will have the same clock speeds at the same point in the game. If PS5 used variable clock speeds based on temperature, than yes, some PS5's would be faster than others. But Cerny explicitly mentioned that's not what they're doing. Explicitly! He used exactly the example of not wanting different performance based on different environments.
Isn’t a respin also hella expensive? If so that would imply the initial run was terrible I would have thought.
 
Isn’t a respin also hella expensive? If so that would imply the initial run was terrible I would have thought.

Generally, yes, but it can save money if the respin results in a substantially better final chip. I've lost track how many times since 2018 Sony allegedly respun their PS5 SOC.
 
The problem I see with the yield thing, as already mentioned, is 1) why would Sony have lower yields than MS on the same architecture and fab? 2) If due to parametric yields, what can they do to improve those over time?

If the architecture cannot deal well with the power requirements, how can that be improved on the current node? To my ill-informed mind, you'd be stuck with that problem until the next node shrink. Although I guess GPU refreshes point to some changes allowing clocking higher? But until I see a good explanation how the yields could have been poor but now be good, I find it difficult to believe there were ever significant yield problems. It was conceivable based on this idea of Sony pushing the chip limits right up until 10 M units ramping up was announced; that's a very contradictory data point, and a solidly reliable one rather than a vague suggestion.

Design rules aren't static as they mature as the node matures. The design rules are used to ensure functionality and maximize yields. As more and more designs move to 7nm, the more knowledge is gained by TMSC. That growing knowledge is conveyed to chip designers through the design rules.
 
Generally, yes, but it can save money if the respin results in a substantially better final chip. I've lost track how many times since 2018 Sony allegedly respun their PS5 SOC.
Considering a respin takes anywhere from 2 months to 6 months, and Sony had devkits early, and started mass production on time, those respins would have been already scheduled as part of the design process. They have a lot of custom additions which were not part of RDNA or Zen.

From Cadence:
"Some companies plan a certain number of chip respins. They use shuttles to tape out the chip before they actually expect to ship a product. This enables them to do post-silicon verification, create development platforms for software bring-up, and to find out where they need to concentrate further efforts. The fact that these respins are planned means they are not treated as failures."
 
how do you have a variable clock speed and then have everything always the same across 100m devices ?
the frequency is based not on power draw, but on activity level of the silicon. Since the code is going to run the same on every PS5, that means teh activity level is the same, which means the frequency will be the same.

So as activity level goes up and uses more parts of the GPU at once, the power draw will naturally increase, all the systems will draw additional power from the CPU as needed, if it can't draw additional power, then the frequency of the gpu will throttle down if needed.
 
the frequency is based not on power draw, but on activity level of the silicon. Since the code is going to run the same on every PS5, that means teh activity level is the same, which means the frequency will be the same.

So as activity level goes up and uses more parts of the GPU at once, the power draw will naturally increase, all the systems will draw additional power from the CPU as needed, if it can't draw additional power, then the frequency of the gpu will throttle down if needed.
and yet they will never increase past their max clock speed as far as I can find out. So in reality your just robbing peter to pay paul.

I guess like with so much in life we just have to wait and see how it shakes out
 
and yet they will never increase past their max clock speed as far as I can find out. So in reality your just robbing peter to pay paul.
In fact PS5 variable frequency can use the power more efficiently.

Just think if some XSX game doesn’t need a lot of CPU power why not use the additional power to push GPU to 13TFs?
 
In fact PS5 variable frequency can use the power more efficiently.

Just think if some XSX game doesn’t need a lot of CPU power why not use the additional power to push GPU to 13TFs?
except what happens when the CPU does need the power ? I don't program games but wouldn't requirements change from frame to frame meaning a ps5 developer would never really know how much gpu power they will have ? and wouldn't there be frames where both cpu and gpu need their full power to render a scene ? So if they both do that and both clock lower wouldn't frame times and fps drop ?
 
It's surprising and unexpected to see Sony pivot from April where they were expecting to ship far fewer consoles than PS4's launch to now where they're looking at around 25% more. Official pre-orders have not begun yet so what is driving this confidence of high demand in the middle of the coronavirus-sponsored 2020? :???:


I keep bringing this up but since Covid, consoles particularly and pretty much everything electronic have sold like hotcakes. I haven't seen an actual console on the shelves in months.

Remember Uncle Sam is giving out massively juiced unemployment ($600 per week above normal) and stimulus too, lots of people with lots of funny money and nothing to do. All it means is a lot of shortages. Especially of electronics for being housebound like tablets, laptops, and consoles Try and get a Switch in the USA the last 5 months.

PS5 and XSX alike will sell everything they can make the first holiday. But they would have done that covid or not (so did PS4 and 599 under powered Xbox One in 2013), but Covid if anything may make the shortages they would have had anyway, even worse.

I guess there's always a chance economy will truly go south soon, but I doubt it, and if so Sam will just juice our pockets even more.
 

Now I'm pretty sure this video has been discussed elsewhere, but a popular youtuber tried to build a "console killer" PC for a similar price as in the past. His conclusion? This time you cant come close. He built PC with 3700X (he admitted this is a little faster than XSX CPU), 16GB low end DDR4, the cheapest b450m motherboard he could find, a 1TB NVME SSD, and the best AMD GPU available the 5700XT, and it came to....$1080. And that's with a significantly less powerful GPU. For Nvidia it would have to be a 2080Ti he concluded (I agree, although some here claim a 2080, either way it's insanely expensive).

He mentioned for example PSU prices have nearly doubled (something I noticed being in the market), and for the 650 watt PSU for the build he had to pay $70. He blamed it on Covid's impact on manufacturing. I also think the stimulus plays a part.

The point is it made me very much conclude we maybe better brace for 599, where before I was sure of 499. MS and Sony wont be able to completely escape things like doubling PSU prices, either. Could be wrong.
 
With DlSS2.0 being successful as of right now, I’m fully invested into ampere. RDNA 2 could be 15-20 TF and still not have the power to render equivalency to dlss2 upscale.

Yes DLSS/tensor turned out to be the biggest game changer despite much critics in the beginning. I'm passing my current system (2080ti/3900x) to my sister, and there's no reason to go Intel, zen2 is the older gen, so something fast zen3 (8 core perhaps) and spend more on an optane drive together with either ampere or rdna2 depending on what prices do.
Either way, you can't go wrong really, a 3700 zen 2 cpu and anything 2070S or higher will outperform the lowest dominator console (PS5) which most cross plat games will target i assume.

But going with RTX2000 gpu's from 2018..... or even zen2, i would get something 2020 or later. No idea when DDR5 comes, but i dont think it's so long off. Anyone with a RTX2060/70 and decent 8 core cpu and NVME, i wouldn't advice to upgrade at all now. Maybe half way into the next gen if theres a need for more speed.
 
Back
Top