Next Generation Hardware Speculation with a Technical Spin [post E3 2019, pre GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
Another thing, PS4 BOM leaks estimated about the same as PS5 BOM leaks.
As far as I'm aware there are no leaks. We have estimates, and in this case a website reporting a BOM estimate they heard. Leaks would be official documents.

I'm also struggling to see why Bloomberg etc. is being talked about in this thread. Is there any particular reason its not in the Baseless Rumours thread? I don't want to have to spend time cleaning the thread up - I'm looking to get my first ever console game released!
 
The github leak only proofs what the target specs are, the final chip will be in the retail units. It is a testbed for a reason, the leak does imply 9.2TF, but they could alter the design somewhat, not from 36CU to 52, or from 2ghz to 1.5. Slight adjustments i ment.



Rather the NAND/DRAM prices that went up are to blame, if those articles are to be believed.

I meant that designs thar show to be overoptimistic in nature have a higher chance of eventually getting over-budget regardless of what ultimately ends up being the straw that breaks the cammel's back.
 
Sweclockers did a short re-test of the old CPUs last year.
https://www.sweclockers.com/test/28...re-i5-750-tar-nehalem-till-massorna/5#content

Many other sites always test at Ultra, Sweclockers did test most runs on low settings. The i5 750 and i7 920 should definitely still provide a better experience than on consoles.
It's hard to find good tests for Battlefield though, the tests in single player are very misleading for CPU performance.
I hate when CPU game tests are run at low settings, because many graphical options will have an impact on CPU performance. If they want to remove the GPU from the equation I think using the highest preset but a super low resolution, like 540p or something. Draw distance and foliage density are examples of settings that can stress CPUs, especially those with lower ipc, or lower core counts, depending on how those features are implemented by the game, for example.
 
There are other analyst, ZHugeEx, that have indicated that PS5 BOM estimate is close to what they think it will be as well. So it's not like that article is entirely baseless.
 
I hate when CPU game tests are run at low settings, because many graphical options will have an impact on CPU performance. If they want to remove the GPU from the equation I think using the highest preset but a super low resolution, like 540p or something. Draw distance and foliage density are examples of settings that can stress CPUs, especially those with lower ipc, or lower core counts, depending on how those features are implemented by the game, for example.

But in the context of the discussion earlier in this thread about how Jaguar compares to Core 2 Quad and the like, I do think low or medium corresponds more to the base console versions. I'm damn sure that Ultra in modern games isn't close to a fair comparison.
Sweclockers' settings differed depending on the game, but this was supposed to just be a quick re-visit and I guess they simply didn't find testing at higher settings worth the time.

When testing new CPUs though, most sites do test at lower resolutions for exactly the reasons you mentioned.
 
But in the context of the discussion earlier in this thread about how Jaguar compares to Core 2 Quad and the like, I do think low or medium corresponds more to the base console versions. I'm damn sure that Ultra in modern games isn't close to a fair comparison.
Sweclockers' settings differed depending on the game, but this was supposed to just be a quick re-visit and I guess they simply didn't find testing at higher settings worth the time.

When testing new CPUs though, most sites do test at lower resolutions for exactly the reasons you mentioned.
Most sites I've seen test at 720p with lower settings. I'm fine with them turning off vendor specific stuff like hairworks, but I really think CPU testing should be performed with higher settings because lower setting are optimized to remove bottlenecks, and I think it would be more useful to find out where those bottlenecks are.

It's really hard to have any concrete comparisons between the console CPUs and a Core 2 Quad because AMD never released an 8 core Jaguar. I guess you could compare a Core 2 Duo to a quad core Jaguar to get a reference, since both CPUs would have exactly half the core counts, but I think when it comes down to it, the fact that those 8 core Jaguars are in a closed box allow developers to optimize for that thread count and IPC more effectively. I'm also not as down on those CPUs as many people here are. The fact that many games employ a resolution scaler or run sub-1080p more often than not means that the systems are GPU bound most of the time anyway.
 
Yes. One could as readily argue Sony have more CUs added generating more heat. I think the more accepted reasoning though is that clocking faster generates more heat than making wider, so for a given number of TFs, the GPU producing those TFs with the fewer CUs will run hotter.

It'd be something of an oversight though to design a system where the cooling adds so much to the cost of the box that you'd be better of going wider, but not actually doing that. If our analysis here looking at the heat curves for RDNA GPUs shows us that above 1.5 GHz (or whatever the sweet spot is) is getting very hot, Sony would have seen the same. That then leads to explaining the odd choice and narrow and high clocks, which one wonders is that to do with BC or not? Or is the thing not even clocked at 2GHz anyway? Without that confirmed, it's all speculation upon speculation.

I guess another possibility is the final RDNA 1.5/2 design is running hotter than expected, and the increase in cooling costs, though not as significant as implied, is still a loss leader Sony doesn't want to take on.

Can someone recap the predicted heat energy at certain clock speeds? My internet search hath failed.
 
Once again, over and over, some of you guys find a new baseless source and you act as if it's really real. You should ask yourselves why Richard Leadbetter didn't report on the bloomberg "revelation" at all despite the internet going crazy with the echo chamber. We've been through this a million times. He will not repeat dramatic click bait articles unless he gets his own corroboration from his own sources, and he has the decency of carefully labelling speculation, official facts, and rumors, as they respectively deserve to be labelled. This remains a baseless rumor until we have corroborated evidence from a reputable publication.

If Richard talks about it, we will definitely have something to work on, but so far no journalist anywhere have a single corroborating source and simply repeat the bloomberg speculaltion without any sort of editorial oversight. This is a red flag that warrants skepticism, as it always does.
 
Sony apparently having to invest more in the cooling system to prevent overheating suggests higher clocks.

See, this is your problem. You said this report "confirmed" high clocks, but that's not the same as "suggests". It's just as likely they are investing more in cooling to address the common complaints about noise this gen. If an alternative theory is just as, if not more likely you cant pretend the report "confirms" your pet theory.
 
See, this is your problem. You said this report "confirmed" high clocks, but that's not the same as "suggests". It's just as likely they are investing more in cooling to address the common complaints about noise this gen. If an alternative theory is just as, if not more likely you cant pretend the report "confirms" your pet theory.

Oh i think it all depends on how you read it. As if noise complaints werent there for all the other PS generations, btw?
 
Yes. One could as readily argue Sony have more CUs added generating more heat. I think the more accepted reasoning though is that clocking faster generates more heat than making wider, so for a given number of TFs, the GPU producing those TFs with the fewer CUs will run hotter.

It'd be something of an oversight though to design a system where the cooling adds so much to the cost of the box that you'd be better of going wider, but not actually doing that. If our analysis here looking at the heat curves for RDNA GPUs shows us that above 1.5 GHz (or whatever the sweet spot is) is getting very hot, Sony would have seen the same. That then leads to explaining the odd choice and narrow and high clocks, which one wonders is that to do with BC or not? Or is the thing not even clocked at 2GHz anyway? Without that confirmed, it's all speculation upon speculation.

I guess another possibility is the final RDNA 1.5/2 design is running hotter than expected, and the increase in cooling costs, though not as significant as implied, is still a loss leader Sony doesn't want to take on.

Can someone recap the predicted heat energy at certain clock speeds? My internet search hath failed.
Not just that, but thermal density goes up. Power grows according to P = f*C*V^2. You're linearly increasing f, and voltage has to grow to enable those clocks. So you're growing power super-linearly in a fixed area, versus growing power linearly in an expanding area.
 
There are other analyst, ZHugeEx, that have indicated that PS5 BOM estimate is close to what they think it will be as well. So it's not like that article is entirely baseless.
Takashi Mochizuki at Bloomberg is more credible than likes of ZHugeEx as Mochizuki who was at WSJ before is most likely talking firsthand with Sony employees for that article unlike those analysts doing armchair speculations or talking with secondary sources. In his Japanese tweet,
Mochizuki says the sentiment shared by all the "people with knowledge of the matter" in the article was "Would like to watch how Microsoft would act. Don't want to stimulate Microsoft." Who don't want to stimulate MS other than Sony employees?
One interesting nugget he put in the following tweet is the Sony IR Day in May, which didn't exist in the PS4 launch year, is what is currently making people at Sony nervous as analysts may be able to guess the price of PS5 depending on how much info Sony would disclose at that event.
The Sony earnings forecast at the end of April last year showed no increase of marketing cost in the term ending at the end of this March, so some analysts predicted no PS5 marketing launch would happen before April 2020.

At lower clocks probably not.
Are you sure? You may compare the heat dissipated by RTX 2080 and RTX 2070 Super.
 
The Sony earnings forecast at the end of April last year showed no increase of marketing cost in the term ending at the end of this March, so some analysts predicted no PS5 marketing launch would happen before April 2020.

They don't really need to do any PS5 reveal before that time either, i think only people on gaming forums want it to happen badly now :)

Are you sure? You may compare the heat dissipated by RTX 2080 and RTX 2070 Super.

No im not sure, that's why i said 'probably'. Was thinking in the line of Sony adding some CU's and lowering that 2000mhz clock, but targeting the same performance numbers. It just seemed logical to me that going with a lower clock, yet a increase in CU count would offer the same performance with perhaps lower heat output.
 
Status
Not open for further replies.
Back
Top