Baseless Next Generation Rumors with no Technical Merits [post E3 2019, pre GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
It's not the same thing though. You are not pushing a chip that much just for BC...but pushing a 36CU chip for the new generation is at least somewhat believable. I'm not even sure that would work for BC given the instability it would introduce since it's over double the frequency of what the games originally ran on. Cerny even said tinkering with different frequencies introduced problems for BC on Pro.
I wonder what kind of problems he ran into and why would there be such problems? With the same exact CU setup, a faster clock should literally just run things faster shouldn't it?
 
I wonder what kind of problems he ran into and why would there be such problems? With the same exact CU setup, a faster clock should literally just run things faster shouldn't it?
Some rare race conditions (if somehow the code is coincidentally or not tied to the 1.6ghz frequency). But the rare games released before Pro that had problems have being patched by the devs. I think I only heard about one case that was eventually patched.

But normally developers don't tie anymore their code logic with the CPU frequency, it's really bad design. But it still can happen by accident.
 
Out of curiosity, when the hardware rumors about PS4 were supposedly leacked, apart from the memory, how close were they? How much was the discrepancy?
I think there was small discrepancy from what we got then, so I suppose it will be now as well.
Expect your 9.2TF console
 
Right, that's reserved for the adjective "Sony's".
Despite the needless jab, the latest example in consoles actually happened the other way around.

In the PS5, Sony used a more advanced version of GCN due to the presence of RPM. This made PS4 Pro's CUs around 11% larger than the ones found in XBoneX (despite the later being clocked ~29% higher, so it might have used more transistors optimized for performance instead of density).
Microsoft's console OTOH used a larger number of higher clocked CU units, coupled with higher a clocked CPU and more memory channels, ultimately getting higher performance and being able to display better quality graphics and/or higher framerates.

So yes, Microsoft's RT approach may be more advanced (i.e. more flexible and/or have better performance-per-watt and/or better performance-per-area), but in the end it could provide identical results.


They can't do this because of how the CUs are now grouped by 2 into one WGP. They need to deactivate the whole WGP. One on each sides of the butterfly design so 4CUs in total.
Is there any known reason why you couldn't disable half of a WGP?
Or why you couldn't disable just one WGP despite the butterfly design?
Do we even know if the PS5 is using a butterfly design?


Out of curiosity, when the hardware rumors about PS4 were supposedly leacked, apart from the memory, how close were they? How much was the discrepancy?
I think there was small discrepancy from what we got then, so I suppose it will be now as well.
There were no discrepancies because what we saw were official presentation slides dated some 5 months before release.

You could compare to the X360 leaks that appeared 1 year before IIRC, and those showed 3.5GHz CPU cores (that ended up at 3.2GHz) and 500MHz GPU (that ended up 550MHz).

Regardless, the github leak referred to results taken around a year and a half before release, meaning they were testing chips that were fabbed probably closer to 2 years before release.
The timings are completely different.

Expect your 9.2TF console
Nah.
 
Last edited by a moderator:
Out of curiosity, when the hardware rumors about PS4 were supposedly leacked, apart from the memory, how close were they? How much was the discrepancy?
I think there was small discrepancy from what we got then, so I suppose it will be now as well.
Expect your 9.2TF console
Well, our own Proelite has posted relatively correct specs back in June 2012.

https://www.neogaf.com/threads/latest-reliable-orbis-and-durango-specs.478941/#post-38997667

Disclaimer: Everything isn't final, but the general range of these system are more or less set in stone.

Durango:
AMD CPU, 6-8 cores
Ram 3-4 GB
AMD GPU
Total processing power 1-1.2 teraflops

Note: I've heard of 8GB ram from reliable people on this form, but a certain, well respected, insider on B3D is certain that 3-4 and 1 teraflops is near final. His information is the most up to date that I've got.

Unreal 4 needing 1 TP is pretty indicative.

It seems that the IGN's 6x more than 360 article isn't that far off, but it's more like 4x 360.

Orbis:
AMD CPU 4 cores x-86
AMD GPU 1150 SPU, 1.8 teraflops, 800 mhz
2GB GDDR5 (unlikely this will be bumped to 4GB)
 
Well, our own Proelite has posted relatively correct specs back in June 2012.
Half of the CPU cores for the PS4, 1/4th of the memory as "set in stone" in June 2012?

That's an information as old as the github leak is from gen9's release, yet it seems it all changed just a little bit from the supposedly "set in stone".
Oh, that or the information was just plain wrong.
 
Well, our own Proelite has posted relatively correct specs back in June 2012.
Pretty broad definition for 'relatively correct' there. ;)

In terms of GPU TFs, he was right (well, potentially some 30% low-balled on Durango with the bottom 1TF versus release 1.3 TF, but we can give them 1.2 TF and the last-minute upclock). The rest is wide of the mark.
 
Pretty broad definition for 'relatively correct' there. ;)

In terms of GPU TFs, he was right. the rest is wide of the mark.

Half of the CPU cores for the PS4, 1/4th of the memory as "set in stone" in June 2012?

That's an information as old as the github leak is from gen9's release, yet it seems it all changed just a little bit from the supposedly "set in stone".
Oh, that or the information was just plain wrong.

I mean, wide of the mark, perhaps, but all we care about is what he was right about - TF :)

Nobody here is saying system BW is locked, or that RAM amount is (well, XSX can go up to 20GBs anyway). What was the point of discussion has pretty much entire time been - GPU TFs. Mind you, from Proelite we didn't get any leak with testing data, what we got was educated guess, so not entirely the same as Github.
 
Well, our own Proelite has posted relatively correct specs back in June 2012.

https://www.neogaf.com/threads/latest-reliable-orbis-and-durango-specs.478941/#post-38997667
VGleaks has beaten him by 10 days on the same 1.84 TF figure. So yeah pretty much everybody knew the specs at that point (June 8th 2012).
http://webcache.googleusercontent.c...n-deep-first-specs/+&cd=1&hl=en&ct=clnk&gl=au
Not saying he's wrong this time but I wouldn't draw conclusion based on all those.
 
Last gen prediction thread...

Just been dipping into this thread. Loads of parallels! Difference cast, same performance. But it's interesting to see claims like "Charlie says the die is massive, like 500 mm²!" and comparing that to what happened. I bet a proper analysis of that 1000-post thread would find the B3D consensus and dismissing of rumours was more right than wrong.
 
VGleaks has beaten him by 10 days on the same 1.84 TF figure. So yeah pretty much everybody knew the specs at that point (June 8th 2012).
http://webcache.googleusercontent.c...n-deep-first-specs/+&cd=1&hl=en&ct=clnk&gl=au
Not saying he's wrong this time but I wouldn't draw conclusion based on all those.
Yea, couldn't find VgLeaks but was sure some other info existed before his post.

One interesting bit...

Our source claims that the final specs should be 10x powerful than PS3, due to this info has got 10 months.

CPU

  • 4 core (2 core pairs) 3.2 GHz AMD x86 (Steamroller)
  • aggregate, 10x PS3 PPU performance
  • 512 KByte L2 per core pair
  • 64 bit pointers
GPU

  • AMD R10x series GPU @ 800 MHz (Tahiti)
  • aggregate 10x RSX performance, 1.84 TFlops
  • DX”11.5″
  • cheap branching
  • 18 compute units, focus on fine grained compute & tessellation
MEMORY:

  • 2 GByte UMA, pushing for 4 GByte
  • 192 GByte/ sec non-coherent access for e.g. GPU
  • 12 GByte/ sec coherent access for CPU
  • >50% longer memory latency for CPU/ GPU compared to PC!!!
  • DXT is lowest bpp texture compression format
MEDIA:

  • 50 GByte BD drive
  • PCAV, between 3.3 and 8x BD, most likely clamped to 6x
  • automatic background caching to HDD
  • HDD SKU with at least 380 GByte
  • downloadable games
EXTRA HARDWARE:

  • video encoder/ decoder
  • audio processing unit, ~200 concurrent MP3 streams
  • HW zlib decompressor

So this info was actually 10 months old in June 2012. So exact GPU specs were known almost in middle of 2012, and while 4 core Steamroller is not featured in PS4, arguably its been actual downgrade to Jaguar.
Not saying changes are impossible, but in semi conductor industry, that info cannot be considered old.

Last gen prediction thread...

Just been dipping into this thread. Loads of parallels! Difference cast, same performance. But it's interesting to see claims like "Charlie says the die is massive, like 500 mm²!" and comparing that to what happened. I bet a proper analysis of that 1000-post thread would find the B3D consensus and dismissing of rumours was more right than wrong.

Seems like quite a few people were very optimistic back then as well :)
 
Last edited:
Last gen prediction thread...

Just been dipping into this thread. Loads of parallels! Difference cast, same performance. But it's interesting to see claims like "Charlie says the die is massive, like 500 mm²!" and comparing that to what happened. I bet a proper analysis of that 1000-post thread would find the B3D consensus and dismissing of rumours was more right than wrong.
Is this a hypothetical, or did someone actually claim that, and if so, was it that Charlie? The one that said we’re getting PS5 in 2018?
 
Pretty broad definition for 'relatively correct' there. ;)

In terms of GPU TFs, he was right (well, potentially some 30% low-balled on Durango with the bottom 1TF versus release 1.3 TF, but we can give them 1.2 TF and the last-minute upclock). The rest is wide of the mark.

It sounds like he loosely described the PC devkit specs of that time.
Back in June 2012 the PS4 devkit was probably using a HD7870 with custom firmware for disabling 2 CUs and using 800MHz clocks (hence the 2GB GDDR5 "set in stone"), and Durango was perhaps using a 1000MHz HD7770.
For the CPU I'm guessing Microsoft and Sony used different approaches to emulate the 2*Jaguar modules. Sony perhaps used a 3.2GHz 2-module/4-core Bulldozer while Microsoft used a 3 or 4-module at lower clocks to get the devs to use more parallel code from an earlier stage.




I mean, wide of the mark, perhaps, but all we care about is what he was right about - TF :)
It is?

Nobody here is saying system BW is locked, or that RAM amount is (well, XSX can go up to 20GBs anyway). What was the point of discussion has pretty much entire time been - GPU TFs.
Or it could be that because in 2012 we already had Southern Islands covering the full range of performance and price targets, the dev kits of that time could use graphics cards on the GCN architecture that matched the consoles' performance estimates. In 2019 the only RDNA GPU they have is Navi 10, and it has no RT at all.
It's possible that some PC devkits from 2018-2019 used Geforce RTX cards, and the expected performance estimates could be coming from entirely different GPUs.
 
Its reported in June 2012, but info is 10 months old as per their source.

So, 4 core 3.2GHz Steamroller was replaced by 8 core 1.6GHz Jaguar, and what was 4GB pushed by devs ended up 8GB.

GPU was 1:1 match with info from possibly August 2011!
 
Its reported in June 2012, but info is 10 months old as per their source.

So, 4 core 3.2GHz Steamroller was replaced by 8 core 1.6GHz Jaguar, and what was 4GB pushed by devs ended up 8GB.

GPU was 1:1 match with info from possibly August 2011!
Possibly.
 
Status
Not open for further replies.
Back
Top