Baseless Next Generation Rumors with no Technical Merits [post E3 2019, pre GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
Anyhow, those boost-clocks for console gaming is a no-go on some many levels...
Can we be certain of that? Consoles provide a space where if possible, you could elect to use less cores at higher frequencies for your game. Is it impossible that we'd see 1.6 GHz for 8 cores, 16 threads, and 3.2 GHz for four cores?
 
Those guys went bankrupt.

I know ZhugeEX mentioned that in a tweet, but I couldn't find any other media source on backing that up. From what was last being reported back in May the team at Subor working on Z+ was dissolved, however the gaming console may still be possible under a new team (direction).

https://www.eurogamer.net/articles/2019-05-15-subor-z-games-console-team-has-been-disbanded
Sources speaking to Chinese hardware news site IT Home revealed that the team working on the Z+ was dissolved on May 10th, with the entire Shanghai office dedicated to the project shutting down. The Z+ website has also been taken offline, suggesting that the console won't be coming to the market any time soon.

However, this may not be the last attempt that Zhongshan Subor makes to enter the Chinese games console market according to a statement by the company's CEO, Wu Song: "While the Shanghai office has been closed, the project is still ongoing and we will have a new announcement to make regarding its progress in the next few months."

Can we be certain of that? Consoles provide a space where if possible, you could elect to use less cores at higher frequencies for your game. Is it impossible that we'd see 1.6 GHz for 8 cores, 16 threads, and 3.2 GHz for four cores?

If it's disabling cores for higher frequencies, then I could see it being a possibility.

If it's bouncing clocks (i.e., throttling during high-temps) across all cores during a particular game, then not so much. Console gaming is all about having a consistent framerate/performance (stability) and uniform experience within that space.
 
Last edited by a moderator:
I know ZhugeEX mentioned that in a tweet, but I couldn't find any other media source on backing that up. From what was last being reported back in May the team at Subor working on Z+ was dissolved, however the gaming console may still be possible under a new team (direction).

https://www.eurogamer.net/articles/2019-05-15-subor-z-games-console-team-has-been-disbanded
hum... curious. Thanks for the link.

Those guys went bankrupt.
I guess the weird part to me is being able to run Windows 10 on it while pairing it to a 500GB HD, or at least, I'm trying to imagine the boot loading aspect in the context of MS/Sony, which I have no idea about.
 
Last edited:
Hmm honestly this all sounds very familiar.


APU with 8 core Zen/ 8MB of L3 (half the Zen2), Navi 10 Lite and 16GB of GDDR6 RAM on 256bit bus.

Kinda hard to imagine this being anything other then console APU, as that amount of GDDR6 would be auto no-no for PC/laptop APU.

1.6 Ghz base clock kind of screams PC laptop. That's a massive boost clock up to 3.2 Ghz, which would trigger when gaming is detected.

Regards,
SB
 
1.6 Ghz base clock kind of screams PC laptop. That's a massive boost clock up to 3.2 Ghz, which would trigger when gaming is detected.

Regards,
SB

And then throttle down to 900mhz after 20 seconds?

Intel allows manufacturers to do something like that via ctdp. Dunno how it is with amd
 
1.6 Ghz base clock kind of screams PC laptop. That's a massive boost clock up to 3.2 Ghz, which would trigger when gaming is detected.

Regards,
SB
Except if this is the case of Sony's well known backwards compatibility technique where clocks would match prior system's clocks for easier emulation like they are doing on Pro. And then 1.6GHZ clocks would sound oddly familiar.

“To give an example, the GPU of the prior version of the system might run at a GPU clock of 500 MHz, and the current system might run at a GPU clock [156] of 750 MHz. The system would run with [156] set to 750 MHz when an application is loaded that is designed only for the current system. In this example, the cycle counter [CC] would correspond to the 750 MHz frequency (i.e., it is a true cycle counter). When a legacy application (i.e., an application designed for the prior version of the system) is loaded, the system [100] may run at a frequency slightly higher than the operating frequency of the prior system (e.g., with [156] set to 505 MHz). In this backward compatible mode, the GPU spoof clock [135] would be configured to run at 500 MHz, and the cycle counter CC would be derived from the spoof clock, thus providing the expected value to the legacy application.”

Other then that, I don't know single laptop that features 16GB of GDDR6 memory (that is not Quadro workstation, that actually has 16GB of GDDR6 memory but its not a system memory and its not an APU).
 
Heh...
  • 16 Samsung K4ZAF325BM-HC18 in clamshell configuration
Judging by SC write stats, this would be downclocked 18Gbps chips with 528GB/s of bandwidth total.
Those are 2GB, which would be logical for a dev-kit.

Judging by SC write stats, this would be downclocked 18Gbps chips with 528GB/s of bandwidth total.
Not necessarily. The total bandwidth gives actually 529.6 GB/s using the single chip test but that would be expected as those are benchs and not theoretical. Using the DDR4 as reference in others tests, there is often a 8% difference between the bench done by their tests and the max theoretical. Here there is about 8.76% from 529.6 to the max thoretical of 576.

From memory (SDK docs) there was a 10% difference between theoretical and tested max bandwidth for PS4 gddr5 (160 -> 176).
 
Last edited:
If I read the Flute results correctly, then this thing has 16 1GB chips of (most probably) GDDR6 clocked at 18 gbps. :runaway:

What am I seeing wrong? That benchmark is measuring 62.3GB/s.
That's about what you'd get with 128bit LPDDR4X at 4266MHz, with a theoretical maximum of 68GB/s.

Plus it seems to have a 10 CU GPU.

This is something I'd expect from a 15-25W Raven Ridge / Picasso successor, not a next-gen console.
 
Last edited by a moderator:
What am I seeing wrong? That benchmark is measuring 62.3GB/s.
That's about what you'd get with 128bit LPDDR4X at 4266MHz, with a theoretical maximum of 68GB/s.

Plus it seems to have a 10 CU GPU.

This is something I'd expect from a 15-25W Raven Ridge / Picasso successor, not a next-gen console.
I am reading the single core test which is from what I could gather the bandwith of one chip (but I could be wrong). And the high latency should indicate that this is GDDR6. The 16 '1024' should tell us there are 16 chips involved.

Where do you get that 10 CU GPU ?
 
I am reading the single core test which is from what I could gather the bandwith of one chip (but I could be wrong). And the high latency should indicate that this is GDDR6. The 16 '1024' should tell us there are 16 chips involved.

Where do you get that 10 CU GPU ?
Sorry, I meant 12:

100-000000004-15_32/12/18_13F9

I thought the multi-core test would give out the total bandwidth.
But maybe it doesn't..?
 
Except if this is the case of Sony's well known backwards compatibility technique where clocks would match prior system's clocks for easier emulation like they are doing on Pro. And then 1.6GHZ clocks would sound oddly familiar.

So you are saying that the PS5 is only 2x more powerful than PS4? :p That's what you are implying if you think the 1.6 GHz is to match BC performance of PS4. Using PS4 as not all PS4 titles operate correctly at full PS4-P speeds.

Then again, this is the baseless next gen rumors thread so anything goes. :)

Regards,
SB
 
Sorry, I meant 12:

100-000000004-15_32/12/18_13F9

I thought the multi-core test would give out the total bandwidth.
But maybe it doesn't..?
No it doesn't. Not from what I could gather. Here is a mobile AMD device with 4 CPU cores (probably a tablet) with 4GB of unknown ram (probably LPDDR), latency is comparable to DDR4.

https://www.userbenchmark.com/UserRun/14294257

Notice that even if the ram is unknown (because it's probably LPDDR) the number of chips is given the same way (2 chips here). The start of the serial number is not hidden, and we can see the GPU as well as the 2 storages (which are much faster than the incredibly shitty HDD used in the Flute test).

And from the Gonzalo serial number there was a 10 instead of the 12 here. We don't know what it it: it's either for Navi 10 (12) or 1000mhz (1200mhz) GPU base clock because the boost clock is just after (18).
 
Last edited:
So you are saying that the PS5 is only 2x more powerful than PS4? :p That's what you are implying if you think the 1.6 GHz is to match BC performance of PS4. Using PS4 as not all PS4 titles operate correctly at full PS4-P speeds.

Then again, this is the baseless next gen rumors thread so anything goes. :)

Regards,
SB

The PS4 Pro has some base clock of 1,6 GHz too and the turbo clock was 2,1 Ghz using the same codename database. This how it works on console. Watch the DF video about Gonzalo, this is not a PC... The Turbo clock is the normal clock...

EDIT: And clock will probably be only a part of backward PS4 compatibility...
 
Ok then it's probably 16 GDDR6 chips.
I didn't know what to look for in that "userbenchmark". I'd never heard of it before.
 
The PS4 Pro has some base clock of 1,6 GHz too and the turbo clock was 2,1 Ghz using the same codename database. This how it works on console. Watch the DF video about Gonzalo, this is not a PC... The Turbo clock is the normal clock...

EDIT: And clock will probably be only a part of backward PS4 compatibility...

Yes, 1.6 GHz using the same or very similar CPU & GPU IP and performance characteristics.

A 1.6 GHz Ryzen isn't going to have even remotely the same CPU characteristics as a 1.6 GHz Jaguar on a per core basis. So it's not as simple as just matching GHz or even core counts.

It's more than likely a laptop APU, but people can dream right? :)

Regards,
SB
 
Yes, 1.6 GHz using the same or very similar CPU & GPU IP and performance characteristics.

A 1.6 GHz Ryzen isn't going to have even remotely the same CPU characteristics as a 1.6 GHz Jaguar on a per core basis. So it's not as simple as just matching GHz or even core counts.

It's more than likely a laptop APU, but people can dream right? :)

Regards,
SB
Its not very likely its laptop APU for single fact that it uses fastest possible GDDR6 chips Samsung can provide. 16GB of GDDR6 as system memory? That is suicidal and actually bad for laptops (latency, costs, TDP, all around pretty bad)
 
Status
Not open for further replies.
Back
Top