PlayStation 4 (codename Orbis) technical hardware investigation (news and rumours)

Status
Not open for further replies.
If so, and that sounds reasonable, I'd like to hear Sony's rationale for 1.6 GHz instead of 1.8. How much cost or volume would the higher clock have added to the console?
 
It may be, as 3dilletante has speculated before that ~1.6 Ghz is where the "knee" of the curve is. And once you get past that point powerconsumption and thus heat can start to increase quite rapidly, even with just a small change in clock.

Not for ALL chips, but for enough of them that it might impact the useable yield when taking into consideration your thermal limits for the cooling you've implemented for the console. So for some of them, perhaps a 100 Mhz increase in clock only results in a 2-3 watt increase in power consumption and heat production. But for others perhaps it increased results in far larger power consumption and heat production, such that either noise become higher than desired or they would have had to redesign the cooling system to accommodate those X percentage of chips.

Especially when you consider that variations in each chip means that "knee" is optimally placed at different points. For one chip the knee might be around 1.5 Ghz, and so at 1.6 Ghz you're now hitting the abosolute limit of the cooling solution with regards to noise production of that cooling system. While a chip with it's "knee" at 1.6 Ghz might have room to clock a bit higher. And a chip with the knee at 1.7 Ghz could safely operate at higher than 1.6 Ghz.

Hence, whatever clock they choose is going to be whatever clock allows the most chips to be used with whatever cooling solution and noise cap they have chosen.

This was something you can observe in early Radeon 7970 overclocking. Some cards could easily do 200 Mhz overclocks at standard voltages with a minimal increase in heat. Some cards couldn't even manage a 50 Mhz overclock with a voltage increase and heat ramping up so much that it required the fan to run twice as fast to try to maintain safe operating temperatures. While the vast majority of cards could easily do a 100-150 MHz overclock with just a slight increase in heat production at stock voltage.

Regards,
SB
 
If the PS4 'near final' drivers were dropping performance by a good 10%, I also expect Sony to up the clocks. As it stands now I heard that final consoles are being produced, even packaged as we speak. Apparently the OS is still being developed and will require a day 1-update, but this update will be supplied on all launch retail games as well, the deadline is somewhere in October.
Skype is not 100% certain to launch, or appear even. But it would be great regardless because this will mean cross-platform chat with friends.
 
Anyone has info on the low level memory controllers of GDDR5 used in PS4?

My question is motivated on so many features like "write data masking" that saves bandwith on read-modify-write workloads, Error Correction, dual-port simulation with dual memory page access (read/write parallelism?). Anyone know if the PS4 OS is leveraging performance by using such features? Maybe in the future ?

For latency an Synopsis article links 8N-prefetch a good match with CPU 64byte cacheline so in a granularity metric GDDR5 has no problems. Then for timings there is data in GDDR5 wikipedia talk page like:

GDDR5 timings as provided by Hynix datasheet: CAS = 10.6ns tRCD = 12ns tRP = 12ns tRAS = 28 ns tRC= 40ns
DDR3 timings for Corsair 2133MHz 11-11-11-28 CAS = 10.3ns tRCD=10.3ns tRP= 10.3ns tRAS =26.2ns tRC=36.5ns

http://www.cse.psu.edu/~juz138/files/islped209-zhao.pdf Memory Clock = 2.5GHz tRAS = 22ns tCL = 8ns tRP = 8ns tRC = 30ns tRCD = 8ns tRRD = 5ns Off-chip GDDR5, 2GB, Bandwidth = 320GB/s

http://www.synopsys.com/Company/Publications/SynopsysInsight/Pages/Art5-ddr4-IssQ2-13.aspx

http://blogs.utexas.edu/jdm4372/2010/10/07/opteron-memory-latency-vs-cpu-and-dram-frequency/

http://blogs.utexas.edu/jdm4372/2011/03/10/memory-latency-components/
 
Last edited by a moderator:
If the PS4 'near final' drivers were dropping performance by a good 10%, I also expect Sony to up the clocks.
Where have you seen/heard that, remember in your dreams doesn't count as a valid source.

We've heard about a recent driver for the other team dropping performance but not Sony as far as I remember.
 
Where have you seen/heard that, remember in your dreams doesn't count as a valid source.

We've heard about a recent driver for the other team dropping performance but not Sony as far as I remember.

That's what I was saying, there is some thermal headroom in the design, after all, they didn't design PS4 to have the fans running at 6000rpm 100% of the time.
Now if there was a performance deficit, I am pretty sure that they could manage a 5-10watt energy increase, no?
 
Last edited by a moderator:
That's what I was saying, there is some thermal headroom in the design, after all, they didn't design PS4 to have the fans running at 6000rpm 100% of the time.
Now if there was a performance deficit, I am pretty sure that they could manage a 5-10watt energy increase, no?

But all the evidence points to the opposite. Sony have quite the performance lead and if anything, according to that edge article and other dev comments, their drivers and software is in a much better condition than the opposition.
 
Correct me if I'm wrong here, but if they bump up the CPU don't they have to up the GPU speed as well, just like Xbox One?

I was under the impression that for GPGPU operations it was most effective when the GPU clock was half that of the CPU.

So if they were to bump the CPU up to 1.8Ghz (200Mhz) that would entail a 100Mhz increase on the GPU which is more power hungry\hot than Xbox One's GPU.. And in a much smaller case.

2Ghz clock would mandate a 1Ghz GPU and so on.
 
If so, and that sounds reasonable, I'd like to hear Sony's rationale for 1.6 GHz instead of 1.8. How much cost or volume would the higher clock have added to the console?

I thought, that the chip ran at its most efficient - power vs heat wise - at 1.6. It can easily run at 2.0, but will be much hotter in that case. There was a graph posted some time ago (quite some time ago, over 6 months I think?)
 
http://www.dualshockers.com/2013/09...specification-of-ps4-and-its-exclusive-games/
The second and more mysterious [conference] will be held on September the 19th at 10:30 AM local time, and will be titled “The World Created by Playstation 4”....

Sony Computer Entertainment Japan and Asia finally provided a brief blurb on what the conference will entail, and it’s quite interesting:

“Playstation 4” is the next generation computer entertainment system which will be released by Sony Computer Entertainment towards the end of this year, 2013. Detailed specification of this latest system and its exclusive software titles and the new gaming experience that can be realized with the “Playstation 4” platform will be introduced together with the latest information.
That might be simply a recap of everything for the Japanese audience, or it might be more interesting.
 
I thought, that the chip ran at its most efficient - power vs heat wise - at 1.6. It can easily run at 2.0, but will be much hotter in that case. There was a graph posted some time ago (quite some time ago, over 6 months I think?)
It does, I accept that. But why not run it a little less efficiently and a little more powerfully? As a plugged-in box, efficiency isn't all that important versus a laptop taking an AMD APU.
 
It does, I accept that. But why not run it a little less efficiently and a little more powerfully? As a plugged-in box, efficiency isn't all that important versus a laptop taking an AMD APU.

Orbis may not have as much headroom.

We have factors like the undisclosed TDP, the large GPU, the higher-power memory bus, the smaller enclosure, and unknown manufacturing choices.

There's going to be a TDP ceiling, which the GPU and GDDR5 bus will go some way towards filling before considering what the CPU will do.
The last bit of info we have seen leaked is a downgrade of uncore bandwidth and the reduction in GDDR5 speed.

At the very least, that choice doesn't support the idea that they had a lot of headroom.
It may also be the case that Sony pulled the trigger after a certain number of respins instead of waiting for additional manufacturing improvements. Maybe the chip's physical improvements weren't as solid or consistent, or Sony had higher yield targets.


Perhaps we'll see a tablet version of the PS4?

I think we're at least a good 100W away from seeing that happening.
 
I thought the reason for the downgrade in BW was only really because they were pretty much forced to go with 5.5 gbps modules from Hynix if they wanted 4gb capacity per chip. It's just what's available now.

I'm not sure we can state whether or not TDP is a contributing factor yet until we have the system in our hands to do an analysis on it.
 
Perhaps we'll see a tablet version of the PS4?
Joking aside... I never understood why sony did not went with integrating cost-reduced versions of their consoles with their Smart TV-Line.
A decently big TV should be capable of dealing with 150W heat and the "smart tvs" could just use the console hardware for their TV and media duties (theres atleast a rather large overlap).
Dont see why there wouldnt be a market there for a TV with the PS3 PSN titles as "apps"
 
PlayStation 4 (codename Orbis) technical hardware investigation (news and rum...

Orbis may not have as much headroom.

We have factors like the undisclosed TDP, the large GPU, the higher-power memory bus, the smaller enclosure, and unknown manufacturing choices.

There's going to be a TDP ceiling, which the GPU and GDDR5 bus will go some way towards filling before considering what the CPU will do.
The last bit of info we have seen leaked is a downgrade of uncore bandwidth and the reduction in GDDR5 speed.

At the very least, that choice doesn't support the idea that they had a lot of headroom.
It may also be the case that Sony pulled the trigger after a certain number of respins instead of waiting for additional manufacturing improvements. Maybe the chip's physical improvements weren't as solid or consistent, or Sony had higher yield targets.

Also, the power supply is internal again, unlike the XboxOne's.

I think we're at least a good 100W away from seeing that happening.

Well, the PS4 uses that in total, so that's maybe overstating it slightly. The iPad 3 is at 40W I think. The screen uses a lot of that power, so a screen with far better power efficiency than that particular IPS (which exist) could give slightly over 30W to the chipset. The device would use solid state memory, which uses less power than the internal HDD of the PS4 and its BluRay drive, etc. So I would actually expect it to be possible before the end of its life cycle?
 
I thought the reason for the downgrade in BW was only really because they were pretty much forced to go with 5.5 gbps modules from Hynix if they wanted 4gb capacity per chip. It's just what's available now.

I'm not sure we can state whether or not TDP is a contributing factor yet until we have the system in our hands to do an analysis on it.

The Vgleaks article about Orbis over time had the downgrade happening before the capacity went up.
What's potentially interesting is that no speed grades above 5.0Gbps have the option for 1.35V until the introduction of 5.5Gbps chips.
Maybe there's a corner case where 5.5 at 1.35V somehow yields better than 6.0 at 1.5V at 4Gb density, but the latter speed and voltage at 2Gbit would have been more established.
 
Joking aside... I never understood why sony did not went with integrating cost-reduced versions of their consoles with their Smart TV-Line.
A decently big TV should be capable of dealing with 150W heat and the "smart tvs" could just use the console hardware for their TV and media duties (theres atleast a rather large overlap).
Dont see why there wouldnt be a market there for a TV with the PS3 PSN titles as "apps"

IIRC, they did have a line of Sony TVs with built-in PS2 a long time ago, though it never did sell much.
 
Status
Not open for further replies.
Back
Top