PlayStation 4 (codename Orbis) technical hardware investigation (news and rumours)

Status
Not open for further replies.
Why would he? It's part of the system irrelevant to gamers, and irrelevant to devs, other than the API they use to interact with it.

If it's a 400MHz ARM, or a 600MHZ ARM, will that make _anything_ in the system different for the purpose of making games? No. I would be very surprised if the developer documentation said anything about this chip other than it exists, and how you interact with it.

The only reason the forum warriors are interested in the ARM chip is because they believe it will house the full OS and allow Sony to completely allocate all the main resources to the game. If it did something like this, it would have been called out in the reveal or the interview. The guy talked about the number of compute queues, for goodness sake.

It obviously won't allow Sony to allocate all resources to the game. However, by knowing the specs we might get some idea of just how many resources might be allocated to the game. Also, it would be interesrting, for those who like casual games, if the chip could run some low powered games like Angry Birds. For example, you could have the main game playing and a weak game in the background. Also, I would like to know if any extra processing power has been added to the GPU. There are tons of rumors about bespoke processing units and extra ALUs. All it would take is a sentence or two to put these issues to rest.
 
That was a horrible interview, imo. There were many questions that were not asked!
Most were answered by their silence. more ALUs on the GPU is a no, otherwise the FLOPs rating would be higher. GPU on the system chip with the ARM is also a no, otherwise it'd have been mentioned (at least nothing substantial, though there may be some display hardware of sorts, maybe, though I seriously doubt it). There's nothing outstanding that needs to be known to understand the system and Sony's objectives.
 
Does that mean the audio, video and zlib units are on separate chips, not inside the SoC?
That kind of confirm the south bridge doesn't have it's own memory, it can use the GDDR5 (maybe they throttle it really slow?).

Yap ! It addresses the memory use for the custom chip. He didn't talk about how the BR data is copied to the HDD though (e.g., whether and how the custom CPU is involved).

We will be able to find out if the dedicated units are on-board via teardown. He flagged the audio decoder's performance (rumor says 200,000 MP3 streams, 'bout half of 1 SPU). It may answer a bunch of questions, including say... whether the video decode/encode can be addressed by the custom CPU during low(er) power mode.

More interesting is his confirmation that a SPURS-like framework is being baked in hardware/firmware. The techniques he mentioned sound a lot like the "coupled" SPU + RSX jobs on the PS3.
 
there may be some display hardware of sorts, maybe, though I seriously doubt it).
Yeah, doesn't seem likely; how would you get the image out of the south bridge anyway, with display interface hardware very likely integrated into the APU? ...You'd have to wake that up too just to display anything, so why even bother with putting graphics anywhere else?
 
It obviously won't allow Sony to allocate all resources to the game. However, by knowing the specs we might get some idea of just how many resources might be allocated to the game. Also, it would be interesrting, for those who like casual games, if the chip could run some low powered games like Angry Birds. For example, you could have the main game playing and a weak game in the background. Also, I would like to know if any extra processing power has been added to the GPU. There are tons of rumors about bespoke processing units and extra ALUs. All it would take is a sentence or two to put these issues to rest.
So, 8 years after the release of the 360, and 7 years after the PS3. Can you tell me how much memory is reserved for the PS3 or 360? Not second or third hand reports from a dev, but a statement by the console maker to the public, or even a gaming site. What clock speed does the XMA Decoder on the 360 run at? Not even developers know this.

Something as simple as memory reservation, and neither console maker has ever publicly spoken about it, what makes you think they would speak about similar stuff _before_ the release of the console? You're asking for specs on parts of the console that most likely won't even be accessible to developers. There's no reason for Sony to ever reveal the full specs for it. If you had asked those questions of Cerny, you would almost certainly have gotten the reply "We aren't talking about that". For all you know, Gamasutra did ask some of those questions.
 
interview wasnt that great, he seemed fairly reticent.

overall it seems to confirm there's no 14+4 setup.

interesting on the unified vs edram comments. the only problem is i susect ps4 is going to be fairly costly. he basically said yeah, edram is cheaper but unified is better and we went with the latter. it's interesting he was considering still using gddr5 with edram, rather than ddr3 with edram. so, the only thing saved would have been bus width? but then it also would have made 8GB impossible, right?

i think having a western designer saved the ps4...
 
So, 8 years after the release of the 360, and 7 years after the PS3. Can you tell me how much memory is reserved for the PS3 or 360? Not second or third hand reports from a dev, but a statement by the console maker to the public, or even a gaming site. What clock speed does the XMA Decoder on the 360 run at? Not even developers know this.

Something as simple as memory reservation, and neither console maker has ever publicly spoken about it, what makes you think they would speak about similar stuff _before_ the release of the console? You're asking for specs on parts of the console that most likely won't even be accessible to developers. There's no reason for Sony to ever reveal the full specs for it. If you had asked those questions of Cerny, you would almost certainly have gotten the reply "We aren't talking about that". For all you know, Gamasutra did ask some of those questions.

I have read articles on websites where the reserved memory of the PS3 was specified. I do not remember the numbers off hand.

My question to you is this...

Why would it be so damaging for Sony to reveal the specs of their system?

Im sure all the devs will know it...

Im sure microsoft will find out....

Why leave the consumer in the dark? If their competitor will know -- either from devs telling the. or hardware tear downs -- what is so horrible about letting the public know?
 
interview wasnt that great, he seemed fairly reticent.

overall it seems to confirm there's no 14+4 setup.

interesting on the unified vs edram comments. the only problem is i susect ps4 is going to be fairly costly. he basically said yeah, edram is cheaper but unified is better and we went with the latter. it's interesting he was considering still using gddr5 with edram, rather than ddr3 with edram. so, the only thing saved would have been bus width? but then it also would have made 8GB impossible, right?

i think having a western designer saved the ps4...

I think it killed the PS4 for hardcore gamers that want photorealism. But for all the casual gamers, you are probably right.
 
I have read articles on websites where the reserved memory of the PS3 was specified. I do not remember the numbers off hand.

My question to you is this...

Why would it be so damaging for Sony to reveal the specs of their system?

Im sure all the devs will know it...

Im sure microsoft will find out....

Why leave the consumer in the dark? If their competitor will know -- either from devs telling the. or hardware tear downs -- what is so horrible about letting the public know?
Don't ask me. I'm just pointing out that 7 years into a console lifecycle, there are still unknown things (to end-users, and developers in some cases) about both current consoles. The memory reservation numbers are covered by NDA, so the numbers you've seen are hearsay, rumors or extrapolations from the few public docs out there. I'm afraid if you believe Sony is ever going to announce the specs of the ARM chip, or detailed specs on their audio processor, or how much the memory reservations are, you're probably going to be disappointed. Some of that we'll glean from leaked docs and hearsay (like memory reservations), and some of it we'll never fully know (Like the clock speed for the XMA decoder). I'd be inclined to put the ARM chip in the latter category.
 
There are no details of the ARM chip in the developer docs, because developers have no direct access to it. The same is true of a number of other functional blocks.
 
I have read articles on websites where the reserved memory of the PS3 was specified. I do not remember the numbers off hand.

My question to you is this...

Why would it be so damaging for Sony to reveal the specs of their system?

Im sure all the devs will know it...

Im sure microsoft will find out....

Why leave the consumer in the dark? If their competitor will know -- either from devs telling the. or hardware tear downs -- what is so horrible about letting the public know?

Because it doesn't matter and in the end no one really cares? I don't think any of us would have gotten better answers than Gamasutra "in ten minutes with Cerny," just my humble opinion.
 
The things they refused to talk about still haven't been leaked yet.
Details about the OS, the south bridge, and the security, are obviously on a need to know basis.

I can't argue that, as a gamer, I would need to know any of it when even devs don't know, but heck would I love to know :smile:
 
Any dev on this board with experience of a US architecture will be well capable to tell us exactly how much idle time the GPU experiences (Sebbbi? Joker?) that could be spent on compute. Until they interject, I'm inclined to believe that a GPU's ALUs are rarely idle between jobs, and any compute will be taking away from the graphics task that could be done instead. It'll be a compromise use of resources as ever.
There are often times when ALUs are idle. Sometimes it's while rendering shadow maps which often don't have heavy vertex shaders and have no pixel shaders. Other times you might be limited by blending for translucency, etc. rather than shaders. If you have ALU heavy compute work it could be very efficient to run it in parallel with certain graphics workloads.

The proof of this situation is furmark. Before the days of power containment it lit up the shader core more than a normal graphics workload showing that games often leave units idle for periods of time.

Compute should indeed help with advanced graphics optimisation (although then is it really compute, or graphics work? Is it not time to give up on the distinctions entirely?).
The distinction between compute and graphics is graphics feeds the rasterizer while compute feeds a data structure in memory. Until the rasterizer is accessible from the shader core there will continue to be a distinction at the hardware level. The distinction from a software viewpoint will likely remain until there is no API.
 
There are often times when ALUs are idle. Sometimes it's while rendering shadow maps which often don't have heavy vertex shaders and have no pixel shaders. Other times you might be limited by blending for translucency, etc. rather than shaders. If you have ALU heavy compute work it could be very efficient to run it in parallel with certain graphics workloads.

The proof of this situation is furmark. Before the days of power containment it lit up the shader core more than a normal graphics workload showing that games often leave units idle for periods of time.

yeah, and furmark makes your gpu a furnace too. seriously. you can run it in a window and watch your temps skyrocket. hence one reason why amd would refer to it as a "power virus" and the like.

these optimization things sound good but there is that caveat, nothings free.
 
The software details will be hard to figure out. In fact, they will likely change over time. The low level APIs may be simplistic, or even missing or delaying some advanced features at the beginning.

The hardware details are easier to guess after teardown sites and other technical sites take a close hard look at the system.
 
My question to you is this...
My question to you is this...

What does Sony GAIN from giving out that info? If they gain nothing, don't bother. You only ever see the specs situation from your own curiosity. You need to just accept it's not going to happen and stop wishing for it or arguing in favour of full reveals. Compare every other CE device and there aren't full specs (what processor and clockspeed does your camera or PVR use, for example). It doesn't happen. Specs are only given out where they are believed to offer a marketing advantage.

Just give it up already!
 
There are often times when ALUs are idle. Sometimes it's while rendering shadow maps which often don't have heavy vertex shaders and have no pixel shaders.
Why doesn't that consumer all the ALUs for a very short time then? I'd have thought, in my naivety, that vertex work saturates the ALUs for a brief moment to churn through those jobs, and then the ALUs sit on the pixel work for much of the scene churning through that work, both pixels and then post processing. In my understanding of US GPUs, there's not going to be a moment when a few ALU pipes are active and the rest are idle, or when the whole set of ALUs are idle waiting for something to do.
 
I remember a while back someone mentioned 32 alus was perhaps overkill and would never be used to their fullest with this information does it now make sense?

A bit of extra resource so compute can always have some alu resources to play with?
 
That was not a great interview. Only devs and first party teams will have more detailed info. It is not required by gamers to know or an average consumers so I don't think it matters.

PS4 has got big thumbs up from devs so that is a great start unlike PS3.
 
Status
Not open for further replies.
Back
Top