PlayStation 4 (codename Orbis) technical hardware investigation (news and rumours)

Status
Not open for further replies.
Three years is going to be a long time on all next-gen console fronts.
 
@Rangers - can you reply to this, genuine question... thanks

Huh? i'm not sure where I supposedly said that (maybe I meant the PS4 has 40% more compute now after the XBO clock bump? Only thing related to ~40% I can think of...)

PS4 and XBO have the same CPU...
 
Huh? i'm not sure where I supposedly said that (maybe I meant the PS4 has 40% more compute now after the XBO clock bump? Only thing related to ~40% I can think of...)

PS4 and XBO have the same CPU...

My bad, I reread your post you were refering to GPU.... :oops:
 
I could see this gen's cycle be slightly shorter actually because the hardware is relatively easier to get into and max out, and cost reduction may go faster this time too. Which is to see it could go back to the more normal 5 year cycle. However, Sony tends to envision a total life-cycle of about 10 years for each of their generations regardless of when the next one starts, so he may just as well be referring to that.
 
The distinction isn't as stark these days. The CUs aren't fully featured cores, but they have a mix of features
My description was vague, for someone that was not familiar with CPU/GPU intricacies... but you raised the interesting question of "what makes a core?"

Probably, what makes me not considering them a core is:
They don't really service interrupts or have much in the way of exception handling and context switching, and their heavier reliance on a separate hardware unit to initialize their kernels
...and the fact that you cannot index them directly - that is, like a SPU (an embedded RISC processor would still be a full processor, even if you cut the outside access, but it is in principle there, at least).

An embedded RISC processor with no access to the outside, would still support interrupt and all the techs that are part of a processor. CUs do not (at least, yet).
 
My description was vague, for someone that was not familiar with CPU/GPU intricacies... but you raised the interesting question of "what makes a core?"
My lowest-level definition is that "core" is a physically distinct path that an instruction stream's associated signals traverse between stages that reside in a generic Von Neumann or related architecture's memory.
This does still require at least one ALU, and at least one port to memory, but exclusivity for either is not absolutely necessary--although it gets iffy once this gets reduced to near-absurdity.

At a bare minimum, there would need to be a physically independent pipeline control unit, a mechanism for generating the program counter, and some physically independent context storage necessary for the the prior two elements to work through an instruction stream.

So, it's a physically separate pipeline control, and at least enough storage for a program counter.
The gray area, since I didn't require ALU exclusivity as an absolute, is if the necessary ALU for generating the PC were somehow shared between two cores. However, that would require an arrangement where an ALU is subject to two schedulers or pipeline control units that are still somehow physically distinct, and I haven't thought up an arrangement where that's workable.

Everything beyond that is increasingly negotiable in terms of sharing or necessity.
Decode? Could be shared, and for some defunct VLIW CPUs, not necessary.
FP? Could be shared, and CPUs predate that functionality.
Cache? Same.

A Von Neumann architecture, in its simplest theoretical implementation, makes no provision for interrupts. The preponderance of all cores in the world do service interrupts, but at least the possibility of a simple system that utilizes no interrupt signals is allowed for, as odd as that might be.

...and the fact that you cannot index them directly - that is, like a SPU (an embedded RISC processor would still be a full processor, even if you cut the outside access, but it is in principle there, at least).
The choice to hide direct indexing of the hardware is more of an API or platform issue. At least from the POV of the GPU's internal firmware, it can make the distinction.
The PS4 in particular may be allowing lower-level access for compute, although what that entails is not disclosed.

In the other direction, the increasing virtualization of execution resources or hardware-level sleight of hand makes it increasingly less likely that the cores software exist or are in use are what it actually gets.

An embedded RISC processor with no access to the outside, would still support interrupt and all the techs that are part of a processor. CUs do not (at least, yet).
Not yet, but the architecture has been drilling down to very few features out of a large list of attributes ascribed to cores.
I think it's close enough to consider them for the most part primitive cores, with some haziness when it comes the scheduling hardware that kicks in at the boundaries of kernel execution--which would include scheduling new contexts or someday potentially switching them.
 
http://realgamernewz.com/15886/play...ained-flexible-if-devs-working-closely-w-sony
The RAM of PS4 will be flexible to some extent, in regard to how much is used for the Operating System and how much is used for the games themselves. While 5.5gb of RAM out of the 8gb in total is set aside from the OS to be used for the game, there is a buffer amount of around 512mb extra which can be applied for by any developer so that 6gb can be used for the game instead.

Any developer choosing this path would have to work closely with Sony as the buffer is likely meant to prevent things from overflowing. However, gamers can rejoice that the PS4 will have this flexible RAM option meaning that it’s likely to see Sony 1st party titles and devs working closely with Sony exceed at their optimal usage of the PS4′s hardware capabilities.
 
I don't think they are shipping the dev kit UI to retail. BTW what exactly does managing your own apps even mean?

Specifically closing apps instead of the system doing so for you may imply some apps can run in the background (or the user is doing memory management on their own). Or maybe it's just a devkit thing.
 
It's the same on iOS, Android and Vita, fyi, the option to actively close an app. Except I think on Vita you can only explicitly close Apps, with a maximum of 6 open at once. And I have heard Sony imply that Vita's OS may be the foundation for the PS3's OS as well.
 
Closing background apps on a console, whats next?
Maybe the reason you only get 5.5GB for Game is that Virus-Killer will be mandatory and there will be a healty market for system restore and cleanu-up utils.
 
http://www.examiner.com/article/geo...ch-ram-you-can-actually-touch?cid=db_articles

Not much above 5 or 6, you start to run out of the amount of actual usable RAM because you can't touch all of it per frame. The actual amount of compute power means that if you're running at say 60 frames per second, there is only so much RAM you can actually touch and do something with per particular frame.

Makes sense as discussed here before.
I don't particularly mind if the PS4 only gets 5~6 GB of ram to the game.

176GB/30= 5.87 GB theoretical max usable BW per frame, and we know it's hard to even come close to this number.

Bandwidth will still be the real limiter.
 
I've never understood that logic personally.
That implies that every bit of data needs to be accessed every frame, or even on average every frame.
 
I've never understood that logic personally.
That implies that every bit of data needs to be accessed every frame, or even on average every frame.

The way I understand it, yes you do need to re-access pretty much everything every frame, even if the data is carried over from the previous frame. Some stuff you won't have to re-write, but if it's needed to render a frame, you will have to read it at some point.

Of course stuff like textures that don't appear in this particular frame won't be accessed.
 
The way I understand it, yes you do need to re-access pretty much everything every frame, even if the data is carried over from the previous frame. Some stuff you won't have to re-write, but if it's needed to render a frame, you will have to read it at some point.
There's a lot more in memory than just graphics, and even not every texture in memory would be displayed all the time, every frame.
 

http://www.examiner.com/article/geo...ch-ram-you-can-actually-touch?cid=db_articles



Makes sense as discussed here before.
I don't particularly mind if the PS4 only gets 5~6 GB of ram to the game.

176GB/30= 5.87 GB theoretical max usable BW per frame, and we know it's hard to even come close to this number.

Bandwidth will still be the real limiter.
So 5.5-6GB is available for games?
 
Status
Not open for further replies.
Back
Top