PlayStation 4 (codename Orbis) technical hardware investigation (news and rumours)

Status
Not open for further replies.
Regardless, 8 ACE vs 2 ACE you still are operating with 18 CU. That isn't changing. 64 vs 16 threads is really about the number of compute threads available. It's not like 64 threads can do more work than 1 thread, if the 1 thread manages to fully saturate.
 
Regardless, 8 ACE vs 2 ACE you still are operating with 18 CU. That isn't changing. 64 vs 16 threads is really about the number of compute threads available. It's not like 64 threads can do more work than 1 thread, if the 1 thread manages to fully saturate.

Yes I understand this too but it seems there is more opportunity to use more thread on PS4 or kaveri architecture. When the graphical or compute thead are not using all available ALU.

I never said that it means for same envelope power you can use 4 times more thread. It is just ridiculous. I just said that it is more efficient on PS4 or Kaveri architecture and you can have more threads if the limit of your workload is cache trashing. I don't like simplification.

I know work with fixed hardware and console API is an advantage but ROP, TMU or bandwith are not more efficient on console.

Edit: And for multiplatform title the limit will be 16 thread...
 
Last edited:
I agree with Shifty.
Woah, there! I didn't make any conclusion. I just observed that 8 ACEs is a choice and as yet, unproven. It might be a golden choice, might be overkill. The fact AMD have rolled it into subsequent architectures suggest they think it might be more useful than not. Not sure what the overhead versus 2 ACEs is though. If it's utterly trivial to implement, no reason not to add it even if in 5 years everyone drops back to 2 ACEs because these saturate the ALUs. The future's unknown and it's for the devs to work out how to use async compute. Once they've sussed it, graphic IHVs will build more optimal designs.

Edit: And for multiplatform title the limit will be 16 thread...
Not necessarily. I'd even say likely not. Threading is hardware independent - you don't know how many cores your target hardware has in cross-platform titles. That'll go for GPU and well as CPU workloads. Threading now revolves around jobs and a scheduler managing across available resources.
 
Woah, there! I didn't make any conclusion. I just observed that 8 ACEs is a choice and as yet, unproven. It might be a golden choice, might be overkill. The fact AMD have rolled it into subsequent architectures suggest they think it might be more useful than not. Not sure what the overhead versus 2 ACEs is though. If it's utterly trivial to implement, no reason not to add it even if in 5 years everyone drops back to 2 ACEs because these saturate the ALUs. The future's unknown and it's for the devs to work out how to use async compute. Once they've sussed it, graphic IHVs will build more optimal designs.

Not necessarily. I'd even say likely not. Threading is hardware independent - you don't know how many cores your target hardware has in cross-platform titles. That'll go for GPU and well as CPU workloads. Threading now revolves around jobs and a scheduler managing across available resources.
:oops::oops::oops::oops:
lol whoops, haha.. sorry! Your post seemed to articulate perfectly what I could not say; which was provide additional threads if for whatever a developer wanted the flexibility of doing it that way, future proofing essentially.
 
May I ask that how does the "deep color" function work in PS4? I can see "12 bit" in my TV when I use deep-color mode, but how does PS4 creates 4 more bits for each primary color and which processor creates additional bits information? What's the major difference in deep-color mode? Thank you!
 
May I ask that how does the "deep color" function work in PS4? I can see "12 bit" in my TV when I use deep-color mode, but how does PS4 creates 4 more bits for each primary color and which processor creates additional bits information? What's the major difference in deep-color mode? Thank you!

pretty sure it doesn't have to "create" more bits, it just allows color reproduction to be more accurate, os instead of snapping to one of the colors in the 0-255 (or really 16-235) range, it uses a range of 0-4095 per channel.
 
pretty sure it doesn't have to "create" more bits, it just allows color reproduction to be more accurate, os instead of snapping to one of the colors in the 0-255 (or really 16-235) range, it uses a range of 0-4095 per channel.
0-255 (or 16-235) requires 8-bits per R, G and B channel. 0-4095 would require 12-bits per R, G and B channel. There are three standard deep colour standards of 30, 36 and 48 comprising 10-bits per channel,12-bits per channel and 16-bits per channel.

So the question remains how is this increased colour depth produced. Surely the games need to be using a wider colorspace than 16.7m (24-bit).
 
0-255 (or 16-235) requires 8-bits per R, G and B channel. 0-4095 would require 12-bits per R, G and B channel. There are three standard deep colour standards of 30, 36 and 48 comprising 10-bits per channel,12-bits per channel and 16-bits per channel.

So the question remains how is this increased colour depth produced. Surely the games need to be using a wider colorspace than 16.7m (24-bit).
The colors are probably calculated internally as floats, and then simply rounded to the nearest value fits whatever color space the device is outputting in.
 
With the backbuffers as FP formats, they can convert to whatever output colour space. But even if the back buffer is simply RGB8 bit colour, they can just upscale to the deep colour. I doubt anyone could see the difference anyway in most game situations. Use of lower bit assets is going to introduce more artefacts than 24 bit colour space.
 
I wonder whether the new HDR tv's will really be all people are hyping them to be, and how this might affect whatever the consoles output to them.
 
Games would have to be written to support HDR out. Should be quite easy, just changing the final tone-mapping, but it's something that'll likely not see much effort applied. Supporting a tiny niche of displays is never a high priority.
 
But that's my doubt exactly. Is it just a matter of shifting the colour gamut, or is it something more complicated? Also I just can't see how a non-OLED screen can actually be HDR. On LCD all the gains from the supposed increase in brightness will be lost by even worse black levels than they have now.
Skeptical to say the least.
 
Locally dimming (LED array) works on LCDs, to some extent. That was actually the first solution to HDR displays. Overall though I don't see the point of Deep Colour for display - it has value in content authoring. More of a marketing gimmick to try and differentiate and worth little to the consumer.
 
With the backbuffers as FP formats, they can convert to whatever output colour space. But even if the back buffer is simply RGB8 bit colour, they can just upscale to the deep colour. I doubt anyone could see the difference anyway in most game situations. Use of lower bit assets is going to introduce more artefacts than 24 bit colour space.
At least those lower bit assets can be displayed more accurately with the higher bit depth. Lower bit assets being displayed in lower bit color space can cause a lot more noticeable banding.
 
Locally dimming (LED array) works on LCDs, to some extent. That was actually the first solution to HDR displays. Overall though I don't see the point of Deep Colour for display - it has value in content authoring. More of a marketing gimmick to try and differentiate and worth little to the consumer.
Most modern LCD panels use PWM for lighting/dimming and it wouldn't be too difficult to exploit PWM at a sufficiently high frequency to fairly effectively mimic the effect of HDR. We stumbled upon this while trying to fix a faulty backlight on our custom 120" monitors* which was weirdly strobing and we figured we could rig a separate clock for the backlight.

*now being patented as a widebeam anti-epileptic phase cannon :yep2: When we get into war with epileptic aliens, we have the ultimate weapon.
 
Sorry, pulsed width modulation. Backlights are almost never a constant brightness, the backlight is often mildly pulsing but to a degree that is rarely perceptible. Even less so where the backlight is regionalised to improve contrast and black levels.

Of course when you introduce an external clock... :runaway:
 
Ah ok. But last time I checked, local dimming LEDs are still very expensive and still have that usual issue of having too few LEDs, causing obvious issues.
 
Locally dimming (LED array) works on LCDs, to some extent. That was actually the first solution to HDR displays. ....

and still is ... LED arrays probably just have better resolution these days

I read some info about if from CES2015 and asked some questions and got this cryptic answer:

"The TV isn't going to be at 1000nits all the time, it will just peak in certain parts of the image, highlighting the brighter elements. In order to achieve this, what some manufacturers are doing is channelling more power to the brighter parts of the image and less to the darker parts, which should help with the blacks as well as make the TV more energy efficient."
 
Status
Not open for further replies.
Back
Top