PlayStation 4 (codename Orbis) technical hardware investigation (news and rumours)

Status
Not open for further replies.
The medium prevents the practice of completely slamming songs to the max when bring mixed Ie. Loudness Wars. So as a by product of this, vinyl versions can actually sound better..

True. The RIAA filter boosts the bass end of the spectrum by 20 dB (and dampens the high frequency end), which ensures good old fashioned rock and roll + dance music actually has a beat when you turn up the volume,

That has more to do with production/mastering than the physical merits of the medium though.

Whenever you have the choice of an original CD and a digitally remastered one, always get the original. The digitally remasterd one has just been through a multi-band volume compressor.

Cheers
 
Last edited by a moderator:
I can deal with the possibility that Sony may reserve a core for the OS. But if they are disabling a cpu core -- of a tiny low power cpu -- for yields that is pathetic. I could understand if the CPU was even midrange, but it is a wimp already.
I thought 8 core CPU for PS4 is official ? Or Sony counted disabled core for official press release ? this make no sense
 
I can deal with the possibility that Sony may reserve a core for the OS. But if they are disabling a cpu core -- of a tiny low power cpu -- for yields that is pathetic. I could understand if the CPU was even midrange, but it is a wimp already.

Overreacting much? All we've seen in the presentations so far is that the whole layout was designed for 8 cores. There are so far zero indications that the system will have less than that. Digital Foundry was pretty clear in specifying that from the Killzone devkit shots, we cannot assume anything other than that the MINIMUM amount of available Cores for GAMES is 6. Guerilla may be using the 7th for networking and audio for all we know, and simply not be showing that in their own, custom performance tools (which they created because they were working on Killzone before Sony had performance tools ready). Who knows they performance tool itself runs on a separate core.
 
Some people need to relax, they wouldn't have got on stage in February and announced the PS4 to the world as an 8 core CPU if they weren't sure they could do it.

The PS3 was going to have an extra SPU, but it was announced prior to it's reveal that one would be disabled to account for yield issues.

If any Jaguar CPU has a faulty core, it will be diagnosed and rejected in the factory before going inside the box.

..an announcement now,by Sony that suddenly their new baby has just lost one of its cores would be a PR disaster for them, they would be torn a new one all over the web.

it's an 8 Core CPU that will release as an 8 core CPU...end of story.

Maybe Guerilla had their own profiler running on core 7, with the 8th reserved.
 
As they made the demo prior to the audio chip and other background chips it's very likely that arwin is correct. Although why we are answering questioned raised by babcat is anyones guess he's basically trolling this thread with his outrage at every step.
 
Indeed, don't respond to babcat performance whinging (not that there'll be any more). The power he wants will come from a PC, not a console (at normal consumer pricing). That 5% of GPU time spent on GPGPU is a good console efficiency if it's as much as that, I say as one of those most vocal in thinking there was little GPU idle time to be exploited. Although we are still missing some decent estimations as to how many spare flops there are in a typical game on GCN which would be great to get a better picture of how the APU+ACE design benefits the system.
 
Indeed, don't respond to babcat performance whinging (not that there'll be any more). The power he wants will come from a PC, not a console (at normal consumer pricing). That 5% of GPU time spent on GPGPU is a good console efficiency if it's as much as that, I say as one of those most vocal in thinking there was little GPU idle time to be exploited. Although we are still missing some decent estimations as to how many spare flops there are in a typical game on GCN which would be great to get a better picture of how the APU+ACE design benefits the system.

I can vouch for that with the warning that I got for speaking of this when no one else here understood what I was trying to tell y'all.
 
I can vouch for that with the warning that I got for speaking of this when no one else here understood what I was trying to tell y'all.

That's because, if you are given the benefit of the doubt that you really did know what you were talking about, you were expressing your ideas poorly. Poorly enough that it caused many to determine you had no idea what you were talking about. The fact that many of your rebuttals were largely comprised of copy/pastes of the statements of others that in some cases completely failed to address the issues that were being raised was a big part of this.
 
I can vouch for that with the warning that I got for speaking of this when no one else here understood what I was trying to tell y'all.
Because of your choice of words (or lack of them) and repetitious quoting without explaining what those quotes were supposed to be supporting, and illogical arguments with vague hand-waving numbers and ideas...
 
Because of your choice of words (or lack of them) and repetitious quoting without explaining what those quotes were supposed to be supporting, and illogical arguments with vague hand-waving numbers and ideas...

I was replying to different people so I used the quotes over again to make it clear that they was talking about running compute & graphics on the GPU.

I even posted a link explaining that it could be compute running at times when the graphic code is waiting.


http://forum.beyond3d.com/showpost.php?p=1723276&postcount=1068

I think I found an explanation the compute code will run at times when the graphic code is waiting.


http://cs.brown.edu/courses/cs168/f12/handouts/async.pdf
 
so you can use the full 1.84TFLOP for graphics & still run physics & other compute tasks on the GPGPU as long as the tasks are not blocking one another.

It came down to this line. It indicated a fundamental misunderstanding of what that 1.84TFLOP number as a maximum meant. In any given second, if any of the compute resources were used for anything but computing graphics and in fact even if any of them were idle for even a single clock cycle during that second then you wouldn't be using the full 1.84TFLOPs for graphics.
 
Temp ban. When he comes back, as long as he doesn't post OT "I want more power!" statements in every console thread, he can carry on.
 
What are the chances that Sony will use 3.5" HDD in PS4?

As far as I can see, 2.5" are maxed at 1TB. That doesn't seams as much these days.
 
What are the chances that Sony will use 3.5" HDD in PS4?

As far as I can see, 2.5" are maxed at 1TB. That doesn't seams as much these days.

I see a pretty large shift in the industry at large to 2.5" for a pretty wide variety of applications (on the enterprise drive side, at least). Not to mention the physical differences, which will likely be quite important for a console, its airflow and its design. I suspect the chances are very low for a 3.5" HDD. I'm just hoping for SATA3 support. GG's current impressions of the drive is that "it's very fast", but they also note that can change for the final retail release.
 
as long as he doesn't post OT "I want more power!" statements in every console thread, he can carry on.
I'm not even sure what he was upset about, AFAIK there hasn't been a word spoken about any disabled cores in PS4, and he goes on a ranting spree that there'll be only 6 active... I'm like, "whuh?" :)
 
Digital Foundry was pretty clear in specifying that from the Killzone devkit shots, we cannot assume anything other than that the MINIMUM amount of available Cores for GAMES is 6. Guerilla may be using the 7th for networking and audio

Audio and networking code is not part of gaming code anymore?
 
Overreacting much? All we've seen in the presentations so far is that the whole layout was designed for 8 cores. There are so far zero indications that the system will have less than that. Digital Foundry was pretty clear in specifying that from the Killzone devkit shots, we cannot assume anything other than that the MINIMUM amount of available Cores for GAMES is 6. Guerilla may be using the 7th for networking and audio for all we know, and simply not be showing that in their own, custom performance tools (which they created because they were working on Killzone before Sony had performance tools ready). Who knows they performance tool itself runs on a separate core.

Audio is covered in those 6 worker threads according to the PP. As previously mentioned in a different interview, they hadn't shifted their audio code to the DSP at the time of the demo and that seems to reflect in their postmortem.
 
which makes me much more curious about the performance gains that will be enabled simply from transitioning from dev kit apus to final retail ps4 apus. Cerny talked up a host of customizations on that front in regard to final retail kits being completely non utilized for launch games.
 
Status
Not open for further replies.
Back
Top