PlayStation 4 (codename Orbis) technical hardware investigation (news and rumours)

Status
Not open for further replies.
Could the CD thing be laser related, the laser in the BD drive does not have to be able to read CDs? Could you save money by doing that?
 
Could the CD thing be laser related, the laser in the BD drive does not have to be able to read CDs? Could you save money by doing that?
They said the drive can read CD, but the system cannot playback Audio CD.
(so it's because of software? maybe it can still rip?)
 
Could the CD thing be laser related, the laser in the BD drive does not have to be able to read CDs? Could you save money by doing that?

I wonder. The PS3 diode appears to have 3 individual diodes (one for each wavelength) grouped together in "Common Cathode Can" container (according to that link). Dropping the infra-red diode could potentially save money, I suppose, but the problem is we know it reads CDs, therefore we know it already has the right diode for that. Doesn't make much since with the limited info we have.
 
I wonder. The PS3 diode appears to have 3 individual diodes (one for each wavelength) grouped together in "Common Cathode Can" container (according to that link). Dropping the infra-red diode could potentially save money, I suppose, but the problem is we know it reads CDs, therefore we know it already has the right diode for that. Doesn't make much since with the limited info we have.
Not just the diode, you also significantly simplify the lens assembly. If you dropped the red-laser DVD diode too, you could make a much slimmer, much cheaper drive.
 
Not just the diode, you also significantly simplify the lens assembly. If you dropped the red-laser DVD diode too, you could make a much slimmer, much cheaper drive.

Very interesting. Simpler tends to mean longer lasting (to some extent) as well. Any A to B examples you're aware of? I'm curious to see any visual, assembly and/or cost differences between a drive with all 3 and a drive with just violet.
 
Actually, I'm not sure what the licensing agreements are or if that would even effect playing back CDDA. Non Redbook compliant/licensed player?
A trademark is just a trademark, IE, a logo on the casing or elsewhere (instruction booklet, whatever.)
 
Some interesting info from the Wii U Gamepad reverse engineering. Hardware wise, it appears to use what is "almost standard" 802.11n

It is also using almost standard 802.11n, which made things easy to experiment on a PC."

As for the communication protocols used:

Custom communication protocols means we can discount previous theories that Nintendo employed the use of Broadcom's streaming video Miracast technology to get the Wii U GamePad working, although there are similarities.

"Video is compressed using h.264 (baseline profile, so no B frames)," Bourdon shares. "Audio is usually uncompressed, but we've found mentions of compressed audio formats in the firmware... We found mentions of [Miracast] when we started working on the GamePad, but it turned out to be false. There is no Miracast anywhere in that GamePad. Audio, video and input streaming is done over custom protocols."

Since Off TV play is generally reported as being very responsive I thought this was encouraging news for the potentials of Remote Play with the PSV and PS4. The hardware used is much the same (802.11n + dedicated video encode/decode hardware). Which means what's left is the software stack. I think this helps clear up what was mentioned as "Gaikai tech" for PS4 Remote Play during the Feb PS Meeting. I now suspect its entirely on the software side (which would make sense).

Of course, some of the remaining questions revolve around how (if) they plan to optimize Remote Play for across WAN access. Some of the bandwidth spikes with the Wii U's Off TV play can't be accomidated by your typical home BB upload speeds:

Baseline profile h.264 rules out many of the more advanced compression techniques employed by the codec, but Nintendo makes up for it via sheer, raw bandwidth. A sample capture from the Wii U WiFi stream offers up 33MB of data captured across 87 seconds - this gives us an average of around 3mbps. This is fairly lavish for an 858x480 stream at 60 frames per second, but the video captured here is only displaying the Wii U's front-end menus. Pierre Bourdon tells us that the Wii U uses variable bitrate, meaning that bandwidth scales up according to the complexity of the image it has to encode.

"This measurement does not include audio. Here is a graph of frame size over time in these 33MB," he says.

Despite the 3mbps average, we're seeing spikes of anything between 25-40mbps, and a massive variation in bandwidth that can only be down to variable bitrate h.264 video encoding. The more complex the image, the more information is required to maintain image quality - something the Wii U seems more than capable of successfully transmitting over its 802.11n wireless link.

"I haven't checked but I think the spikes are just the Wii U sending a large I-frame (full picture/key frame)," Bourdon explains. "If you average the bandwidth over something like 10 frames these spikes mostly disappear. In normal operation mode the Wii U sends one I-frame and then only P frames, unless the application requests to send an I-frame or a frame was not received properly by the GamePad (because of packet loss)."

The answer to this most likely lies in the engineering knowledge they gained from Gaikai as well. I'm curious to see how their approach differs, however.
 
Last edited by a moderator:
Killzone discussion here.
As everything got moved over there, one could mention in this hardware thread, that the timing information from the profiler shown in the Killzone presentation confirmed without doubt, that the CPU cores in the current devkits run at 1.6 GHz. There may be a bit of movement for the final version going on sale at the end of the year depending on the yields and power consumption AMD/Sony experience (and they have to decide about that pretty soon if it's not yet fixed), but one probably shouldn't expect 2+GHz or something like that.
 
Clocks tend to go down when chips are put in their final resting places on the motherboards. Its not likely this time as 1.6 is well in jaguars heat range but pushing it higher is definitely not going to happen
 
Some interesting info from the Wii U Gamepad reverse engineering. Hardware wise, it appears to use what is "almost standard" 802.11n



As for the communication protocols used:



Since Off TV play is generally reported as being very responsive I thought this was encouraging news for the potentials of Remote Play with the PSV and PS4. The hardware used is much the same (802.11n + dedicated video encode/decode hardware). Which means what's left is the software stack. I think this helps clear up what was mentioned as "Gaikai tech" for PS4 Remote Play during the Feb PS Meeting. I now suspect its entirely on the software side (which would make sense).

Of course, some of the remaining questions revolve around how (if) they plan to optimize Remote Play for across WAN access. Some of the bandwidth spikes with the Wii U's Off TV play can't be accomidated by your typical home BB upload speeds:



The answer to this most likely lies in the engineering knowledge they gained from Gaikai as well. I'm curious to see how their approach differs, however.

Back when I had a Wii U, I still remember the compression being a bit sloppy since it was still pretty noticeable as compared to my TV (or maybe I just noticed how the unsaturated the tablet display was). I guess the focus was on being responsive instead of high quality.
 
I can deal with the possibility that Sony may reserve a core for the OS. But if they are disabling a cpu core -- of a tiny low power cpu -- for yields that is pathetic. I could understand if the CPU was even midrange, but it is a wimp already.
 
Physical defects generally don't care how awesome a core is. The number of cores raises the chance that at least one is going to be hit, but without more data we don't know what Sony has decided for those chips.
 
Clocks tend to go down when chips are put in their final resting places on the motherboards. Its not likely this time as 1.6 is well in jaguars heat range but pushing it higher is definitely not going to happen


Could be that they was using the older Devkits with the 8 core Bulldozer clocked at 1.6 Ghz
 
Physical defects generally don't care how awesome a core is. The number of cores raises the chance that at least one is going to be hit, but without more data we don't know what Sony has decided for those chips.

Then Sony needed to use a different core that would not be prone to defects. The idea of this already weak CPU being made weaker is just very sad.
 
That's not how large-scale semiconductor manufacturing works.

Random defects in the wafer or a flaw in one of the exposures or processing steps are going to happen. A design can only be made so redundant, particularly the logic of a CPU. A more redundant CPU design is simply bigger, without being better, and after a point it's not even more resistant.
 
I can deal with the possibility that Sony may reserve a core for the OS. But if they are disabling a cpu core -- of a tiny low power cpu -- for yields that is pathetic. I could understand if the CPU was even midrange, but it is a wimp already.

In such an event that 2 cores are not available for games, the simple solution is to buy a nice PC. Any perceived wimpiness in the CPU is then fully addressable by you, and you don't have to deal with Sony's wimpy solution. Problem solved. :yep2:
 
Status
Not open for further replies.
Back
Top