PlayStation 4 (codename Orbis) technical hardware investigation (news and rumours)

Status
Not open for further replies.
Ah ok. But last time I checked, local dimming LEDs are still very expensive and still have that usual issue of having too few LEDs, causing obvious issues.
I don't know what a locally dimming LED is. I don't make a habit of disassembling my (or works) displays but from the units I've seen in the past few years, modern panels tend to use inexpensive arrays of LEDs where light is localised by a) relatively low power drive (meaning each LED emits far less light) and b) by design of the LCD substrates which help prevent light bleed from neighbouring LEDs in the array.

The number and type of LEDs will depend on your TV. Budget sets will definitely have fewer LEDs and of lower quality, i.e. not pure white, sometimes blue LEDs (relative in light terms) with the LCD calibrated to compensate for the colour shift. Decent sets will have more.

Cheap LCD monitors and TV sets are basically an abomination. 3.. 2.. 1.. like from BRiT incoming :yep2:
 
I don't know what a locally dimming LED is.
Local dimming LED TVs. TVs with LED arrays of somewhat significant numbers and reasonably localised output, illuminating a subset of pixels. Each LED is set to an average brightness of the pixels it's illuminating, so darker for dark areas and brighter for lighter areas. But because they aren't focussed to each individual pixel, you get brightness halos around high-intensity spots. OLED is thus ideal, unless someone can invent a truly opaque LC or substitute.

Or basically, image luminance is rendered on a separate LED array as a starting light source.
 
They shoved all those chips inside into coprocessor package? It will be nice to see if SSDs now work faster now that SATA>USB3 bridge is hopefully gone.
 
Why? So that they can create "level playing field" for all hard drives?

For the sake of a consistent platform, console makers have gone to far greater lengths. Microsoft hobbled its unified Xbox 360 chip to make sure having the GPU and CPU on die behaved as if they were physically separate.
Keeping hard drive performance at a certain level means devs can validate to that spec without worrying that their testing and performance budgeting falls down on millions of older models.
 
Last of Us was tweaked last days before gold master because 1 dev tested it on very old devkit where levels often did not load.
(though it happened 1 time even on my last 3000 Slim model).

PS3 had limit too and it still happened.

Also Read speed at the beginning and end of the drive is different.
 
I think I read a long time ago that the PS3 version of Rage hit a snag with a specific PS3 revision, so even when platform holders try there can be corner cases that get missed.
That no form of hardware equivalence is 100% complete without actually equivalent hardware doesn't mean it's a good idea to tempt the fate bear.
 
Uncharted 3 faced great disaster 1 or two months before launch when ND discovered game is not streaming assets correctly on fat PS3 models. :D
 
CUH-1200 motherboard(Japanese)
http://pocketnews.cocolog-nifty.com/pkns/2015/06/ps4cuh-1200-4af.html
8x1GBGDDR5
APU: CXD90037G(CUH-1000: CXD90026G, CUH-1100: CXD90026AG), still 28nm, maybe 28HPP
USB 3.0/SATA bridge LSI,Ethernet Controller, USB 3.0 Controller all gone, replace by the co-processor

Resulting in considerable power consumption and noise level reductions at idle and load, according to one report.
http://wccftech.com/ps4-model-cuh1200-performance-power-differences/

ykhz22or.ifw.png


hgumsuxv.w3e.png
 
Last edited:
Hynix recently added 8Gb GDDR5 in their databook, planned for Q1'2016. This should provide a nice competition against samsung. Micron is now shipping 20nm 8Gb too. Hynix is really late in the game.

What I find interesting is that even the lowest speed bin is fast enough for PS4 (at 1.5v). That makes 3 factors which would lower Sony's cost, along with the memory die shrink, and half the number of chips.
 
Last edited:
Status
Not open for further replies.
Back
Top