PlayStation 4 (codename Orbis) technical hardware investigation (news and rumours)

Status
Not open for further replies.
Why would SoC shrinks be less painful based on whether there's going to be 8Gbit GDDR5? Those operate in different manufacturing realms.

The high-density GDDR5 could make compact form factor SoCs with decent bandwidth and good-enough capacity possible--but uncertain economicaly, for SoCs beyond the PS4.

They're not related. Sorry if I made them seem so. My belief on SoC shrinks lies with who the designer is and who the fab house is.
 
Full screen particle effects: as in screens in which you can only see particles.

edit: And maybe god-rays on top of that, with god-rays for the god-rays.
 
Yeah, with that route they couldn´t have gone 8GB. They should have gone like X1 with DDR3 and 256 bits bus.
Either way after reading the interviews of Oles Shishkovstov on DF, I wonder if actually 1 192 bit with 16/24 ROPs along with 6GB would have been "wiser" in the long run.
The guys thinks that 8GB was pushing a bit (MSFT reserved lots of GB for the OS).
Then there is the matter of shrinking the chip, I'm eager to find what is the size of Orbis SoC. I don't expect it to "big". It could set hurdle for price reduction.
Then there is the matter of hardware utilization. The PS4 (as parts like Pitcairn) doesn't have enough bandwidth to make the most of silicon invest in the ROPs under any circumstances (though I guess there are cases where 32 ROPs is better than 24 or 16).
I guess Sony wanted to have that advantage in raw numbers (for real perfs and the difference it makes as far as masses are concerned... it is another matter) and land some PR wins early, though I still think they could have paid an high price for that.

In the light of Durango specs and MSFT relative silence on the matter (even if the system were to be better) they don't have anything easily "marketable" "PR friendly" to push to the press/media, I think that Sony should have done a blend of "upgrade/downgrade".

I know nobody will agree, especially Sony fans, but Sony could have:
up the ram up to 6GB, 4GB for games, 1GB for the OS and services, 1 "for later"
related to that is the choice of a 192 bit bus
Disable 1 core and reserve one for the OS.
I would have disable 2 SIMD and try to save on the power slightly by clocking the GPU @750MHz

All that starting from a chip with 8 cores, 18 SIMD, 32 ROPs :runaway:
Sounds crazy but:
it could have pushed good yields into great great/awesome territories
they would have saved a few Watts on the GPU
you save 25% on the RAM costs, lower power consumption (less memory controllers, less memory chips).
you have a 192bit bus easier to fit down the road (after a shrink).
They could have under cut the Xbone without cutting their new eye toy
Last but not least, from a PR pov they would still have been in a situation to claim they have the biggest one, TFLOPS and simple metrics still rule the opinion.

Now when I read about that "rumor" about 8Gb GDDR5 chip I think that Sony could have been really aggressive on price.
 
Either way after reading the interviews of Oles Shishkovstov on DF, I wonder if actually 1 192 bit with 16/24 ROPs along with 6GB would have been "wiser" in the long run.
The guys thinks that 8GB was pushing a bit (MSFT reserved lots of GB for the OS).
Then there is the matter of shrinking the chip, I'm eager to find what is the size of Orbis SoC. I don't expect it to "big". It could set hurdle for price reduction.
Then there is the matter of hardware utilization. The PS4 (as parts like Pitcairn) doesn't have enough bandwidth to make the most of silicon invest in the ROPs under any circumstances (though I guess there are cases where 32 ROPs is better than 24 or 16).
I guess Sony wanted to have that advantage in raw numbers (for real perfs and the difference it makes as far as masses are concerned... it is another matter) and land some PR wins early, though I still think they could have paid an high price for that.

In the light of Durango specs and MSFT relative silence on the matter (even if the system were to be better) they don't have anything easily "marketable" "PR friendly" to push to the press/media, I think that Sony should have done a blend of "upgrade/downgrade".

I know nobody will agree, especially Sony fans, but Sony could have:
up the ram up to 6GB, 4GB for games, 1GB for the OS and services, 1 "for later"
related to that is the choice of a 192 bit bus
Disable 1 core and reserve one for the OS.
I would have disable 2 SIMD and try to save on the power slightly by clocking the GPU @750MHz

All that starting from a chip with 8 cores, 18 SIMD, 32 ROPs :runaway:
Sounds crazy but:
it could have pushed good yields into great great/awesome territories
they would have saved a few Watts on the GPU
you save 25% on the RAM costs, lower power consumption (less memory controllers, less memory chips).
you have a 192bit bus easier to fit down the road (after a shrink).
They could have under cut the Xbone without cutting their new eye toy
Last but not least, from a PR pov they would still have been in a situation to claim they have the biggest one, TFLOPS and simple metrics still rule the opinion.

Now when I read about that "rumor" about 8Gb GDDR5 chip I think that Sony could have been really aggressive on price.

No no nooooooooo
 
It would've been the second coming of PS2 with that memory setup. But even PS2 was more extreme in bandwidth to available RAM ratio I believe.
 
No no nooooooooo

Why? why?? why????

The first law of alchemy teaches us that if you want something you have to give something of equal value. Either way you have to get your hands on a philosopher stone.
But I guess you and me know really well that that stone doesn't exist.

fma_memes_by_authorchick96-d58hb0k.png

:LOL: just kidding
 
I'd start to wonder about the cost and time to implement a 1TB/s eDRAM...
Let's say a 3.2 GHz bus. You'd need a ~2750 bit bus. A 2560 bit bus would have to operate at 4.3 GHz.

I'd love to have seen Sony's prototypes!

What kind of crazy effects we would have seen with a 1TB of bandwidth...
Possibly not much above, say, 256 GB/s. Smarter ways to do things circumvent the value of massive BW. ge. Don't draw millions of transparent particles, and instead calculate a single volumetric effect and draw that.
 
The entire memory subsystem would need to be revamped. There's nothing internal to the GPU or uncore that can absorb that kind of bandwidth.

If on-die, it would probably need to be banked and physically distributed like the L2.
 
Let's say a 3.2 GHz bus. You'd need a ~2750 bit bus. A 2560 bit bus would have to operate at 4.3 GHz.

I'd love to have seen Sony's prototypes!
It isn't that crazy. The L1 <=> L2 connections are in total much wider (512bit for each L1 cache or each of the twelve L2 tiles in Tahiti, respectively) and even run through a quite massive crossbar in between. The L2 of Tahiti can deliver up to ~800 GB/s (6144 bits wide if you accumulate the tiles) iirc. Already the L2 of the RV770 had a bandwidth of up to 512 Byte/cycle or 4096bits in total (RV870 aka Cypress and also Cayman didn't scale that number).

Assuming we are talking about on die memory structures of course or at least some wide interface DRAM on an interposer (some HBM derivative).
 
Last edited by a moderator:
A quick question to those in the know: Does PS4 have double the GPU L2 than XB1? Based on what I know of GCN, this *might* be true, given the double rops, but I'm not sure.
 
A quick question to those in the know: Does PS4 have double the GPU L2 than XB1? Based on what I know of GCN, this *might* be true, given the double rops, but I'm not sure.
The L2 cache size is tied to the memory interface channels not the number of ROPs. That's not saying a 128 bit interface must have half the cache of a 256 bit interface. I don't know if the cache sizes have been released/leaked.
 
Either way after reading the interviews of Oles Shishkovstov on DF, I wonder if actually 1 192 bit with 16/24 ROPs along with 6GB would have been "wiser" in the long run.
The guys thinks that 8GB was pushing a bit (MSFT reserved lots of GB for the OS).
Then there is the matter of shrinking the chip, I'm eager to find what is the size of Orbis SoC. I don't expect it to "big". It could set hurdle for price reduction.
Then there is the matter of hardware utilization. The PS4 (as parts like Pitcairn) doesn't have enough bandwidth to make the most of silicon invest in the ROPs under any circumstances (though I guess there are cases where 32 ROPs is better than 24 or 16).
I guess Sony wanted to have that advantage in raw numbers (for real perfs and the difference it makes as far as masses are concerned... it is another matter) and land some PR wins early, though I still think they could have paid an high price for that.

In the light of Durango specs and MSFT relative silence on the matter (even if the system were to be better) they don't have anything easily "marketable" "PR friendly" to push to the press/media, I think that Sony should have done a blend of "upgrade/downgrade".

I know nobody will agree, especially Sony fans, but Sony could have:
up the ram up to 6GB, 4GB for games, 1GB for the OS and services, 1 "for later"
related to that is the choice of a 192 bit bus
Disable 1 core and reserve one for the OS.
I would have disable 2 SIMD and try to save on the power slightly by clocking the GPU @750MHz

All that starting from a chip with 8 cores, 18 SIMD, 32 ROPs :runaway:
Sounds crazy but:
it could have pushed good yields into great great/awesome territories
they would have saved a few Watts on the GPU
you save 25% on the RAM costs, lower power consumption (less memory controllers, less memory chips).
you have a 192bit bus easier to fit down the road (after a shrink).
They could have under cut the Xbone without cutting their new eye toy
Last but not least, from a PR pov they would still have been in a situation to claim they have the biggest one, TFLOPS and simple metrics still rule the opinion.

Now when I read about that "rumor" about 8Gb GDDR5 chip I think that Sony could have been really aggressive on price.

are there yield problems for liverpool soc ?? last time i heard the problem was with microsofts soc !
 
are there yield problems for liverpool soc ?? last time i heard the problem was with microsofts soc !
Last time we heard anything it could really well have been fud ;)

Anyway we know nothing about the yields of liverpoolt , and that is not the issue.
I do not know what kind of yields AMD get for example with Pitcairn or Tahiti, they are selling salvaged parts but it could have not much to do with yields (more about creating market segmentation => I've a doubt about the proper marketing term).

My pov was foremost about price and price reduction, we haven't seen the size of Liverpool, I would think that it is not that big (somewhere around 300mm^2). If shrinking goes really well they may have trouble fitting a 256 bit bus as soon as on the 20/22nm process.
Then there is the other points about the amount of memory, etc.

I still fell like Sony missed an opportunity to lower its costs. They could have shipped with 4GB of RAM (16 2Gb memory chips) if those 4Gb memory chips got late for some reasons. Even though they should have volume that is expensive. At this time the rumors about Durango were pretty precise and I expect Sony to have quite possibly better intel that we had back in time. Imo once they knew they could have more RAM than 4GB, they should have taken that opportunity to lower their costs and enable more aggressive price reduction down the road.
Ultimately they could still have claimed as I said to have the "bigger one", not too mention the PR disaster MSFT went through would have further helped them (though they could not know or bet on that).
Ultimately without DF and the likes having their hands on finished games what is the end result of E3? People can't really tell the difference between Durango and Orbis. Though I don't dispute Sony PR wins but I would think that once the system hits the streets, price may actually be their biggest "win" not the extra FLOPS, 2GB of RAM, etc.
There is more they could have avoided cutting Eye toy and in the long run I think putting them selves in a better position to sustain that price advantage could have done them more good than pleasing a few geeks.
 
Last edited by a moderator:
Last time we heard anything it could really well have been fud ;)

Anyway we know nothing about the yields of liverpoolt , and that is not the issue.
I do not know what kind of yields AMD get for example with Pitcairn or Tahiti, they are selling salvaged parts but it could have not much to do with yields (more about creating market segmentation => I've a doubt about the proper marketing term).

My pov was foremost about price and price reduction, we haven't seen the size of Liverpool, I would think that it is not that big (somewhere around 300mm^2). If shrinking goes really well they may have trouble fitting a 256 bit bus as soon as on the 20/22nm process.
Then there is the other points about the amount of memory, etc.

I still fell like Sony missed an opportunity to lower its costs. They could have shipped with 4GB of RAM (16 2Gb memory chips) if those 4Gb memory chips got late for some reasons. Even though they should have volume that is expensive. At this time the rumors about Durango were pretty precise and I expect Sony to have quite possibly better intel that we had back in time. Imo once they knew they could have more RAM than 4GB, they should have taken that opportunity to lower their costs and enable more aggressive price reduction down the road.
Ultimately they could still have claimed as I said to have the "bigger one", not too mention the PR disaster MSFT went through would have further helped them (though they could not know or bet on that).
Ultimately without DF and the likes having their hands on finished games what is the end result of E3? People can't really tell the difference between Durango and Orbis. Though I don't dispute Sony PR wins but I would think that once the system hits the streets, price may actually be their biggest "win" not the extra FLOPS, 2GB of RAM, etc.
There is more they could have avoided cutting Eye toy and in the long run I think putting them selves in a better position to sustain that price advantage could have done them more good than pleasing a few geeks.

In all due respect mate, who cares about sony trying to reduce its costs?? Unless you are a sony shareholder it shouldnt matter to you, as gaming fans and tech fans surely we would want the best tech at a reasonable price point. Sonys margins are of no consequence to us, would you really want to get a weaker console so sony could cream more money off the top?? :/

Besides the console has been announced already and we pretty much got more than we hoped for. .amen to that :)
 
A quick question to those in the know: Does PS4 have double the GPU L2 than XB1? Based on what I know of GCN, this *might* be true, given the double rops, but I'm not sure.
The L2 cache size is tied to the memory interface channels not the number of ROPs. That's not saying a 128 bit interface must have half the cache of a 256 bit interface. I don't know if the cache sizes have been released/leaked.
That's almost right. Each RBE contains apparently 16kb color + 4kB Z cache. That scales with the number of ROPs and there the PS4 has probably 128kB color cache while the XB1 has just 64kB. But this doesn't have too much of an effect if one doesn't resort to some quite fine tiling as sebbi tested lately.
Anyway, the number of L2 tiles are tied to the size of the memory controller (one tile per 32bit channel), but the size of each tile is flexible. Current GCN GPUs use either 64kB per tile (Tahiti/Pitcairn) or 128kB per tile (CapeVerde/Bonaire, don't know about Mars). That's why CapeVerde and Bonaire have the same amount of L2 as Pitcairn (512kB).

CV/Bonaire: 4x128kB = 512kB
Pitcairn: 8x64kB = 512kB
Tahiti: 12x64kB = 768kB

For XB1/PS4 the bets are open, but I would think both have 512kB L2 (XB1 has the eSRAM, doubt they would go to 1MB; for the PS4 I doubt Sony changed too much to keep the risk down), but in principle 1MB [8x128kB] would be also possible within the parameters of existing GPUs.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top