PlayStation 4 (codename Orbis) technical hardware investigation (news and rumours)

Status
Not open for further replies.
Probably. But wouldn't that get much hotter? Anyway, roughly speaking the system seems balanced enough, just by looking at how much GB worth of bandwidth you can use in a single frame at that speed? Wasn't that 5.6GB for 30fps?
 
GDDR doesn't run very hot, the RAMs themselves even at 6GHz speed don't need a heatsink as shown by Geforce Titan, and I doubt the I/O interfaces in the GPU draws any more power than the interface in the RAM dies... Die space is probably a bigger concern, and you gotta find room to fit all the extra I/O pins. Especially when trying to shrink the chip.

I think we should count ourselves lucky we actually got 256 bits in a consumer device. I remember a time when that wide a bus was seen as something quite extraordinary, and almost mythically impossible in fact. :) (The ATI 9800 Pro made the dream reality of course.)
 
To be fair, BW could be replaced with compute power as long as algorithms advanced suitably, but I'm not too convinced. I was asking for new fog techniques all this generation, but we still ended up with alpha blended particles and crawling framerates.
 
Memory controller and bus takes a lot of die space. I think it would be more cost effective to pick 7gbps chip to get up to 224 GB/s on a 256-bit wide bus. In fact they will most probably choose 7gbps chip, but will underclock/volt them to get the most units and reliability. I still think they could settle at 6gbps for 192 GB/s but even 176 GB/s is really nice with that amount on a console.

1 Million PS4 = 16 Millions high end GDDR5 chips. Good luck with that.

It's probably a better case than blu-ray diode when PS3 launched, but still challenging in production.

There does not an error in your calculations?

For me PS4 have 8Go GDDR5 memory so 1 million PS4 sold= 8 millions GDDR5

KSterson

Don't take care of my previous message.
I forgot that it was 512Mo memory bank.

384-bit bus would have been a 7950 derivative. Even with more CU/TMUs/ROPs, the difference with the Pitcairn is really thin in benchmarks. Not really worth i think since it's about 150mm2 bigger.

A Pitcairn custom is the rational choice for a console this gen, it's really balanced while keeping the 32 ROPs and the right amount of CU/TMUs, you can't get better performance at about 200mm2. 870Mhz clock and 6gbps memory would have been great for round 2Tflops - 200 GB/s, but who knows..
 
In my view, basic peer-to-peer RemotePlay arrangement should only be used for LAN gaming.
In this case, the Vita does not need a copy of the RemotePlayed PS4 game.

To switch the game host across WAN between 2 PS4s on-the-fly, they can either:

* Mandate both parties to own the game: only need to share action and state data like co-op.
or
* Both parties Gaikai from the servers so that the host doesn't have to stream 1080p game presentation upstream (to another PS4). In this case, both parties don't need a local copy of the game technically speaking. It just happens that the host already bought a copy but he got stuck. This approach is only applicable if Gaikai's instant demo platform works as advertised.



As you can see, I don't believe in peer-to-peer RemotePlay across WAN.


It will work as long as latencies isn't too wild.

It should be no different from essentially hosting a Gaikai server on your PS4, and since there's HW encode decode embedded, it should be fine.

There seems to be a concern about local ISP bandwidth issues and the like.
While I understand that streaming 720p+ is going to take about 5+ Mbps, it is totally possible that PS4 doesn't encode at that resolution and goes for something in the realm of 540p (PSV resolution I remember) or 480p. That in effect can lower the requirement down to under 3 Mbps.


Anyway, from my experience, if a user wants to stream and has infrastructure limitations, it really isn't up to the manufacturer and service provider to fix the user's own problems. It's up to the user and his ISP to fix. Or don't stream, period.

This is like asking GE to provide a refrigerator that still work 24/7 when you have power outage issues.
It's just not their responsibility.

Trying to work around network infrastructure limitations (latency, bandwidth) will most likely result in overly complex systems and make the experience much more unpleasant.
 
Last edited by a moderator:
I have a question if the PS4 has 2GB reserve for OS and system(example),will the GPU will be able to take advantage of those 6GB or their will be a point where it will be a waste.?
 
It will work as long as latencies isn't too wild.

It should be no different from essentially hosting a Gaikai server on your PS4, and since there's HW encode decode embedded, it should be fine.

There seems to be a concern about local ISP bandwidth issues and the like.
While I understand that streaming 720p+ is going to take about 5+ Mbps, it is totally possible that PS4 doesn't encode at that resolution and goes for something in the realm of 540p (PSV resolution I remember) or 480p. That in effect can lower the requirement down to under 3 Mbps.


Anyway, from my experience, if a user wants to stream and has infrastructure limitations, it really isn't up to the manufacturer and service provider to fix the user's own problems. It's up to the user and his ISP to fix. Or don't stream, period.

This is like asking GE to provide a refrigerator that still work 24/7 when you have power outage issues.
It's just not their responsibility.

Trying to work around network infrastructure limitations (latency, bandwidth) will most likely result in overly complex systems and make the experience much more unpleasant.

This is why some service providers (e.g. IP-based video conferencing services) partner with ISPs to make sure their applications deliver acceptable, or the best experiences.

I remember reading that Apple do make sure the mobile operators' networks work well with their devices (and applications), before allowing them to sell iPhones.

A fridge is just a standalone device.

I'm not saying Sony is going there, but there are organizations that pay attention to quality of service.
If the experience is poor, then they may not be able to market/promote the feature as a WAN service anyway. To be fair, there are also business challenges to deal with for streaming full games from Gaikai servers.
 
New features like these also put pressure on ISPs to improve their service or risk losing customers to competitors better equipped to provide a good experience. If your DSL upstream sucks maybe your PS4 will convince you to switch to cable. Or if your cable is unreliable, maybe you seek out a fiber provider.
 
I have a question if the PS4 has 2GB reserve for OS and system(example),will the GPU will be able to take advantage of those 6GB or their will be a point where it will be a waste.?
If your program uses only small amount of memory, GPU should have ~6GB available to it.

IMHO, it's a lot more interesting to think what games will be able to do with >4GB of memory dedicated for a CPU..
 
In that kind of architecture, the actual memory consumption for the cpu code will be very low (it's not a Cell, no audio on cpu, etc..) Of course not all the memory available will act as vram, but most of it. That's one of the unified advantage, you can balance it as you like. It's a lot like the Xbox1 in fact.
There will always be something smart to do with lots of fast memory available in a close box, i can think of a lot of tricks.

ps: My previous message has been merged with the wrong guy ;).
 
New features like these also put pressure on ISPs to improve their service or risk losing customers to competitors better equipped to provide a good experience. If your DSL upstream sucks maybe your PS4 will convince you to switch to cable. Or if your cable is unreliable, maybe you seek out a fiber provider.

Fiber would be nice, if it wasn't limited to a very small percentage of citizens in the US.

As well, depending on how crowded your branch of the cable trunk is, you could have either great speed and reliable connections or greatly varying speeds (depending on the time of day) and horrible reliability with frequent outages. And at least for Comcast, if you start uploading a lot of data (hosting remote play at 4-5 Mbit/s upload for 720p) constantly, you could likely start getting warnings if you're in a high traffic neighborhood.

DSL on the other hand is generally more stable, at least in my area and for many of my guildmates in past and present MMOs. But has lower overall download speeds and vastly lower upload speeds. As well, many DSL providers are moving to FTTC (Fiber to the Curb, at which point copper takes it the last bit of distance into the house). But upload is still generally limited to 896 Kbit/s. Although some go as high as 2 Mbit/s.

For our cable members, we generally expect them to disconnect occasionally during raids or suffer from high pings (making them not worth having in raid) and hence always have a backup ready to take over. Generally don't have to worry about our DSL members.

Regards,
SB
 
New features like these also put pressure on ISPs to improve their service or risk losing customers to competitors better equipped to provide a good experience. If your DSL upstream sucks maybe your PS4 will convince you to switch to cable. Or if your cable is unreliable, maybe you seek out a fiber provider.

That's true. It depends on the average case performance and the appeal. If the starting experience is too far off, few users and businesses will bother to consider it.
 
What’s more, developers are predicting the death of pure single player experiences in several years.

I can't see this happening but if it does it will be followed by mine and I'm guessing many, many others gamers exits from the games market very shortly after.
 
I can't see this happening but if it does it will be followed by mine and I'm guessing many, many others gamers exits from the games market very shortly after.

The only sucky thing about this is the big publishers like EA might not even miss us when they really are farming the casual market to unspeakable degrees...

Microtransactions I am slowly realising are here to stay for example... I only wonder where I'm going to have to draw the line. I am wary of them for obvious reasons, though I can appreciate how they can keep a game or studio or publisher supporting a series. But there definitely is line to cross when micro transactions feel abusive and game design starts pushing players to cough up money... No need to look further than how "unlocks" work in some phone games with their gems and dual game currency systems...

I'm only supporting whoever makes games the way I'm used to playing them too. As soon as that spout is turned off, I'm outta there. I'll happily play Red Alert 1 in my spare time... Still fun today, still will be fun tomorrow.
 
Has sony said anything about the WiFi stuff in the PS4? I'll be disappointed if its not dual-band 2.4 and 5 Ghz and the new 802.11ac.
 
I hope it isn't, last thing I need is people polluting the unii 1 band range and chewing whooping great big chucks of bandwidth while there at it. Would have to move my stuff over to unii 2 which I doubt i could find a wireless usb dongle my tv will support.

unless you are running some of the more extreme MIMO configurations ac isn't doing much except allowing people to chew more bandwidth.
 
I see no reason for more than 802.11n at this point. Why include costly modern wifi features that none of your customers are going to be able to use due to lack of a suitable router? A later more recent spec can be rolled out if needed, but n is perfectly up to the task of playing games and streaming internet content. Heck, 802.11g is! The only obvious interest in faster wireless is streaming video from the console to other devices. Streaming uploads to the internet will be capped at internet upstream BW, which is typically low (1 MBps).
 
Status
Not open for further replies.
Back
Top