PS4 Pro Speculation (PS4K NEO Kaio-Ken-Kutaragi-Kaz Neo-san)

Status
Not open for further replies.
I see this happening at friends an relatives as well. In the store they go: "you got to have a sound bar". So they get one. No multiple speakers. which is a shame, because I have yet to hear a sound bar which sounds 'good'. I am no paedophile but I can enjoy a good game or movie blasting to my yamaha receiver and Bose 301 speakers. With the sound bar, even if you turn the volume up it still sounds a bit cheap. Even if the sound bar was 350 euro (which gets you pretty good speakers)
 
I see this happening at friends an relatives as well. In the store they go: "you got to have a sound bar". So they get one. No multiple speakers. which is a shame, because I have yet to hear a sound bar which sounds 'good'. I am no paedophile but I can enjoy a good game or movie blasting to my yamaha receiver and Bose 301 speakers. With the sound bar, even if you turn the volume up it still sounds a bit cheap. Even if the sound bar was 350 euro (which gets you pretty good speakers)

I sure hope not.
 
Lol, sorry looks like autocorrect messed up :D
I meant to say I am no audiophile. Unless we are talking about crystal clear recordings of swimming pools, or children's playgrounds, in that case I want the best, most lifelike audio performance possible..
(that was obviously a joke).

Btw I don't like listening trough headphones. But between those, and an audio bar, I would wear headphones
 
I've got the Sonos soundbar. It sounds great, if you add the $600something Sub to it, so it's not a cheap solution. If you don't, it's kinda meh. I still chose to spend the money (plus a set of Play:1 rears) because I don't really have the room for discrete front speakers, and I have enough cables snaking across my floor as it is without adding speaker wires to the mess as well. Plus, Sonos integrates great with Spotify.

Not really the right thread for this discussion tho. ;)
 
I'll believe it when I see it... "Ultra-high definition" graphics would need a dGPU pretty much, wouldn't it. 4k has ~3x the pixels of 1080P IIRC, so ~5.5TF shading and 300% rasterization resources would need a ridiculous APU, especially where memory is concerned.

A split system where iGPU and dGPU shares burden (with iGPU running compute tasks for latency reasons) would be more realistically capable of handling 4k at gaming-friendly framerates. Not that I'm convinced any "PS4k" would necessarily run at 4k... Maybe it'd settle for say, 1440P, and then upscale to 4k. Then you could probably manage with just a single chip APU. Hook some 7GHz GDDR up to it, add a proportional amount of shader arrays and ROPs... Maybe some increased internal efficiency as well from latest-generation GCN tech.

Yeah, let's dream on, shall we!

'UHD' covers two resolutions: UHD 4K (3840×2160) has 4 times as many pixels as 1080p and there is UHD 8K (7680×4320). Make it rain, Sony! :runaway:

Doesn't the new processes + designs offer 2.5x performance per watt, a figure that increased from an original 2x? And doesn't async compute have a possibility of increasing relative performance up to 45%, and the original ps4 has enhanced async compute capabilities? And let's not forget lower settings, and console optimizations. Even GTAV can run at 50-60fps at 4k on a 970 with lower settings.

But the memory would be an issue. I imagine they'd need either hbm2 or gddr5x to feed it. IF they could get one of these at low price, assuming reduced power consumption compared to gddr5, that would also increase the power budget available for the apu, and thus potential performance.
 
Doesn't the new processes + designs offer 2.5x performance per watt, a figure that increased from an original 2x? And doesn't async compute have a possibility of increasing relative performance up to 45%, and the original ps4 has enhanced async compute capabilities? And let's not forget lower settings, and console optimizations. Even GTAV can run at 50-60fps at 4k on a 970 with lower settings.

But the memory would be an issue. I imagine they'd need either hbm2 or gddr5x to feed it. IF they could get one of these at low price, assuming reduced power consumption compared to gddr5, that would also increase the power budget available for the apu, and thus potential performance.
Yes 2.5x performance per watt. But they cannot maintain as high a wattage with 14nm finfet based chips. Sure we are getting a massive decrease in power consumption, but are we getting a massive improvement in performance/$? From what i've heard the price per wafer is double 28nm. From what I've heard the amount of transistors they can fit in a wafer is about double despite logically you'd think ~4x more.

So that puts a dent in the idea that Sony getting +45% performance from a 14nm $120 apu (assuming it has the improvements to its memory system needed to facilitate a 45% more powerful gpu portion of the APU)
 
Yes 2.5x performance per watt. But they cannot maintain as high a wattage with 14nm finfet based chips. Sure we are getting a massive decrease in power consumption, but are we getting a massive improvement in performance/$? From what i've heard the price per wafer is double 28nm. From what I've heard the amount of transistors they can fit in a wafer is about double despite logically you'd think ~4x more.

So that puts a dent in the idea that Sony getting +45% performance from a 14nm $120 apu (assuming it has the improvements to its memory system needed to facilitate a 45% more powerful gpu portion of the APU)

Those figures should be wrong, no? Or are you suggesting that the new amd and nvidia cards are not going to offer better perf/$? what about the phone refreshes on these processes, are they also not going to offer improved perf/$? Most sites seem to expect significantly improved performance, without significant increase in price.
 
Those figures should be wrong, no? Or are you suggesting that the new amd and nvidia cards are not going to offer better perf/$? what about the phone refreshes on these processes, are they also not going to offer improved perf/$? Most sites seem to expect significantly improved performance, without significant increase in price.
I hope I'm wrong. I've been holding back on a new videocard waiting for what these chips offer in terms of performance. But I have read those 2 tidbits that they can't hit the same power and clock speeds will be lower, and wafer costs are double, while transistor count on Samsung's 14nm finfet processes is only double TSMC 28nm.
There has been so many excited about performance increase/watt, but I worried that its marketing spin from AMD since the official info about the new cards is lower wattage and rumors these cards run at lower clock speeds. Also I've heard this 28-14nm node shrink is one where there isn't much of a decrease in $/mm^2, which sounds plausible since 20nm TSMC was skipped because there wasn't enough of a decrease in $/mm^2 or performance increase for AMD or Nvidia.

eezbRGE.jpg


Hmm Apple's semiconductor costs have been going up ~30% since migrating off of 28nm for the Iphone 5s to the 14nm Samsung finfet process, it seems according to this Merril Lynch sourced graph.

Apple 5S SoC A7 28nm - 102.7mm2
Apple 6S SoC A9 14nm - 71mm2


iPhone-6s-component-costs.png

both-dies.jpg


charts.0011-980x720.png
 
Last edited:
What seemed like crazy talk a few months ago now seems like an logical explanation.

It's clear that we will not be getting a 7.36 tflop GPU in this PS4.5 to render 4K PS4 games. So there has to be some offloading & acceleration happening somewhere.
 
I hope I'm wrong. I've been holding back on a new videocard waiting for what these chips offer in terms of performance. But I have read those 2 tidbits that they can't hit the same power and clock speeds will be lower, and wafer costs are double, while transistor count on Samsung's 14nm finfet processes is only double TSMC 28nm.
There has been so many excited about performance increase/watt, but I worried that its marketing spin from AMD since the official info about the new cards is lower wattage and rumors these cards run at lower clock speeds. Also I've heard this 28-14nm node shrink is one where there isn't much of a decrease in $/mm^2, which sounds plausible since 20nm TSMC was skipped because there wasn't enough of a decrease in $/mm^2 or performance increase for AMD or Nvidia.

eezbRGE.jpg


IIRC, I read that GF 14nm is more power optimised than TSMC 16 nm. There's also some tradeoff between power efficiency and transistor density on these nodes.

I assume those AMD slides are for GF, what with them being 14 nm and all. But hey, at least with 14 nm cost per transistor is dropping again! A ... bit.

P.S. as for the rumours you've heard, lower clocks could be because they're going wider, and lower wattage would presumably be a good thing if it's to hit PCI-E and laptop power targets. It might not all be bad!
 
What seemed like crazy talk a few months ago now seems like an logical explanation.

It's clear that we will not be getting a 7.36 tflop GPU in this PS4.5 to render 4K PS4 games. So there has to be some offloading & acceleration happening somewhere.
Bah! If it's not 7.154TFlop minimum, it will tera-flop.
 
Status
Not open for further replies.
Back
Top