portable PS4 with the new ryzen APU.... shudder.. even if they did that, i cant imagine how loud it will sound
My litmus test for mobile is NSW. And that has some of the best perf/watt there is and if you look at where it is graphically today, it's a far ways from what PS4 is capable of.Was the PS4 GPU/CPU architecture THAT inefficient?
I'm gobsmacked that a PS4 APU (18CU GPU and 8c Jaguar) at 14nm consumes 60-80W and a RR APU (10CU GPU and 4core Zen) consumes 9-15W....
That's a HUGE difference in power consumption for a CPU that's probably twice as fast and a GPU half the size (which is based on Vega—which according to the internet is the anti-christ of power inefficiency).
We have a 14/16nm PS4 already, the PS4 Slim. And that console needs ~80-90W for the whole system. So 60-80W for the APU is not that far away.Yeah... the 60-80W range for PS4 soc seems fishy to me.
I dunno, I'm just throwing ideas out there.HDD cache?
You are comparing peak power of whole machine ~70W comprising of:Was the PS4 GPU/CPU architecture THAT inefficient?
I'm gobsmacked that a PS4 APU (18CU GPU and 8c Jaguar) at 14nm consumes 60-80W and a RR APU (10CU GPU and 4core Zen) consumes 9-15W....
That's a HUGE difference in power consumption for a CPU that's probably twice as fast and a GPU half the size (which is based on Vega—which according to the internet is the anti-christ of power inefficiency).
We have a 14/16nm PS4 already, the PS4 Slim. And that console needs ~80-90W for the whole system. So 60-80W for the APU is not that far away.
Btw, the Ryzen that we speek of has a power window from 9-25W. On 9W it will be downclocked to something you don't wanna see in a console. Event on full 25W the GPU shouldn't be that good, just because of the clock-rates. It may boost here and there, but it is no efficiency wonder.
You also forget the wider memory-interface, the GDDR5 RAM and so on.
Than you would need a display (because it should be mobile) so you APU can't even use the whole 9W for itself.
As I wrote before, a portable PS4 will not be realizeable, not even with 7nm without making drastic changes to the architecture. And than you loose the compatibility.
You are comparing peak power of whole machine ~70W comprising of:
- APU + fan + 8 GB of GDDR5 + HDD + Bluray drive + ARM + 256MB DDR3 + rest of components on board + what is lost by the internal power supply inefficiency VS optimistic estimated consumption of only one APU.
86W with disc spinning is the max power consumption realized in one game, generally the max is ~63w. For reference PS4 consumes 47W on the front end with online on. The max for the APU only should be way lower than ~70w.
You are comparing peak power of whole machine ~70W comprising of:
- APU + fan + 8 GB of GDDR5 + HDD + Bluray drive + ARM + 256MB DDR3 + rest of components on board + what is lost by the internal power supply inefficiency VS optimistic estimated consumption of only one APU.
86W with disc spinning is the max power consumption realized in one game, generally the max is ~63w. For reference PS4 consumes 47W on the front end with online on. The max for the APU only should be way lower than ~70w.
HDD cache is volatile yet small - the worst of both worlds! There's a significant difference in implementation and results between an HDD with 512 MBs cache, a 32 GBs pool of 'slow' RAM, and an SSD. I'm not even sure a 512 MB cache on the HDD would be particularly useful as the data access patterns probably don't fit it well. Common sense suggests if it'd speed things up, manufacturers would be providing HDDs with large caches and notable performance gains.
One additional purpose for the DRAM on many SSDs is to hold the translation table between logical and physical blocks, which can be dynamically remapped for purposes such as wear-leveling or garbage collection. Cheaper SSDs might have cut out the DRAM, although performance suffered.I'd expect the DRAM on mechanical HDDs to be used to optimistically prefetch blocks from the same track as the current request, and to buffer writes to see if multiple writes hit the same track, - cutting down on seeks.
You're absolutely right, that beyond these buffer requirements, adding more DRAM would just add cost, not performance
Edit/Addendum: The DRAM on SSDs is used for the same purpose; Aggregating writes and prefetch reads.
Cheers
Ram is relatively cheap, the scaling continually shows a downward trend in price per GB. HDD prices have somewhat stagnated, worst part is having faster access to your information.I think Scorpio really pushed the bar hard. Well, in a broader sense its iterative consoles.
It's going to be fairly difficult to make substantial leap anytime soon, especially in the area of costly RAM where people are getting carried away.
I dont expect to see 24 GB of GDDR6 be affordable for a long time. The king $750 1080ti graphics card is only 11 GB. Additionally the gfx card race seems to have slowed. I remember when my brother bought a GTX 780 when it was near the top, it got left behind quickly. He bought a 1080 Ti a few months ago, it's still on top for the forseeable future unlike that 780.
People throwing around 24-32GB GDDR RAM, that's gonna be super costly. 16GB seems like a more realistic short term goal.
At these specs the next gen will have to be $500 minimum. Which is probably OK I guess.