AMD did shrink K10 to 32nm for Llano. They had yield issues on the CPU side and clock speeds were disappointing, rendering them uncompetitive.
These days, die shrinks aren't necessarily a magic wand that makes everything massively faster and cooler. 32nm to 28 nm at GF made design choices that prioritised density increases (suiting GPU) but hurt frequency at the top of the CPU spectrum.
The top end Jaguar on desktop hits 2.05 gHz, and that sees a bump for the APU power to 25W from 15W. The top end Puma is 2.2 gHz, and that's desktop only with a 25W TDP for the APU. Sony aren't doing badly with 2.1 gHz for 2 Jaguar modules - they're likely at the point where either power would start to sky-rocket or yields would drop off. Jaguar simply isn't designed for high clocks, and Sony have a sizeable GPU in there needing power too.
I look at Intel's past gains from die shrinks on PIII into the rev-arch core and P4 and I know about AMDs past die shrinks of "poor yielding Athlon X2s on 65nm" which even had "magical bugs preventing overclocking like the 90nm CPUs...
That was blatantly obsoletenced arch in favor of pushing up the first Phenoms which shipped with bugs...something that could have been avoided if focus was placed on proper 65nm maturation to surpass the 90nm, sell tons of chips then ship flawless quad-cores then die shrink...etc
The Llano smells a lot like usual AMD shens that cost them.
Who cares if overclockers were gonna snap them up...they could have mimicked an extended "tic toc boom" instead of superficially copying Intel (which is impossible because Intel maxed out their performance before making replacements) then again AMD couldn't charge $999.95 CPUs...some of the decision making is just or has just not really been based on making strong transitions when the same problem magically appears.
28nm Zambezi Octa-core shrinks would have helped keep channels filled instead of dried up leaving Intel to collect most of the pie.
Jaguar can safely go to 2.4ghz.
Along with die shrinks there's usually additional process enhancements.
Jaguar+ can safely go to 4.2Ghz.
Doubts are very justified at this point.
Remember that Sony needs to incentivise people to buy the new platform, while at the same time maintaining the original PS4 userbase.
If a game is 'too much better' on PS4K, and the PS4 version runs like crap, people might buy into the new system but I would expect the backlash against Sony to be similar to what MS experienced at the beginning to this generation.
If a game runs great on PS4, without much of a difference from the Neo version, then what's the point of buying the bloody thing?
They will need to strike the right balance, right in the middle of these two outcomes, and I think it will be a tough job.
PS1 coexisted with PS2 for a ridiculous amount of time...we could argue that Sony and AMD stand to make profits of 14nm shrunk PS4 slims while coexistence with PS4Kaio Ken.
Remember that besides the "enhanced old games" there's still devs who would take the challenge to make PS4Kaio Ken exclusives which could be standard 1080p but much higher image quality thanks to a combined fusion of all parts.
Uncharted 4 is already being said to fill the 50GB Blu Ray on single player campaign alone leaving the Online multiplayer a DLG component which is a smart move as past Uncharted series games didn't have huge online communities compared to close the "usual suspects" and rivals.
Again SATA-III 6GB/s hardware controller chip built in regardless (again) of diminishing returns on non-SSD drives could prove an additional testing platform for early adopters who may already have 512GB and 1TB SSDs
Doubt is casual adoption...and hardcores are unreliable consumers...but what if there's special exclusive titles?
Eventually the hardcore graphics whore is not gonna resist.
But Nvidia has less efficiency.
Lol I like that but wanted to say that we should remember the GeForce GTX 980 which back then was a lower TDP and thermals than what high end GPUs were.
When Mark Cerny called it a "super-charged PC architecture" it's evident that PS4 is essentially a PC with fewer kinds of models and a closed architecture just like Mac. With x86, unlike Cell, they don't have to think about the long term ventures like depreciation of semiconductor factories by themselves. What they have to think about is how they can maximize profit and reach at the global level, some countries need cheaper, affordable models, other countries need pricier and shinier models, etc. The variations with different HDD size could justify the prices of more expensive models, why not try more substantial customization with the additional bonus of UHD BD & HDR support?
Maybe it was designed to be flexible as a hidden deck of cards to play.
PCs rely on brute force...not consoles.
Rumors said that Polaris 10 will use ~110W, but that was already with 8GB of GDDR5 and higher clocks that PS4K will use. With added CPU module and other console components, I think PS4K will have simmilar TDP as the launch PS4. ~150W.
AMD already presented their engineering PC "unnamed midrange" parts consuming less than 90W in OVERALL power draw in the video with identical (according to AMD) PC components minus GeForce 28nm v Polaris 14nm FinFet.
Iirc they only showed Battlefront fps matching while being careful not to talk too much benchmarks.
PS4Kaio Ken is a full CPU and GPU die shrink, arch revision Zenkai clocks power up.
Even accounting for a full power transformation the CPU/GPU are more than likely to either consume less or nearly similar figures.
My estimate would be a conservative 40W less overall than current PS4.