PS4 Pro Speculation (PS4K NEO Kaio-Ken-Kutaragi-Kaz Neo-san)

Status
Not open for further replies.
If that's true (smaller than the original PS4), then it has to be a smaller node. It also probably means that it's not pushing the new node very much in terms of die size so a 2X GPU would be within reason.
 
Sounds like it will be announced at the "PlayStation Experience" event they usually have late in the year.

500GB is interesting. But I guess if it keeps costs down...

That $499 price though...not sure how that plays.
 
If that's true (smaller than the original PS4), then it has to be a smaller node. It also probably means that it's not pushing the new node very much in terms of die size so a 2X GPU would be within reason.

So it's sounds like they are keeping the original PS4 as is and selling a smaller more powerful version for more money. Interesting.
 
Based on his previous posts, a 499$ price should mean an improved CPU as well. But how do you significantly improve the CPU with a strict hardware BC compliance? Very big overclock? I don't see others possibilities. 2 CPUs? 4 quad core jaguar modules? Is that even technically possible?

500GB is more like good news. It's easier to upgrade the HDD than the APU...Well, except when you are stupidly too afraid to lose your P.T. copy in the process. :yep2:
 
will it run crysis ?
;)

Based on his previous posts, a 499$ price should mean an improved CPU as well. But how do you significantly improve the CPU with a strict hardware BC compliance?
They could move to Puma [successor of Jaguar] that would by default run at small clocks [to emulate PS4 performance level as much as possible], with option for devs to activate higher clocks. Variable clocks are nothing new, especially for modern X86 [my 4GHz desktop processor runs most of the time at 800MHz]. I presume that this would not offer truly significant boost the CPU, but even 20-40% would be good.

Of course, if they go with something other than Jaguar/Puma, then we could be getting something faster.
 
Very big overclock? I don't see others possibilities. 2 CPUs?
Unless all software is written in a scalable threading manner that doesn't make any assumptions about hardware implementation, core count changes strike me as unrealistic. Is something like 2.4 GHz possible? A 50% CPU increase would be significant, compatible, and not require any changes anywhere. Alternatively 8 cores that are just better. ;)
 
Unless all software is written in a scalable threading manner that doesn't make any assumptions about hardware implementation, core count changes strike me as unrealistic. Is something like 2.4 GHz possible? A 50% CPU increase would be significant, compatible, and not require any changes anywhere. Alternatively 8 cores that are just better. ;)

It could be as simple as they are just taking existing PS4 architecture with more gpu compute units down to 14nm. And clocking everything higher.
 
I played through this entire game on PS3, at 720p... despite owning a beefy gaming rig at the time (played it already on PC).

I enjoyed it on PS3 as much as I did on my PC... and then I went to bed that night with a splitting headache and blurred vision, after spending so long looking at the blurry shoddy IQ.
 
I'm not sure that's ever been the claim though has it?

That's what ToTTenTranz is claiming.

I read the article. I also read Carmack's tweets about a single GPU in VR having to deal with a lot of overhead and latency for changing between two different viewpoints for the same frame, which a dual-GPU system would avoid (all they have to do is "wait" for both frames to finish, which is usually not a lot if the GPUs are identical), and I read presentation slides about AMD's LiquidVR and nVidia's GameworksVR.

I also have tested SteamVR myself, with a single 290X getting a mediocre score and two 290X with multigpu enabled getting a perfect score.

They all point to dual-GPUs being excellent for VR, where you can at least achieve twice the complexity of each frame at the same latency.
Then there's all the VR devkits being shipped with dual-GPU solutions.


After all this stuff presented how someone could still think dual-GPU is somehow worse for VR is beyond my comprehension.

But none of that means that dual GPU is inherently better than single GPU, and it certainly isn't better if you take into account cost per performance unit (dual GPU double up on bus, power, interface, inefficiency).

Dual GPU is a good fit for VR, but not inherently better than single GPU. And if you should try and use your dual GPU for a single display then things end up somewhere between "somewhat worse" and "everything has gone to shit".
 
$500 is a mistake.

How?

The core PS4 will more than likely be $299 when the PS4K is introduced. This product is more for those diehard gamers willing to spend more on a premium product. Why should the core gamer / casual gamer even care about the pricing (of a premium product) that doesn't effect them from buying the entry-level model?
 
I could imagine a situation where this is happening from both Sony and Microsoft because both discovered it would be cheaper to commision new 14nm APU tech from AMD than to have their current architectures shrunk...
This makes no sense. Porting to a new node is relatively easy. It just takes money and time and far less of both than creating a new architecture.
 
This makes no sense. Porting to a new node is relatively easy. It just takes money and time and far less of both than creating a new architecture.
The time and money for the new architecture on a new node has already been invested though (presuming a PC technology derivative like the PS4).
 
Damn. A console smaller than PS4 with 3.6TFlops+ would be enticing...
I wonder if the smaller size could point towards it using HBM2 memory. That would cut down the pcb size appreciably, and help explain the supposed $499 price.
Probably still unlikely for cost reasons, as I presume Sony will want to keep the hardware at least breakeven.
 
Would a developer that's got access to a development kit really be in the know about how much it'll cost? I would have thought that only Sony would have that information.
 
Would a developer that's got access to a development kit really be in the know about how much it'll cost? I would have thought that only Sony would have that information.

I can't see how a developer would have any kind of accurate idea of what a final product will cost. Devkits are often quite different to production hardware and even if a dev had access to final hardware for testing, how would somebody know what Sony are paying to license IP, manufacture, distribute and what their profit margin is?
 
Agreed. I can't help but think it puts that particular leaker into question.

The other possibility is that he works for Sony. Though again, to have that knowledge, he wouldn't be an average employee, he'd need to be deep in product development.

Makes it seem quite unbelievable to me.
 
Status
Not open for further replies.
Back
Top