PS4 Pro Speculation (PS4K NEO Kaio-Ken-Kutaragi-Kaz Neo-san)

Status
Not open for further replies.
Developers have to know its spec, publishers/retailers have to know its estimated price to do their own sales estimates and plannings for schedules and shelf space.
 
You'd have thought, but it doesn't work that way. Devs/pubs work in the dark, and then get happy/sad when prices are announced. PS3 and OVR were priced way higher than devs were expecting and the impact on their potential customer base was a genuine concern.
 
Developers have to know its spec, publishers/retailers have to know its estimated price to do their own sales estimates and plannings for schedules and shelf space.
Retail won't know the price before investors. Investors won't know the price until Sony announce it. If retail knew about the price before it was announced it would leak.
 
And if you should try and use your dual GPU for a single display then things end up somewhere between "somewhat worse" and "everything has gone to shit".

As someone with a dual-GPU setup, I can confirm this to be awfully untrue.
Do you happen to own a dual gpu system? Did you ever, at least for the past 2 years?

But none of that means that dual GPU is inherently better than single GPU, and it certainly isn't better if you take into account cost per performance unit (dual GPU double up on bus, power, interface, inefficiency).
I don't get the dual gpu hate here. A pair of 290X is cheaper and gets consistently better results than a 980ti in VR and in games where crossfire is supported (which are all AAA games with demanding graphics.. which is where you'll need the second gpu anyway).
That's actual cost-performance, not some aficionado's theories about bus and interface.
 
Xbox 360 One II
I wonder if the smaller size could point towards it using HBM2 memory. That would cut down the pcb size appreciably, and help explain the supposed $499 price.
Probably still unlikely for cost reasons, as I presume Sony will want to keep the hardware at least breakeven.

Hmmm still think HBM2 would be to expensive.

Only thing got me confused though is the whole extra $100 for better cpu. That seems like a lot relative to the overall cost just for the cpu. Had to be cpu + something else.
 
Xbox 360 One II


Hmmm still think HBM2 would be to expensive.

Only thing got me confused though is the whole extra $100 for better cpu. That seems like a lot relative to the overall cost just for the cpu. Had to be cpu + something else.

Not if the beefed up CPU has an impact on PCB layout, cooling solution design, packaging size and thus shipping and handling costs. We currently have no idea what "better CPU means"; it could mean the same 8 core Jaguar but over-clocked, or an entirely new architecture like 8 Zen cores clocked at something like 3.2 GHz.
 
As someone with a dual-GPU setup, I can confirm this to be awfully untrue.
Do you happen to own a dual gpu system? Did you ever, at least for the past 2 years?


I don't get the dual gpu hate here. A pair of 290X is cheaper and gets consistently better results than a 980ti in VR and in games where crossfire is supported (which are all AAA games with demanding graphics.. which is where you'll need the second gpu anyway).
That's actual cost-performance, not some aficionado's theories about bus and interface.

You're using retail prices in your comparision when I feel you should be using manufacturer pricing. They artificially increase the price on the higher end gpus because the market will pay it so you dont get a realistic look at how much does it cost Nvdia/AMD to deliver the dual gpu core vs single beefier gpu core. That is the real cost-performance we should be looking for. Dont forget to factor in the increase in circuit board complexity and cooling complexity with a dual gpu setups. In the end, that will be the deciding factor for what Sony/MS selects.
 
You'd have thought, but it doesn't work that way. Devs/pubs work in the dark, and then get happy/sad when prices are announced. PS3 and OVR were priced way higher than devs were expecting and the impact on their potential customer base was a genuine concern.

Retail won't know the price before investors. Investors won't know the price until Sony announce it. If retail knew about the price before it was announced it would leak.
I'd say they won't know the exact price beforehand, but if they are given a ballpark price they're much appreciated, for example it won't go higher than $499. As for Oculus Rift, it has almost no shelf space for a while.
 
I still don't think that happens. Only maybe the senior execs in a retail chain will be informed, when negotiating purchase orders. Devs and publishers are definitely in the dark. It's part of the reason we get game 'demos' that aren't at all realistic because the devs have no idea what the final hardware specs are before reveal and completely overshoot the mark.
 
You're using retail prices in your comparision when I feel you should be using manufacturer pricing. They artificially increase the price on the higher end gpus because the market will pay it so you dont get a realistic look at how much does it cost Nvdia/AMD to deliver the dual gpu core vs single beefier gpu core. That is the real cost-performance we should be looking for. Dont forget to factor in the increase in circuit board complexity and cooling complexity with a dual gpu setups. In the end, that will be the deciding factor for what Sony/MS selects.

Well the discussion (wrongly) went towards dual GPUs for VR in general, so I did end up presenting the customer's PoV, not the manufacturer's.

Regardless, dual GPU (in this case, dual APU?) setups might not be bad at all for PS4K.

CrossfireX nowadays can get >90% scalability with 2 GPUs, and since XDMA the frame pacing issues are mostly gone. Get it to a console's highly optimized environment and perhaps you can get some steady 95% scalability.

And while this is admittedly worse than a larger monolithic APU for the motives you stated, there's also the factor "number of different custom chips being manufactured".
With a dual APU for PS4K, Sony could sell a cheaper PS4 Slim with one APU and the PS4K with two APUs. Making a larger contract for one chip will probably result in a better deal than two smaller contracts for two chips. Not to mention that R&D, yield optimization, transition for smaller nodes, etc. would be required for only one chip.
 
I imagine PS4K can be a more controlled, modest launch. The PS4 is selling nicely right now, Sony needs full cooperation and education with retailers to market a new pricier SKU in a way they want it to be with PSVR, etc. etc. So they may try new things, sharing more info with more people.

But I think that neogaf guy is not a retailer, retailers don't have to know its spec.
 
I still don't think that happens. Only maybe the senior execs in a retail chain will be informed, when negotiating purchase orders.
Right, and this happens after the announcement. In the case of PS4, Sony announced it in February 2013 but you couldn't place actual orders for day 1 release until July. I'm sure some places were accepting pre-orders earlier on the basis of first come, first served but nobody in retail knew if they were getting stock for sure, and how much, until months later. Amazon are always beautifully transparent about this. You can register interest for a product for alerts about availability but that's it.

The only time retail needs to know pricing before an announcement is for stealth launches, i.e. they announce at launch the product is available immediately.
 
I imagine PS4K can be a more controlled, modest launch. The PS4 is selling nicely right now, Sony needs full cooperation and education with retailers to market a new pricier SKU in a way they want it to be with PSVR, etc. etc. So they may try new things, sharing more info with more people.

But I think that neogaf guy is not a retailer, retailers don't have to know its spec.

Yeah Sony could just be expecting this to gradually sell over time slowly displacing PS4.

PSVR - and in that case VR as a whole - actually seems more risky. $400 dollars to play what mostly seems like tech demos so far...

This could be the next "3D television" if they are not careful. Should release the tech when it's actually ready and the proper use case's have been worked out.
 
As someone with a dual-GPU setup, I can confirm this to be awfully untrue.
Do you happen to own a dual gpu system? Did you ever, at least for the past 2 years?

There are still all kinds of problems with multi-gpu on PC, from games that don't support it, UWA not supporting it, frame pacing (which can still be an issue), and rendering tech that doesn't play well with two separate cards with two separate memory pools. But we aren't here to talk about the PC.

John Carmack's own article - which you linked to but misunderstood wrt multi-gpu vs single for VR - even talks about some of the issues with multi-gpu in single display games.

So I guess it's my turn to play the "are you saying that JC is wrong?" card.

I don't get the dual gpu hate here. A pair of 290X is cheaper and gets consistently better results than a 980ti in VR and in games where crossfire is supported (which are all AAA games with demanding graphics.. which is where you'll need the second gpu anyway).
That's actual cost-performance, not some aficionado's theories about bus and interface.

Retail price for PC gamers buying GPUs is utterly irrelevant here. Sony won't be buying parts from Newegg.

There are additional costs and inefficiencies in going the multi-gpu route which is why the market for single board multi-gpu is almost none existent, and only starts where single gpu performance tops out. This is relevant to the discussion. Retail prices aren't.

Edit: if Sony were to go the multi-chip route there would have to additional cost factors that they are taking into account. For example, economy of scale by using one chip for a popular low end system and two for a niche high end system (which seems unlikely but is possible). For a single model, with a chip made only for a system, a 4 TF console using two APUs would appear to be inferior in every way to a single chip.
 
Last edited:
I still don't think that happens. Only maybe the senior execs in a retail chain will be informed, when negotiating purchase orders. Devs and publishers are definitely in the dark. It's part of the reason we get game 'demos' that aren't at all realistic because the devs have no idea what the final hardware specs are before reveal and completely overshoot the mark.

Sony won't tell any outside parties the price.

But publishers have to get some idea or else they may not commit to developing for the new system, unless the marginal cost of developing for a roughly 2X PS4 system is so minimal that they'd do it without worrying too much about whether the new system would sell well. Presumably the PS4 games they had in the pipeline could be modified for minimal cost to support the new system or they could plan for a patch.
 
But publishers have to get some idea or else they may not commit to developing for the new system...
That's the thing though. Pubs end up commiting to consoles with no idea what it'll cost, on faith alone. So for PS3, the assumption was, "It's the next PlayStation. It'll sell gangbusters. We need to be in on that." Then the price is announced and the pubs and devs are just much boggle-eyed as everyone else. Everyone who committed to developing for OVR did so on the belief that it'd sell for ~$300. The actual price wasn't known until finally announced, to the same shock amoung developers as consumers.

Sony only need say, "we've new dev kits for a PS4.5. Who wants one?" and some will take it up as a speculative venture. Or no-one will, but that doesn't matter as it's not really designed for new content yet. Once there's an install base, devs may target it. Or not. But a price doesn't need to be given to anyone when the brand is strong enough.
 
...

CrossfireX nowadays can get >90% scalability with 2 GPUs, and since XDMA the frame pacing issues are mostly gone. Get it to a console's highly optimized environment and perhaps you can get some steady 95% scalability.

...
Yes but the benchmarks are done using powerful desktop CPUs. Not sure the scaling is going to be similar using one laptop CPU.

Not if the beefed up CPU has an impact on PCB layout, cooling solution design, packaging size and thus shipping and handling costs. We currently have no idea what "better CPU means"; it could mean the same 8 core Jaguar but over-clocked, or an entirely new architecture like 8 Zen cores clocked at something like 3.2 GHz.

The problem is that SONY is forced to have a PS4K 100% BC with PS4. It's the only they won't alienate current PS4 player base. I even think that SONY will impose cross compatibility to developers. All PS4K games (even VR games) will have to be playable on PS4, without conditions. It's the only way the whole thing can work in the long term (otherwise people won't be enticed to buy PS5s the way they bought PS4).
 
There are a number of reasons I consider it *plausible* that PS4K, as described in the various leaks, is powered by a multi-chip solution. Here they are in no particular order.

  1. Time frame - This seems have to have come together pretty fast. Taking the existing chip that they had already been (and would already be) shrinking for the new process tech, even without the PS4K existing, and modifying it to accommodate working with another GPU/APU seems like it would be an easier job than designing a whole new chip.
  2. Performance target - 2X performance. 2X the same chip with overclocks all around to make up for the imperfect scaling of multi-gpu.
  3. Backwards Compatibility - Non-PS4K aware games just work - system sets the clocks to PS4 spec and doesn't expose the second chip. Other games can be patched to varying degrees according to developer commitment. Anywhere from "I can run and benefit from PS4K clocks without breaking" to "We're going to throw some work at the 2nd GPU/APU since it's there".
  4. VR - As has been stated, VR workloads map relatively well to multiple GPUs.
  5. Economies of scale - There's bound to be a cost benefit to using the same chip in multiple designs and even more of one from using only that chip (2 APU scenario). There may even be chips that are unable to run at PS4K clocks that run fine at PS4 clocks. Being able to bin chips in this way may allow for a more aggressive clockspeed to be set for PS4K. Also, cheaper R&D going forward for enabling further shrinks.
  6. Radeon Technology Group are *really* into multi-gpu - As the pace of process improvements has slowed down, RTG have determined that the way to continue to improve graphics performance at the pace necessary to enable the fully immersive experiences they want to deliver is to make multi-gpu viable. A partnership with Sony to get multi-gpu into the PS4K would greatly incentivize developers to embrace this model. See the below bullet points from PCPer's interview with Raja Koduri @ AMD's Capsaicin event.
PCPer said:
  • Raja claimed that one of the reasons to launch the dual-Fiji card as the Radeon Pro Duo for developers rather than pure Radeon, aimed at gamers, was to “get past CrossFire.” He believes we are at an inflection point with APIs. Where previously you would abstract two GPUs to appear as a single to the game engine, with DX12 and Vulkan the problem is more complex than that as we have seen in testing with early titles like Ashes of the Singularity.

    But with the dual-Fiji product mostly developed and prepared, AMD was able to find a market between the enthusiast and the creator to target, and thus the Radeon Pro branding was born.

    Raja further expands on it, telling me that in order to make multi-GPU useful and productive for the next generation of APIs, getting multi-GPU hardware solutions in the hands of developers is crucial. He admitted that CrossFire in the past has had performance scaling concerns and compatibility issues, and that getting multi-GPU correct from the ground floor here is crucial.

  • With changes in Moore’s Law and the realities of process technology and processor construction, multi-GPU is going to be more important for the entire product stack, not just the extreme enthusiast crowd. Why? Because realities are dictating that GPU vendors build smaller, more power efficient GPUs, and to scale performance overall, multi-GPU solutions need to be efficient and plentiful. The “economics of the smaller die” are much better for AMD (and we assume NVIDIA) and by 2017-2019, this is the reality and will be how graphics performance will scale.

    Getting the software ecosystem going now is going to be crucial to ease into that standard.
 
Status
Not open for further replies.
Back
Top