Retail won't know the price before investors. Investors won't know the price until Sony announce it. If retail knew about the price before it was announced it would leak.Developers have to know its spec, publishers/retailers have to know its estimated price to do their own sales estimates and plannings for schedules and shelf space.
And if you should try and use your dual GPU for a single display then things end up somewhere between "somewhat worse" and "everything has gone to shit".
I don't get the dual gpu hate here. A pair of 290X is cheaper and gets consistently better results than a 980ti in VR and in games where crossfire is supported (which are all AAA games with demanding graphics.. which is where you'll need the second gpu anyway).But none of that means that dual GPU is inherently better than single GPU, and it certainly isn't better if you take into account cost per performance unit (dual GPU double up on bus, power, interface, inefficiency).
I wonder if the smaller size could point towards it using HBM2 memory. That would cut down the pcb size appreciably, and help explain the supposed $499 price.
Probably still unlikely for cost reasons, as I presume Sony will want to keep the hardware at least breakeven.
Xbox 360 One II
Hmmm still think HBM2 would be to expensive.
Only thing got me confused though is the whole extra $100 for better cpu. That seems like a lot relative to the overall cost just for the cpu. Had to be cpu + something else.
As someone with a dual-GPU setup, I can confirm this to be awfully untrue.
Do you happen to own a dual gpu system? Did you ever, at least for the past 2 years?
I don't get the dual gpu hate here. A pair of 290X is cheaper and gets consistently better results than a 980ti in VR and in games where crossfire is supported (which are all AAA games with demanding graphics.. which is where you'll need the second gpu anyway).
That's actual cost-performance, not some aficionado's theories about bus and interface.
You'd have thought, but it doesn't work that way. Devs/pubs work in the dark, and then get happy/sad when prices are announced. PS3 and OVR were priced way higher than devs were expecting and the impact on their potential customer base was a genuine concern.
I'd say they won't know the exact price beforehand, but if they are given a ballpark price they're much appreciated, for example it won't go higher than $499. As for Oculus Rift, it has almost no shelf space for a while.Retail won't know the price before investors. Investors won't know the price until Sony announce it. If retail knew about the price before it was announced it would leak.
You're using retail prices in your comparision when I feel you should be using manufacturer pricing. They artificially increase the price on the higher end gpus because the market will pay it so you dont get a realistic look at how much does it cost Nvdia/AMD to deliver the dual gpu core vs single beefier gpu core. That is the real cost-performance we should be looking for. Dont forget to factor in the increase in circuit board complexity and cooling complexity with a dual gpu setups. In the end, that will be the deciding factor for what Sony/MS selects.
Right, and this happens after the announcement. In the case of PS4, Sony announced it in February 2013 but you couldn't place actual orders for day 1 release until July. I'm sure some places were accepting pre-orders earlier on the basis of first come, first served but nobody in retail knew if they were getting stock for sure, and how much, until months later. Amazon are always beautifully transparent about this. You can register interest for a product for alerts about availability but that's it.I still don't think that happens. Only maybe the senior execs in a retail chain will be informed, when negotiating purchase orders.
I imagine PS4K can be a more controlled, modest launch. The PS4 is selling nicely right now, Sony needs full cooperation and education with retailers to market a new pricier SKU in a way they want it to be with PSVR, etc. etc. So they may try new things, sharing more info with more people.
But I think that neogaf guy is not a retailer, retailers don't have to know its spec.
There seems to be a conflict in multiple leakers, apparently the launch in this year is unlikely according to some sources, so I bet the unveiling at PlayStation Meeting in Feb 2017 and will be released soon thereafter.The only time retail needs to know pricing before an announcement is for stealth launches, i.e. they announce at launch the product is available immediately.
As someone with a dual-GPU setup, I can confirm this to be awfully untrue.
Do you happen to own a dual gpu system? Did you ever, at least for the past 2 years?
I don't get the dual gpu hate here. A pair of 290X is cheaper and gets consistently better results than a 980ti in VR and in games where crossfire is supported (which are all AAA games with demanding graphics.. which is where you'll need the second gpu anyway).
That's actual cost-performance, not some aficionado's theories about bus and interface.
I still don't think that happens. Only maybe the senior execs in a retail chain will be informed, when negotiating purchase orders. Devs and publishers are definitely in the dark. It's part of the reason we get game 'demos' that aren't at all realistic because the devs have no idea what the final hardware specs are before reveal and completely overshoot the mark.
That's the thing though. Pubs end up commiting to consoles with no idea what it'll cost, on faith alone. So for PS3, the assumption was, "It's the next PlayStation. It'll sell gangbusters. We need to be in on that." Then the price is announced and the pubs and devs are just much boggle-eyed as everyone else. Everyone who committed to developing for OVR did so on the belief that it'd sell for ~$300. The actual price wasn't known until finally announced, to the same shock amoung developers as consumers.But publishers have to get some idea or else they may not commit to developing for the new system...
Yes but the benchmarks are done using powerful desktop CPUs. Not sure the scaling is going to be similar using one laptop CPU....
CrossfireX nowadays can get >90% scalability with 2 GPUs, and since XDMA the frame pacing issues are mostly gone. Get it to a console's highly optimized environment and perhaps you can get some steady 95% scalability.
...
Not if the beefed up CPU has an impact on PCB layout, cooling solution design, packaging size and thus shipping and handling costs. We currently have no idea what "better CPU means"; it could mean the same 8 core Jaguar but over-clocked, or an entirely new architecture like 8 Zen cores clocked at something like 3.2 GHz.
PCPer said:
- Raja claimed that one of the reasons to launch the dual-Fiji card as the Radeon Pro Duo for developers rather than pure Radeon, aimed at gamers, was to “get past CrossFire.” He believes we are at an inflection point with APIs. Where previously you would abstract two GPUs to appear as a single to the game engine, with DX12 and Vulkan the problem is more complex than that as we have seen in testing with early titles like Ashes of the Singularity.
But with the dual-Fiji product mostly developed and prepared, AMD was able to find a market between the enthusiast and the creator to target, and thus the Radeon Pro branding was born.
Raja further expands on it, telling me that in order to make multi-GPU useful and productive for the next generation of APIs, getting multi-GPU hardware solutions in the hands of developers is crucial. He admitted that CrossFire in the past has had performance scaling concerns and compatibility issues, and that getting multi-GPU correct from the ground floor here is crucial.
- With changes in Moore’s Law and the realities of process technology and processor construction, multi-GPU is going to be more important for the entire product stack, not just the extreme enthusiast crowd. Why? Because realities are dictating that GPU vendors build smaller, more power efficient GPUs, and to scale performance overall, multi-GPU solutions need to be efficient and plentiful. The “economics of the smaller die” are much better for AMD (and we assume NVIDIA) and by 2017-2019, this is the reality and will be how graphics performance will scale.
Getting the software ecosystem going now is going to be crucial to ease into that standard.