There's no way they're suddenly going to be like "oh, now we're going to put Zen in there" with 6 months to go. Maybe if it was discrete CPU and GPU, but it's an APU. These things take years to plan out.
Unless you're speculating about things other than zen. I think possible GPU up-flops are a distinct possibility, while hardly to be counted on.
They still have to wait for Vega11, node improvements, etc. to get to 6TF, i don't think they're gonna go any higher, achieving that target in 2017 shouldn't be any easier than achieving 4.2TF in 2016, quite the contrary.could they ship it sooner than expected ? Like before summer ? Like annoucing it at E3 and then say "available tomorrow !"
Another way of examining it is looking at cpu bottlenecked moments in multiplatform games that show a disparity greater than the 10% clockspeed difference 1.75vs 1.6. Digital Foundry has done 50 or more comparisons of games, and of the ones I've seen in those moments such as Fallout 4 1st Deathclaw encounter the Xbox had a 3 frame advantage over PS4, 29-30 vs 26-27, idential 10% clockspeed.DDR3 is roughly the same latency as GDDR5, however. When I cross-referenced the SDK for the ESRAM latency, the rough latencies for memory came out close. More came out of the delay getting out of the cache and through the uncore.
What usually happens is that GDDR5 is tied to a GPU memory subsystem that heavily emphasizes bandwidth at the expense of taking more time to coalesce and re-order accesses. DDR3 is usually tied to a device like a CPU that cannot tolerate that kind of latency, and so the memory subsystem fires off accesses sooner.
The internal arrays of GDDR5 and DDR3 are physically similar, if not the same. The difference is more in the choice of how many arrays the interface reads at once, and signalling. The interface and distance traveled doesn't add that much, so the dominant factors are the DRAM arrays (not different) or the memory controller and hierarchy (independent of the memory device).
edit:
For reference, here's where I tried to derive some of the values:
https://forum.beyond3d.com/threads/...are-investigation.53537/page-407#post-1816112
RAM latency is not issue on a console. It is essential to optimize cache misses.Another way of examining it is looking at cpu bottlenecked moments in multiplatform games that show a disparity greater than the 10% clockspeed difference 1.75vs 1.6. Digital Foundry has done 50 or more comparisons of games, and of the ones I've seen in those moments such as Fallout 4 1st Deathclaw encounter the Xbox had a 3 frame advantage over PS4, 29-30 vs 26-27, idential 10% clockspeed.
There's no way they're suddenly going to be like "oh, now we're going to put Zen in there" with 6 months to go. Maybe if it was discrete CPU and GPU, but it's an APU. These things take years to plan out.
Unless you're speculating about things other than zen. I think possible GPU up-flops are a distinct possibility, while hardly to be counted on.
Well, I stand to be corrected here, but one of the major hurdles for consoles is the thermal envelope. If the technology just isn't there to keep the TDP low it's hard to imagine that when you've hit that thermal envelope that you'd try to jam more power into there; something needs to give and that may very well just be the number of operations the processing unit(s) can do.Personally I hope for more than just an cpu upclock but information to corroborate this is slim to none.
I was more quoting as the original tweet seemed to be about that white paper which df dissected, if his was different info and changed we cannot really infer change with the df contents with any confidence.
Looking his subsequent tweets do not disagree from the df report; did specs shift or just Microsofts become more bullish on their device. Less talk of technical measures to help and more sales "native 4k" "pretty pixels" etc.
Is there something else between Jaguar and Zen for the console market ? Or, if it's not Zen, then it's Jaguar 100% ?
There is puma ? I think it's been referred to as a bug fixed jaguar here.
Not sure how well it bodes with MS talking about half rate animation (which is already used in some games).
that can be very noticeble.
Perhaps if CPU is really that critical. Scorpio coming in at FL 12_2 would be huge. I would consider that a generation gap despite the modest power increase.Frame interpolation should (should) make everything move more smoothly.
I could see a situations where you finish rendering a frame, and as you begin the next - to be delivered in 33 ms - you use compute to start work on an intermediate frame based on colour, depth and motion vectors from the last two or three frames. You deliver the interpolated frame after 16.7 seconds, while continuing work on the next 'proper' frame 16.7 seconds after that.
Given post processing and the usual heavy use of motion blur, this might make interpolated frames practically indistinguishable and give the appearance of full 60 fps.
Though input lag would still be that of a 30 fps game ... :S
Perhaps if CPU is really that critical. Scorpio coming in at FL 12_2 would be huge. I would consider that a generation gap despite the modest power increase.
On the discussion on whether Scorpio is a new generation machine or not, Thomas Mahler from Moon Studios says that Scorpio is a brand new generation machine.
DF journalists are such little rascals. They admit the papers are 6 months old. Things have changed. See also how they describe Scorpio now compared to how they described it a month ago.Digital Foundry got a hold of a whitepaper sent to developers about future Xbox One/Scorpio/PC development.
Some tidbits:
No ESRAM in Scorpio.
Reiterates that developers will *not* be able to develop Scorpio exclusives. Any title developed for Scorpio must also run on OG Xbox One and they go on to suggest how that can be achieved while also maximizing Scorpio.
Suggestions that developers find ways to run their CPU stuff at half-rate (30hz) if they want to target 60hz which *may* indicate the lack of a significant CPU upgrade in Scorpio.
Kinda need a good CPU for VR framerates, but it could also mean VR is kinda DOA.
Its a feature level, I'm not sure if optional is the right terminology. I would say the current standard is FL12_0 for features (XBO, PS4, 4Pro, GCN1 GCN2 GCN3, nvidia < Maxwell), but that's effectively just DX11.2+ in terms of feature set.FL 12_2?
Getting lost now , is that direct x feature level and optional tier?