Thread splitting brought to you by cutting edge undersampling circa 2001.
Time circuits activate: 88GFLOPs time waaaarp
Time circuits activate: 88GFLOPs time waaaarp
We've never gotten an official WiiU spec (nor to my knowledge did the Wii ever get a definitive spec sheet from Nintendo either) but 160SP makes the most sense based on what is known from die-shots et al. I'm not the person to discuss this with as I can't differentiate between groups of trannies on a die shot and say "this is X, that is Y" but those arguing for 320 seem less convincing to me as their arguments tend to too quickly devolve into statements of faith 'but Nintendo wouldn't...', 'title X looks awesome and way better than Title Y on 360/PS3', 'Look at X' (I actually really want this to launch so I can 'look at X' and not CG or badly compressed YT), etc, etc
X really does look like a great game. And a pretty great looking one as well.
On the Wii u's Eshop, you can find the highest quality videos of the trailers available, while they arent lossless, they are much much cleaner than even the best youtube video of the trailers, to the extent you can see a lot of details that were compressed out on youtube.
Its tough to know what to believe honestly. There are people here and at Gaf that have seemed pretty convinced that that its 160 SPU's based on looking at the dye photo and comparing the density to other GPU dye photo's, but then you have Jim Morrison from Chipworks claiming you shouldnt try and compare the chip to any off the shelf parts. Most of the parts have yet to even be identified, I dont remember anyone conclusively identifying the ROP's or even the TMU's as an absolute. I was pretty confident in the 176Gflop theory presented, but the argument that if a more modern 176Gflop GPU can outclass the older 240Gflop Xenos, then shouldnt a very modern Xbox One 1.2 Tflop GPU absolutely murder the Xenos? It should be even more efficient than the Wii U's GPU, and 6 times the performance. Its been acted as if a 352 Gflop GPU should be able to do what the 1.2 Tflop GPU powering the Xbox One isnt able to do.
Not in this thread. In this thread, we don't care whether Wii U is a business success or not. It's solely about what the hardware is.Whatever power level Wii U needed to be a viable next gen port target, it fell well short of, and that's what matters.
Well, Wii U is deficient in almost every 360 port it runs that I know of. So by your same argument, if it's really 352 GFLOP shouldn't Wii U be "murdering" 360 in straight ports?
If you ran straight 360/PS3 ports on XBO, it would murder them. Instead it's running "next gen" versions, and sometimes struggling, but that's not the same thing.
Beyond that, you're just into diminishing returns to some extent. Nothing on either PS4 or XBO could be argued to "murder" prior gen. Yet, at least.
After that I feel its safe to assume that the assistance from the Cell and Xenon would be significant enough to quickly close the gap. The Expresso has no real answer for what the VMX unit on the Xenon could do, and certainly not what the SPE's could do on the Cell.
I still think observing the X1 does have some relevance. Do I think a 352Gflop GPU coupled with a CPU that has terrible SIMD performance could struggle to run code designed for a 240Gflop GPU coupled with a CPU with very good SIMD performance? Yes I do. The difference is the X1 has enough over head. Even if we assume the Wii U is 352 Gflop, that still makes the X1 roughly four times stronger in shader performance, not to mention it has double the ROPS and TMU's that run at a much higher clock speed. It seems after reading through tons of post here yesterday, one of the main reason for the 160SPU theory is based on poor performing ports, but based on developers comments, it seems more likely that the CPU is the culprit, coupled with dev kits and a lack of business incentive to really dig dip into the hardware.
My main concern was how the 160 spu theory came to be.
Outside of third party ports not performing up to par, no concrete evidence was ever discovered to say what it is for sure.
The idea that a gpu with 320 spu's should murder the 360/ps3 is pretty presumptious.
It just seems like if that is true, then why is the X1 not easily doing COD Ghost in 1080p? I think its safe to say the X1 build was given more attention than the Wii U build, so lets not pretend its because its a launch game for X1.
It primarily comes from the number of physical register banks. It's been pointed out so many times in this thread. It's even been pointed out accompanied by annotated die shots of various VLIW5 GPU's, across a range of process nodes from different fabs.
http://www.neogaf.com/forum/showpost.php?p=89466641&postcount=553bgassassin said:The SDK says there are 192 threads (which I checked on after being informed about that post) and 32 ALUs. I gave EC a summarized version if you must know. The 32 ALUs and VLIW5 was a part of the summary.
32 x 5 = 160
.....I wish devs could talk more so we could better understand where the problems come from.
Yup Nintendo NDAs must come with a 1000 year serfdom clause or some such, it seems we knew everything about XB1 and PS4 months before launch but we're still guessing on WiiU (even if I regard the guesses as 99% sure). I guess when you stand aside from the graphics wars not addressing it all is a better stance than 'justifying' your hardware choices. Would still love an in depth DF style interview with the h/w architects at Nintendo though.