bgassassin
Regular
Yes. Saying they will have the CPU and GPU towards the back of the console like with Wii means the motherboard will look just like Wii.
And if the suggested much more powerful CPU/GPU found in the wii:u happen to require more substantial heatsinks that location might not be available. How many power traces and caps does it require? How big are the packages? If you want to put a V8 in your volkswagen you might need to think about taking out the back seat.
They can handle it, but if you have to code to fit your data[in local cache memory] as if it were an spe for it to reach near theoretical, and the hardware is going to take more space and consume more energy for less performance.... what's the point?
Latency is probably worse compared to xdr, though given predictability with similar coding style you could probably get around it(but if latency is part of the reason for xdr use even in cell, then it may affect how close one can come to peak even with similar approaches... someone with expertise in the area can probably clarify the issue.).
True, but if we take say 64 spes, which are near the limits of what some say can be handled with simple cpu bus designs. We get 1.64Tflops for 64W. At 28nm the power would be even less. So sure the i7 will get you 0.3Tflops at double the energy consumption, but why settle for that?
200watts is what I'm expecting which was the power budget for ps3/xb360.
Running a 2048 alu Tahiti gpu at 600MHz will result in a <160watt budget leaving plenty for a cpu.
It's available. There's an important piece of info missing in this discussion you should see.
http://www.neogaf.com/forum/showpost.php?p=33584406&postcount=15709
The part that's spoiler'd. Of course since then I'm leaning more towards 32nm now.
Why are you jumping to the conclusion that they are going with a smaller node on day one? As far as I can tell, you saw a piece of hardware. That means absolutely nothing to me because all console manufacturers eventually die shrink if it makes sense to do so, but that doesn't tell us when they'll do it. There are other ways to get around heat production such as lower target specs. Was this the devkit that devs are using for the final release? Too many unkowns with too absolute a conclusion to be made regarding a piece of technology for which you are only making wild hopes.
Why are you jumping to the conclusion that they are going with a smaller node on day one? As far as I can tell, you saw a piece of hardware. That means absolutely nothing to me because all console manufacturers eventually die shrink if it makes sense to do so, but that doesn't tell us when they'll do it. There are other ways to get around heat production such as lower target specs. Was this the devkit that devs are using for the final release? Too many unkowns with too absolute a conclusion to be made regarding a piece of technology for which you are only making wild hopes.
You seriously think they did all that with the intention of it staying like that? How can you consider that as a possible option? That's not an unknown, that's common sense from console development that they wouldn't keep a 55nm GPU in there when they don't have to. Nintendo is the type that would go for the smallest, stable process as possible. At the same time a 45nm CPU and a 32nm GPU would set still allow them to do a future shrink if necessary.
Does it have to be 45 or 28nm? 32nm exists on some foundries too, TSMC isn't the only thing out there
I'm sorry but I do have to point this out because I see this all the time.............
You are forgetting in your power budget:
Memory
Bluray
HDD
WiFi
Bluetooth
System Fan
And no doubt many other little bits and pieces that add up.
Take the current PS3 Slim. It consumes ~80 watts at the wall yet I believe the Cell uses <20 watt and the RSX <25 watt. (Please correct if wrong)
Even allowing for PSU losses, a fair chunk of the power budget is taken up by everything else......
Or they were simply waiting to see what the thermal characteristics of the custom part would be. They obviously didn't have it for E3, so they went with something they were hoping to target. They'll have plenty of time to nail down specs that make sense for the final unit.
Or they were waiting to see if the next process node would be available in time. As it is, 28nm is going to be costly for 2012, and up until late last year, 28nm was in a world of uncertainty and only now it's only barely possible for use in other parts of the industry (mobile sector).
Why bother targeting 45nm for the CPU if they're going to go with the next gen node for the GPU?
There are alternate explanations that don't make choosing 28nm a foregone conclusion.
How is 28nm even stable at this point? It is new. It is far from being a mature process. What makes you think 32nm is that much more widely available for their needs?
Guys don't worry, even if we get a custom 1200 ALU ps360, it will be 10 times the power, of course not every metric will be 10x but that will be made up for with more advanced parts.
Or they were simply waiting to see what the thermal characteristics of the custom part would be. They obviously didn't have it for E3, so they went with something they were hoping to target. They'll have plenty of time to nail down specs that make sense for the final unit.
According to develop-online Nintendo has some problems to deliver on targets
“I've heard [a project designer] complain it's underpowered compared to what Nintendo announced, resulting in people having to de-scale their plans,” the source added.
IGN might not be totally bullshiting XboxNext power 6x360 (Kotaku 6-8x) but they definitely are overstating Wii U by quite monstrous amounts leading to salty tears at E3
Or they were simply waiting to see what the thermal characteristics of the custom part would be. They obviously didn't have it for E3, so they went with something they were hoping to target. They'll have plenty of time to nail down specs that make sense for the final unit.
Or they were waiting to see if the next process node would be available in time. As it is, 28nm is going to be costly for 2012, and up until late last year, 28nm was in a world of uncertainty and only now it's only barely possible for use in other parts of the industry (mobile sector).
Why bother targeting 45nm for the CPU if they're going to go with the next gen node for the GPU?
There are alternate explanations that don't make choosing 28nm a foregone conclusion.
How is 28nm even stable at this point? It is new. It is far from being a mature process. What makes you think 32nm is that much more widely available for their needs?
I think those who plan to buy a Wii U will be rather happy.