Predict: The Next Generation Console Tech

Status
Not open for further replies.
Yes. Saying they will have the CPU and GPU towards the back of the console like with Wii means the motherboard will look just like Wii. :rolleyes:
 
And if the suggested much more powerful CPU/GPU found in the wii:u happen to require more substantial heatsinks that location might not be available. How many power traces and caps does it require? How big are the packages? If you want to put a V8 in your volkswagen you might need to think about taking out the back seat.
 
And if the suggested much more powerful CPU/GPU found in the wii:u happen to require more substantial heatsinks that location might not be available. How many power traces and caps does it require? How big are the packages? If you want to put a V8 in your volkswagen you might need to think about taking out the back seat.

It's available. There's an important piece of info missing in this discussion you should see.

http://www.neogaf.com/forum/showpost.php?p=33584406&postcount=15709

The part that's spoiler'd. Of course since then I'm leaning more towards 32nm now. :p
 
They can handle it, but if you have to code to fit your data[in local cache memory] as if it were an spe for it to reach near theoretical, and the hardware is going to take more space and consume more energy for less performance.... what's the point?

What type of performance? single precision floating point only. The point is that the more generalised cores can be used for a lot more and with a lot less hand crafting of code to achieve similar performance.

Latency is probably worse compared to xdr, though given predictability with similar coding style you could probably get around it(but if latency is part of the reason for xdr use even in cell, then it may affect how close one can come to peak even with similar approaches... someone with expertise in the area can probably clarify the issue.).

I see no reason to believe that the latency of XDR used in the PS3 would be lower than that of DDR3 1600Mhz. The following link gives an indication of GDDR3 latency compared with XDR and suggest they are similar:

http://forum.beyond3d.com/showthread.php?t=20930&highlight=latency+ps3

GDDR3 is based on DDR2 so I'd assume similar latency's (although I could be way off there) and the following link shows DDR3 at sub 1600Mhz speeds offering lower measured latency than fast DDR2:

http://www.neoseeker.com/Articles/Hardware/Reviews/ddr2_vs_ddr3/8.html

This is hardly conclusive but it's more than enough to open up the question of which has lower latency.

But regardless of that, your link talked of bandwidth, not latency being a driving factor for hitting near your theoretical maximum and the 3960X as already established has over double the bandwidthof Cell as used in PS3. I'm sure there's far more to the picture than that but you see my point?

True, but if we take say 64 spes, which are near the limits of what some say can be handled with simple cpu bus designs. We get 1.64Tflops for 64W. At 28nm the power would be even less. So sure the i7 will get you 0.3Tflops at double the energy consumption, but why settle for that?

As I've mentioned previously, GFLOPS is not the only way to measure performance, especially not in a CPU. If that's all your targeting you may as well forget about putting a CPU in there altogether, just drop a GPU in it's place and off you go.
 
200watts is what I'm expecting which was the power budget for ps3/xb360.

Running a 2048 alu Tahiti gpu at 600MHz will result in a <160watt budget leaving plenty for a cpu.

I'm sorry but I do have to point this out because I see this all the time.............

You are forgetting in your power budget:

Memory
Bluray
HDD
WiFi
Bluetooth
System Fan
And no doubt many other little bits and pieces that add up.

Take the current PS3 Slim. It consumes ~80 watts at the wall yet I believe the Cell uses <20 watt and the RSX <25 watt. (Please correct if wrong)

Even allowing for PSU losses, a fair chunk of the power budget is taken up by everything else......
 
It's available. There's an important piece of info missing in this discussion you should see.

http://www.neogaf.com/forum/showpost.php?p=33584406&postcount=15709

The part that's spoiler'd. Of course since then I'm leaning more towards 32nm now. :p

Why are you jumping to the conclusion that they are going with a smaller node on day one? As far as I can tell, you saw a piece of hardware. That means absolutely nothing to me because all console manufacturers eventually die shrink if it makes sense to do so, but that doesn't tell us when they'll do it. There are other ways to get around heat production such as lower target specs. Was this the devkit that devs are using for the final release? Too many unkowns with too absolute a conclusion to be made regarding a piece of technology for which you are only making wild hopes.
 
Why are you jumping to the conclusion that they are going with a smaller node on day one? As far as I can tell, you saw a piece of hardware. That means absolutely nothing to me because all console manufacturers eventually die shrink if it makes sense to do so, but that doesn't tell us when they'll do it. There are other ways to get around heat production such as lower target specs. Was this the devkit that devs are using for the final release? Too many unkowns with too absolute a conclusion to be made regarding a piece of technology for which you are only making wild hopes.

LOL @ wild hopes. There aren't that many unknowns, well at least from what I know.

Anyway I've felt that well before seeing the dev kit that they were targeting a smaller process. Why put a hot 55nm chip in the kit when you have to underclock it just to run stable? You seriously think they did all that with the intention of it staying like that? How can you consider that as a possible option? That's not an unknown, that's common sense from console development that they wouldn't keep a 55nm GPU in there when they don't have to. Nintendo is the type that would go for the smallest, stable process as possible. At the same time a 45nm CPU and a 32nm GPU would set still allow them to do a future shrink if necessary.
 
Why are you jumping to the conclusion that they are going with a smaller node on day one? As far as I can tell, you saw a piece of hardware. That means absolutely nothing to me because all console manufacturers eventually die shrink if it makes sense to do so, but that doesn't tell us when they'll do it. There are other ways to get around heat production such as lower target specs. Was this the devkit that devs are using for the final release? Too many unkowns with too absolute a conclusion to be made regarding a piece of technology for which you are only making wild hopes.

It makes sense but really only from the perspective that they want the system to be small and quiet.
 
You seriously think they did all that with the intention of it staying like that? How can you consider that as a possible option? That's not an unknown, that's common sense from console development that they wouldn't keep a 55nm GPU in there when they don't have to. Nintendo is the type that would go for the smallest, stable process as possible. At the same time a 45nm CPU and a 32nm GPU would set still allow them to do a future shrink if necessary.

Or they were simply waiting to see what the thermal characteristics of the custom part would be. They obviously didn't have it for E3, so they went with something they were hoping to target. They'll have plenty of time to nail down specs that make sense for the final unit.

Or they were waiting to see if the next process node would be available in time. As it is, 28nm is going to be costly for 2012, and up until late last year, 28nm was in a world of uncertainty and only now it's only barely possible for use in other parts of the industry (mobile sector).

Why bother targeting 45nm for the CPU if they're going to go with the next gen node for the GPU?

There are alternate explanations that don't make choosing 28nm a foregone conclusion.

How is 28nm even stable at this point? It is new. It is far from being a mature process. What makes you think 32nm is that much more widely available for their needs?
 
Does it have to be 45 or 28nm? 32nm exists on some foundries too, TSMC isn't the only thing out there
 
Does it have to be 45 or 28nm? 32nm exists on some foundries too, TSMC isn't the only thing out there

Considering AMD is designing the GPU, I'd be a little surprised if they designed for 32nm. I would expect 32nm to be more IBM's forte.
 
Well AMD have some experience on Global Foundries' 32nm process with Llano and now Trinity. And don't GloFo use IBM's 32nm process tech? There's probably a highly improbable hypothesis hiding in that somewhere ...
 
I'm sorry but I do have to point this out because I see this all the time.............

You are forgetting in your power budget:

Memory
Bluray
HDD
WiFi
Bluetooth
System Fan
And no doubt many other little bits and pieces that add up.

Take the current PS3 Slim. It consumes ~80 watts at the wall yet I believe the Cell uses <20 watt and the RSX <25 watt. (Please correct if wrong)

Even allowing for PSU losses, a fair chunk of the power budget is taken up by everything else......

Not forgetting anything.

This was discussed pages ago.
 
Guys don't worry, even if we get a custom 1200 ALU ps360, it will be 10 times the power, of course not every metric will be 10x but that will be made up for with more advanced parts.

If we get a 2000+ ALU it will be vliw 5, and conventional cpu, and i do think that is doable for 200-220w.
32 ROPS 64TMU's based off modified 6870 with some salvaged GCN parts. @700mhz....its gonna rock.
 
Or they were simply waiting to see what the thermal characteristics of the custom part would be. They obviously didn't have it for E3, so they went with something they were hoping to target. They'll have plenty of time to nail down specs that make sense for the final unit.

Or they were waiting to see if the next process node would be available in time. As it is, 28nm is going to be costly for 2012, and up until late last year, 28nm was in a world of uncertainty and only now it's only barely possible for use in other parts of the industry (mobile sector).

Why bother targeting 45nm for the CPU if they're going to go with the next gen node for the GPU?

There are alternate explanations that don't make choosing 28nm a foregone conclusion.

How is 28nm even stable at this point? It is new. It is far from being a mature process. What makes you think 32nm is that much more widely available for their needs?

Sure was a lot of talk about 28nm when I said 32nm. Other than that, no need to really discuss further. :smile:

Guys don't worry, even if we get a custom 1200 ALU ps360, it will be 10 times the power, of course not every metric will be 10x but that will be made up for with more advanced parts.

This is similar to my view, but might seem disappointing to others.
 
I hope software gets better if hardware is taking a back seat this generation. That's the impression that I'm getting and I'm tired of just more shaders, higher resolutions, and more frames per second.

I want to see something exotic like ray tracing and rasterization hybrid engine.
 
Or they were simply waiting to see what the thermal characteristics of the custom part would be. They obviously didn't have it for E3, so they went with something they were hoping to target. They'll have plenty of time to nail down specs that make sense for the final unit.

According to develop-online Nintendo has some problems to deliver on targets

“I've heard [a project designer] complain it's underpowered compared to what Nintendo announced, resulting in people having to de-scale their plans,” the source added.

IGN might not be totally bullshiting XboxNext power 6x360 (Kotaku 6-8x) but they definitely are overstating Wii U by quite monstrous amounts leading to salty tears at E3
 
According to develop-online Nintendo has some problems to deliver on targets

“I've heard [a project designer] complain it's underpowered compared to what Nintendo announced, resulting in people having to de-scale their plans,” the source added.

IGN might not be totally bullshiting XboxNext power 6x360 (Kotaku 6-8x) but they definitely are overstating Wii U by quite monstrous amounts leading to salty tears at E3

Whose? Yours? Since you are taking that out of context after all.

I think those who plan to buy a Wii U will be rather happy.
 
Last edited by a moderator:
Or they were simply waiting to see what the thermal characteristics of the custom part would be. They obviously didn't have it for E3, so they went with something they were hoping to target. They'll have plenty of time to nail down specs that make sense for the final unit.

Or they were waiting to see if the next process node would be available in time. As it is, 28nm is going to be costly for 2012, and up until late last year, 28nm was in a world of uncertainty and only now it's only barely possible for use in other parts of the industry (mobile sector).

Why bother targeting 45nm for the CPU if they're going to go with the next gen node for the GPU?

There are alternate explanations that don't make choosing 28nm a foregone conclusion.

How is 28nm even stable at this point? It is new. It is far from being a mature process. What makes you think 32nm is that much more widely available for their needs?

Seems that both you and bsassassin are basically in agreement in why nintendo would have the dev-kits designed the way it is. In either case, rumor has it that the final dev kit is out and is apparently "different" and stronger in some ways to what nintendo was targeting. So, either nintendo's custom parts did meet the targeted thermal reduction, or nintendo did resized the casing to get the specs desired.
 
I think those who plan to buy a Wii U will be rather happy.

Agreed.

I just don't know who that is.

Wii gamers? Is it still Wii-mote centric? Did they improve the motion interface to be as accurate as Move, or is it still the same issue plagued wii-mote?

Core gamers? I'm pretty sure most of them will be waiting for ps4/xb720 based on the specs known.

Casuals/moms/kids? Not sure if they stick around or if they're headed to kinect/facebook/tablets/phones.


I think by them coming in light on hardware they limit the available demographic interested in their wares. They will still be competing with ps3 (w/move) and xb360 (w/kinect) for the duration of the WiiU's cycle.
 
Status
Not open for further replies.
Back
Top