Predict: The Next Generation Console Tech

Status
Not open for further replies.
But my logic tell me that they can't put a 2+tflops GPU on alpha devkits and change it for a 1.5tflops GPU on final devkit. Am I wrong?

Of course they can.
Alpha kits are there primarily for OS development and are an indicator of what a dev can expect, and something to experiment with for new techniques.
Dev's will have a clear description of what to expect from the final hardware, and how it differs rom the alpha kits. Some will do a better job of interpreting that than others.

Aside from that even if you put a 4 GFlop GPU int there, it would be trivial for MS or Sony to cripple it by disabling shaders or reducing clock rates in the BIOS they ship on the cards.

FWIW if MS and Sony are using AMD CPU's and AMD GPU's and I was developing a launch title, I'd probably write an abstraction layer, and develop on windows with the target specs in mind. The tools are better and your not fighting Alpha OS' continually, I'd probably still run on the devkits frequently, but I'd prefer to do my day to day work in a none alpha environment.
 
Of course they can.
Alpha kits are there primarily for OS development and are an indicator of what a dev can expect, and something to experiment with for new techniques.
Dev's will have a clear description of what to expect from the final hardware, and how it differs rom the alpha kits. Some will do a better job of interpreting that than others.

Aside from that even if you put a 4 GHz GPU int there, it would be trivial for MS or Sony to cripple it by disabling shaders or reducing clock rates in the BIOS they ship on the cards.

FWIW if MS and Sony are using AMD CPU's and AMD GPU's and I was developing a launch title, I'd probably write an abstraction layer, and develop on windows with the target specs in mind. The tools are better and your not fighting Alpha OS' continually, I'd probably still run on the devkits frequently, but I'd prefer to do my day to day work in a none alpha environment.

Then, even if the GPU model inside dev kits (at least the pictured) can reach 2+tflops, it can be a "downgraded" version?
 
For a GPU a console doesn't really have a different workload perse.
PC GPU's do tend to be backwards looking in so far as if you look at which paths through the silicon are fast, it's always the key titles from the last 12 months, so titles like Rage or Crysis actually drive/stagnate development, but usually after they've shipped.
On a console, games will use the resources as they are presented, so there is scope to be more forwards looking, it's hard to say if they actually will be.

The only thing interesting about the choice of card in the devkit is that it's not GCN, MS have to write a shader compiler, and it's easier to write one, than two, one for the alpha kits and one for the final silicon.
Having said that it's possible they couldn't source enough GCN based cards when the dev kits shipped to developers, I've also heard a rumor of a late GPU change on the MS side, but I have no way to confirm it, so you can't even infer much from that.
Is it practice for MS to be "late" with console hardware or I'm wrong? I remember reading that hardware engineers on 360 where practically running around like mad to secure good deals with hardware companies and where running out of time to get everything started and in place. They made deal with Ati in late summer 2003 AFAIK, so to change GPU this late...I dunno, sounds like another crazy launch is ahead of them.
 
As an aside, I could imagine a scenario where one console had a clear 2x advantage in performance, and still couldn't differentiate itself significantly from the competition.

60fps vs 30fps in all games wouldn't draw any notice from the public?

Or how about 1280x720 vs 1920x1080 in all games?

Nobody would notice?



Talk like the above just encourages and emboldens this race to the bottom between Sony and MS.

"Well, if we don't need 2TFLOP and can get by with 1TF, let's go with 1!"
- competitor#1

"Hmm, if they're going with 1TF, we can get away with 500GF"
- competitor#2


Of course, there is a baseline which discourages a race all the way back to ps3/xb360 spec...

iPhone



I honestly can't believe that we are seriously debating whether or not a 2x increase over a competitor (at a time when exclusive development is near non-existent as a differentiator) would even be noticed and/or the plausibility of a 120mm GPU die with a 25mm CPU being one of these competitors.

It's a joke, and I'm anxious to see that joke not be on the games industry.

However, I will laugh at the inevitable fail of whichever corporation decides to invest so little in their silicon budget under whatever guise they want to label it. Bottom line, such a pathetic effort in silicon investment isn't deserving of discussion (notice the flock of WiiU crickets?), much less purchase.
 
Last edited by a moderator:
It is just as challenging to engineer something to be cool/quiet/small/reliable as it is to engineer a "beast".

There is a saying that goes something like this, I believe:
"Cheap, small, powerful - pick 2"
 
It is just as challenging to engineer something to be cool/quiet/small/reliable as it is to engineer a "beast".

There is a saying that goes something like this, I believe:
"Cheap, small, powerful - pick 2"

and somebody on here had a signature that said gamecube had all 3 :p
 
Drop to 720p and here is it, Unreal Engine 4 on a radeon 7770 level GPU.
It's that simple. The GPU could be slightly more powerful too, or have side memory.

Instead of rendering 1080p you could remove shader noise (i.e. have some shaders take four samples rather than just one), have 16x AF all over the place, then use the latest FXAA variant (if we're too cheap for MSAA, else use it if you can) and 720p instead of 600p. This gives a leg up against current consoles then it's about detail, texture resolution and content, content, content.
 
It is just as challenging to engineer something to be cool/quiet/small/reliable as it is to engineer a "beast".

There is a saying that goes something like this, I believe:
"Cheap, small, powerful - pick 2"
Is there any way to pick "even more powerful" and big? Because, I could care less if my console is 20-30% bigger than the "perfect size" volume wise. Just as it has more grunt, I'm good. I don't really care about wattage either. 100W, 200W or 250W makes no difference to me ;)
 
Is it practice for MS to be "late" with console hardware or I'm wrong? I remember reading that hardware engineers on 360 where practically running around like mad to secure good deals with hardware companies and where running out of time to get everything started and in place. They made deal with Ati in late summer 2003 AFAIK, so to change GPU this late...I dunno, sounds like another crazy launch is ahead of them.

FWIW I don't believe this is true at all.
I remember having pretty much complete specs including things like memory latencies and instruction timings from the first time MS showed 360 to devs. The only thing that changed late was the memory size.

I heard the GPU change rumor back early in the year, the rumor I heard wasn't a radical change in direction either, just that they had decided to lock down the GPU feature set later so they could take advantage of some later developments. But it was 3rd hand and I have no way to verify it.
 
Is there any way to pick "even more powerful" and big? Because, I could care less if my console is 20-30% bigger than the "perfect size" volume wise. Just as it has more grunt, I'm good. I don't really care about wattage either. 100W, 200W or 250W makes no difference to me ;)

Although it may not matter to the likes of us (believe me, I'm right there with you - that's why an i7/crossfired pitcairns/etc 800w beast of a PC sits in my home theatre rack)

gaminghtpc4yudu.jpg


Thing is - it matters to the mass market. What we want doesn't represent the mass market. And that's where this discussion sorta forks off into the other thread.
 
FWIW I don't believe this is true at all.
I remember having pretty much complete specs including things like memory latencies and instruction timings from the first time MS showed 360 to devs. The only thing that changed late was the memory size.

I heard the GPU change rumor back early in the year, the rumor I heard wasn't a radical change in direction either, just that they had decided to lock down the GPU feature set later so they could take advantage of some later developments. But it was 3rd hand and I have no way to verify it.
I was thinking of 2003 summer, equivalent to last years summer if next gen comes out in late 2013. Nick Baker and engineering team knew what they wanted, but didn't get the deal in place yet. ATi started working on the chip at that time but they signed a deal in late summer (August). Obviously you worked there so you know what is the actual case.

You think AMD didn't already have GCN architecture firmly in place by mid 2011 for MS to go with that? Would be weird if they went with VLIW4/5 at the time...
Although it may not matter to the likes of us (believe me, I'm right there with you - that's why an i7/crossfired pitcairns/etc 800w beast of a PC sits in my home theatre rack)



Thing is - it matters to the mass market. What we want doesn't represent the mass market. And that's where this discussion sorta forks off into the other thread.
I'm not sure that the size of the box is that big of a deal in the Western countries, obviously I'm talking about something like 360 +-20-30%, not something like mini tower.

And wattage thing...I have alot of friends who are also into tech and games, and I've never heard of anyone ever looking on the box to see if the power draw is lower than his "standard". I could be wrong, of course, thats just my experience.
 
60fps vs 30fps in all games wouldn't draw any notice from the public?

Or how about 1280x720 vs 1920x1080 in all games?

Nobody would notice?

Getting off topic, but I didn't mean no-one would notice.
I'm thinking about the general populous certainly a lot of people wouldn't "notice" 720p vs 1080p, 30 vs 60 perhaps given the number of FPS' we see these days, but even there I'm inclined to think a lot of people would just skim over it.
Anyway those are more likely limited by ROPS/Bandwidth than shader ops.
I was thinking more interms of 2x polygon counts, or shader flops.
At current geometry levels the former doesn't buy you much, it doesn't even increase the number of segments in a silhouette dramatically.
The later id heavily gate on how practical it is to be ALU limited in the shader and I suspect 2X is more about subtleties than it is about different techniques.

As I said earlier I don't expect that sort of disparity because I expect both players to be aiming for similar power envelopes.
 
VLIW4 to me looked strangely short lived, can be developed primarly for console and used as stopgag for the pc until gcn?
What is best suited for the tipical console workload, wliv4 or gcn/gcn si?

Yes it's short-lived. It was meant to live on a cancelled 32nm process if I'm not mistaken. The radeon 6970 as it came out was sort of a backup plan. In the end it will be most significant on the Trinity APU, here it's a stop-gap until gcn 2.0 (or gcn 1.1 if you're cynical)

VLIW 4 is a tweak on VLIW 5 architecture, it remains entirely similar. We may joke that AMD had more work updating its compiler.
A 5D vector was good for basic rendering operations you do over and over again but with shaders getting longer, more instructions would readily fit in 4D. So switching to VLIW 4 allowed better utilization.

GCN is a whole lot better for GPGPU and damn good for gaming, whatever cruft is added from the complexity doesn't matter much when we see the result.
 
Is there any way to pick "even more powerful" and big? Because, I could care less if my console is 20-30% bigger than the "perfect size" volume wise. Just as it has more grunt, I'm good. I don't really care about wattage either. 100W, 200W or 250W makes no difference to me ;)
Bigger box + hotter cores == more material, more cost, more cooling, louder, higher price
I'm not sure that the size of the box is that big of a deal in the Western countries, obviously I'm talking about something like 360 +-20-30%, not something like mini tower.

And wattage thing...I have alot of friends who are also into tech and games, and I've never heard of anyone ever looking on the box to see if the power draw is lower than his "standard". I could be wrong, of course, thats just my experience.
You may not, but there are laws in place in a number of countries now that limit the maximum wattage for such things as game consoles and set top boxes. Your next console _will_ be designed to abide by these laws, since you always design to the lowest common denominator. If those countries have a significant market, and the console maker wants to sell to that market, then _all_ their consoles will conform to the power envelope those countries require.
 
Bigger box + hotter cores == more material, more cost, more cooling, louder, higher price
Thats why I said, "bigger". Not necessary mid tower, something similar to PS360 in size. I understand the costs go up and that some people may not like it, but that was only my point of view. I don't mind bigger box.

You may not, but there are laws in place in a number of countries now that limit the maximum wattage for such things as game consoles and set top boxes. Your next console _will_ be designed to abide by these laws, since you always design to the lowest common denominator. If those countries have a significant market, and the console maker wants to sell to that market, then _all_ their consoles will conform to the power envelope those countries require.
Didn't know there was a law that limits wattage of console. I knew there is a limit when the console is on stand by, but not on the entire console.
 
Didn't know there was a law that limits wattage of console. I knew there is a limit when the console is on stand by, but not on the entire console.

AFAIK as of right now there isn't in either the EU or the US, however there has been a lot of attention recently on the amount of power wasted by Game Consoles, and I would be surprised if both territories weren't at least getting ready to propose legislation.
It's also possible the manufacturers might try to voluntarily adopt a standard in order to avoid legislation.

I'd be interested if someone knows of legislation that exists in another territory.
 
Here's a chart's for people to have a gander at

power_maximum.gif


CPU's were loaded with Prime95

power_load.gif


Also remember that a 7800 GTX 512mb only consumes around 80-85w on full load and as can be seen, the 7770 that I recommended is right in the 7800 GTX territory.
 
AFAIK as of right now there isn't in either the EU or the US, however there has been a lot of attention recently on the amount of power wasted by Game Consoles, and I would be surprised if both territories weren't at least getting ready to propose legislation.
It's also possible the manufacturers might try to voluntarily adopt a standard in order to avoid legislation.

I'd be interested if someone knows of legislation that exists in another territory.
Well, if they go with ~200W and than EU or US courts enact law that limits them to say 180W, there is nothing they can legally do about it. I'm sure manufacturers are not worried by upcoming laws, they are worried about losing money on more powerful and bigger box. I understand that, and I would hate to see console business crumbling down because of huge losses that occur in beginning of every generation.

Granted, this time MS has it much easier. 360 will be selling next year, probably in region of ~10 mill, and they have LIVE subscription money so they will definitely be able to cover loses when they release new console. In comparison to last gen when they had big loses per system sold, OG Xbox loses and then few years of RROD. I would feel very comfortable in their position when compared to competitors comes next year.
 
No you have a picture of an alpha devkit.
Even if the part were selected to give an indication of performance, it could be under clocked, the BIOS could be updates to disable shaders etc etc etc.
The same goes for whatever is in the other alpha kits.

I think that's being overly cautious, the choice of GPUs seem quite reasonable for a next gen console, it's not like they're GTX 680s or anything.

It seems quite reasonable to assume the 720 will have a GPU with similar capabilities.
And history would seem to bear out that alpha kits are, if anything, less capable than final hardware.
 
I think that's being overly cautious, the choice of GPUs seem quite reasonable for a next gen console, it's not like they're GTX 680s or anything.

It seems quite reasonable to assume the 720 will have a GPU with similar capabilities.
And history would seem to bear out that alpha kits are, if anything, less capable than final hardware.

While I agree that it's a reasonable spec for a NG console, I wouldn't draw any conclusions from it.

The processors in the 360 alpha kits were MUCH faster (excluding FP code) that the final 360 processors, bit a lot of teams in the ass even though it was obvious if you gave it any thought.

360 and Xbox before it shipped with state of the art GPU's, Xbox near the transition to programmable shaders and 360 at the transition to Unified shaders, so not surprisingly yes the alpha kits had weaker GPU's, again I wouldn't call it a trend.

Note that traditionally neither Sony nor Nintendo have in the past shipped alpha devkits of this type, but rather devkits based on early revisions of real silicon.
 
Status
Not open for further replies.
Back
Top