Kutaragi talks CELL technology @ TGS2002

If Nintendo chooses to build another console that will have the same power as the PS3 they won't have any problems finding fab partners. Even if they don't go with IBM for their cpu design, there are other cpu manufacturers out there. All the fab companies will have sub .01u process tech in a year or too anyway. Once that's available all Nintendo needs is to contract a cpu manufacturer to manufacture a custom multi-core CPU. It doesn't have to be a single core off-the-shelf part either. They can work with NEC, Hitachi, or Fujitsu AFAIC. All three of those companies make supercomputers and those contain very high performance single-core cpus. Their experience in making these high performance cpus can EASILY be used by Nintendo to make a multcore solution using sub .01u fab tech. NEC and Hitachi has had 8 GFLOPS single core cpus for years btw and they're not even clocked very high. As a matter of fact NEC has the highest performing cpu in the world clock for clock in terms of GFLOPS. Also they have embedded memory tech that can be used for each core in a multi-core cpu.

Bottom line is the tech will be there for Nintendo to use if they decide to use it. To think that IBM and SONY are the only companies out there that has the tech and fab abilites is very naive.
 
IMO any competent cpu company that has extensive cpu experience can design from the ground up a multi-core solution using .01u fab tech.
 
V3 said:
but you can't predict player actions / movements. That was what I was getting at.

You don't actually perdict player actions, in real time system, it practically does it by reflex :)

You can't predict player's input, but you can predict the player's avatar's actions and movements, base on previous input and such. That's where statistic and probability should be put to good used.

But generally for such thing as cloth simulation you don't really need that kind of thing. Those prediction is probably more important for online games and such.

Well, I am studying technical physics, so I am enjoying this argument quite a lot. Actually if you're really interested I could bring some maths into it. But back to the topic...

The problem with statistic / probability is that in order to use it you must ensure random behaviour. Players don't decide randomly. So even if player A has reacted in situation X with action L, doesn't mean that in the same situation will player A react with action L. What you could do is to watch Player A's action over a long time or to watch a sample of N players in a much shorter time, the you could say for example, 55% attacked. But you can't actually predict what the next action IS.

Probability gives a general statement over a significant amount of events. But you can't predict the outcome of the next event.

So let's get back to our game:
Player A is again in situation X. Our statistics show he's likely to take action L. That's why our game code precalculates the event in oder to save calculation time when he's attacking. But now player A takes action M which is extremely unlikely and is therefore not calculated. So the game has to do it instantly. But we were precalculating and doing statistic to save ressources in order to use them for sg. else. That'd lead to a non fluid gameplay...
 
All that stuff about supercomputers is really pointless. You will never see a game console which could in any way challange a contemporary supercomputer. Sony is playing the hype game as they did with EE, which on its release time was certainly not more powerfull than other mainstream uniprocessors, as was stated here. K7-1GHz and P3-1Ghz were released in March 2000 as was PS2. Both of these offer ~ 2.5x-3x times the usable real world performance of EE. And in my opinion this will be exactly the same upon release of ps3. It will be an impressive system (for its price upon release), but will be around 2x slower than what is currently available on x86/x86-64 at that time.
 
PiNkY said:
All that stuff about supercomputers is really pointless. You will never see a game console which could in any way challange a contemporary supercomputer. Sony is playing the hype game as they did with EE, which on its release time was certainly not more powerfull than other mainstream uniprocessors, as was stated here. K7-1GHz and P3-1Ghz were released in March 2000 as was PS2. Both of these offer ~ 2.5x-3x times the usable real world performance of EE. And in my opinion this will be exactly the same upon release of ps3. It will be an impressive system (for its price upon release), but will be around 2x slower than what is currently available on x86/x86-64 at that time.

If that's true, how come the PS2 has more geometry throughput than my Duron 1Ghz?

3x faster my arse.
 
If that's true, how come the PS2 has more geometry throughput than my Duron 1Ghz?

PS2 is a fixed system, so any game you buy is specifically coded for this target hardware, which is never true for open plattforms (xbox in a way *performs* much better than your average piii733+64mb+gf3 for example, though from a hardware pov, it doesn't differ too much from your home pc). Also for vectorizable operations, that 2.5-3 figure would be more in the 1.5* to 2* range (if sse / 3dnow is utilised). But normal code is not just vectorized additions and multiplications, and here EE and P3/K7 do not differ too much in clock normalized performance. Also both of them (esp. K7) have lots more and faster (exp p3@1ghz) cache, which imo is EEs weak point.
 
Dude I know this. But he said "real world performance". In-game geometry = real world performance.
 
best comparison i can come up with

fp:
pov ray (really the best case ps2 scenario, image rendering, highly vectorized, etc...)

00:00:24 616.67 0.48 PSX SDK Board Emotion Engine 294 MHz Linux 2.2.2 with a PSX compiler Sony/Toshiba PSX Compiler 299 Single: POV 3.01 Feb-2001

00:00:20 740.00 Custom AMD Duron 700 928 MHz Mandrake Linux 8.1 gcc 3.03 ?? POV 3.1 Feb-2002

int: specint2000 base peak
SGI SGI 2200 2X 300MHz R12k 2 254 264
Intel Corporation Intel OR840(1 GHz Pentium III processor) 1 438 442

i did not find any integer contributions for ee, so i referred to an sgi workstation with two r12k mips cores. each of those is a ee on steroids (32kb data, 32kb inst l1 cache, !!8Mb!! L2 cache...)

if you weigh those 1/1 you come up with p3/k7(@1ghz) ~1.7x EE in a *very* ps2 friendly benchmark environment
 
Again, we're meant to be talking REAL WORLD. Who renders POVRAY images on their PS2? Or benchmarks?

Show me a game that runs at 60fps with the geometry detail of MGS2, GT3, or any number of PS2 games on a 1Ghz PC.
 
I also find real world comment completely strange. I have a 1GHz P3 machine with GF2 videocard and the games on it look absolutely simplistic and have slowdown to pits of hell compared to best looking PS2 games. That is not to mention upcoming PS2 games that look even more staggering. My PC looks like dated piece of junk compared to PS2 right now.
 
i agree, "real world" was not a fitting term if you just regard games. This was not what i wanted to point out, it was in regard of postings like...

the EE kills the uniprocessor appraoch.

If Sony used a 'Uniprocessor' instead of the 'distributed' method of dual VU's, they'd never have attained the sheer power that they did in the EE for that timeframe. At that time, the nearest Intel processor was like a P2 300-400mhz

which imply EE seriously raised the bar for mainstream cpus at its release time. It sure did this for consoles though.
 
Show me a game that runs at 60fps with the geometry detail of MGS2, GT3, or any number of PS2 games on a 1Ghz PC.

GTA3 runs a hell of a lot faster, and looks better, on my GHZ(TBird) PC then it does on the PS2. Not that I agree with the general line of argument, but you can certainly find examples if you look for them.
 
Show me a game that runs at 60fps with the geometry detail of MGS2, GT3, or any number of PS2 games on a 1Ghz PC.

just some numbers from daves recent gainward review for pIII/733(@640*480)

Max Payne:
PIII 733MHz 72.1

UT2003:
PIII 733MHz 73.7

RTCW:
PIII 733MHz 41.6

SS2:
PIII 733MHz 47.4
 
Gainward what? GF4?

GTA3 ran and looked like absolute ass on my 1Ghz Tualatin P3 and GF2. Then again, it was a clunky renderware port. The PC GTA3 motion blur made me blind :(

zurich
 
Well, I am studying technical physics, so I am enjoying this argument quite a lot. Actually if you're really interested I could bring some maths into it. But back to the topic...

Ohh so you are :), I'm not familiar with what branch of physics, that is technical physics, is it applied physics of some sort ?

The problem with statistic / probability is that in order to use it you must ensure random behaviour. Players don't decide randomly. So even if player A has reacted in situation X with action L, doesn't mean that in the same situation will player A react with action L. What you could do is to watch Player A's action over a long time or to watch a sample of N players in a much shorter time, the you could say for example, 55% attacked. But you can't actually predict what the next action IS.

If you have some probability of the event occuring, and you picked one event out of the possibilities, you are predicting basically, is it accurate, well that depend on the probability and the actual outcome :) Its called predicting afterall, you can be right or wrong, but with help of statistic on your side, you hope you make the right decision.

So let's get back to our game:
Player A is again in situation X. Our statistics show he's likely to take action L. That's why our game code precalculates the event in oder to save calculation time when he's attacking. But now player A takes action M which is extremely unlikely and is therefore not calculated. So the game has to do it instantly. But we were precalculating and doing statistic to save ressources in order to use them for sg. else. That'd lead to a non fluid gameplay...

Hmm, we are talking something over the network here aren't we ?

Anyway if action M is very unlikely, that means this situation is also highly unlikely. So it should be alright. And player A will learn what he did will caused some non fluid gameplay. And if it is deemed necessary, we should punish Player A for doing action M in situation X, like by killing his avatar :)

Beside, aren't we talking about simulation of cloth, just a post or so ago ? Simulating is quite different to predict something.
 
If it was possible to use a remote distributed computing fabric for such computations at all it would make more sense to keep all the computations there, and simply make the console a dumb console.
 
BenSkywalker said:
Show me a game that runs at 60fps with the geometry detail of MGS2, GT3, or any number of PS2 games on a 1Ghz PC.

GTA3 runs a hell of a lot faster, and looks better, on my GHZ(TBird) PC then it does on the PS2. Not that I agree with the general line of argument, but you can certainly find examples if you look for them.

Funny, I was going to mention that. GTA3 runs like an absolute dog on my 1Ghz Duron.
 
Back
Top