Ken Kutaragi's keynote speech @ ISSCC 2006

I recently had the opportunity to see some of movie at 60fps (think it was Spiderman 2). And what a difference! At first I didn't like it because it reminded me of videotaped/live TV, that also runs at 60fps (although in fields), but after a couple of seconds of adjustment I could see how much more live and "fresh" everything seemed.
60fps is very important, but there are a few games where it was a wise choice to go with 30 instead and have better effects, such a game is SotC.
30fps in a racing game though, is completely unacceptable to me, no matter how much eye candy or motionblur.
 
kutaragi_after.jpg


Some coverages at Tech-on (Japanese)
http://techon.nikkeibp.co.jp/article/NEWS/20060207/113091/
http://techon.nikkeibp.co.jp/article/NEWS/20060207/113092/

According to Kutaragi, ISSCC is the conference he's visited most often since 1980s, though lately he couldn't attend it as he'd been too busy. Apparently it's something to muse upon for him to do the keynote speech. Instead of a tutorial-like speech often seen at ISSCC, this time he tried to focus on proposing future vision. He suggested presenting his future vision at ISSCC may lead to collaboration with other disciplines such as cybernetics/cyborg in medical science.

Ken Kutaragi
http://techon.nikkeibp.co.jp/article/NEWS/20060207/113092/kutara700situmon.jpg
Next Computing
http://techon.nikkeibp.co.jp/article/NEWS/20060207/113091/kutaragi_Computing.jpg
Keywords for Future Computing
http://techon.nikkeibp.co.jp/article/NEWS/20060207/113091/kutaragi_keyword.jpg
Formula 1
kutaragi_Formula.jpg

Final Word
kutaragi_tool.jpg
 
Last edited by a moderator:
Titanio said:
Non-local computation could be useful for perhaps something like physics in an online game - where you otherwise have each client doing the same calculations. Instead you could pass "events" or the like around the network, let a big "supercomputer" server do the common physics calculations, for example, and propogate the results around the network. IBM's server-side physics demo for Cell is a concrete example, I guess. It's relieve the clients of processing to allow them to focus on other things, and the "supercomputer server" could handle a lot more to boot - you'd no longer be limited by what each client could handle on their own.
As I said, the only application for which it really makes sense to have non local computing is games ... it's the only consumer application there is with unsatiable demand for computing power. For anything other than games it both doesn't make much sense, and Cell is a nasty architecture to develop to.
 
Phil said:
Anyone that has played two games side by side, one running at a constant 30 fps, the other at a 60 fps will notice that the difference is night and day. It's not that console gamers don't care - it's that the average console buyer is less tech-orientated than PC gamers and probably only notices differences in feel but can't put it down to framerate.

Yet when I play PC games I don't set the visual quality and resolution so they are always running > 60fps with vsync...

Take the FEAR option that to helps you judge system settings. With a GeForce 6 ultra I had to drop below my LCD native resolution, and settled for a low of 30 something fps and an average of about 50 or 60 (can't remember). On a console that would mean setting to 30fps.

I'd hate to see a situation where developers were forced to stick 60fps, regardless of what they were trying to acheive in game because someone picks an arbitrary frame rate number as being "correct". Developers should be able to select a target frame rate based on what they want to acheive. And with real time temporal blurring now becoming a real possibility the situation is even less black and white than it was before.
 
Though I would hate to see 60fps options sacrificed for eyecandy just because it makes better promo shots in magazines. Low framerates make for bad games.

If the game is CPU bound, that might be awkward. If it's GPU limited to <60 fps, there should be the option for simpler shaders and such to elevate the framerate. And certainy if I was in charge I'd mandate a smooth 30 fps minimum for games.
 
Given the nature of the PC platform, it's unlikely you could prevent >= 60fps games given the options normally available. Likewise, it's hard to see how you could prevent games that dipped below 30fps (all you could do is fiddle around with the minimum recommending specs on the back of the box and hope later driver revisions didn't negatively affect performence I guess ...).

I've certainly played games where the frame rate dropped below 30fps for some bits given the settings I've chosen (Doom 3 on a 9600 Pro!), and it's not stopped me from really enjoying them. I think most PC gamers spend much of their gaming lives well below 60fps.

On consoles, I think frame rates are selected pretty naturally by what consumers choose. A stable frame rate is valuable of course, but given the success of games like Halo and Grand Theft Auto, it's clear that detail vs frame is a balancing act and not something that should be forced on anyone.
 
Curious:

Kutaragi predicted the Cell processor connected to a broadband network could eventually dominate the market. "If you follow the course of computer history, you see how the quest for constant semiconductor scaling and the equal quest for ever higher performance have led to where in 2006 a single chip such as the Cell is one-sixth the size of the original two chips brought out in 1998," said Kutaragi.

Is he saying that the Cell currently is 1/6 the size the original PS2 chipset was? (although that includes both EE + GS, it seems). Wouldn't that imply the PS3 chipset (Cell + RSX) is likely still smaller than the original PS2 chipset? Wonder how costs, comparatively speaking, are for PS2 at launch vs PS3 at launch...
 
Not sure if the chips were on a larger node when they were introduced in 1998 or what the deal was, but this is the evolution of the chip sizes from the PS2's launch forward:

SONY1306_PG_6.gif
 
xbdestroya said:
Not sure if the chips were on a larger node when they were introduced in 1998 or what the deal was, but this is the evolution of the chip sizes from the PS2's launch forward:

I can't see the picture (work blocks it), but I can only imagine they didn't change the node from development to before launch -- seems like too much extra work to do that.

If what he says is right, then Cell + RSX might actually be cheaper for Sony to make than EE+GS was originally (if Cell is 1/6th the size of EE + GS, then Cell + RSX should probably be at most 1/2 the size?). I didn't realize that EE + GS was actually really quite large in size at launch... and if thats the case then wouldn't it be logical to assume that PS3 might not actually be much different in price (assuming that BR eats up any money saved from Cell + RSX not actually taking up as much space as the PS2's chipset originally did at launch) compared to launch PS2?

We've constantly been assaulted by analysts saying PS3 will cost Sony 3 unborn children for each PS3 manufactured. But it seems possible the PS3 will probably cost about as much as launch PS2 to manufacture (and will likely scale like PS2 did -- BR should drop in cost real fast, and Cell/RSX shrinks to 65nm will be a rather nice savings). Not sure though, maybe he didn't mean what that quote said, but that's what it sounds like.
 
Well, if you can't see the picture you're missing a lot of context I'm afraid, but I'll lay it out for you.

Basically the EE at PS2's launch was 240mm^2, and the GS was 279mm^2. So not sure where he got that 1/6 thing to tell you the truth unless a) the '98 evaluation dies were on a larger node or b) Cell is down on 65nm now.

Anyway just pretending Kutaragi slipped and said 1/6th when he meant a 1/3rd, I agree that in terms of the costs of the major chips, Cell and RSX should cost Sony a fair bit less than did EE and GS in PS2 at launch. I've been saying this on and off for some time now actually, and I'd point also to the fact that these chips will enjoy production on 300mm wafers (will RSX at the beginning? - not sure) and enjoy the benefits of 'redundancy' on-die, whereas the EE and GS did not - defect on the die and the whole thing would get tossed.
 
my (proberly bad) math goes like this: 1st gen ee + gs=519mm^ divide that by 86mm^

and that=1/6th?

well that is the math ken is using anyway!
 
mrdarko said:
my (proberly bad) math goes like this: 1st gen ee + gs=519mm^ divide that by 86mm^

and that=1/6th?

well that is the math ken is using anyway!


That's definitely true - but wasn't he supposedly refering to Cell? Who knows, maybe he was refering to the EE+GS and EETimes misunderstood what was being discussed.
 
xbdestroya said:
That's definitely true - but wasn't he supposedly refering to Cell? Who knows, maybe he was refering to the EE+GS and EETimes misunderstood what was being discussed.

even though i read that quote i assumed he was talking about ee+gs:oops:

out of interest,what would the size of cell be at 65nm?
 
mrdarko said:
even though i read that quote i assumed he was talking about ee+gs:oops:

out of interest,what would the size of cell be at 65nm?

Well, just assuming a clean 'halving' from 90nm to 65nm, we'll just say it would be roughly ~110mm^2. Of course I don't think that's exactly what it will be, but it should be in that range. Which would make even the Cell on 65nm roughly 1/5 of the original EE and GS, rather than 1/6. I don't know, let's just assume he was talking about the EE+GS rather than Cell, because honestly that's what makes the most sense at this point. And to boot, that would be in line with the subject matter of his speech after all.
 
xbdestroya said:
Well, just assuming a clean 'halving' from 90nm to 65nm, we'll just say it would be roughly ~110mm^2. Of course I don't think that's exactly what it will be, but it should be in that range. Which would make even the Cell on 65nm roughly 1/5 of the original EE and GS, rather than 1/6. I don't know, let's just assume he was talking about the EE+GS rather than Cell, because honestly that's what makes the most sense at this point. And to boot, that would be in line with the subject matter of his speech after all.

yep,i agree.
 
"In the pursuit of reality through pixel-based technology, graphics on computer entertainment systems have reached the same level of quality as that of the latest movies," said Kutaragi, president and CEO, Sony Computer Entertainment (Tokyo).

He said that with a straight face? Wow. I thought the Japanese cared about their honor...
 
Ok I think that comment of his would go best ignored rather than discussed, because we all know where the later gets us. ;)
 
Laa-Yosh said:
He said that with a straight face? Wow. I thought the Japanese cared about their honor...

It's not the first time a videogames exec has said this ;)

But here, at least, it was merely a prop for his main argument. His main point is effectively that rendering quality is getting so high that how things behave and move is incredibly important now. At least, that seemed to be the context from the article.
 
Laa-Yosh said:
He said that with a straight face? Wow. I thought the Japanese cared about their honor...

Well he's a busy guy - maybe he's not been to the cinema in a while. A long while.
 
Back
Top