Ken Kutaragi's keynote speech @ ISSCC 2006

oooohhhh, scandal..

Ken K talked about chips at a chip conference, whatever next??

beer at the pub? shops at the shopping centre?? cars in a carpark? What is the world coming too??
 
...And a more level-headed article
http://www.eetimes.com/news/latest/showArticle.jhtml;?articleID=179100739
Kutaragi predicted the Cell processor connected to a broadband network could eventually dominate the market. "If you follow the course of computer history, you see how the quest for constant semiconductor scaling and the equal quest for ever higher performance have led to where in 2006 a single chip such as the Cell is one-sixth the size of the original two chips brought out in 1998," said Kutaragi.

"Improving system response changes the relationship between computers and humans," said Kutaragi, citing the example of real-time computing in Formula 1 racing cars packed with sensors and processors that are monitored by a supercomputer miles from the racetrack.

Real-time network computing would require the upgrading of network bottlenecks like servers and switches. Instead Kutaragi proposed forming a Cell-based supercomputer-like server as an application server accessed by many client systems. By incorporating Cell processors, real-time responsiveness could be secured by exchanging only a small number of objects.

Kutaragi noted that the "location-free" concept enables users to access their home terminal via the Internet. "By expanding this concept, it will also become possible to access computer entertainment systems in the home, from mobile clients, in real-time," he said. Thus, mobile clients become a remote display and interface for PC entertainment systems.

"Computer entertainment systems and applications are now leading this trend," said Kutaragi.
 
dskneo said:
that was a "video" frame rate in the future, not gaming frame rate... but im sure you know that :)
I remember reading somewhere that Sony's SXRD display can do 120hz refresh rate in the near future. So the future Ken Ken sees is pretty sweet, indeed. :)
 
The concept in and of itself isn't really helped by Cell though ... the rather arbitrary 128kb local storage makes developing applications in the form of "apulets" for network computing a bit of a pain. Plus computation for anything except for games is cheap, what's the point in doing the computation non locally in the first place?
 
Since it's indirectly related to the 'vision' Kutaragi expressed in his keynote, here's that 'grid' network patent again Sony got cleared based on Cell computing.

Patent

Anyway and yes, that edn article on the keynote is completely whack. That guy deserves having wasted his day if he got up at 4am and went to ISSCC expecting PS3 information.
 
Shifty Geezer said:
They ought to mandate 60fps constant framerate. I don't want extra eye-candy at the expense of framerate. At the very least ensure every game has a fixed 60fps mode that loses some of the detail to keep the framerate up so we, the gamers, can decide which we prefer.

i could care less about 60fps, give me more eye candy any day. cod2 is 60fps, it still stutters if you spin and the GFX weren't great, halo wasn't 60fps and it was the best fps i've ever played, so why is FPS important?

as long as they can hit a solid 30fps that's fine. i think PC gamers really need to realize that most console gamers could care less about 60fps or 30fps. it's either good, or it's bad, and a solid 30fps is just fine.
 
scooby_dooby said:
i could care less about 60fps, give me more eye candy any day.

No thanks. Having the impeccable smoothness and consistently silky speed of movement 60fps brings to games is just as important to detail and immersiveness for me as any other form of "eye candy". Its literally twice as many frames being fed to your retina per second after all. Either way you're sacrificing one element of detail for the other. And personally Id much prefer the game world to move and animate at a much more realistic clip than "look" a little more realistic as the expense. And it has a much bigger impact on the actual gameplay than any extra shading, polys or shadows (or whatever).

Again, just my preference on the matter. I concede many gamers prefer a game look its best before it moves, animates and responds its best. But please dont assert that its "just fine" for every gamer.

scooby_dooby said:
cod2 is 60fps, it still stutters if you spin

I havent played CoD2 on 360 yet, but if it stutters as you scroll horizontally than it obviously isnt a consistent 60fps.
 
Last edited by a moderator:
one said:

I thought this was at least a relevant part of that article, as far as PS3 is concerned:

"In the pursuit of reality through pixel-based technology, graphics on computer entertainment systems have reached the same level of quality as that of the latest movies," said Kutaragi, president and CEO, Sony Computer Entertainment (Tokyo). Kutaragi touted the Cell processor as the means for generating high-quality natural motion for moving objects, just as "pixilation" was used to improve computer graphics.

Natural motion requires vast amounts of physics simulation. Without it, overall graphics would differ little from existing PC entertainment systems. The multicore Cell processor—developed by Sony, IBM and Toshiba—has achieved single precision floating-point calculation performance equivalent to a 200-plus-Gflop supercomputer.

Whilst I'd disagree about graphics reaching the quality of the latest movies, the overall point rings true. Seems SCE is pushing physics and motion in a big way with PS3 (as we've seen already in the "free" AGEIA/Havok licences with dev kits etc.), and that they perhaps see it a as unique strength for them.

MfA said:
The concept in and of itself isn't really helped by Cell though ... the rather arbitrary 128kb local storage makes developing applications in the form of "apulets" for network computing a bit of a pain.

It's 256KB, whether that helps or not ;)

Non-local computation could be useful for perhaps something like physics in an online game - where you otherwise have each client doing the same calculations. Instead you could pass "events" or the like around the network, let a big "supercomputer" server do the common physics calculations, for example, and propogate the results around the network. IBM's server-side physics demo for Cell is a concrete example, I guess. It's relieve the clients of processing to allow them to focus on other things, and the "supercomputer server" could handle a lot more to boot - you'd no longer be limited by what each client could handle on their own.
 
Last edited by a moderator:
Titanio said:
(as we've seen already in the "free" AGEIA/Havok licences with dev kits etc.)
Aren't these only evaluation copies? I can't remember what Sony are providing for free and which are included just for people to nose at. Or was it Havok comes as standard, and UE3.0 was an evaluation copy?
 
Shifty Geezer said:
Aren't these only evaluation copies? I can't remember what Sony are providing for free and which are included just for people to nose at. Or was it Havok comes as standard, and UE3.0 was an evaluation copy?

I always thought UE3 was the evaluation copy, but Havok and AGEIA were both fully licensed for each kit? Maybe I'm wrong on that, perhaps a dev can clarify.

Reading the original release, for Havok SCEI obtained "sub-licensing rights" and talked about including it in every kit, with SCEI also providing their own support for it. The AGEIA agreement is similar, but also allows SCEI to also engage with AGEIA in optimisation of their SDK for Cell.

Perhaps devs still have to pay a fee if they choose to use it, though, I'm not sure :? Either way, though, there's an obvious focus on physics middleware on the part of SCEI, and a more intimate involvement on their part than for other types of middleware.
 
Last edited by a moderator:
Titanio said:
I always thought UE3 was the evaluation copy, but Havok and AGEIA were both fully licensed for each kit? Maybe I'm wrong on that, perhaps a dev can clarify.

Reading the original release, for Havok SCEI obtained "sub-licensing rights" and talked about including it in every kit, with SCEI also providing their own support for it. The AGEIA agreement is similar, but also allows SCEI to also engage with AGEIA in optimisation of their SDK for Cell.

Perhaps devs still have to pay a fee if they choose to use it, though, I'm not sure :? Either way, though, there's an obvious focus on physics middleware on the part of SCEI, and a more intimate involvement on their part than for other types of middleware.
I'm quite sure i remember DeanoC (or maybe it was someone else) saying they're all evaluation copies, if they want the full programs, they have to pay.
 
london-boy said:
I'm quite sure i remember DeanoC (or maybe it was someone else) saying they're all evaluation copies, if they want the full programs, they have to pay.

If that's true, Deano's word is good enough for me. Cheers.
 
Would you accept a framerate comparable to 30 fps in games, in movies?
Now we all know 30fps in movies is not the same as 30fps in games, so no need to argue that again.

A movie with jerkier movement than you're gotten used to (the jerkiness not being the cinematographers intended artistic effect), but with some extra "eyecandy"... you'd rather take that?

The fluidity of movement and animation is part of that "eyecandy", and imo a much bigger part than some extra layer of shine. Games and movies are not just still shots you know ;)
 
Anyone that has played two games side by side, one running at a constant 30 fps, the other at a 60 fps will notice that the difference is night and day. It's not that console gamers don't care - it's that the average console buyer is less tech-orientated than PC gamers and probably only notices differences in feel but can't put it down to framerate.
 
rabidrabbit said:
Would you accept a framerate comparable to 30 fps in games, in movies?
Now we all know 30fps in movies is not the same as 30fps in games, so no need to argue that again.

A movie with jerkier movement than you're gotten used to (the jerkiness not being the cinematographers intended artistic effect), but with some extra "eyecandy"... you'd rather take that?

The fluidity of movement and animation is part of that "eyecandy", and imo a much bigger part than some extra layer of shine. Games and movies are not just still shots you know ;)


Movies already suffer from visible framerate related stutter, best seen in the loooong panoramic movements in the LOTR trilogy. They do run at 24fps afterall and it really shows sometimes.
So i guess your argument doesn't really hold up, cause we do put up with that already in movies. Only some people are more sensible to it than others. Personally i always see it in movies and although i don't make a problem out of it (there is nothing i can do to avoid it so there's no point in bitching about it), i always wish movies were shot and displayed at 60fps.

Having said that, i wish everything was at 60fps, it would really make the experience so much better, especially in movies.
 
Well, as I said 30 (24) in movies and 30 in games is a bit different.

I really don't know the technical details of that, but I am sure you agree the movies generally feel much much smoother in motion than 30 fps games.

So what I ment with my failed analogy that comparatively, I'm sure one would not accept jerkier movement in films, like they seem to accept jerkier movement in games.

Right... you do agree? ..... You DO AGREE!!!!

Edit: If you like, I could try a car analogy instead, if that would be more succesfull in getting my point to your head.
- Would you rather like roads that are bumpier and shinier, where you would not be able to drive fast and comfortable, instead of even roads where you can drive full speed? -
 
Last edited by a moderator:
rabidrabbit said:
Well, as I said 30 (24) in movies and 30 in games is a bit different.

I really don't know the technical details of that, but I am sure you agree the movies generally feel much much smoother in motion than 30 fps games.

So what I ment with my failed analogy that comparatively, I'm sure one would not accept jerkier movement in films, like they seem to accept jerkier movement in games.

Right... you do agree? ..... You DO AGREE!!!!

I think people accept the jerky pans in movies because that's just the way they're made and there's nothing anyone is gonna do about it in a long time. Plus, i'm not sure how many people actually see the effect enough to get bothered by it.

In games there is a choice over what framerate to use, and generally games will know more about framerates than normal people going to the movies.

30fps in movies is different from 30fps in games, fair enough, but on fast long movements they look equally awful.
 
london-boy said:
I'm quite sure i remember DeanoC (or maybe it was someone else) saying they're all evaluation copies, if they want the full programs, they have to pay.

I remember them saying that UE 3 was an evaluation copy while the other two were not (I mean DeanoC and Faf on these very forums too). Hopefully I remember it correctly.
 
Last edited by a moderator:
I would also think that things like motion blur help out with movies as well as it is less harsh and easier to view.
 
Back
Top