Ken Kutaragi Interview by Hiroshige Goto (PC Wach)

Status
Not open for further replies.
I love how every few years someone tries to reinvent webtv and/or Tivo and everyone gets giddy. Or they're gonna revive the workstation markets thats been dead/dying for 10 years. I just don't see a purpose for these machines outside of gaming. Judging from the past anything more than gaming has turned into hype. People have already clustered ps2's and Xbox1's and it was nothing more than "just to do it" type thing. It's always been cheaper and more efficient to use off the shelf PC hardware.
 
Hey, look at the brightside then! At least one day in the future, that may no longer be true. I don't think anyone can honestly say that x86 will be "the answer" forever.
 
Here's the machine translation for the third part of the interview. Might as well have it here too.


The reason which does not adopt the Cell based graphic chip

With PS3 architecture, those where last year you are surprised graphics are to apply with Cell based architecture. Why, not making Cell based GPU?

Kutaragi: 7 SPE of Cell (Synergistic Processor Element) it can use in graphics. Actually, several of the demonstration with E3, when it has not been the graphic processor, has done everything of graphics to rendering with just Cell. But, such using is wasted. There are times when you do more in other things in Cell.

2 placing Cell, (you use Cell of one side) with also the plan which is said it was on the graphic center, but as Cell and Shader as the computer thinking that weight is different from function you stopped. We would like to make the architecture which Shader with, all things thoroughly (in graphics) can specialized Shader. Though, for example, displacing mapping (it does) with it can also say with SPE.

So far (real time 3D graphics), that it seems and shows, (3D graphics) in the space is different really. Even then, with present resolution it was good. The game which has come out with Xbox 360 the majority, is 3D such as that even at present point in time.

But, I would like to make 3D where (metamorphosis and the like is reflected securely in the 3D space). For that, there is the thought as much as possible of liking to share the data (with CPU and GPU). Therefore, the latest architecture was taken. Originally if as for the floating point unit of GPU and Cell, from precision round (making round) until error we would like to make entirely the same. This time, it has become close, almost rather simultaneous. Therefore, each other, it can use (the operational result) bidirectionally.

- EDRAM is stopped because of full HDTV

EDRAM (it installed graphic memory and DRAM) expected, but the HDTV 2 picture hearing, you could agree upon the reason which it does not make eDRAM.

Kutaragi: Originally, there is no graphic memory in GPU, Redwood (the high-speed interface which connects Cell and RSX) with YDRAM (the code name of XDR DRAM) with also correspondence is achieved. Because YDRAM (memory) has become unified.

But, so entering, that there is a problem which you say whether () definite form processing and Shader of graphics (computing) when it does, the distant place (accessing memory) with, it is possible to make (zone and cycle time) wasteful. It is not necessary to use (memory zone), with definite form task of the side of with special care Cell. Because Shader with Shader, calculates the tremendous quantity, there is a necessity of memory even to there. Especially, when with full HDTV, 2k×1k (the 1,920×1,080 dot) being progressive, when 2 pictures or more we would like to handle, it becomes, mass VRAM becomes necessary.

When that happens, eDRAM is unreasonable. As for using eDRAM, at the time of PS2 it is good. Are not enough in so or this time just 2 pictures. We assume that eDRAM of the quantity which (can support HDTV) 200 squares mm or 300 squares mm was inserted in the stone. So, when it does, (it can load onto the tip/chip logic because of the area of eDRAM) at once to decrease, the quantity of Shader decreases. That compared to, in full using in logic, the method which places large quantity Shader is better.

- NVIDIA vision of the processor of the ideal which is shared

In the first place, why uniting with NVIDIA in the GPU vendor?

Kutaragi: So far, we, Toshiba and together did the graphics for computer entertainment, personally. Including also process technology, the finish you did. And, this time, it united with NVIDIA in order to do the computer itself.

The finish pursuing PC graphics, Intel to the processor doing, it is about NVIDIA probably to do with Programable Shader. NVIDIA pursuing function and efficiency as a processor, the デビッド kirk (Chief Scientist of David B. Kirk and NVIDIA) to include the person, because the developer is graduate person of various computer enterprises such as SGI. There is a character that they, size and the like of the tip/chip do in the air, resign the fact that we would like to do and would like to pursue. Occasionally, there are also times when you do too much, but culture has been similar to me.

The approach of NVIDIA my approach, agrees in the point that finally the finish it will pursue the full programmable processor. ジェンセン (president and CEO of Jen-Hsun Huang and NVIDIA) and デビッド to be good there is an opportunity which does story, but a story that the processor of ideal it probably will do that it appears in that time. Ideal, naturally, present PC, well, is the processor which exceeds the present all processor.

They are opposite to the direction steadily, in that sense, us and vision share. It shares also the road map. In addition, they have received influence even from our architecture. Mutual temper having known, because you think, that we would like to do the same thing, it united with NVIDIA.

As for another element, display fixed pixel system (the liquid crystal and the like) the point which is moving. When it becomes fixed pixel system, TV and PC, are times when everything fuses. Therefore, we would like to support everything perfectly.

So it puts out also the downward compatibility of PS, dirty レガシー (graphics) from to up-to-date Shader would like to support entirely. As for resolution バシッ we would like to put out those above WSXGA. When such as that, rather than we making scratch (build) with, the method which the van has all is quicker (from NVIDIA).

As for Microsoft, with Xbox 360 GPU of ATI took Unified-Shader type architecture. With program characteristic Unified-Shader with advanced.

Kutaragi: As for the architecture of ATI, as for Shader however Vertex Shader and Pixel Shader equality (architecture) with, are visible at first glance well with joint ownership, you think that it is difficult. For example, whether the result of doing apex processing is made somewhere, how doing that, in Shader letting flow (for pixel processing) again? When somewhere is plugged, the stall it does entirely. Really it is different from those which are drawn in the picture. If of realistic performance is thought, you think that method of approaching NVIDIA is superior.

PLAYSTATION 3
- As for maintenance of compatibility with combination of hardware and software

As for the compatibility of past PlayStation actualizing with the hardware?

Kutaragi: You take with the combination of the hardware and the software. You try if probably to do (with just the software) how it becomes, but just which drives to perfectly close compatibility, it is important.

As for the person who develops the software unexpected, the fact that you cannot imagine is done. For example, however it is not logical as a program, you said that it moved accidentally. It is moving however, with, that is moving with, is a kind of case which is said in completely another reason. Passing through also our tests, "as for this cord/code which is what! "There are times when the cord/code which we would like to see passes.

We do not take either the compatibility for the cord/code such as that and the て is not good. ) Just a little it is painful but, because there is no logic, (to take compatibility with just the software. There are also times when hard becomes necessary. If so, this time (PS3) there is a power of extent, as for a certain place the correspondence such as with the software can do the place where it is hard.


When the cord/code of CPU side is emulated with the software, as for the endian of CPU.

Kutaragi: Therefore as for Cell bi- endian, how it becomes.

Xbox 360 takes compatibility almost with just the software. Because they have not produced the tip/chip at the respective company, it is the case that it is not the choices in other things, but how seeing?

Kutaragi: As for Xbox, when the new generation comes to November of this year, as for current Xbox you become the old generation. So when it does, Xbox means to kill by your by your. The only method of rescuing that takes 100% compatibility from first day. So or that it probably cannot commit (Microsoft), technically it is painful.


<Watch Impress Comments below>

- As for SCEI and NVIDIA those which are similar

Hisashi 夛 well, being the element, directivity of enterprise above transacting the device development and agreement of culture simply is transmitted to the relation of SCEI and NVIDIA, from word of the wooden person. Also both corporations, like original idea, with リスクテイカー, pursue to cost function and efficiency last. Is not the case, always, but the propensity such as that is strong. In addition, also both corporations, presently agree with the conception which pursues the processor.

In the GPU vendor, NVIDIA directivity to especially programmable conversion is strong. Speaking accurately, it has the direction whose also ATI Technologies and 3Dlabs are strong in programmable conversion, but it was most aggressive in NVIDIA raising general purpose. As for NVIDIA, because of that, the die/di size of GPU (the area of the semiconductor itself) it enlarges, it does not leave either the fact that production cost soars. As SCEI, that the directivity of such NVIDIA, it seems that you thought is faced as a partner.

Presently, GPU stream processing (with the small program piece keeps processing the mass data) has dashed forward to the programmable processor which specializes in stream system. By the fact that the general purpose of Programable Shader which is the operational core is raised, it is the case that it tries to be able to do, also the general-purpose processing other than graphics. It is the idea of the general-purpose processor which places the sub processor which on the one hand, also the basic idea of Cell, optimizes in stream processing. Evolving the general-purpose processor, it makes the structure which faces to the stream type processing which in the future would become important. If it tries saying, SCEI and NVIDIA are approaching to the same goal from another direction. So when you think, SCEI and NVIDIA agreed, are not strange thing with vision. You can understand also the fact that it is the agreement point which is called to the directivity both, the processor of ideal.

Hisashi 夛 well, with the graphic architecture of PLAYSTATION 3 it can support the fact that several choices were examined from explanation of the wooden person. First, with 1 Cell processor, the plan which can let do graphic processing. SPE which is the data processing processor core of Cell, SIMD (Single Instruction and Multiple Data) has the operational unit of type, can designate the same thing as Programable Shader of SIMD structure similarly basically. But, to be able to let do graphics to Cell, proper thing, it is not realistic because the efficiency of Cell as CPU is shaved.

Next, 2 loading Cell, the plan which uses Cell of one side in graphic exclusive use. Expanding the architecture for graphics Cell, it is presumed in this plan that also the plan which makes SPE for graphic processing was included. In that case, it is presumed that also loading and the like the operational unit for specification processing of graphics probably is done. Though, plan of the Cell based graphic tip/chip, is said that it went out rather at early stage.

By the way, even with present PS3 architecture, it can use Cell in graphic processing. As for Kirk of NVIDIA, with the combination of Cell and RSX, it has made clear that pre- processing and post processing of 3D graphics can be done with SPE of Cell. For example, metamorphosis is done Displacement Mapping which (displacement mapping) and the like also to do on SPE side it is possible the apex data.

SCEI eDRAM (installs in RSX and DRAM) the reason which is not placed, is clear from the picture resolution which as written even in the past, is supported. In addition, in order to actualize high Shader processing performance, thinking that it is not possible to consume the die/di area with eDRAM it is recognized. This took the special graphic architecture which utilizes the wide band of eDRAM, conception differs from Graphics Synthesizer of PS2 fundamentally. If you look at the information which is open with RSX, architecture, NVIDIA color quite is strong.

SCEI with PlayStation 2 solved the problem of compatibility by the fact that it loads the chip set of old PS as the sub processor with the hardware. Because this, unless it makes the hardware base, cannot guarantee almost 100% compatibility. When the hardware emulation is done completely with the software, enormous CPU power becomes necessary. This, like PS2 releasing the content of the hardware, in the machine which the developer that tries can access the resource freely, especially is critical.

Those where presently it is clear are basically "perfectly to have been about to actualize close" compatibility even with PS3. Because of that, as still compatibility in the hardware base the direction, continues even with PS3. However, this time, utilizing the high processing power of Cell, the interchangeability in the software base (emulator) it is taken. That specially Cell was designated as bi- endian the fact that compatibility of the CPU side is taken with Cell it means. At early stage of the cooperation development with IBM, SCEI is conveyed that it requested that bi- endian is necessary because of compatibility. By the way, this time, the compatibility of PS, PS2 and 2 generations is actualized. The both of PS and PS2 has loaded CPU of MIPS architecture.
 
From IGN:

More PS3 With Kutaragi
Ken-chan discusses backwards compatibility and Cell graphics.
by Anoop Gantayat
June 12, 2005 - We've provided you with translations/summaries of two Ken Kutaragi interviews from Japanese tech website PC Watch (be sure and read both part 1 and part 2 of the interview here). Great comedy occurs in three, though, so we were pleased to open up our browsers early Monday morning Japan time for part three of Kutaragi's PS3 commentary.


This interview starts off with Kutaragi offering a reason that the PS3 made use of a specialized graphics unit (GPU) from NVIDIA (the RSX), rather than a GPU based on the Cell processor. "The seven SPEs (Synergistic Processor Element) of Cell can be used for graphics," reveals Kutaragi. "In fact, many of the E3 demos were made without a graphics chip, with only Cell used for all graphics. However, this means of use is wasteful."
Kutaragi reveals that there was once the idea of using two Cell chips in the PS3, with one used as the CPU and the other used for graphics. However, this idea was killed when it was realized that Cell isn't appropriate for the functionality required for shaders, software tools that are used to draw images to the screen. The decision to go with a separate GPU was made in order to create the most versatile architecture possible.

Backwards compatibility is also on the cards in the latest interview. Referring to the means of backwards compatibility used in the PS3, Kutaragi reveals, "We use a combination of hardware and software." While Kutaragi won't specify what hardware components are being brought over from the PS2 to the PS3 (the specs don't seem to reveal anything), he hints that some hardware solutions were required in order to max out compatibility due to the fact that some PS2 games do things with the hardware that are not theoretically possible.

In the area of backwards compatibility, Kutaragi finds some time to take shots at the competition (he also does this quite a bit in part 1 of the interview). "With the Xbox next generation coming in November of this year, the current Xbox will become last generation. With that, the Xbox will kill itself. The only way to save it is to have 100% backwards compatibility from the first day. However, it seems that [Microsoft] cannot make that commitment -- on a technology level, it's difficult."

With the PS3 approaching its expected September playable debut, we expect to hear more from Kutaragi in the coming months. Stay tuned!
 
By the way, even with present PS3 architecture, it can use Cell in graphic processing. As for Kirk of NVIDIA, with the combination of Cell and RSX, it has made clear that pre- processing and post processing of 3D graphics can be done with SPE of Cell. For example, metamorphosis is done Displacement Mapping which (displacement mapping) and the like also to do on SPE side it is possible the apex data.

Post processing in Cell? FSAA in Cell?
 
two said:
By the way, even with present PS3 architecture, it can use Cell in graphic processing. As for Kirk of NVIDIA, with the combination of Cell and RSX, it has made clear that pre- processing and post processing of 3D graphics can be done with SPE of Cell. For example, metamorphosis is done Displacement Mapping which (displacement mapping) and the like also to do on SPE side it is possible the apex data.

Post processing in Cell? FSAA in Cell?


No he's talking about image filters, DOF, blur or whatever else you want to do.

You could do the same think on a Xbox360 (or even a PS2) if you want to waste the CPU cycles there instead of on the GPU.
 
ralexand said:
So can anyone explain why they chose not to use edram according to that interview?

For the GPU: 2-300mm^2 silicon needed to hold a HD back+Z-buffer (without tiling buffers similar to ATI's Xenos).

For CELL (not according to that interview, but one that was done at ISSCC): IBM's 90nm node doesn't sport edram, - yet. Besides the cores in CELL runs hot which edram dislikes (data deteriorates faster). Also edram steps increases cost (extra processing steps) and decreases performance.

Cheers
Gubbi
 
ERP said:
two said:
By the way, even with present PS3 architecture, it can use Cell in graphic processing. As for Kirk of NVIDIA, with the combination of Cell and RSX, it has made clear that pre- processing and post processing of 3D graphics can be done with SPE of Cell. For example, metamorphosis is done Displacement Mapping which (displacement mapping) and the like also to do on SPE side it is possible the apex data.

Post processing in Cell? FSAA in Cell?


No he's talking about image filters, DOF, blur or whatever else you want to do.

You could do the same think on a Xbox360 (or even a PS2) if you want to waste the CPU cycles there instead of on the GPU.

Thanks, ERP 8)
 
ralexand said:
So can anyone explain why they chose not to use edram according to that interview?


from that thread on GAF that I posted a link to on page 10 of this thread:
gofreak said:
- There is no eDram in PS3, because they couldn't put enough on the die to support a 1920x1080 frame let alone 2. Kutaragi thinks the transistor cost of eDram is too high when you consider what the same amount of transistors buys you elsewhere in terms of more shading power.
 
- There is no eDram in PS3, because they couldn't put enough on the die to support a 1920x1080 frame let alone 2. Kutaragi thinks the transistor cost of eDram is too high when you consider what the same amount of transistors buys you elsewhere in terms of more shading power.
So we should assume that ATI removed shading power from the xgpu to include the edram unit, ie. it should have less shading power than the other ATI chips shipping this year.
 
from that thread on GAF that I posted a link to on page 10 of this thread:gofreak wrote:
- There is no eDram in PS3, because they couldn't put enough on the die to support a 1920x1080 frame let alone 2. Kutaragi thinks the transistor cost of eDram is too high when you consider what the same amount of transistors buys you elsewhere in terms of more shading power

They honestly think that the bandwidth avalible to the rsx will be enough for two 1080p displays ?

So we should assume that ATI removed shading power from the xgpu to include the edram unit, ie. it should have less shading power than the other ATI chips shipping this year.
Well it seems that the xenos is not very traditional so they may have saved transitors going else where and still have the shader power . I believe the xenos has 8rops ? so right there could be a savings if the rsx has a more traditional pipeline set up
 
No he's talking about image filters, DOF, blur or whatever else you want to do.

You could do the same think on a Xbox360 (or even a PS2) if you want to waste the CPU cycles there instead of on the GPU.

With all due respect ERP but why does Sony, Nvidia, and company always show the very close relationship between the CELL and RSX. The CELL has to be doing something with graphics. Wouldn't all that time spending talking about it at the E3 conference, in presentations before E3, and Nvidia and Sony talking about it after the conference would all be wasted. I have to believe that Sony and Nvidia are talking about this in the open to get devs doing this from the start.

The way I understand if devs don't use the SPEs for those things they wouldn't be using the CELL for its full purpose.
 
marconelly! said:
He's also talking about the displacement mapping, which to me sounds like the most interesting proposition.
Displacement mapping = texture fetch in the vertex shader.
So... he's basically just confirming that SPEs can process vertices.

I would think a depth-based tesselation scheme would be a much more interesting usage of Cell than displacement mapping.
 
There's no doubt Cell can do T&L...even on a PC with DirectX you can disable the T&L on the video card and have it done in software by the CPU.
 
jvd said:
They honestly think that the bandwidth avalible to the rsx will be enough for two 1080p displays ?

I haven't been following much of the next generation talk but Sony actually said the they would push for running dual 1080P output as opposed to say, something like dual 720P?
 
Status
Not open for further replies.
Back
Top