Next gen consoles: hints?

Panajev2001a said:
I seriously would like to know how slow would be to sample Textures using soem APUs as TMUs if needed with what we know of the APUs.

Peak of either 1 Vector FP/FX ( we have MADD ) or Scalar FP/FX per cycle.

I wrote this a while back in a slightly different context and I assume an array of four FMAD (not FMAC) units. In other words, a VU0 or a VS (or APU). It's also assuming that the memory controller has already loaded the color vectors for all eight texels and we have the texture coordinates already resolved.

Me two weeks ago said:
But look at something so "simple" and frumpy as a trilinear filter. If I remember correctly you must calculate the blend factor from eight texels, each of which requires a multiply between the fractional part (or the one-inverse of the fractional part) of the texcoords. Then you need eight vector operations to retrieve the partial sum of that texel's color value times that blend factor. Then accumulate the partial sums for each mip map. Then interpolate between the values you have from the mip maps. If you want an aniso filter, repeat this process "x" number of times. All in all, that's about 50-400 [vector] operations, just for one texture lookup...

In other words, it's a pretty nasty performance killer. Everytime I hear "software texture filtering" I shudder. :LOL:
 
Akira, that might be a performance killer still... but if you have 3-4 APUs free ( and out of 32 there might be some APUs free ) it means that you have around 32 FP ops per cycle to use... 10-15 cycles would give you enough time to do somethign nice with them and would not constitute a performance hit out of this world IMHO.
 
the only places where HDTV will be a concern is USA and Japan looks like the rest of the world is too busy doing other things........the GC already has the ability to display a 480P image and I am sure X-box can do 1080i (not sure about PS2 but I am under the impression that you can get 480p out of it.

whom ever said sony will have superior graphics compared to ATi must be smoking crak and needs to share with the rest of us. ATi is centuries a head of sony in terms of Video Processing Technology. even MS knows this and is also using ATi. PS2 is weak compared to GC and X-box graphics..........

AS for GC not playing DVDs yeah thats ok as I would much rather put the disk in my sony player and let it play automatically without having to use a game controller........
 
Panajev2001a said:
Akira, when we would do texture sampling for a Pixel or Vertex Shader not for Texture mapping we would not do tri-linear filtering.

Not neccessarily. For a vertex shader we will most likely be doing many displacement texture samples - and if we only do bilinear then we will run into the same problems we do on color textures - abrupt and disconcerting changes in our level of detail. Now 1-D textures that can encode arbitrary functions or floating-point normal maps have different rules of course...

NB: There was one error in my post. I actually assumed FMAC (which can only accumulate the previous result) instead of FMAD. :oops: The correct answer is ~45 V-FLOPS rather than ~52 V-FLOPS.
 
So, wait. How can you tell me Cell/Broadband Engine/PS3 isn't going to be anywhere near what Tim's intensions are if:

(a) He hasn't commented on it in 10 years
(b) You don't know what the hell he intended?

I believe taht recently (with in the year) he said that the power wouldn't be there for 10 years . Not 10 years ago that he said this .

I believe that is a missunderstanding on your part . Unless i'm messing up through reading .
 
That's how I read it as well. (Though of course we don't know how long it's been since he said it. ;) )

YeuEmMaiMai said:
the only places where HDTV will be a concern is USA and Japan looks like the rest of the world is too busy doing other things........the GC already has the ability to display a 480P image and I am sure X-box can do 1080i (not sure about PS2 but I am under the impression that you can get 480p out of it.

You can indeed get 480p out of PS2 (some 2-3 dozen games offer it, I believe, including visually-impressive ones like Jak II). Xbox indeed can do 1080i, but I think only a handful of them actually do--same with 720p. 480p is basically a standard, though.

On the GameCube, I'm not aware of any 720p nor 1080i titles, though I think with both it and the PS2 it is possible to--just nothing any game developer would at all consider worth the trade-offs or other problems. (As with Xbox, since I think only 10-20 titles offer either.)

In the meanwhile, HDTV will definitely be an issue for next generation's consoles. 2005-2006 still may see them not too widespread, but they'll also be riding the same box until 2010 and beyond, so if ANY of them exclude support, I will be sorely annoyed. Even Progressive scan is rather shruggable at the moment, and for this generation in general, but through next generation it will be much more affordable and much more the norm.
 
In the meanwhile, HDTV will definitely be an issue for next generation's consoles. 2005-2006 still may see them not too widespread, but they'll also be riding the same box until 2010 and beyond, so if ANY of them exclude support, I will be sorely annoyed. Even Progressive scan is rather shruggable at the moment, and for this generation in general, but through next generation it will be much more affordable and much more the norm.

Heh. I was in the store today helping my new bro in law pick out a hdtv for thier house . 1 grand for a 32 inch. He could have gotten a 60 inch normal projection tv for 200 more . So i don't see hdtv being a must for at least another gen.
 
Well nothing is ever really a "must" anyway, but I don't think anyone is going to ride towards 2010 at the earliest while ignoring it. Certainly I'll be pissed if the prices don't go down, screen sizes and qualities don't go up, and the networks still languish and don't bother supporting it much... and I'm a notoriously under-teched nerd! :p
 
cthellis42 said:
Well nothing is ever really a "must" anyway, but I don't think anyone is going to ride towards 2010 at the earliest while ignoring it. Certainly I'll be pissed if the prices don't go down, screen sizes and qualities don't go up, and the networks still languish and don't bother supporting it much... and I'm a notoriously under-teched nerd! :p

You have to understand the userbase of hdtv is small. Perhaps not even 1% of what ntsc tvs are at . I don't really see it picking up anytime soon. Only way for that is if they don't make normal screens anymore .

Because even when you can get a 60 inch hdtv for 1200$ your going to be able to get the non hdtv version for less. i hate to say it but thats how it is .
 
Certainly not "anytime soon," but five years from now is a lot of time to ramp (now that it's actually GETTING some real push), and that will only be halfway into next generation anyway.

It's always going to be up to the developers to decide to actually SUPPORT the resolutions, and they'll closely follow the TV industry anyway, but if any of the next gen consoles for some reason don't have the CAPACITY to go out to HDTV resolutions--or make it suffer disproportionate penalties to do so--I will think it rather asinine.
 
Vince said:
I'm trying to remember where I saw a console developer propose utilizing Loop Subdivision with a basic wavelet compression scheme for the next generation platforms. Sound familiar to anyone?
Here? : Incredibly Dense Meshes

and for Panajev...do you remember what I told you about texture sampling and filtering in the APUs? it would be very nice if they would implement a texld-like instruction in the APU instruction set.

ciao,
Marco
 
Vince said:
What?!? Then why are we even discussing DXNext? Why have we been talking about PC architectual limitation?

Where did this come from?

Umm, Vince, the thread is titled "Next Generation Console Hints" to which the DX Next article was posted in relation to some of the directions MS are likely to be taking with the XBox2, yes?

I thought you were all big on "this is the consle forum, we talk about consoles".

akira888 said:
In other words, it's a pretty nasty performance killer. Everytime I hear "software texture filtering" I shudder. :LOL:

Yeah, the processing requirements are still fairly huge and that’s why 3Dlabs P10 still has dedicated units despite the fact most of the rest of the chip is quite programmable.

Panajev2001a said:
Akira, when we would do texture sampling for a Pixel or Vertex Shader not for Texture mapping we would not do tri-linear filtering.

That’s not necessarily a given – IIRC, if you want LOD with displacement mapping then filtering is an easy way to achieve that. Generally speaking you do want some kind of LOD with displacement mapping otherwise the level of displacement is just as complex all the way through which is unnecessary all the way into the screen.
 
Sorry, just noticed this:

Panajev2001a said:
I see though something hironic and you prolly saw it before me... while we advocated the fact VS and PS are merging in terms of functional units on such a PlayStation 3 we would divide Vertex Shading and Pixel Shading between the two chips for best efficiency ( unless we take the penalty, sample on the BE with some dedicated APUs and all... ).

Yes, while functionally it would appear that the APU’s of the BE and Visualiser can be applied to both cases, at the moment the division between the two chips would suggest that it’s logical (in terms of a usage scenario) to divide the work of the Geometry on the BE and fragment processing on the other. Had this all been one chip then that logical split wouldn’t appear to be the case. I suspect that even if the is a very high bandwidth interconnect between the BE and Visualiser its not going to match the internal bandwidth of the chips, and if this were the case then you are going to have to be careful with passing lots of operations back and forth between the two.

Also another funny thing you noticed ( much better arguing when you are not trying to deny somethign you feel true as well ) in such a scenario Vertex Processing would be the area to push strongly as you can much more easily off-load that to the Visualizer than off-load the Pixel Shading part on the Broadband Engine due to the Texture Issue we just mentioned.

I’d still like to hear more about the Visualisers sampling and fragment processing abilities though, I’d like to know how it operates with texture data as an input and how efficient that will be.
 
cthellis42 said:
That's how I read it as well. (Though of course we don't know how long it's been since he said it. ;) )

YeuEmMaiMai said:
the only places where HDTV will be a concern is USA and Japan looks like the rest of the world is too busy doing other things........the GC already has the ability to display a 480P image and I am sure X-box can do 1080i (not sure about PS2 but I am under the impression that you can get 480p out of it.

You can indeed get 480p out of PS2 (some 2-3 dozen games offer it, I believe, including visually-impressive ones like Jak II). Xbox indeed can do 1080i, but I think only a handful of them actually do--same with 720p. 480p is basically a standard, though.

On the GameCube, I'm not aware of any 720p nor 1080i titles, though I think with both it and the PS2 it is possible to--just nothing any game developer would at all consider worth the trade-offs or other problems. (As with Xbox, since I think only 10-20 titles offer either.)

In the meanwhile, HDTV will definitely be an issue for next generation's consoles. 2005-2006 still may see them not too widespread, but they'll also be riding the same box until 2010 and beyond, so if ANY of them exclude support, I will be sorely annoyed. Even Progressive scan is rather shruggable at the moment, and for this generation in general, but through next generation it will be much more affordable and much more the norm.

Gamecube only has like 3 megs of video ram, wouldn't it have to drop down to 16 bit(or less) color to do 1080i? PS2 probably could do it at 24 bits, without any textures.
 
I’d still like to hear more about the Visualisers sampling and fragment processing abilities though, I’d like to know how it operates with texture data as an input and how efficient that will be.

I've already told you, it doesn't exist. Many of the things you want to know the patent hasn't went over in terms of the VS.

Here's what we know.

The entire Visualizer is Basically a Broadband Engine cut down to render graphics.

It has coming from the patent a 4Ghz clock speed and 512GFLOPS and 512GOPS peak performance.

It has a CRTC and a Pixel Engine as well as Image Cache which will be where the e-DRAM is.

Texture sampling noone knows, noone can talk efficiency, however you can infer that it would be logical to do Shader operations on the APU's.

I suspect that even if the is a very high bandwidth interconnect between the BE and Visualiser its not going to match the internal bandwidth of the chips, and if this were the case then you are going to have to be careful with passing lots of operations back and forth between the two.

Toshiba and Rambus demonstrated the worlds fastest parallel interface in October running at 6.4Ghz, this is Redwood which has been liscensed to be used in the Broadband Engine.

Fast as it is I agree that this won't top PS3 e-DRAM at hundreds of GB's per second.
 
Dio said:
Back to the Transputer again? This kind of architecture is usually considered to be extremely hard to get working close to its peak efficiency (on general code - how well specific divisions of general code, such as game general code or renderer general code work I don't have any specific data on).
Oh I don't have any illusions about efficiency in general purpose code - but I would like to think that a 4ghz unit will be fast enough to do most of general purpose housekeeping without breaking a sweat even if we underutilize it in those sections.
Meanwhile I am under impression that the real calculation hogs which include stuff like rendering and physics would lend themselves well to this kind of processing. It remains to be seen if the actual hw will reflect that though.
That said, performance in general purpose code could always be viewed as having a lot of redundancy, rather then lower efficiency :oops:

DaveB said:
But the issue appear that the implementation its being put to here appears to suffer from some of the same pitfalls that Vince was citing, due to the fact that there are two separate units.
I suppose Vince just has a lot of faith in Sony/Toshiba bandwith solutions :p

DaveB said:
The other more fundamental question is do you want a your ?GPU? to act functionally equivalent to the ?CPU?, or spend its transistor budget being focused on tasks that are commonly required for a ?GPU??
That depends on what kind of tasks we're talking about. There is certain parts of fixed functionality I would like to see them include, but beyond that I like my freedom.

I?d still like to hear more about the Visualisers sampling and fragment processing abilities though, I?d like to know how it operates with texture data as an input and how efficient that will be.
Get in line ;)
Anyway, I am not aware of anything being known on that yet, but I also am quite sure this kind of info will fall under NDA when it does become available. So it may be awhile before any of us will know much about this.

cthellis42 said:
On the GameCube, I'm not aware of any 720p nor 1080i titles, though
GCN's display controller doesn't go beyond 480p.
 
Toshiba and Rambus demonstrated the worlds fastest parallel interface in October running at 6.4Ghz, this is Redwood which has been liscensed to be used in the Broadband Engine.

Fast as it is I agree that this won't top PS3 e-DRAM at hundreds of GB's per second.

What's the width of that bus ? That's pretty impressive speed. Is that raw or is that one of those octal rate thingy ?

If they used 256 bit wide bus, that would about match the internal bandwidth.
 
nAo said:
Vince said:
I'm trying to remember where I saw a console developer propose utilizing Loop Subdivision with a basic wavelet compression scheme for the next generation platforms. Sound familiar to anyone?
Here? : Incredibly Dense Meshes

and for Panajev...do you remember what I told you about texture sampling and filtering in the APUs? it would be very nice if they would implement a texld-like instruction in the APU instruction set.

ciao,
Marco

You even bolded my name... :( I feel like I have been just picked by my ear and scolded :(
 
Fafalada said:
Dio said:
Back to the Transputer again? This kind of architecture is usually considered to be extremely hard to get working close to its peak efficiency (on general code - how well specific divisions of general code, such as game general code or renderer general code work I don't have any specific data on).
Oh I don't have any illusions about efficiency in general purpose code - but I would like to think that a 4ghz unit will be fast enough to do most of general purpose housekeeping without breaking a sweat even if we underutilize it in those sections.
Meanwhile I am under impression that the real calculation hogs which include stuff like rendering and physics would lend themselves well to this kind of processing. It remains to be seen if the actual hw will reflect that though.
That said, performance in general purpose code could always be viewed as having a lot of redundancy, rather then lower efficiency :oops:

DaveB said:
But the issue appear that the implementation its being put to here appears to suffer from some of the same pitfalls that Vince was citing, due to the fact that there are two separate units.
I suppose Vince just has a lot of faith in Sony/Toshiba bandwith solutions :p

DaveB said:
The other more fundamental question is do you want a your ?GPU? to act functionally equivalent to the ?CPU?, or spend its transistor budget being focused on tasks that are commonly required for a ?GPU??
That depends on what kind of tasks we're talking about. There is certain parts of fixed functionality I would like to see them include, but beyond that I like my freedom.

I?d still like to hear more about the Visualisers sampling and fragment processing abilities though, I?d like to know how it operates with texture data as an input and how efficient that will be.
Get in line ;)
Anyway, I am not aware of anything being known on that yet, but I also am quite sure this kind of info will fall under NDA when it does become available. So it may be awhile before any of us will know much about this.

cthellis42 said:
On the GameCube, I'm not aware of any 720p nor 1080i titles, though
GCN's display controller doesn't go beyond 480p.

Fafalada, do you think that dedicating like 2-4 APUs on the Broadband Engine that they could act as software TMUs decently enough for what you know of them ?
 
Back
Top