Yamauchi on the PS3: "...beginning of a new world..."

Laa-Yosh said:
The problem is that RSX's blending units are more suited for this task; on Cell, you'd have to use the SPEs which only work with 32 bit floats. It's overkill for the job,

Yes, but unless your game is using all 7 SPEs for other tasks, you'll have one or more to spare, so why not?

and it taks a massive amount of bandwith to work with a 128 bit framebuffer. As I've said, it's been discussed on B3D before, with PS3 developers, and the conclusion was that it's probably not possible in practice.

Neither was AA on the PS2.
 
I don't wish to stifle this debate but aren't we talking about possible techniques that wont even be considered until late second or third generation development titles for the PS3? Thats not withstanding the outlined bandwidth issues of course.

Unless there are architectural changes from the G70 in RSX which permits HDR+AA, the vast majority of games will not feature AA and anything that does will just be an exception case.

Personally, I'd prefer to engage in speculation as to what changes are necessary (if it's even feasible) to allow HDR+AA in a G70 architecture. I dont profess to have a technical background in 3D so please have mercy ;)
 
Powderkeg said:
And again, RSX can read/write directly to XDR, and Cell can perform the same processes as the logic in the edram.

What's the difference?

Actually it can't.
The Xenos daughter die has additional information.
If you note that in a recent response that Dave got from ATI, one of the architects stated that the actual bandwidth figure is slightly higher than the stated 32 GB/s. That's because in order to do the AA the daughter die requires additional "sideband" information (minimally coverage and Normal of the surface). It's also has knowledge of every pixel written to a destination pixel which required for correct AA. This information is not exposed at the end of the pipeline after the data is in the framebuffer the data is lost and cell can't do the job.

Cell could postprocess an image to give a sort of poor mans AA (much like a photoshop filter), but ignoring bandwidth and CPU load issues the memory cost of additional intermediate buffers is fairly daunting.
 
MasaC said:
How about the other way around?

Can RSX do the AA and the Cell do the HDR?
Wanting CPU to do AA and GPU to do HDR or vice versa is kinda like wanting the CPU to render the scene's red and green components and wanting the GPU to render blue and alpha components. HDR is intrinsic to the rendering process and needs to be included in the render pipeline at the pixel-shader level. HDR isn't a process but an accuracy thing (though I believe HDR is rendered somewhat differently to noremal rendering).

Again, like most next-gen topics the forum has already covered this extensively. Do a search on Cell's use in post processing and AA and the like and you'll find epic threads full of all the relevant bits on info.
 
one said:
Super offtopic but chip companies are not innocent, nice guys... why do they have to lend their brand-new, shiny 65nm lines for cheaper than 90nm lines even when yield is still low? They have customers from all over the world not only MS. Fabs may charge per working chip not wafer, but chip price will include estimated amount of wafers wasted in production. Those chip companies have to make profit out of customers pockets. Besides, Xenos daughter die uses eDRAM so practically they can't change the fab from NEC for the part.

It's usually per wafer.
 
jvd said:
of course . I play at 1600x1200 and love it when i can have 6x fsaa on my x800xt pe .

It makes the game look much better .

Also remember people will be playing these systems on 30-60inch tvs (mabye bigger) and the asailing will be much more noticable in a bigger screen than on a pc monitor .

This completely ignores that people will be watching the big screen from considerably farther away than a computer screen. So it is more a wash than a matter of it being "much more" noticeable. At 50+ inch screens, picture quality is suffering (even with hdtv resolutions) already from dwindling dpi- further softening from AA isn't going to suddenly make that "ok". Suffice to say, making associations between what you see on a computer monitor at 12" and an hdtv from across the room, makes for a pretty strained analogy.
 
Just asking

Shifty Geezer said:
Wanting CPU to do AA and GPU to do HDR or vice versa is kinda like wanting the CPU to render the scene's red and green components and wanting the GPU to render blue and alpha components. HDR is intrinsic to the rendering process and needs to be included in the render pipeline at the pixel-shader level. HDR isn't a process but an accuracy thing (though I believe HDR is rendered somewhat differently to noremal rendering).

So what was the whole Getaway demostration about? I thought they were showing off the HDR capabilities of the CELL? Just curious.
 
mckmas8808 said:
So what was the whole Getaway demostration about? I thought they were showing off the HDR capabilities of the CELL? Just curious.


the Gateway demonstration was a demo, ingame like PGR3 is very different and much more difficult to do. Bizarre has been working on PGR3 for 2 years.
 
mckmas8808 said:
So what was the whole Getaway demostration about? I thought they were showing off the HDR capabilities of the CELL? Just curious.
It was a demo of Cell's overall processing capabilities, showing how versatile it was and primarily, how it could 'bring a virtual city to life'. Remember Harrison sayingf as much? (it was Harrison wasn't it? I'm hgopeless at names!). The fact it had a nice HDR renderer isn't surprising considering Cell works in floats and so naturally has a higher dynamic range for lighting which they leveraged. NOT including HDR effects would have had Cell calculating everything with extended float range and not using it. Kinda like entering a race in a car that can go 170 MPH, revving it to redline, and then only using first gear so you use squeeze the engine for performance but don't apply it to best effect.
 
Shifty Geezer said:
The fact it had a nice HDR renderer isn't surprising considering Cell works in floats and so naturally has a higher dynamic range for lighting which they leveraged. NOT including HDR effects would have had Cell calculating everything with extended float range and not using it.

So can real games use the CELL to implement HDR? You make it sound like an obvious yes.
 
First, games that have HDR rendering through Cell will have to resort to full software rendering, leaving the RSX to sit idle, which is IMHO quite a stupid thing to do.

Second, I've already expressed my doubts about that Getaway demo being software rendered... I think it's using the GPU to do the graphics.
 
Laa-Yosh said:
First, games that have HDR rendering through Cell will have to resort to full software rendering, leaving the RSX to sit idle, which is IMHO quite a stupid thing to do.

Second, I've already expressed my doubts about that Getaway demo being software rendered... I think it's using the GPU to do the graphics.

Well if you think it's using the GPU then things would start to make sense. But since Sony says otherwise that's what's causing all of this confusion in my head and some others.
 
Sony hasn't exactly been the most honest company in the past when it comes to their products and demonstrations.
 
drpepper said:
That can be said for all companies. What's your point?

I thought my point was pretty obvious - Take much of what Sony says with a grain of salt. Yes, the same can be said for all companies (some to a greater degree than others) but the subject here was Sony and the PS3. So what is your point?
 
Laa Yosh said:
First, games that have HDR rendering through Cell will have to resort to full software rendering, leaving the RSX to sit idle, which is IMHO quite a stupid thing to do.
Actually I imagine you would use deferred shading for that.
I agree that wasting all the shader potential on GPU side would be stupid though - so you'd actually want to split the HDR workload between RSX and Cell, which could be an interesting avenue of future research.
And there's also forms of software AA that work with this method at reasonable cost.
 
Azrael said:
Sony hasn't exactly been the most honest company in the past when it comes to their products and demonstrations.

Don't be a troll, you could say that about Microsoft, Nintendo, or pretty much any company for that matter. All this is, is a PR statement, nothing more. What are you expecting them to say, "The PS3 is crap, I can't believe we are developing for this dump!" ?.

Anyway this is far from on topic, so let's get back to it.
 
Fafalada said:
Actually I imagine you would use deferred shading for that.
I agree that wasting all the shader potential on GPU side would be stupid though - so you'd actually want to split the HDR workload between RSX and Cell, which could be an interesting avenue of future research.
And there's also forms of software AA that work with this method at reasonable cost.

Huh? What? And are you serious. You make it sound like some form of HDR and/or can be done with the CELL help. This has been debunked by the smartest people here. Can you please explain a little more.

Thanks.
 
Back
Top