MS: "Xbox 360 More Powerful than PS3"

Jawed said:
You'd never guess it from their Rep :cry:

Jawed

I've got to say, I dislike the whole rep system overall. Everyone here has a real 'reputation' linked to their names and based on the quality of their prior posts. For example no one's looking at your or my rep ratings when they start readng our respective posts Jawed; if there are pre-conceived notions as to the content of our posts, they exist beyond the rep bars.

When London-Boy can go from like seven green bars to three red bars overnight, you know the system's kind of bogus to begin with. ;)
 
DeanoC said:
Its got nothing to do with alpha or final or even the same hardware. Its simple that half sized framebuffer are faster on all systems... Surely this is obvious if you know how GPUs work. It apply equaly to Xenos, RSX, G70, G80, R520 etc.

Sorry if you've addressed this already, but I'm presuming that this is the case for a G70 based development platform, is that so?
 
DeanoC said:
We don't know yet as we still have lots of things to explore. We've get a speed increase with this, so we can have more AA than before.
We are now running at 720p with a 'good' amount of AA (cause we believe this will be the normal way the game is played). Just to clarify the 1080p quote that I also seem to be misunderstood. The game will support and run well on all resolutions Sony say we must support but we believe that most people will be seeing it at 720p so thats the res we use mainly in development.

Hmm, does that mean the user could select the resolution at which the game renders? Or will the game software somehow detect what kind of display is connected?

Similar to the way X360 users can select the output resolution?

So you would select the render resolution in the game but use the system browser in the PS3 to also select the output resolution possibly?

Console games haven't let you select the rendering settings like PC games. Maybe it becomes necessary with all the display choices coming available? (looks like there are several 1080p models from different brands, usually at the top of their lines, but the bread and butter are still 720p DLPs and close to 720p plasmas and LCDs. But by next year, at least in the US, the percentage of 1080p displays may be higher than 50% of available models).
 
BenQ said:
Deano, You shouldn't answer Nfactors questions. He's taking your words and going off to another forum and bashing you and the PS3 with them. Just a little heads up.
I find that very funny, considering Nfactor is a big fan of Sony and the PS3,
btw hey Nfactor (SW.com)

So DeanoC,
after reading alot of this thread I was just wondering ow you felt your method of HDR compares to the likes of Half-Life 2's or PGR3's HDR system?
 
DeanoC said:
Well to be fair its not the holy grail, it has some issues as you'd expect. Its not magic its simple trading shader instructions for bandwidth. You have to do colour space conversion which costs you shaders instruction. Also blending becomes more complex (LERP_RGB != LERP_NAO32). In a way thats what the talk about programmable shader has really been about, us devs finding our own solutions to the problems rather than whatever solution the hardware vendors think up. They solved the problem of HDR with a bigger buffer and FP16 hardware, Marco solved it with shader instructions and using the existing INT8 hardware.

As for not talking... I have enough restrictions due to real NDAs... seems madness that things that aren't covered by NDAs should be restricted because a few people have agendas to fill. Software is whats interesting... hardware is meh...

isn't this what ati was doing on thier sm2.0 hardware ? games like half life 2 lc used this right ?
 
jvd said:
isn't this what ati was doing on thier sm2.0 hardware ? games like half life 2 lc used this right ?
I can try to answer you..if someone point me some real explanation of what Valve is doing :)
Regarding what ATI proposed in the past about blending the answer is yes/no..and regarding wha ATI is suggesting in their SDK about HDR rendering the answer is no, we are not using any methods they (well, Humus I suppose..;) ) evaluated/proposed.
 
nAo said:
I can try to answer you..if someone point me some real explanation of what Valve is doing :)
Regarding what ATI proposed in the past about blending the answer is yes/no..and regarding wha ATI is suggesting in their SDK about HDR rendering the answer is no, we are not using any methods they (well, Humus I suppose..;) ) evaluated/proposed.

Okay I'm going to ask a question and it's straight to the point. Is your NAO32 method better than Humus' method at creating a HDR effect? I've seen his HDR post and it's very amazing.
 
mckmas8808 said:
Okay I'm going to ask a question and it's straight to the point. Is your NAO32 method better than Humus' method at creating a HDR effect? I've seen his HDR post and it's very amazing.
nice try ;)
 
zidane1strife said:
Heh, but it likely is or you'd probably've used his ;)
Humus' is INT16. It would still have twice the framebuffer bandwidth etc. - as such, it's not comparable.

Uttar
 
DeanoC said:
Well to be fair its not the holy grail, it has some issues as you'd expect. Its not magic its simple trading shader instructions for bandwidth. You have to do colour space conversion which costs you shaders instruction. Also blending becomes more complex (LERP_RGB != LERP_NAO32). In a way thats what the talk about programmable shader has really been about, us devs finding our own solutions to the problems rather than whatever solution the hardware vendors think up. They solved the problem of HDR with a bigger buffer and FP16 hardware, Marco solved it with shader instructions and using the existing INT8 hardware.

As for not talking... I have enough restrictions due to real NDAs... seems madness that things that aren't covered by NDAs should be restricted because a few people have agendas to fill. Software is whats interesting... hardware is meh...

brings to mind the Jaguar's 16 bit CRY mode for graphics - gave '24 bit' colour in 16 bits, but gourard shading was a bit strange at times :)
 
Mintmaster said:
But, as I always say on these forums, it will always come down to developer skills. I think Sony has the upper hand there, and if any game looks substantially better on one platform than the other, good coding will be the reason, not hardware.


Hmm, I think most multi-platform games looked as good or better on Xbox as Ps2.

So if multi platform games look better on PS3 than Xbox360, it wont be because of Sony's developers, it stands to reason. We have proof of that from last gen.

Anyway, this MS statement does mean something to me. MS bash all you want, they're not given to talking about system power idly.

I only think they might be basing this on raw numbers which dont mean a lot unfortunatly. That's the problem I see with it.
 
Last edited by a moderator:
Alstrong said:
Is it a technique you wouldn't mind discussing at a conference? :)

(Thought I'd ask directly this time ;) )
Sincerely I don't think it deserves a conference, cause I haven't invented anything new, a lot of other people have done all the work I needed in the past decades :)
Maybe I could write a small report and publish it somewhere on some website
 
nAo said:
Sincerely I don't think it deserves a conference, cause I haven't invented anything new, a lot of other people have done all the work I needed in the past decades :)
Maybe I could write a small report and publish it somewhere on some website

It would be nice if Beyond3D was to be one of those "some sites".
 
nAo said:
Sincerely I don't think it deserves a conference, cause I haven't invented anything new, a lot of other people have done all the work I needed in the past decades :)
Maybe I could write a small report and publish it somewhere on some website
We need a console site, Dave. We really do.

Other than that, yeah, nAo, B3D is that site you looking for!

PS: Don't forget to add me to credits! :p
 
Uttar said:
Humus' is INT16. It would still have twice the framebuffer bandwidth etc. - as such, it's not comparable.

Uttar
The main point of Humus' demo was not the HDR rendering itself, nor the framebuffer used. It was the fact that source textures don't have to use FP 16.

He used a single channel INT16 texture for luminance and a DXT texture for colour. This allowed him to get 'good enough' HDR texture filtering on hardware without FP16 filtering, it used a lot less space, and ran faster due to less bandwidth. Precisely the kind of thing you'd use for RSX, despite it's capability to filter a 4 channel FP16 texture.
 
Devs make a rare appearance and suddenly every layman becomes a friggin' expert. No wonder such posts have all but disappeared.
 
Back
Top