MS: "Xbox 360 More Powerful than PS3"

emacs said:
whenever i see the word magnitude used in this context i think about a logarithmic scale. as such, i don't think there will be a ten (10) or a one hundred (100) fold increase in power. of course defining power is often left to the marketing / PR departments.
A few weeks ago I heard a "commoner" say that the PS3 was supposed to be 15 times more powerful than the Xbox 360. That's the Sony hype machine at work.
 
Perhaps the PS3 isn't powerful enough to handle FP16 with good framerates? Anyway trade offs like that are common place in game development. It's always a balance of quality versus performance.

Deano when are more screen shots going to come out? I wanna see more of your game. Are you taking any cues from
 
Last edited by a moderator:
I said this before when sony annouced 1080p support. it was mearly a bullet point but not anythign i expect to be realized in games. Look at it reasonably, if High def gaming is somehting a small fraction of the market can enjoy, 1080p support in games will be something a fraction of a fraction can enjoy. Meaning I don't expect any console develoeprs to support anything above 720p or 1080i. There's no need for it and it's a waste of time/money.
 
expletive said:
Apologies in advance for this n00b question:
What is INT8? Is this somethign that would be available on Rev and 360 as well?

<SNIP>

(I'm, trying to think of the roadmap of 1080p displays as well, not jsut the consoles, and wonder when will there be a 'critical mass' amount of content for them. AFAIK, most of todays 1080p displays only accept a 1080i signal anyway)
We still haven't managed to buy a 1080p display in the UK yet... so at the mo, having to have less effects to support a TV standard we can't even buy feels a bit stupid. But if its required we can do it, obviously we would not be able to use the same level of effects as on a lower resolution.

As Phil said INT8, is the ol' fashioned 32bit framebuffer, basically the method Marco is using can be done on any machine capable of running SM2 or above pixel shaders... (possible it might work on even lower machine with some trickier...)
 
Qroach said:
Perhaps the PS3 isn't powerful enough to handle FP16 with good framerates? Anyway trade offs like that are common place in game development. It's always a balance of quality versus performance.

Unless Deano wants to elaborate on performance pre-precision switch - which would have been on non-final hardware to boot I'm guessing - we can't really infer anything about whether or not FP16 would have been usable for Heavenly Sword from his comments thusfar. The switch allows things to be faster (that doesn't => it was too slow to start with) and there was a good non-framerate related reason given for them switching too - MSAA. In other words, the motivation was not necessarily unacceptable performance (but rather the chance of better performance and IQ). Devs have that choice on PS3, I guess. Maybe this is also a hint at whether MSAA with HDR will be a feature in Nvidia's 90nm refreshes as rumoured (which RSX is said to be tied to, but then again, maybe they're independent).

As for 1080p, I think Sony's major reason for including that option is that Bluray movies will be 1080p. They have to provide for it. Sure, they showed trailers in 1080p at E3, but I wouldn't consider that use that as a reflection of the resolution distribution in the final products.
 
Alpha_Spartan said:
A few weeks ago I heard a "commoner" say that the PS3 was supposed to be 15 times more powerful than the Xbox 360. That's the Sony hype machine at work.

that was from those brilliant "news" reporters from E3.

I saw it myself on tv and was like... :rolleyes:

;)
 
aaronspink said:
Quick question, anyone know if the flex I/O interface is serial control or parallel control? How about the Xenon-Xenos interface?


I’m not sure this will answer your question, The FlexIO within the PS3 uses unidirectional signal channels; meaning the read & write channels are separated. This in theory provides less latency and less chance for error among the read & writes. As far as I know the Xbox 360 still uses the common bidirectional bus system found in today’s PCI-Express based PCs.
 
ROG27 said:
Ahhh....before you jump to conclusions, my fine, furry friend, you should realize that once again I have not implied whatsoever that the Unreal 3 engine in any way, shape, or form is pushing the PS3 hardware...while, contrary to popular belief, the 360 is chugging along at a sub-par framerate with loads more time for optomization. Why I mentioned GoW being able to be run on flawlessly on the PS3 is because Expletive tried to argue that things on 360 will match the quality of things we will see on the PS3. I'm pointing out that the Unreal 3 engine is not the pinnacle of what is achievable, but rather, the bare minimum of what we should see in the next-gen (a multi-platform/unoptomized engine). I think games like MGS4, Motorstorm, Killzone will surprise you when they come out because of how close they come to their projected targets. This, in itself, is no insignificant leap from what is "here and now" on the XBOX 360.

You can tap-dance around the issue all day long if you want. The fact is the best confirmed realtime stuff for both systems are completely and totally comparable.

On one hand you have stuff like Mass Effect, Gears of War, Too Human, and Chromehounds, on the other you have stuff like MGS4, Mobile Suit Gundam and Heavenly Sword.

If you really think that the best realtime stuff we've seen so far is not pretty much on par, I would suggest that's your own BIAS.

KZ was CG, the mere fact you would even mention it speaks volumes to me about your level of objective reasoning.
 
[Mounts high horse] Why don’t all you guys wait until CES (January) and Sony Press Conference (February), to make a valid opinion or decision on which system is providing better graphics. All this console penis envy is getting real old…real old [Mounts high horse]
 
nao,

What kind of trade offs are you talking about?

Quality versus speed. In this case quality = percision, however adding AA certain is going to improve the visual quality in a different way. To be honest I think FP 16 will be a waste for most games to use, considering how the over whelming majority of fixed pixel LCD televisions can't display the same range of colors as a plasma or tube TV.


Titanio,

The switch allows things to be faster (that doesn't => it was too slow to start with) and there was a good non-framerate related reason given for them switching too - MSAA.

I don't think you need to stretch what was said to such lengths. Deano did say it's faster and they can add MSAA, so to me that is something that didn't sound like was possible before the switch. Not a big deal really but still an interesting decision.

As for 1080p, I think Sony's major reason for including that option is that Bluray movies will be 1080p. They have to provide for it. Sure, they showed trailers in 1080p at E3, but I wouldn't consider that use that as a reflection of the resolution distribution in the final products.

Not necessarily. Sony was touting 1080p for "games", which is why many videos of games at E3 were made in 1080 resoloution. They said " look our games run in 1080p" compared to our competition. it was a bigger dick contest, it's pretty obvious anyone can see that. I fully don't expect developers to support a feature that can't be used by the majority of the market. It's a waste of thier time.
 
Last edited by a moderator:
Deano, are you guys releasing any new screen shots soon? I'm dying to hear some more technical details about the rendering as well.

This is currenly the only must have game on my list for PS3. I've been interested in it since i first heard you guys were working on it. :)
 
Titanio said:
Unless Deano wants to elaborate on performance pre-precision switch - which would have been on non-final hardware to boot I'm guessing - we can't really infer anything about whether or not FP16 would have been usable for Heavenly Sword from his comments thusfar.
Well hopefully its fairly obvious that 32 bit integer framebuffers are faster than 64 bit ones. So why wouldn't you use 32 bit integer framebuffers if the quality is the same?

There no real lose of quality... float RGB space doesn't make much sense for HDR, so we don't use it.

Bet NVIDIA and ATI are pissed about wasting all that silicon for FP16 stuff :devilish:
 
Qroach said:
Deano, are you guys releasing any new screen shots soon? I'm dying to hear some more technical details about the rendering as well.

This is currenly the only must have game on my list for PS3. I've been interested in it since i first heard you guys were working on it. :)
Don't ask me, Sony control every little bit of info released...
 
Qroach said:
I don't think you need to stretch what was said to such lengths. Deano did say it's faster and they can add MSAA, so to me that is something that didn't sound like was possible before the switch.

Sure, but what's possible with their current implementation not being possible before does not mean that what they were using before was unuseable ;)

Qroach said:
Not necessarily. Sony was touting 1080p for "games", which is why many videos of games at E3 were made in 1080 resoloution. They said " look our games run in 1080p" compared to our competition. it was a bigger dick contest, it's pretty obvious anyone can see that. I fully don't expect developers to support a feature that can't be used by the majority of the market. It's a waste of thier time.

I agree it won't be used in the vast majority of cases.

DeanoC said:
Well hopefully its fairly obvious that 32 bit integer framebuffers are faster than 64 bit ones.

Of course, but faster doesn't imply it was too slow with a 64bit buffer. Unless you're saying that now? :)

DeanoC said:
So why wouldn't you use 32 bit integer framebuffers if the quality is the same?

There no real lose of quality... float RGB space doesn't make much sense for HDR, so we don't use it.

Whilst this is your experience with HS sofar, do you think you could never construct scenes that would challenge that? Would certain scenes be easier to handle with higher precision? I remember you talking before about even the limitations of FP16, with your cloud renderer - is it the case that things are mostly as they were before, but not entirely? Are there literally no compromises? It's something I've wondered since the debate over higher precision erupted.
 
Last edited by a moderator:
Maybe in later part of PS3 (say 2009+), when 1080p TVs become widely available, we will see flagship games such as Gran Turismo supporting 1080p.
 
Deepak said:
Maybe in later part of PS3 (say 2009+), when 1080p TVs become widely available, we will see flagship games such as Gran Turismo supporting 1080p.
I would expect support much sooner from that franchise. They already support 1080i.
 
Titanio said:
Of course, but faster doesn't imply it was too slow with a 64bit buffer. Unless you're saying that now? :)
Well anything slower than the fastest IS too slow ;-)

Titanio said:
Whilst this is your experience with HS sofar, do you think you could never construct scenes that would challenge that? Would certain scenes be easier to handle with higher precision? I remember you talking before about even the limitations of FP16, with your cloud renderer - is it the case that things are mostly as they were before, but not entirely? Are there literally no compromises? It's something I've wondered since the debate over higher precision erupted.
Of course its alway possible to need more dynamic range than you have but than its hard to make decent art with too much range of lighting conditions anyway. We will have roughly the same useable range with FP16 or Marco's INT8 colour space, if we ever needed FP32 HDR I'd consider it an engine/art failure.
 
Last edited by a moderator:
jvd said:
Take a look at re4 , it easily stands up to doom3 . The lighting is great in the game


Although its realyl a waste of time talking to you

I felt that the Doom 3 port to xbox was abyssmal, it was about on par graphically with the on foot sections of Rebel Strike. (which had some nice lighting and bump mapping going on for gamecube, and were about what the PC version of doom 3 was degraded to on xbox, wasn't the xbox doom 3 limited to 2 light sources at a time?)

BTW, people keep referring to the MGS4 demo as having awesome graphics....have you guys seen more of it than I have, or at least a direct feed? It doesn't look so good from what I saw of it, I don't even recall it having bump mapping.
 
Back
Top