ATI - PS3 is Unrefined

AlphaWolf said:
I suppose I should have said, dynamic branching performance.

Unless its something they have seriously refined for rsx over g70 its not going to be strong in this area compared to xenos.

Ok then. :smile:
 
weaksauce said:
No I mean an early devkit.

Where is Xenos more superior? Framebuffer bandwidth? Anything else?

John Carmack was pissed on sony because he didn't get devkits sooner and that they weren't complete.
"He knows from a technology horsepower standpoint that it'll do everything that we want it to do, so we're committed to it."
http://www.bdgamer.net/?itemid=19874

also:
http://www.joystiq.com/2005/09/26/doom-3-super-edition-in-the-works-for-ps3/
:LOL:
He only mentions 360 development in his most recent interview. Might be a sign that his next engine will run best on the 360's GPU. Kind of like Nvidia vs. ATI with Doom 3. He also seems like the person who would make use of advanced features like MEMEXPORT.

http://blogs.guardian.co.uk/games/archives/2005/12/27/a_brief_john_carmack_interview.html
 
TurnDragoZeroV2G said:
Much faster if we ask the other question as well: where is RSX, so far as NV has detailed it, superior.

Correct me if I'm wrong:
Better shading performance, better flop performance, higher clocked core, a third more transistors in core. It's got twice the bandwidth to main ram if you wanna count in the XDR memory, but then it hasn't got edram so they both have advantages over each other there.

robofunk, well he doesn't really say he's not developing for ps3 so.. Maybe you're right, on the other hand tho, he still knows how to code for OpenGl so I don't think there's gonna be a problem. :eek:

Let's look it like this then:
Ps3 - epic
360 - Id

:smile:
 
weaksauce said:
robofunk, well he doesn't really say he's not developing for ps3 so.. Maybe you're right, on the other hand tho, he still knows how to code for OpenGl so I don't think there's gonna be a problem. :eek:
I'm fairly certain his preference for OpenGL in the past was due to it being a superior development environment and portability. He's had nothing but praise to the Xbox 360 SDK so he probably has no problem with it being DirectX aside from not being portable.
 
MasaC said:
I'm talking about games rendering two different angles and outputting each to its own video output which are connected to a pair of 3D glasses (obviously the game won't have to render at full 1080p). Since there are two separate video signals there won't be any annoying flickering as in previous attempts with 3D glasses using only one video signal.

The different angle should be the same as the distance between the human eyes gives us when we watch something and that is what gives us depth perception. You could simulate this on the PS3 by connecting a pair of 3D glasses to both hdmi outputs.
IIRC that's what Metal Gear Acid 2 is doing with PSP...
 
weaksauce said:
Correct me if I'm wrong:
Better shading performance,

What "shading performance," exactly?

better flop performance

Flops huh. ;) I always wanted a crack at that game.

Value your brain cells? Heh. And I am just an idiot when it comes to this

Mm. G70 has a significantly higher peak than R520. ~630Mhz with 16 pipes, with 1 full ALU and a mini ALU. Total of 12 flops/clock*pipe. Comes out around 120, right? G70, with two full ALUs a pipe and 24 pipes, with a total of either 16 flops or 27 flops depending on whether FP16 normalize counts. But, comes out around 165 I think, at 430Mhz. That is just pixel shaders, of course. But, when I see performance, I naturally link that to what it actually gets. 165 vs. ~120 ('course, using NV's numbers from that old G70/RSX/Xenos chart doesn't help it here--that was made by NV, no?). Despite that huge difference, it doesn't gain much of a lead over R520 very often. Of course, one of the ALUs is often busy being borrowed, so it's to be expected (conversely, how limited is ATI's mini-ALU? is there anything besides Add that it can do?). Certainly doesn't seem to get as big an advantage as its flops numbers (the taste in my mouth reminds me of a BBQ sandwich I once ate from KFC. Man I knew I should have put it down by the time the next morning came around) would suggest.

If RSX ends up with the same pipeline structure as G70, with 24+8 pipes, then you'll get about 211 against Xenos' 240-vertex processing. If they're to be pushing the same amount of geometry, roughly, then 8 vertex processors (then again, Huddy seems to say alot that vertex processing isn't the limit in any but a few cases) brings it down to 200. An even smaller advantage than G70 has on R520. Gotta question how valuable the extra component is for pixel shading, however.

higher clocked core

Considering how massibely parrallel GPUs are, that alone kinda seems inadequate for a comparison.

a third more transistors in core.

~301 vs. 232+20 (if the ROPs don't count, then we should be using a minus sign on RSX's numbers, right?). But, it does of course still have the numbers. Then again, Xenos is technically missing half of its ROPs (with an unknown trade-in value with regard to its fixed function, but 4-sample multi-sampling), not to mention scheduling logic for only 3 SIMD arrays, with no second full or mini-ALU to worry about, as opposed to 4 or 6 quads and multiple vertex shaders in whatever their respective arrangements are. *shrug* Then more registers and extra logic for the threads it keeps. And so on, etc, etc. So, uh... tell me when you've added and subtracted that all. I'm not that hard working. :oops:

It's got twice the bandwidth to main ram if you wanna count in the XDR memory, but then it hasn't got edram so they both have advantages over each other there.

Yeah, so that's a big trade-off, but if Xenos' greater framebuffer bandwidth counts, then RSX's greater flexibility regarding its framebuffer bandwidth should certainly be an advantage. Brings up a rather good consideration, too: If a game doesn't need that 4Gigapixels/16Gigasamples per second of fillrate from Xenos, then there's a ton of bandwidth really going to waste there. But, heh.
 
Sis said:
A) I've still not heard that Blu-ray movies will be encoded at 1080p. In fact, this would be an excellent avenue for doubling up a movie studio's money by selling a 720p version, then offering a quasi-Superbit 1080p version. However, I could be wrong and 1080p is standard encoding.
Maybe, maybe not. No standard has been announced, speculation is on 1080i as standard progressing to 1080p later on. 1080i will almost certainly be utilised over 720p in the majority of cases though (all authored movies I've heard of are 1080i or 1080p).

Sis said:
B) If a few games rendered in a higher resolution amount to crowning a system as being "higher-def", then the original Xbox should be considered hi-def. But that would be silly.
PS3 has a lot more advantages than just a few games (or more than a few, who knows). A true high-definition video format, true high-definition output, along with support for the highest current (consumer) resolution.
Just having Blu-ray itself makes it "higher-def" than Xbox360. Thats a fact.

Sis said:
C) Claiming that you can output 2 1080p signals is disingenuous at best.
How so? The PS3 can output two 1080p signals at once. They're very sincere and upfront on that point.
 
boltneck said:
Heck, Even John Carmack Chose the Xenos platform over the RSX platform for future development.

I would never use that as an indication of the superiority of a chip, at least not if we talk about performance. After all, he prefered the NV30 in front of the R300 for development just because it supported longer shaders.
 
Last edited by a moderator:
Nicked said:
Maybe, maybe not. No standard has been announced, speculation is on 1080i as standard progressing to 1080p later on. 1080i will almost certainly be utilised over 720p in the majority of cases though (all authored movies I've heard of are 1080i or 1080p).

In the case of a movie at 24fps - generally speaking the video stream would be 50 or 60 fps and would allow you to reconstruct the original progressive picture by pairing up the appropriate fields. So a 1080i movie should allow a good player to build and output a true 1080p picture, providing the source material is film.

I would hope this is how a decent player would cope with 720p also - reconstruct the whole frame and scale down, rather than scaling each field (which I would expect to exhibit some kind of flickering or shimmering).

The alternative would be to encode at 24fps - but if you're doing that, you'd want to be 1080 progressive anyway - interlace would be a really bad idea at that rate.

Once/if movies move up to a faster rate of shooting, like with video, then we might want to see full 1080/60 encoding.

Having seen one of the new 1080p plasmas recently, I personally can't wait for a player capable of 1080p to match it. And then a win on the lottery so I can afford it all including a house big enough to put it in.
 
On Carmack, his choice of Xenos as his primary dev platform for now seems mostly related to tools and software environment, and particularly the API. He's said himself that PS3 is more powerful from both a CPU and "graphics operations" standpoint - just not so significantly so, from his POV, as others may consider (particularly for those who may place more importance on CPU power, for example, than Carmack evidently does).

TurnDragoZeroV2G said:
If RSX ends up with the same pipeline structure as G70, with 24+8 pipes, then you'll get about 211 against Xenos' 240-vertex processing.

8 G70 vertex pipes, and 24 G70 pixel pipes comes out at ~255Gflops. That doesn't count the mini ALU in the pixel shaders or the FP normalise, on a positive note, or texture addressing, on a negative note. Xenos's count for 32-bit Gflops is closer to 210-220Gflops IIRC. It's just paper figures, of course.

On the bandwidth point. if you put the framebuffer in VRAM, as I think will pretty much always be the case, RSX will always have more bandwidth for everything else, but the framebuffer. Depending on CPU bandwidth usage, the difference could be quite significant, twice or more even.

It's very difficult to make a comprehensive comparison between the two chips like this, though, particularly since we don't know everything about either chip, and any comparison you are going to make will be from information gleaned from Xenos articles that have come via ATi versus wholly independent analysis of G70 tech (i.e. warts and all) via reviews etc. A couple of other advantages on RSX's side might be blending on FP16 buffers, and more cache (as suggested by Inis - they think a lack of cache on Xenos is what's made them split into two shader passes what can be done in one on nVidia hardware, for example. These are things you won't find out from Xenos articles!).
 
boltneck said:
:rolleyes: Oh you mean like a Athlon X2 of some kind and a SLI 7800GTX setup. Woweee..

Epic is and has been in the hip pocket of Nvidia for years. There is no reason at all for them to be choosing the RSX over what is a clearly superior Xenos chip. Superior in all the things they have said is so important over the last couple of years.

Its just more blatant Hypocrisy, Bias and true misleading spin from Epic.

Heck, Even John Carmack Chose the Xenos platform over the RSX platform for future development.


The PS3 kit used a 2.4 Cell with a new card from Nvidia which was never confirmed to be the 7800 GTX,but was assume to be.


Also you should stop eating Ati PR like this,their knowlage about the RSX is as good as yours,they don't know they are assuming.


Even Bill Gates has say that both consoles are a ferrari,in other articles they say that the PS3 is a little faster in some areas and the 360 faster in another,this is done by looking at some of the specs of the consoles,Ati already say that unified pipelines will take them ahead of the RSX,now they say they chip can out do the RSX by a healthy marging,when even people from inside MS has say the oposite.


Epic doesn't need to be a hipocrite,they are very neautral until now in fact MS need GOW 100 times more than Epic need MS.


Also that Carmack comment proves nothing,as he also prefered the xbox as his platform of choice,just like Hideo prefer the PS3,and Itagaki prefer the xbox 360,nothing new.
 
One thought I would like to share : Just why did'nt Sony ask for a variant combining elements of both the G70 and the G80 since the latter would release in mid 2006 ?
 
AlphaWolf said:
I suppose I should have said, dynamic branching performance.

Unless its something they have seriously refined for rsx over g70 its not going to be strong in this area compared to xenos.

But only for fragment shaders. The more granular MIMD, dual issue, vertex shader units in G70/RSX should have better dynamic branching performance than the 'coarser' 32 issue (16 ALU) SIMD unified shader unit in Xenos...
 
mckmas8808 said:
HELLO McFly! You don't have to have a 1080p HDTV for the second screen. A small 15" flat screen or a regular SDTV will work.
No shit, I established that a few pages back. The discussion here centers around TWO 1080p displays. Developers aren't going to waste bandwidth on something as useless as a trivial display in 1080p. In short, we won't see 1080p PS3 games, period.
 
Alpha_Spartan said:
No shit, I established that a few pages back. The discussion here centers around TWO 1080p displays. Developers aren't going to waste bandwidth on something as useless as a trivial display in 1080p. In short, we won't see 1080p PS3 games, period.

Aren't you a bit opinionated over something you know nothing about...

How many people used the 720p option in Soul Calibur 2 (XBOX)? How many people used the 1080i option in GT4?

Why is it so hard to think that the option to view games in 1080p will be there, seeing how we've had HD resolutions options in old consoles for years already?

Saying "we won't see 1080p PS3 games, period" is a bit much. It won't be in every game - by far - but i fully expect the option to be there, especially just for PR reasons in 1st party games (Sony trying to do the my-dick-is-bigger-than-yours-thing and telling the devs to put the option there or else!!).
 
TurnDragoZeroV2G said:
What "shading performance," exactly?



Flops huh. ;) I always wanted a crack at that game.

Value your brain cells? Heh. And I am just an idiot when it comes to this

Mm. G70 has a significantly higher peak than R520. ~630Mhz with 16 pipes, with 1 full ALU and a mini ALU. Total of 12 flops/clock*pipe. Comes out around 120, right? G70, with two full ALUs a pipe and 24 pipes, with a total of either 16 flops or 27 flops depending on whether FP16 normalize counts. But, comes out around 165 I think, at 430Mhz. That is just pixel shaders, of course. But, when I see performance, I naturally link that to what it actually gets. 165 vs. ~120 ('course, using NV's numbers from that old G70/RSX/Xenos chart doesn't help it here--that was made by NV, no?). Despite that huge difference, it doesn't gain much of a lead over R520 very often. Of course, one of the ALUs is often busy being borrowed, so it's to be expected (conversely, how limited is ATI's mini-ALU? is there anything besides Add that it can do?). Certainly doesn't seem to get as big an advantage as its flops numbers (the taste in my mouth reminds me of a BBQ sandwich I once ate from KFC. Man I knew I should have put it down by the time the next morning came around) would suggest.

7800GTX, 430 MHz ~ 199 Gflops, 32 bit programmable

1800XT, 625 Mhz ~ 170 Gflops, 32 bit programmable

No, there is NO HUGE difference as you suggest with those numbers.

TurnDragoZeroV2G said:
If RSX ends up with the same pipeline structure as G70, with 24+8 pipes, then you'll get about 211 against Xenos' 240-vertex processing. If they're to be pushing the same amount of geometry, roughly, then 8 vertex processors (then again, Huddy seems to say alot that vertex processing isn't the limit in any but a few cases) brings it down to 200. An even smaller advantage than G70 has on R520. Gotta question how valuable the extra component is for pixel shading, however.

Xenos, 500 Mhz ~ 216 Gflops, 32 bit programmable*

GTX512/RSX, 550 Mhz ~ 255 Gflops, 32 bit programmable

* Not 240 Gflops, from recent MS leak and another Japanese site. Each of the 48 ALUs is rated at 9 Flops/cycle.
 
london-boy said:
Aren't you a bit opinionated over something you know nothing about...

How many people used the 720p option in Soul Calibur 2 (XBOX)? How many people used the 1080i option in GT4?

Why is it so hard to think that the option to view games in 1080p will be there, seeing how we've had HD resolutions options in old consoles for years already?
Okay, perhaps you're right and PS3 could support 1080p (with trivial graphics, of course due to obvious bandwidth limitations), I just don't see people playing the game in 720p with the secondary display in 1080p.
Saying "we won't see 1080p PS3 games, period" is a bit much. It won't be in every game - by far - but i fully expect the option to be there, especially just for PR reasons in 1st party games (Sony trying to do the my-dick-is-bigger-than-yours-thing and telling the devs to put the option there or else!!).
I don't think it's "a bit much" at all. In fact, it's probably the only thing I'm damn certain about. Face it, the 1080p came free with the RSX's PC heritage, it's not like the rest of the system is built around supporting this for anything other than movies (128 bit graphics memory bus-width). There is a recent Phil Harrison interview where he only mentions 1080p for BR movies. Then there is a Sony presentation I believe that shows dual HDMI for things like extended displays and secondary displays with trivial graphics.

I'm sure Sony could upscale PSX and PS2 games to 1080p just to say they have it, but all we'll get is Enhanced JaggyVision (TM). However, I don't believe they'll do this. The lion's share of PS3 titles will be in 720p/1080i. Don't hold your breath for MGS4 in 1080p though. I'll continue to spout this until I'm blue in the face.
 
Alpha_Spartan said:
Okay, perhaps you're right and PS3 could support 1080p (with trivial graphics, of course due to obvious bandwidth limitations), I just don't see people playing the game in 720p with the secondary display in 1080p.

I don't think it's "a bit much" at all. In fact, it's probably the only thing I'm damn certain about. Face it, the 1080p came free with the RSX's PC heritage, it's not like the rest of the system is built around supporting this for anything other than movies (128 bit graphics memory bus-width). There is a recent Phil Harrison interview where he only mentions 1080p for BR movies. Then there is a Sony presentation I believe that shows dual HDMI for things like extended displays and secondary displays with trivial graphics.

I'm sure Sony could upscale PSX and PS2 games to 1080p just to say they have it, but all we'll get is Enhanced JaggyVision (TM). However, I don't believe they'll do this. The lion's share of PS3 titles will be in 720p/1080i. Don't hold your breath for MGS4 in 1080p though. I'll continue to spout this until I'm blue in the face.

How did you get the RSX White Papers?
 
Back
Top