ATI - PS3 is Unrefined

Vysez said:
The early PS3 Devkits had lower clocked DDE2 Cells, and Sli'ed (or not) 6800U.
The part about lower clocks is right, but I'd love to know where the talk about SLI comes from. There's never been a CEB model with SLI. It's always been a single 4x PCIe slot across a bandwidth-restricting southbridge.

Vysez said:
Which ones? The only three demos that were running, live, with the PS3 Devkit were the Duck demo, the UE2K7 and the Fight Night ones.
That ain't true.. the Alfred Molina face demo and the exploding gas station one were both definately running in realtime, being controlled by Phil, as our group in SCEE wrote 'em.

Dean
 
one said:
Have you watched the pre-E3 Sony conference? If you don't think Tim Sweeney is a liar...
Errr, I wasn't talking about Tim, I was asking about Vy's assertion stating "There were PS3 Devkits sent to developers months prior to E3. The early PS3 Devkits had lower clocked DDE2 Cells, and Sli'ed (or not) 6800U."
 
in my point di view, I BET




early life cycle, first 6 month of each console:


x360
CPU.
50% of devs uses 1 PPE, others 50% uses 2 or 3 PPE
5% uses 6-threading (doa4)
GPU.
no tiled render, no render tag, no procedural text., no memexport, no core locking
aa titles have an performance hit of 35-45% (no edram aa), maybe the hdr+aa produce a performance hit of 50-55% @720p

ps3
CPU.
90% of devs uses 1 PPE, others 10% uses some spe for physics (no vertex computing)
GPU.
devs uses rsx as g70

situation:
cpu, ps3 similar to x360
gpu, ps3 similar to x360


second gen titles (6 to 18 life cycle)

x360
CPU.
30% of devs uses 1 PPE, others 70% uses 2 or 3 PPE
40% uses 6-threading for IA and other things
GPU.
yes tiled render, no render tag, yes procedural text., no memexport, yes core locking
some titles start to use global illumination (blue dragon)
ps3
CPU.
70% of devs uses 1 PPE, others 20% uses some spe for physics, 10% uses spe for vertex computing
GPU.
90% devs uses rsx as g70, 10% uses cell to help with vertex calc

situation:
cpu, ps3 has better physic sim then x360, x360 can be better IA
gpu, x360 has better antialised and clean graphic than ps3, that show a little more polygons





end cycle life (after 36 months)

x360
CPU.
10% of devs uses 1 PPE, others 90% uses 2 or 3 PPE
80% uses 6-threading
GPU.
yes tiled render, yes render tag, yes procedural text., yes memexport, yes core locking, yes tessellator

Global illumination is diffuse in games

ps3
CPU.
10% of devs uses 1 PPE, others 90% uses some spe for physics AND vertex computing
GPU.
+ vertex comp from cell
+ post effects from cell

situation:
cpu, ps3 better than x360
gpu, x360 better than ps3
 
Last edited by a moderator:
SynapticSignal said:
end cycle life (after 36 months)

x360
CPU.
10% of devs uses 1 PPE, others 90% uses 2 or 3 PPE
80% uses 6-threading
GPU.
yes tiled render, yes render tag, yes procedural text., yes memexport, yes core locking, yes tessellator

Global illumination is diffuse in games

ps3
CPU.
10% of devs uses 1 PPE, others 90% uses some spe for physics AND vertex computing
GPU.
+ vertex comp from cell
+ post effects from cell

situation:
cpu, ps3 better than x360
gpu, x360 better than ps3

DEV's right know are doing post processing on Cell already, Hideo for example. I think you'll find that Cell and its SPE's will start being utilized sooner than every body think's. All first party Sony titles WILL use the SPE's afterall it was the in-house sony titles on PS2 that really used the vector units in PS2. And by all means dont put 360 GPU better than PS3, as 360 still takes a performance hit when using HDR+FSAA when using tile rendering, in fact i believe the performance hit is greater than ATI and M$ said it is.
 
!eVo!-X Ant UK said:
DEV's right know are doing post processing on Cell already, Hideo for example. I think you'll find that Cell and its SPE's will start being utilized sooner than every body think's. All first party Sony titles WILL use the SPE's afterall it was the in-house sony titles on PS2 that really used the vector units in PS2. And by all means dont put 360 GPU better than PS3, as 360 still takes a performance hit when using HDR+FSAA when using tile rendering, in fact i believe the performance hit is greater than ATI and M$ said it is.

excuse me for my unbiased bet

you think that ati and microsoft (not M$, ******s writes words as this or $ony, please keep the replyes calm and educate, we are all civil people, not put the thread in a "sony vs microsoft" fans-war)

I know what Hideo says and I've seen the video when cell applyes postprocessing on MGS4 demo
but hideo talk about post-processing as modifying the color temperature, I mean a different thing, I mean that devs will modify the frame buffer with advanced effects, and modify the color is not so advanced

I think that I give an unbiased point of view, but if thinking that x360 gpu can have some advantage, hurts you, I'm sorry for you...


to all users:
please don't switch this thread in a console war
 
SynapticSignal said:
ps3
CPU.
70% of devs uses 1 PPE, others 20% uses some spe for physics, 10% uses spe for vertex computing
GPU.
90% devs uses rsx as g70, 10% uses cell to help with vertex calc

And why exactly would a developer sitting in front of a PS3 devkit today do such a thing?
 
Hardknock said:
the int. comes from "Edge" Magazine

Q&A: Richard Huddy European developer relations manager, ATI

How do you think your work on the 360 measures up to PS3?

I take a fairly robust view on this. The Xbox 360 GPU is designed to be a console GPU - that's what we set out to produce when we started the collaboration with Microsoft; let's build a really powerful, really flexible kind of general purpose GPU which doesn't have performance cliffs where if you do certain things suddenly the performance crashes down by a factor of two or something like that; let's have things pretty predictable and easy to work with, and let's generate about the best performance that we can- so we went for things like the unified shaders and so on. The PS3 has been designed in a quite different way because of the way the process worked. We sat down with Microsoft and said: 'This is what we think we can build', and they said: 'Yes, but what about...?' And they started picking holes in our design, so we came up with a collaborative design. They didn't put a spec in front of us and say: 'How much for this?' That definitely wasn't the dialogue - in fact that would make it more of a monologue; it would be kind of bidding on prices and so on. Instead what we have is a very collaborative design.

With the PS3 my understanding of what happened is that they had three different internal hardware solutions - at one point, for example, as I understand it there was a proposal to use multiple Cell processors just to handle the graphics. And towards the end of the process, as the story goes, they took a look at the three internal tenders and decided than none of them would actually do; none of them would deliver the kind of performance and quality that games programmers could use and would make for a good cost-effective console, so they had to go out and shop around. And one of the places they shopped was Nvidia, and what Nvidia did was say: 'Well, you've got this relatively short timeframe, you've got roughly this kind of budget, I'll tell you what we'll do: we'll do you a good price on what is essentially the 7800GTX'. So that's a PC chip, and if you look at the architecture of the two consoles you can see we've done bizarre things that they haven't. We've built ten megabytes of dedicated ED RAM which knows how to antialias and so on, because that's a specific way of addressing a console's problem. It's bizarre in a PC sense but a special skill for a console builder. Whereas the PS3 has 256 meg of system memory and 256 meg of graphics memory it communicates through what is effectively a PCI express bus. It uses GDDR3 fast memory, it's essentially a PC graphics design bolted on to a Cell processor and 256 meg of fast system memory...

You make it sound so unrefined!

[Laughs.] Well, yeah, but the tragedy is that it is unrefined. There's a lot brute force in there - I'd be the last person to admit it, but the truth is that the 7800GTX is a pretty powerful piece of hardware, but it's not very elegant, it hasn't got the kind of: 'Well, how do we design this to be the best possible console we can build for this money?' Instead it's been put together at the end of quite a complicated process. We have two very different design processes. If Microsoft had come to us and said: 'All right, what are we going to do about this graphics chip, then? Let's sign the contract and let's go', but then we'd got two thirds through and they'd said: 'Look, you guys aren't going to deliver - now what are you going to do?' and then walked away from us, they would have ended up with a design very much like the PS3 in some essential characteristics - it would have had to use bought-in components. And our GPUs instead are custom-designed components, and that's one of the fundamental reasons why I think Xbox 360 technology is likely to outperform PlayStation 3 technology by a pretty healthy margin in the long run.

So how about this one: can those E3 PS3 demos be achieved on Xbox 360?

Well, why not take another combative line here? I think it's more likely that they can be realised on an Xbox 360 than they can be on a PS3. Those things are movies generated using whatever DCC software the houses had in mind. The Epic demo was running on a PC, and it was done using an early 7800 in SLI mode, so that was a high-end PC demo, but the movies were generated as movies and dressed up as: 'This is what you can expect from a PS3', but that's probably overstating what the PS3 can do a little bit. Indeed, it's well beyond what we expect the PS3 to be able to do. So I guess we'll just have to see what happens..."

lol lol lol
 
DeanA said:
The part about lower clocks is right, but I'd love to know where the talk about SLI comes from. There's never been a CEB model with SLI. It's always been a single 4x PCIe slot across a bandwidth-restricting southbridge.

I think, not sure, that came from psm at least probably a few sites too, and the incredible otherwordly look of the realtime demos combined with the talk of the 7800s being scarce. Only demo that one'd consider as likely to be 6800 are the I-8 one and sega's phantom thing.

DeanA said:
That ain't true.. the Alfred Molina face demo and the exploding gas station one were both definately running in realtime, being controlled by Phil, as our group in SCEE wrote 'em.

Dean

Dunnoh, if this is NDA or not, but if not I'd be happy if you could comment on the following:
Those were with 7800 right? GTX? overclocked, stock speed? and 2.4Ghz cell? how many spes? How did they stress the h/w, was it barely taxing it or was it pushing some bounds(The molina demo in particular)? As for the molina demo, dunnoh if this is tech nonsense or not, but was it using the highest lvl of detail textures the h/w can handle or can it go even further?

When I read this:
if you look at the architecture of the two consoles you can see we've done bizarre things that they haven't.

What comes into my mind(yeah probably not the case but still.), is the following:
if you look at the architecture of the two consoles you can see we've done bizarre things that they haven't, like compromise aniso performance for no good reason :LOL:
edited
 
Last edited by a moderator:
SynapticSignal said:
excuse me for my unbiased bet

So very unbiased, yes. Informed and informative are clearly different things in this case, however.

I think more people will simply have a problem with crystal ball gazing, wrapped in figures and percentages pulled from your arse - to give it an air of science or measurement - followed up with lashings of "i'm unbiased. sorry if it hurts you to think system x has advantages" (paraphrased), and topped quite spectacularly with "let's not turn this into a console war". Delightful, Synaptic.
 
Last edited by a moderator:
if you look at the architecture of the two consoles you can see we've done bizarre things that they haven't.

I know he is talking about the GPU, but isn't it ironic that the same people that use the argument of Cell being a "bizarre" CPU as being a bad thing and than the same "bizarre" argument as being something good if it's their GPU?

Fredi
 
zidane1strife said:
Dunnoh, if this is NDA or not, but if not I'd be happy if you could comment on the following
I can't comment on the precise differences between the graphics cards types that have been mentioned on these forums (A, B, and C) - due to a mix of NDAs, and the fact that I just can't remember.. was so long ago! Regarding performance, all I can probably say is that the performance (of A and B) was way less than the current Type C cards. The two demos I mention were running on the development hardware described by Kutaragi-san at E3 (CEB series), so that'd be 2.4Ghz. Probably can't discuss the split between PPU and SPEs - sorry.

Worth pointing out that the kit is called 'Cell Evaluation Board' for a reason - it doesn't exactly represent the performance of a 'real' PS3' (eg: that southbridge really gets in the way - whereas RSX doesn't suffer from having to go through it).. Especially the earlier units with DD1.0.

WRT the Molina demo.. From memory, I don't think it taxed the machine too much, to be honest. It certainly runs a lot faster on current evaluation systems (3.2Ghz DD2.0, Type C graphics). And no, it wasn't using the largest textures sizes the h/w can handle.
 
Titanio said:
So very unbiased, yes. Informed and informative are clearly different things in this case, however.

I think more people will simply have a problem with crystal ball gazing, wrapped in figures and percentages pulled from your arse - to give it an air of science or measurement - followed up with lashings of "i'm unbiased. sorry if it hurts you to think system x has advantages" (paraphrased), and topped quite spectacularly with "let's not turn this into a console war". Delightful, Synaptic.

yes, very unbiased bet

and if you have problem with the meaning of "bet" I suggest you to use a dictionary

and we don't need your trolling with ironic personal attack, if you are able to talk of the topic, limit yourself to the topic
the mods should see your reply, because this kind of useless replyes makes the discussion to garbage

if you want to discuss about the advantage of xenos, here we are, but if you wan't to troll, I' start to ignore you, after report your replyies
 
Well..This isn't the best period ATI is undergoing right now so they need to talk alot of smack about the competition which all began right before the R520. That didn't turn out so well for them. My guess is they don't know anything about the RSX.
 
SynapticSignal said:
yes, very unbiased bet

and if you have problem with the meaning of "bet" I suggest you to use a dictionary

You're not betting anything, that's the problem. There's nothing for you to lose :) You're simply saying "I bet", as an intro to speculated futures.

SynapticSignal said:
if you want to discuss about the advantage of xenos, here we are, but if you wan't to troll

Anything other than a discussion of Xenos advantages is trolling now?

If you want me to invest time in a discussion of your original post, I'll do so, but I don't think you'd be happy with it. In brief, even as a speculated future - and all the liberties that bestows on you to pretty much say anything you want - it's quite flawed.
 
You're all assuming that using SPUs is entirely up to individual developers like it mostly was with PS2 and the VUs.

Times have moved on. Even on PS2 middleware was available which utilised the extra processors. On PS3 this is likely to be a lot more widespread - and Sony themselves are supplying a lot more in the way of libraries and stuff themselves.

So just by plugging in some standard libraries we're getting some usage of SPUs.

Meanwhile actually using the SPUs is an awful lot easier than it was to use, for example, VU0. They're self-contained so we don't have to manage them externally and they have a proper set of capabilities, they have a proper compiler so we don't have to hand-roll assembly for them...

It might take a while for developers to all be making the best possible use of the whole system, but already most developers will be using SPUs, and I doubt we'll see any titles that don't make some attempt at usage. It's not going to be nearly as tough to properly utilise as a PS2.
 
Titanio said:
You're not betting anything, that's the problem. There's nothing for you to lose :) You're simply saying "I bet", as an intro to speculated futures.

if it is so "simply" than why do you don't understand?

Anything other than a discussion of Xenos advantages is trolling now?

anything that personal attack a user is trolling, exactly what are you doing


If you want me to invest time in a discussion of your original post, I'll do so, but I don't think you'd be happy with it. In brief, even as a speculated future - and all the liberties that bestows on you to pretty much say anything you want - it's quite flawed.

ok, as you want to be polemic and off topic and continue an useless discussion on my person, you'll be ignored and reported
grow up boy.
 
DeanA said:
I can't comment on the precise differences between the graphics cards types that have been mentioned on these forums (A, B, and C) - due to a mix of NDAs, and the fact that I just can't remember.. was so long ago! Regarding performance, all I can probably say is that the performance (of A and B) was way less than the current Type C cards. The two demos I mention were running on the development hardware described by Kutaragi-san at E3 (CEB series), so that'd be 2.4Ghz. Probably can't discuss the split between PPU and SPEs - sorry.

So can you tell us all more about the Getaway demo? Sony have gone on record as saying it was software rendered on a kit - how was that done? Or indeed, why?

I'm just curious as it always seemed a bit of a strange choice/claim.
 
DeanA said:
WRT the Molina demo.. From memory, I don't think it taxed the machine too much, to be honest. It certainly runs a lot faster on current evaluation systems (3.2Ghz DD2.0, Type C graphics). And no, it wasn't using the largest textures sizes the h/w can handle.

Can you mention any specifics about what Cell was doing in this demo? Phil Harrison mentioned rather generally that it was helping with some heavy-duty number crunching relating to lighting, taking some load off the GPU. Can you shed any more light, within the bounds of your NDA?
 
MrWibble said:
So can you tell us all more about the Getaway demo? Sony have gone on record as saying it was software rendered on a kit - how was that done? Or indeed, why?
Sorry, but no.. I wasn't involved in that demo, so it's not really my place to speculate about it's development.
 
Back
Top