What's the current status of "real-time Pixar graphics&

That's a very good question.

I'd say there was no doubt that the Xbox is superior to the PS2 in visuals, the main reasons being
- the difficulty of implementing per-pixel mipmapping and hence trilinear on PS2
- anisotropic filtering
- compressed textures

However, the console game market seems to be reasonably tolerant of this because the comparison is with the previous generation (PS1's texturing quality and polygon snapping sucked so bad it was unplayable for me) rather than with the PC market.

If the PS2 had a technology 'lead' over the (PC / Xbox) architectures here labelled 'brute force' it was a six to twelve month lead at most. PS3 may do better; my personal opinion (but of course I would say this because I work for ATI) is that PS3 will only be on-a-par of what you could do with a £600 PC on the day it is released. (To defend that choice of price; inevitably, the PC is more expensive because of the lack of format royalties).

The PS3 will probably go ahead in actual realised quality for a period because PC games don't target the bleeding edge market.
 
Going back to the topic at hand what kind of effects and performance is a £600 PC or PS3 (and GC2, XBOX2) going to be able to pull off come 2005/6?

We still wont be at Toy Story level but on a TV Screen of 720*576 resolution and 60fps I think even a Radeon 9800 and midrange P4 or AthlonXP could pull off a emulation of Toy Story that would fool most people.
 
what you could do with a £600 PC on the day it is released. (To defend that choice of price; inevitably, the PC is more expensive because of the lack of format royalties).

Wow!!! I'd hold on upgrading my rig... I mean supah pcs, and cheap ones, are arriving soon out of nowhere...

100s GB/s bandwith.... approxx... 2B+trans. for vs and ps h/w.... at multi-GHz speed... featuring world's smallest and highest perf trans... IIRC their 65nm trans... has better perf. than the one for 45nm intel had in the labs...nigh 45nm h/w ( that is supah large and can only break-even profit, be financially viable in the long run, at 45nm )

H/w that will likely be sold at 400$(ps2 orig. price in japan equiv. 380$), with 200-300$ in losses per unit....
 
zidane.. everything you stated there is speculation however likely it is. You have to also realise that some of the specifications you go on about may be happening on the PC side as well or the PC side will still be able to keep up in terms of visuals when comparing the PS3 to the state of the art PC game.

One reason is that it will take a while for developers to get the hang of PS3 and learn its secrets whilst on the PC it has always been about incremental updates. By the time the PS3 begins to become familiar to developers the PC architecture may have caught up or gone further passed the visual prowess of PS3.

I am speculating too - I didn't want to turn this thread it the inevitable PS3 will kick PC's ass.
 
Tahir said:
We still wont be at Toy Story level but on a TV Screen of 720*576 resolution and 60fps I think even a Radeon 9800 and midrange P4 or AthlonXP could pull off a emulation of Toy Story that would fool most people.
I'd agree, although it would undoubtedly be a 'processed' scene rather than 'the original data'. Any ray-tracing effects in particular aren't really feasible. (Is there a demo coder to prove me wrong?)
 
I'm not so sure about technical parity between PCs and PS3 at it's introduction, for the simple fact that at first a PS3 will cost several hundred dollars more to produce than a high end PC card (if it's not a bit better, then what a waste!). OTOH I think by 2005/2006 real-time 3D will be at a point where even huge differences in hardware will equate to very minimal perceived difference (diminishing returns). By the time "Playstation 4" is here even low end hardware should be able to do what is basically photorealistic graphics in real-time. OH and we'll have flying cars ;)
 
Josiah said:
I'm not so sure about technical parity between PCs and PS3 at it's introduction, for the simple fact that at first a PS3 will cost several hundred dollars more to produce than a high end PC card
That's why I said a £600 PC. I'm assuming the PS3 won't ship for much more than £300.

Even in the early days, I can't see Sony burning more than £150 per console. (Note also that retail margins on PC sales are generally smaller than RRP'ed electronics retail margins).
 
Going back to the topic at hand what kind of effects and performance is a £600 PC or PS3 (and GC2, XBOX2) going to be able to pull off come 2005/6?

We still wont be at Toy Story level but on a TV Screen of 720*576 resolution and 60fps I think even a Radeon 9800 and midrange P4 or AthlonXP could pull off a emulation of Toy Story that would fool most people.

No, not a R350 / Radeon 9800, but perhaps an R5xx would come close to pulling off a version of Toy Story that could fool alot of people.

( thus Xbox 2 ?? )
 
One reason is that it will take a while for developers to get the hang of PS3 and learn its secrets whilst on the PC it has always been about incremental updates. By the time the PS3 begins to become familiar to developers the PC architecture may have caught up or gone further passed the visual prowess of PS3.

I am speculating too - I didn't want to turn this thread it the inevitable PS3 will kick PC's ass.

Well... I hope so... but still it'd be difficult, I mean are the pc cpus gonna contribute anything to the gphx? or are they just gonna sit idle? I mean even upcoming tech doesn't seem to provide decent b/w to connect it all...

and that's not even talking about dev. costs for games...
 
Does the CPU run at idle when you play a game like Halo for example, with your PC now?

No. It is a different design strategy and nothing else. Sony's strategy may well be suited better to visual performance and quality. We shall wait and see, PS3 is only 3 years and 2 months away at the most. ;)
 
Dio said:
If the PS2 had a technology 'lead' over the (PC / Xbox) architectures here labelled 'brute force' it was a six to twelve month lead at most. PS3 may do better; my personal opinion (but of course I would say this because I work for ATI) is that PS3 will only be on-a-par of what you could do with a £600 PC on the day it is released.

I think the question you need to ask yourself before making such a projection is along the lines of, "If Sony launched ~1.8 years later, concurrently with XBox, just how big of a preformance delta would there be?"

You know, there's an aweful lot they could have done going from 250 to 150nm. Just looking at the sheer geometrical scaling (which is ignorant) they could do with the EE and GS designs are impressive... and likely to be dwarfed by the current STI ones. IMHO, Microsoft is in for a rude awakening this time around if they shoot for temporal parity.
 
Vince said:
I think the question you need to ask yourself before making such a projection is along the lines of, "If Sony launched ~1.8 years later, concurrently with XBox, just how big of a preformance delta would there be?"
As I said, 6-12 months, but that would also have been a very different strategy for Sony to operate because it would have essentially cut one and a half chip shrinkages out of the price ramp. It's all very well to say 'Look what they could have done with two 250mm2 chips on 150nm' but then they would not have been able to keep the PS2 consistently just under the price of the Xbox without making a lot less profit than they find desirable or even none at all.

I think the economics of the console market (which are pretty consistent - launch at £250-400 depending on target market segment, ramp down to £50-100 over four years) preclude any console launched around the same time having more than a slight technology lead simply because you can't afford to put in more transistors.

My original argument is this: Sony said EE+ GS were revolutionary, enabling cinematic rendering, and the press lapped it up, especially with all the 'PS2 can simulate nuclear detonations' stuff. From what I know of the EE and the GS they are decent enough, but nothing 'special', and they are hell on earth to get working well. I shouldn't even complain about the latter - it means that us asm hackers are still in demand! But in particular, I find PS2 image quality poor compared to Xbox (and Gamecube) in the same way that PS1 image quality was poor compared to N64 - certainly a long way from 'cinematic rendering'. But I never heard, or even hear now that criticism, just more PR dross about how revolutionary the EE is.

Now I hear the same claims for Cell and, well, a CPU is a CPU (or maybe 256 CPU's in this case, I don't know). It certainly sounds harder to program; maybe it is more powerful.

Rule #1: don't believe the hype.

(Actually, that is rule #3. Rule #2 is 'never get into a land war in Asia' and Rule #1 is 'never take the short cut when the long cut will get you there in time')
 
I find PS2 image quality poor compared to Xbox (and Gamecube)

In many games, but there are many an exception...

I know ps3 might not have 1Tflops... but it sounds promising, I mean two full blown chips contributing to gphx calcs, chips that likely will basically fill 45nm(65nm giants at a loss) to nigh the financially viable top. On top of that everything will likely be connected with supah b/w and mem., and you also have some good flexibility... if done right, it could be more than quite decent.

Many other cpus are taking different routes, and the b/w just doesn't seem to be there... thus the way I see it we're most likely left with just one single(unless things change) chip to do most of the gphx for us in the pc arena.

ed
 
Everything 'sounds' promising when promoted by the right PR people

Well, the technical side of the ps2's pr was quite accurate, IIRC... the tech demos and figures given were quite honest.... true their interpretation was blown a little out of proportion, but still...
 
Dio said:
As I said, 6-12 months, but that would also have been a very different strategy for Sony to operate because it would have essentially cut one and a half chip shrinkages out of the price ramp. It's all very well to say 'Look what they could have done with two 250mm2 chips on 150nm' but then they would not have been able to keep the PS2 consistently just under the price of the Xbox without making a lot less profit than they find desirable or even none at all.

And, with all due respect, you were also wrong in what you said. Or, more precisely, you aren't wrong - just copping out of the true issue at hand.

If you intend to look at what STI is doing with Cell as per it's relation to the competition (which you were, I can quote you if you'd like) then you first need to see that this generation will have release dates that are very near eachother - 2H2005 with a range of +/-1Q at a high probability. Thus, we can conclude that there will be no lithography/process advantage for XBox as we previously saw.

Continuing with this logic, it's impossible to not ask the question you stated, "Look what they could have done with two 250mm2 chips on 150nm". Which I think demonstrates my point very well. Hopefully Microsoft will tap an entity like Intel to do the back-end design and manufacturing of the X2GPU, otherwise STI will have the superior process technology even if they're on the same 65nm node.

Dio said:
I think the economics of the console market (which are pretty consistent - launch at £250-400 depending on target market segment, ramp down to £50-100 over four years) preclude any console launched around the same time having more than a slight technology lead simply because you can't afford to put in more transistors

I agree on the economical aspects, which is why process technology is paramount in feasibly yeilding a superior part. Sony is inheriently advantaged due to their in-house (or STI by extention) development pipeline and rapid shrinks. Unlike, say.. ATI, they can and have fielded ICs that approached 300mm^2 in the mass market due to this very reason. From what I've heard, the [14.8]*[14.8]mm^2, R300 is the largest you've ever done by a bit and that particular core didn't ship in 10M+ devices either.

My thinking (which I shouldn't stake a position on, but oh well) is that the 'Graphics ICs" will be relativly close when it comes to shading and such when done explicitly on the Graphic IC, but the overall "system" will be heavily in Sony's favor.

My original argument is this: Sony said EE+ GS were revolutionary, enabling cinematic rendering, and the press lapped it up, especially with all the 'PS2 can simulate nuclear detonations' stuff.

I think you'll also find that Sony was not the source of much of this "Hype" - yet people keep stating it was. In fact, SCE was pretty outgoing in regards to releasing specifications on realistic preformance of the Graphic Synthesizer:

http://ntsrv2000.educ.ualberta.ca/nethowto/examples/m_ho/ps2graphic.html

Even here you can see how it's the dude whose talking "hype" in regards to the specs Sony released.


Now I hear the same claims for Cell and, well, a CPU is a CPU (or maybe 256 CPU's in this case, I don't know). It certainly sounds harder to program; maybe it is more powerful.

Um, you are just a bit misinformed. Perhaps you should visit the console forum some time, or alternatively ask in a PM.

(Actually, that is rule #3. Rule #2 is 'never get into a land war in Asia' and Rule #1 is 'never take the short cut when the long cut will get you there in time')

HA! :LOL:
 
Vince said:
Or, more precisely, you aren't wrong - just copping out of the true issue at hand.

If you intend to look at what STI is doing with Cell as per it's relation to the competition (which you were, I can quote you if you'd like) then you first need to see that this generation will have release dates that are very near eachother - 2H2005 with a range of +/-1Q at a high probability.
I don't think I was ever talking about anything other than PS3 vs 'PC type' architecture, which includes the current Xbox - I've never made any statements about any Xbox successor! I made a comment about the price of the PS2 vs. the price of the Xbox, but this was looking backwards, not forwards.

I think you'll also find that Sony was not the source of much of this "Hype" - yet people keep stating it was.
Yes and no. Emotion Synthesis - hence Emotion Engine - was a big part of Sony's PR. http://www.geocities.com/TimesSquare/Arcade/5630/h_play2a.html This was pushed to the press - who then, as we've both noted, made up even more hyperbolae.

Now I hear the same claims for Cell and, well, a CPU is a CPU (or maybe 256 CPU's in this case, I don't know). It certainly sounds harder to program; maybe it is more powerful.
Um, you are just a bit misinformed. Perhaps you should visit the console forum some time, or alternatively ask in a PM.
I know that it is a multi-'cored' CPU with a variable number of cores - Ken Kutaragi has said 1000, and inevitably anything Kutaragi-san says is going to be associated with Playstation. I know that the Cell designers are claiming, as with Emotion Engine, similar boosts to interactivity, simulation, etc. and, again much as with the emotion engine, associating a lot of 'biological' type words. And I know that traditionally, multiprocessing architectures are harder to program efficiently than single-processor architectures.
http://news.com.com/2100-1001-948493.html
http://www.eetimes.com/story/OEG20010313S0113

I don't know for sure it will end up in the PS3, of course, but I don't think I ever said that I did know. What was I wrong about that caused you to say 'misinformed'? I made a comment that included a few of these facts in inconclusive statements, which seem to be reasonably justified by the quick Google search...
 
Dio said:
I don't think I was ever talking about anything other than PS3 vs 'PC type' architecture, which includes the current Xbox

Ok, neat. I was commenting on this though:

Dio said:
It's all very well to say 'Look what they could have done with two 250mm2 chips on 150nm' but then they would not have been able to keep the PS2 consistently just under the price of the Xbox without making a lot less profit than they find desirable or even none at all

And what your saying isn't true looking forward; and even in hind-sight is wrong. The truth is, it's all relative - so if SCE launched almost 2 years later with XBox in 2001, PS3 would be launched in 2H2007 and you'd see the "enhanced" EE+GS go down to 65nm in 2005 and 45nm in 2007 and integrate then. You're just effectivly sliding the process down a notch, which intrinsically relates to the time period. So, the economics remain viable for this generation.

Yes and no. Emotion Synthesis - hence Emotion Engine - was a big part of Sony's PR.

True. But, I allways put this in the same vein as that SmartShader, Pixel Tapestry, that while CineFX/Cinematic Computing BS or even the terms GPU and VPU.

The insane hype was entirely community generated if you look at the comments and you can hardly blame Sony for that, much in the same vein as you can't blame ATI for much of the community support and cheering that's come from their recent ass-kicking.

I know that it is a multi-'cored' CPU with a variable number of cores - Ken Kutaragi has said 1000, and inevitably anything Kutaragi-san says is going to be associated with Playstation.

I can assure you Kutaragi never said this. He said 1000GFlop preformance. Right now, the Cell derived MPU looks to be 4 cannibolized "traditional superscalar cores" with a DMAC and 8 APU's (each containing 4FPUs & 4FXUs + 128k SRAM, forget the 4register sizes) under each. There is then 32-64MB of eDRAM crossbarred to those DMACs. So, it's a 32 "way" architecture at heart (much like a steroid using, big-brother to the EE) built on a 65nm sSOI/Low-K process. The Cell derived Graphic IC has 4 cores (PEs) that have half the number of APUs and 4 Pixel Engines (one per PE) each with an image cache and eDRAM shared as in the MPU. It would appear that these are just for the "back-end" dumb ops like sampling and other highly pipelined/iternative tasks with the "front-end" being the FP heavy MPU and GPU. I'd hazard something similar to an advanced Unified Shader Architecture with more basic pixel pipes behind it. Ohh, and with YellowStone/XDR being the mainRAM. This might not be the exact, final embodiment, but it' should be close in the overall theory. There is actually quite alot of information around coming from members of SCE and IBM's contact team and patents; there are some good threads hidden somewhere in the console forum. Should drop by sometime, atleast early in the select few technical threads before they go to shit. ;)

What was I wrong about that caused you to say 'misinformed'? I made a comment that included a few of these facts in inconclusive statements, which seem to be reasonably justified by the quick Google search...

Um < looks up > Anyways, I'm running to dinner soon and this discussion isn't for this forum anyways, so... I'll talk to you some other time. Thanks for the chat, good luck at ATI.
 
I can assure you Kutaragi never said this. He said 1000GFlop preformance. Right now, the Cell derived MPU looks to be 4 cannibolized "traditional superscalar cores" with a DMAC and 8 APU's (each containing 4FPUs & 4FXUs + 128k SRAM, forget the 4register sizes) under each. There is then 32-64MB of eDRAM crossbarred to those DMACs. So, it's a 32 "way" architecture at heart (much like a steroid using, big-brother to the EE) built on a 65nm sSOI/Low-K process. The Cell derived Graphic IC has 4 cores (PEs) that have half the number of APUs and 4 Pixel Engines (one per PE) each with an image cache and eDRAM shared as in the MPU.

ps3.jpg


Better picture incase anyone wants to see it.

8 APU per PE, 4 PE make up the "Broadband Engine" The BE is the chip on the left.

The GPU on the right is basically a modified BE, with half the number of APU's. Said APU's on the GPU could crunch Shader programs rather well I think.

External memory will be the new Rambus XDR, with the chips connected by new Rambus Redwood.

Of course this is by no means final and can change at any time, although the basic "gist" is there.
 
Back
Top