PS3 - who's really benefitting?

dr3amz

Newcomer
mapping.jpg


Virtual Displacement Mapping

Also known as “parallax mappingâ€￾, this effect is actually an enhanced version of normal mapping. It can produce a stunningly real three-dimensional appearance on flat surfaces, without the intensive tessellation computations and extreme polygon counts required for true displacement mapping. In addition to a base texture, it requires a corresponding normal map and height map to work. With SMARTSHADER HD, performance of this technique is similar to that of standard normal mapping.

For an even more realistic effect, Virtual Displacement Mapping can be combined with horizon mapping to capture self-shadowing effects. It also works nicely with 3Dc, which can compress the normal maps and allow more detailed ones to be used.

I'm not techy enough to go deeper into this, i say what i see :D

oh, and not forgetting subsurface scattering? the point is - who really see's ps3 as a major benefit right now? :?:

marble.jpg


I've read a lot about ps3 and pretty much everyone knows its not going to be an issue this generation anyway, and if developers use tech such as 3Dc (which a lot have stated they will) then more for ATI :D
 
The title of your post is a little apologist/fanboyish in nature, because of course we all know ATi "needs" SM3.0 to satisfy the geeky nature in all 3D graphics afficionados. :)

Of course, as there are no games using 3.0 shaders yet (DX9c isn't even released to expose that functionality), ATi doesn't have to worry about anything, and even when SM3.0 games are available, there won't be an earthshattering difference in visual quality unless the developer deliberately decides to make it that way, since piling on too heavily with the shaders will reduce performance to a virtual standstill.

Still, HAVING IT is always better than NOT HAVING IT, and there's not really any way around that argument, which makes your position somewhat flawed IMO. :)

Is it a big flaw for ATi? No, not really, especially as it seems they have the perfornance edge right now. But from a technical point of view, it is still there.
 
I think history justifies Ati's decision. With Tombraider the only exception, it took a long while for game devs to adopt SM2.0.
Besides, SM3 demand 32bit shaders (correct me if I'm wrong) and that seems a waste of transistors to me. It's just for visuals afteral, not mathematical research.
Would make more sense if MS first set a standard for what different AF and AA modes are supposed to look like exactly.
 
Sandwich said:
I think history justifies Ati's decision.

I just wonder how games would evolve if one company didnt go for more features before the other? Do you guys think that SM3.0 games would begin appearing just as fast if Nvidia had waited another year or so to bring out a SM3.0 part? They would definitely have had a smaller, cheaper, cooler running chip that could possibly clock faster and give ATI a closer fight. So is their decision to go SM3.0 a bad one and if so bad for who?

I haven't really seen anyone address this and was wondering what people's opinions are on the issue. And please, comments like "but where is SM3.0 today" are not welcome :D
 
trinibwoy said:
Sandwich said:
I think history justifies Ati's decision.

I just wonder how games would evolve if one company didnt go for more features before the other? Do you guys think that SM3.0 games would begin appearing just as fast if Nvidia had waited another year or so to bring out a SM3.0 part? They would definitely have had a smaller, cheaper, cooler running chip that could possibly clock faster and give ATI a closer fight. So is their decision to go SM3.0 a bad one and if so bad for who?

I haven't really seen anyone address this and was wondering what people's opinions are on the issue. And please, comments like "but where is SM3.0 today" are not welcome :D

Well wasn't it ATI that pipped NV a while back with the inclusion of PS1.4? I remember the extra demo that was added to 3dmark2001 for it.

PS3 will obviously just develop as part of progression, but I agree that ATI made the right decision now not to include it. Their tech demo of Ruby (sorry to go on) really backs this up. Compared to Nalu or Timburry, Sailboat etc - what do you see on those NV demos that is any better?

It's gonna take a long time for games to use ps3 properly, we've only just had games like farcry etc which are finally starting to hit previous gen cards hard.

But this is 2 years on from the release of the 9700pro. And its already been shown that what you can do in ps3 you can do with 2, by the time games etc start using it ATI will surely have it - if not PS4 etc etc.

And isn't 3Dc more beneficial at this point in time? :?:
 
trinibwoy said:
Sandwich said:
I think history justifies Ati's decision.

I just wonder how games would evolve if one company didnt go for more features before the other? Do you guys think that SM3.0 games would begin appearing just as fast if Nvidia had waited another year or so to bring out a SM3.0 part? They would definitely have had a smaller, cheaper, cooler running chip that could possibly clock faster and give ATI a closer fight. So is their decision to go SM3.0 a bad one and if so bad for who?

I haven't really seen anyone address this and was wondering what people's opinions are on the issue. And please, comments like "but where is SM3.0 today" are not welcome :D

I think we should appreciate nvidia made the effort atleast. This is good for game devs and good for all PC gamers in the long run.
Ati took the trouble when they were the first to release first a PS1.4 card and later a PS2.0 card...someone always has to be the first.
 
trinibwoy said:
Sandwich said:
I think history justifies Ati's decision.

I just wonder how games would evolve if one company didnt go for more features before the other? Do you guys think that SM3.0 games would begin appearing just as fast if Nvidia had waited another year or so to bring out a SM3.0 part? They would definitely have had a smaller, cheaper, cooler running chip that could possibly clock faster and give ATI a closer fight. So is their decision to go SM3.0 a bad one and if so bad for who?

I haven't really seen anyone address this and was wondering what people's opinions are on the issue. And please, comments like "but where is SM3.0 today" are not welcome :D

Well wasn't it ATI that pipped NV a while back with the inclusion of PS1.4? I remember the extra demo that was added to 3dmark2001 for it.

PS3 will obviously just develop as part of progression, but I agree that ATI made the right decision now not to include it. Their tech demo of Ruby (sorry to go on) really backs this up. Compared to Nalu or Timburry, Sailboat etc - what do you see on those NV demos that is any better?

It's gonna take a long time for games to use ps3 properly, we've only just had games like farcry etc which are finally starting to hit previous gen cards hard.

But this is 2 years on from the release of the 9700pro. And its already been shown that what you can do in ps3 you can do with 2, by the time games etc start using it ATI will surely have it - if not PS4 etc etc.

And isn't 3Dc more beneficial at this point in time? you show me some games that look as good as ruby and i'll eat my words - its gonna be a long time before you see graphics of that quality in game, and this is without PS3.

So to me, it really is no loss - and those that really try to brag about ps3 are either just fanboys or just gripping onto one thing that the 6800 has that the x800 doesnt.

Trying to remain impartial here, but from the tech demos/specs released - I see ATI as offering more in way of image quality/effects etc than NV are. (or at least displaying). ;)
 
dr3amz said:
It's gonna take a long time for games to use ps3 properly, we've only just had games like farcry etc which are finally starting to hit previous gen cards hard.

But this is 2 years on from the release of the 9700pro. And its already been shown that what you can do in ps3 you can do with 2, by the time games etc start using it ATI will surely have it - if not PS4 etc etc.

And isn't 3Dc more beneficial at this point in time? :?:

LoL. dr3amz I knew you would come back with an ATI-centric twist :devilish: You're getting predictable man :LOL: :LOL:

Anyway we all know that ATI will be ready when SM3.0 goes mainstream but my question was more "Would it have taken longer for SM3.0 to come to fruition if it wasn't for these first steps?"

It's all good though...a market flooded with video cards that my current rig can't max out. My monitor can't even do 1600x1200 :(

Can't wait to upgrade in the fall....I could do it now but I'm gonna wait till a couple months after Socket 939 goes live :)
 
dr3amz said:
Trying to remain impartial here, but from the tech demos/specs released - I see ATI as offering more in way of image quality/effects etc than NV are. (or at least displaying). ;)

I doubt that you're trying very hard.
 
ATI has higher image quality and speed, and while we don't have PS3 I am confident that we'll look back on this a year from now with the R420 as the winner, even if only by a slight margin. Sure, in the long run, ATI may suffer a bit for not having it... but I doubt any games will go PS3-only and not have an alternative in PS2.
 
Bjorn said:
dr3amz said:
Trying to remain impartial here, but from the tech demos/specs released - I see ATI as offering more in way of image quality/effects etc than NV are. (or at least displaying). ;)

I doubt that you're trying very hard.

well it's hard to be impartial when a company like nvidia are involved - you don't get much slimier or dirtier.

Yes the 6800 seems a great card too, but again, ATI win it. I'm just expecting to see the 6800 suffer in dx9 apps as the nv30 did, and it's already started off that way. :?

I used to be a great fan of Nvidia, until the Radeon 8500 appeared (i swapped my geforce3 for one) and regardless of what is said about the geforce4 it really did nothing in terms of progression other than max the clocks out. The tech demos for example were bitterly disappointing - wolfman lol...

You've gotta give credit to ATI for holding the top spot for 2-3 years rolling now, Nvidia have been on the back foot since the 9700pro and since the 9700pro have done nothing but lie and cheat their way through the years.

I just don't have any faith in them. How many driver revisions will they ship with their cards to try and look better in one test than another and so on.
 
dr3amz said:
Yes the 6800 seems a great card too, but again, ATI win it. I'm just expecting to see the 6800 suffer in dx9 apps as the nv30 did, and it's already started off that way. :?

Uhhh dude....what are you talking about. None of these new cards are 'suffering' in anything :rolleyes:

I used to be a great fan of Nvidia, until the Radeon 8500 appeared (i swapped my geforce3 for one) and regardless of what is said about the geforce4 it really did nothing in terms of progression other than max the clocks out.

Huh? Isn't that what ATI basically did. R420 = speed bumped R300 ?

You've gotta give credit to ATI for holding the top spot for 2-3 years rolling now, Nvidia have been on the back foot since the 9700pro and since the 9700pro have done nothing but lie and cheat their way through the years.

I just don't have any faith in them. How many driver revisions will they ship with their cards to try and look better in one test than another and so on.

This part was ok but please don't turn into another oblivious IHV floozy with comments like the above :p I don't even have a fave color far less a fave graphics card company .... blech!!!
 
i thought vertex shader 3.0 was suppose to be all the rage not pixel shader 3.0? both of which the r420 dont have.
 
dr3amz said:
I used to be a great fan of Nvidia, until the Radeon 8500 appeared (i swapped my geforce3 for one) and regardless of what is said about the geforce4 it really did nothing in terms of progression other than max the clocks out. The tech demos for example were bitterly disappointing - wolfman lol...

So you swapped to a 8500, despite very slow FSAA just for more features, shouldn't you love the NV4X then ?
 
Eronarn said:
trinibwoy said:
Huh? Isn't that what ATI basically did. R420 = speed bumped R300 ?

This time, though, their speed bump actually won and has similar features. :D

Huh ? The R420 don't have similar features. It's a FP24, SM2.0 part (no FP16 blending, no geometry instancing, vertex texturing....) vs full SM3.0 capabilties.

And afaik, the GF4 was definitely the fastest card when it was released.
 
Bjorn said:
[
Huh ? The R420 don't have similar features. It's a FP24, SM2.0 part (no FP16 blending, no geometry instancing, vertex texturing....) vs full SM3.0 capabilties.

Call me when that's useful to a substantial amount of people. :rolleyes:
 
dr3amz said:
I used to be a great fan of Nvidia, until the Radeon 8500 appeared (i swapped my geforce3 for one) and regardless of what is said about the geforce4 it really did nothing in terms of progression other than max the clocks out. The tech demos for example were bitterly disappointing - wolfman lol...

Eronarn said:
Bjorn said:
Huh ? The R420 don't have similar features. It's a FP24, SM2.0 part (no FP16 blending, no geometry instancing, vertex texturing....) vs full SM3.0 capabilties.

Call me when that's useful to a substantial amount of people. :rolleyes:

Jesus man...are you that dense? Bjorn's comment was a rebuttal of the above statement. By dr3amz logic he should want NV4x over R420 but he obviously favors ATI this time around. The discussion had nothing to do with usable features. What usable features of the 8500 were so compelling back then? Can't believe people try to twist shit just to argue a non-relevant point.
 
Back
Top