PS3 - who's really benefitting?

Really, what's the point of this thread other than to be another fanb0y proclamation? 'Shadeposer' posted a similar thread a few weeks ago and the forum members handled it quite well (funny, too - I might add :) though it looks like the mods deleted the topic post). I see this thread as no different from the 'nVidia' version :LOL:.
 
Bjorn said:
Eronarn said:
trinibwoy said:
Huh? Isn't that what ATI basically did. R420 = speed bumped R300 ?

This time, though, their speed bump actually won and has similar features. :D

Huh ? The R420 don't have similar features. It's a FP24, SM2.0 part (no FP16 blending, no geometry instancing, vertex texturing....) vs full SM3.0 capabilties.

And afaik, the GF4 was definitely the fastest card when it was released.

yes it was but not for long due to the release of the 9700pro.

reason i swapped to a Radeon 8500 was because i was more impressed with the image quality of the radeon at the time.

I loved my gf3 to bits, but it was getting old - and after waiting anxiously for nvidias new card to bring new things to the table - it didn't - back then speed wasn't a MAJOR issue, not when the geforce4 was first released anyway. And having watched the tech demos for the 3 and then the 4, i felt very disappointed in nvidias decision to just bump up core speeds and get lazy on their throne.

So i took a sidestep for better IQ, yeah ATI had teething (?) problems then, but they were soon sorted.

I decided i was going to get the next nvidia card at this point, after the geforce4 - or see what ati would come up with.

as we know, ATI hit first with the 9700pro and just blew everything away, to see demos like the Natural Light one (debevec) in real time 2 years ago was just stunning and rekindled the passion for fancy 3dgraphics in me :D i was sold.

i have to admit i was still waiting for the next nvidia card, but the way nvidia started acting after the release of the 9700pro just made me sick. any company that tries to decieve their potential buyers like they did will just keep on losing customers, they really thought they were infallable, and as such didn't see the 9700pro coming.

the nv30 launched and as we all know flopped in comparison. its hard not to get anti-product when you are anti-company as the 2 go hand in hand. and although i know i would never have to deal with the IHV - i couldn't justify giving them any more of my money.

the 6800 is without doubt a fantastic card, but i can't say i'm unhappy to see the x800 out doing it again.

when i speak out, it really is direct at nvidia rather than their product - i just wish they knew the meaning of honesty and trust.

im just one poor soul who's money they lost, but thats how things go - you keep conning the world and soon they get smart. there are many former nvidiots (i was one) that have now turnt to ATI.

Nvidia simply aren't in a position anymore where-by they can say/act how they want, because each time the competition beats them - and slowly but surely more and more sites/customers are moving away.

Just my opinion, you don't have to agree :D not a flame or anything :)
 
trinibwoy said:
dr3amz said:
I used to be a great fan of Nvidia, until the Radeon 8500 appeared (i swapped my geforce3 for one) and regardless of what is said about the geforce4 it really did nothing in terms of progression other than max the clocks out. The tech demos for example were bitterly disappointing - wolfman lol...

Eronarn said:
Bjorn said:
Huh ? The R420 don't have similar features. It's a FP24, SM2.0 part (no FP16 blending, no geometry instancing, vertex texturing....) vs full SM3.0 capabilties.

Call me when that's useful to a substantial amount of people. :rolleyes:

Jesus man...are you that dense? Bjorn's comment was a rebuttal of the above statement. By dr3amz logic he should want NV4x over R420 but he obviously favors ATI this time around. The discussion had nothing to do with usable features. What usable features of the 8500 were so compelling back then? Can't believe people try to twist shit just to argue a non-relevant point.

and no i shouldn't - because back then fsaa really was too much of a performance hit to use fully, due to the card and cpu.
 
dksuiko said:
Really, what's the point of this thread other than to be another fanb0y proclamation? 'Shadeposer' posted a similar thread a few weeks ago and the forum members handled it quite well (funny, too - I might add :) though it looks like the mods deleted the topic post). I see this thread as no different from the 'nVidia' version :LOL:.

get a clue? if you label someone a fanb0y because they have an opinion which you don't agree with or maybe too strong for you, then you need some help defining the word `fanb0y` :rolleyes:
 
I think the benifit of SM 3.0 depends on game developers that use it - if its like other forms of structure programming - this could mean more re-usability of the shaders and also possible that the hardware can do with the shader in memory with out switching it out for something else. These switch could be expensive. I do program but not too much with shaders - so I am not the best one to mention.

Most of this SM 2.0 vs SM 3.0 is PR driven NVidia wants developers to use 3.0 and ATI states its a waste and not provide it.

One interesting thing that I yet seen an answer? Does NVidia drivers convert SM 2.0 shaders into SM 3.0 on the fly - if so than I could see with this option NVidia could have a signficant advantage - of course this will likely depend on Direct X 9.0c.. If this is so when Direct X 9.0c, instantly games running on 6800's would be faster than when running on DX9.0b.

Of course this is just my opinion.
 
reason i swapped to a Radeon 8500 was because i was more impressed with the image quality of the radeon at the time.


I take it you never applied full AF with it then :)

loved my gf3 to bits, but it was getting old - and after waiting anxiously for nvidias new card to bring new things to the table - it didn't - back then speed wasn't a MAJOR issue, not when the geforce4 was first released anyway. And having watched the tech demos for the 3 and then the 4, i felt very disappointed in nvidias decision to just bump up core speeds and get lazy on their throne.


Heh I must ask when did you buy your GF3 and then buy your 8500? If I remember right the 8500 came out in Sept-Oct of 2001 and the GF4 came out in March of 2002. GF3 came out in March of 2001. You must be inpatient if you couldnt wait a couple of months :) And what do you mean speed wasnt a major issue back then? Speed probably was more of an issue back then than it is now. People want eye candy now not brute strength speed.

GF4 added a second pixel pipeline over the GF3 which increased its performance greatly.


have to admit i was still waiting for the next nvidia card, but the way nvidia started acting after the release of the 9700pro just made me sick. any company that tries to decieve their potential buyers like they did will just keep on losing customers, they really thought they were infallable, and as such didn't see the 9700pro coming.


I am trying to picture this timeline for you. March-April of 2001 you get a GF3 then in lets say in Dec-Jan of 2001-2002 you get a 8500 and then a 9700 Pro in Sept of 2002? You are one busybody when it comes to purchasing video cards :)

Oh and btw in this time fram that you decided you didnt like the way Nvidia was acting ATI had the whole quake quack issue :)

Amazing how less than 9 months can change your opinion :)

when i speak out, it really is direct at nvidia rather than their product - i just wish they knew the meaning of honesty and trust.

Personally I wouldnt trust either company because anybody taking your money will bend the truth in order to do it.

Nvidia simply aren't in a position anymore where-by they can say/act how they want, because each time the competition beats them - and slowly but surely more and more sites/customers are moving away.


Well maybe I was reading different reviews but it appears both cards have their benchmarks there they win and others where they lose. Saying one side in this battle is a clear winner is naive IMO. This round is the perfect round because both cards are pretty damn close. Nvidia still has some issues in FarCry but other than that I cant seen any real beatings they took. Hell I even saw a benchmark where 6800U was running ahead of the X800XT platinum in Shadermark 2.0.

That tells us both companies came out firing on all cylinders and we as customers will get the benefit of no clear winner.
 
one of the misconceptions of some people seems to be that they dont realize that it isnt much of an effort to recompile existing hlsl 2.0 shaders to the 3.0 target and it doesnt break compatibility....
 
The shame of it is, one of the big reasons to want to use VS3.0 would be for displacement mapping. However NV40 doesn't have any higher order surface support, so it greatly limits the potential. Even Npatches would have been nice, even with as many faults as they have. And of course ATI doesn't even support VS3.0 which is just disappointing.
 
dr3amz said:
and no i shouldn't - because back then fsaa really was too much of a performance hit to use fully, due to the card and cpu.

So tell us again WHY you weren't impressed by the GF4 although it was definitely faster than the 8500?

It's pointless of course given the several threads you've started aimed at ATI glorification and all of the criticism you've levelled at anyone criticizing ATI in any way.

Well I think RUBY sucks!!! :p
 
trinibwoy said:
dr3amz said:
and no i shouldn't - because back then fsaa really was too much of a performance hit to use fully, due to the card and cpu.

So tell us again WHY you weren't impressed by the GF4 although it was definitely faster than the 8500?

It's pointless of course given the several threads you've started aimed at ATI glorification and all of the criticism you've levelled at anyone criticizing ATI in any way.

Well I think RUBY sucks!!! :p

lol no i haven't - i'm trying to think of you as a neutral but you're really coming off as nv biased ;)

regarding time scales... hmm i pretty much always had the latest card out - although i skipped the geforce2 range (i had the original creative annihilator pro 256ddr? wow nice card) i skipped the gf2 in favour of waiting for the geforce3.

i dont really remember the time scales exactly, but picked up my 8500 around december/january - just before the gf4 was coming out, this was really done at the time because i'd seen the difference in image quality - and that made me switch to the 8500, especially as performance was on par with the gf3ti500. as i work in the industry i get parts at cost price, so selling them on 6 months later and buying a newer card with the cash isn't an issue for me :D

the geforce4 came out, which i was gonna buy - but to me it really wasn't worth it over the geforce3 OR 8500 - yes of course something like th ti4600 was in the end, but i figured, waiting that long for nv to get announce the model and actually get it out i'd be better of waiting to see what ati had to offer.

so i did, and im glad i did - tell me i made a bad decision? :D

when i say something is suffering, im comparing it to the competition - obviously its still producing fantastic results - but is still suffering in comparision to the competition, for a card/company that want to be number 1 that is.

i remember the nv30/9700 fiasco also, this reeks of it again, and ati came through that the stronger. i just think they will again this time.

fact is, nvidia got it right (mostly) this time, in that they have released a card which previously took them 3-4 months to fix?

we're gonna see further improvements from both camps also with maturing drivers for these cards too.

so im not saying the 6800 is bad, just that i have a preference to the x800/ati atm because of what nvidia has done previously.

and if you want to bring up the quack scenario, there are far more many cases of nvidia cheating than is comparable, its just so one sided. and since the 9700pro ati have been happily beavering away refining their product, whilst nvidia mostly spent the time coming out with more false numbers and slating the opposition, trying to start wars with products they claimed weren't how games were going to develop.

their comments alone did a lot of damage to the credability of 3dmark03 at the time, and all over the fact that the nv30 didn't perform as promised. that wasn't 3dmarks fault.

so funny now how nvidia flaunt 3dmark scores on their launch isn't it. :rolleyes:
 
dr3amz said:
get a clue? if you label someone a fanb0y because they have an opinion which you don't agree with or maybe too strong for you, then you need some help defining the word `fanb0y` :rolleyes:

Then what exactly were you doing when you stated: "So to me, it really is no loss - and those that really try to brag about ps3 are either just fanboys or just gripping onto one thing that the 6800 has that the x800 doesnt. " when, at the very same time, you were advocating 3dc? Hypocrisy?
 
dksuiko said:
dr3amz said:
get a clue? if you label someone a fanb0y because they have an opinion which you don't agree with or maybe too strong for you, then you need some help defining the word `fanb0y` :rolleyes:

Then what exactly were you doing when you stated: "So to me, it really is no loss - and those that really try to brag about ps3 are either just fanboys or just gripping onto one thing that the 6800 has that the x800 doesnt. " when, at the very same time, you were advocating 3dc? Hypocrisy?

3dc is an open standard, and i was stating that ati are displaying it now - nv are not, they're too busy pushing ps3 - which really seems isnt an issue if using 3dc anyway?
 
Uhh... that one flew right over your head didn't it?
uhh.gif


Nevermind, it's futile.
 
dksuiko said:
Uhh... that one flew right over your head didn't it?
uhh.gif


Nevermind, it's futile.

no i got your point, but my original point was that do you really need ps3 when 3Dc performs as it does? at least for now? :D
 
I find the now revealed X800 Series rather medicore... As someone who is interested in "general programming with GPUs " scene, new cards from ATI are...hmm... really not that good.

There seems to be decent shading power, it's just that sucky Linux/OGL performance + 2 years old feature set really are not cutting it. There are so much more "expressive power" with full SM3.0 part when you are trying implement some generic algorithms as optimized GPU versions.

If people have to still wait for ONE YEAR for the (finally) real SM3.0/SM3.0+ part from ATI, things are really fuck*d up.
 
eSa said:
I find the now revealed X800 Series rather medicore... As someone who is interested in "general programming with GPUs " scene, new cards from ATI are...hmm... really not that good.

There seems to be decent shading power, it's just that sucky Linux/OGL performance + 2 years old feature set really are not cutting it. There are so much more "expressive power" with full SM3.0 part when you are trying implement some generic algorithms as optimized GPU versions.

If people have to still wait for ONE YEAR for the (finally) real SM3.0/SM3.0+ part from ATI, things are really fuck*d up.

I feel the same way right now, but I hope good news will come tomorrow (well, later today). At least Dave alluded that we will get some glimpse of ATI future plans/roadmap... :idea:
 
Colourless said:
The shame of it is, one of the big reasons to want to use VS3.0 would be for displacement mapping. However NV40 doesn't have any higher order surface support, so it greatly limits the potential. Even Npatches would have been nice, even with as many faults as they have. And of course ATI doesn't even support VS3.0 which is just disappointing.
I don't see how HOS is really that essential. I like vertex texturing for things like cloth, water, and soft-body simulation. I really think these sort of techniques could greatly improve game immersability. A fixed mesh is just fine by me.

I definately think ATI won the performance crown again this time (especially seeing the RTHDRIBL results), but vertex texturing opens up a plethora of possibilities. If I were buying one of these cards, it would likely be NV40. However, I'm going to wait for the unified pipe generation.

Hopefully we'll see some render-to-vertex-array or uberbuffer/superbuffer stuff for R3xx/R4xx soon. They should suffice for the above applications.
 
LeStoffer said:
eSa said:
I find the now revealed X800 Series rather medicore... As someone who is interested in "general programming with GPUs " scene, new cards from ATI are...hmm... really not that good.

There seems to be decent shading power, it's just that sucky Linux/OGL performance + 2 years old feature set really are not cutting it. There are so much more "expressive power" with full SM3.0 part when you are trying implement some generic algorithms as optimized GPU versions.

If people have to still wait for ONE YEAR for the (finally) real SM3.0/SM3.0+ part from ATI, things are really fuck*d up.

I feel the same way right now, but I hope good news will come tomorrow (well, later today). At least Dave alluded that we will get some glimpse of ATI future plans/roadmap... :idea:

why , how many years did it take to get a p.s 1.4 part from nvidia ? how many years to get a real p.s 2.0 part from nvidia
 
Back
Top