Devs Speak on PS3.0

nAo said:
just don't fit, at least with scenario you depicted.
I don't think EA (or just to make another name: Epic Games) needs nVidia (or ATI) moneys at all, but I could be wrong..

Maybe you're right, but never the facts are so black or white like you're telling us there.

EA Corp is rich and powerfull. EA Corp makes a lot of money. EA PC makes a small fraction of that money.

Can you think of any reason EA would specifically disable features for non-nvidia products, even when through external 3rd party programs, you can enable those features and they work fine, still having higher performance?

Aaron spink
speaking for myself inc.
 
ninelven said:
I don't think Carmack, Sweeney, etc. are poor.

They may not be poor, but then again, neither Spielberg or Lucas are poor. Both engage in promotion for profit marketting schemes with their products. These promotion for profit schemes offset the cost of producing the entertainment and thus require Spielberg/Lucas to use less of their own money. Remember, just because Carmack/Sweeney aren't poor doesn't mean that their companies are rolling in cash.

Aaron Spink
speaking for myself inc.
 
keeping this thread of topic isn't helping, threatening to post things that i've posted in other forums is also trolling, and generally carrying on about it after the person has said they will no longer comment is just as bad.

Back on topic

Anyways you can't escape it, Nvidia supports 3.0 shaders and ATI wont (as far as we all know so far) if you want to run games using these features then it will have to be Nvidia untill ATI release something better

All the shader sets are the natural progression of the previous versions and all are important, you can't downplay any of them.
 
Anyways you can't escape it, Nvidia supports 3.0 shaders and ATI wont (as far as we all know so far) if you want to run games using these features then it will have to be Nvidia untill ATI release something better

If I can't see the features working with my eyes, what exactly am i missing out on?
 
If I can't see the features working with my eyes, what exactly am i missing out on?

Exactly nothing, apart from maybe some performance, what YOU can see and make out is irelevent to what is actually happening.

We are talking about the performance end of the cards here, as they're the flagship cards, they get the most press and they've been benchmarked. I dont know about other people but if i pay a lot of money for a high end video card i expect to get a full feature set out of it for the latest technology.

Im looking forward to STALKER personally, apparantly going to be using PS3.0, already the game looks awesome, can't wait to see what it looks like with the latest hardware.
 
I dont know about other people but if i pay a lot of money for a high end video card i expect to get a full feature set out of it for the latest technology.

Sure didn't stop FX buyers from making their decision, actually, don't you have an FX card?
 
reever said:
Anyways you can't escape it, Nvidia supports 3.0 shaders and ATI wont (as far as we all know so far) if you want to run games using these features then it will have to be Nvidia untill ATI release something better

If I can't see the features working with my eyes, what exactly am i missing out on?

According to the NV40 Launch video, you will be missing out on alot of "Dudes" on the screen. Apparently more "Dudes" than is capable using SM 2.0 without severe slow down.

If ATi's compression techniques are what they suggest, you shouldn't see the slow down (I'd imagine) though I am unsure if you will be capable of still seeing more "Dudes" on screen.

Something is great about it or else why come out with it? If R500 is to have it, what's wrong with early adoption of it from nVidia if they already have use for it now?

Besides, R500 is PCI-Express unless ATi decides to build another AGP card though I thought R420 was it?

AGP owners will have an extended life with their SM 3.0 nVidia cards due to just that, SM 3.0 support.

I'd be pissed about the HL2 coupon if that is why I got a ATi product. I'd also be pissed about my ATi product for $399 or $499 being replaced by a similar priced R500 in 8 months because it will then offer SM 3.0 support.

After a $399 or $499 purchase, I'm not so sure I'd wanna jump so quickly to a PCI-E native mobo/videocard, plus new memory, though... that might just be my way of thinking. :D
 
Princess_Frosty said:
If I can't see the features working with my eyes, what exactly am i missing out on?

Exactly nothing, apart from maybe some performance, what YOU can see and make out is irelevent to what is actually happening.

I dont know about other people but if i pay a lot of money for a high end video card i expect to get a full feature set out of it for the latest technology.

So, judging from your posts, a few years back you'd choose an ATi R100 over a GeForce2 given its support for 16x aniso, 3D textures, hw matrix skinning and excelent image quality.

Or wouldn't you?
 
For clarity reasons...

I own a Asus FX 5900 Ultra which i've BIOS flashed to a FX 5950 Ultra, which i then overclocked to 550/975 which incidently is an overclock of 100Mhz on the core and 125Mhz on the memory. It will actually overclock more than that, but i see no reason to push it any further, thats a fairly massive overclock for stock cooling.

If your reason for posting this is to try and imply that the FX5900 Ultra cards are not fully DX9.0 compatible hardware then try again.

And just for the record, after reading several reviews the overclocked FX 5950 Ultras beat all other overclocked cards in a lot of benchmarks, including DX9 games such as halo.

I wont comment on older hardware any more as this is of topic and again will end in trolling. The discussion is about shader 3.0 and thats all i'll commen on in any depth.
 
So, judging from your posts, a few years back you'd choose an ATi R100 over a GeForce2 given its support for 16x aniso, 3D textures, hw matrix skinning and excelent image quality.

Or wouldn't you?

I am guessing that the Geforce2 had much much better performance than the R100?

The difference this time around is that the Geforce 6800U is obviously a very fast card, and a big upgrade to any of the current generation cards with respect to performance and arguably even features.
 
If I can't see the features working with my eyes, what exactly am i missing out on?

Apparently, PS 3.0 is supposed to be a more efficient way of processing instructions than PS 2.0, even if image quality is unchanged.
 
If your reason for posting this is to try and imply that the FX5900 Ultra cards are not fully DX9.0 compatible hardware then try again.

How about you answer the question? They may make it enough for PS/VS 2.0, but that is hardly the full featureset(compatabile! = fully supported), they are lacking DX9 capabilities to make it "full", but did this stop people looking at FX's from buying it? Or are you just going to shift your logic here and now make it all about performance and not features?
 
I wouldn't say much better, but overall it was faster. Specially after the ultras.

And he is implicitly comparing nv40 with R420 so, in reality, there is no difference.

Apparently, PS 3.0 is supposed to be a more efficient way of processing instructions than PS 2.0, even if image quality is unchanged.

That doesn't sound much from a consumer perspective if the PS2.0 card outperforms the PS3.0 one, does it?

OTOH if you were a developer....
 
That doesn't sound much from a consumer perspective if the PS2.0 card outperforms the PS3.0 one, does it?

Obviously a card with slow PS 2.0 performance and full support for PS 3.0 will not be as desireable (for most) as a card with fast PS 2.0 performance and no support for PS 3.0. However, the 6800U is definitely very fast in PS 2.0 performance.

As for exactly how the NV40 compares to the R420, unfortunately we still have to wait a bit to see more :)

Purchasing decisions are always based on numerous factors. Performance in various games, features, image quality, price, stability, etc.
 
jimmyjames123 said:
If I can't see the features working with my eyes, what exactly am i missing out on?
Apparently, PS 3.0 is supposed to be a more efficient way of processing instructions than PS 2.0, even if image quality is unchanged.
What a ridiculous statement. It's competely dependent on what the application is doing.

-FUDie
 
What a ridiculous statement. It's competely dependent on what the application is doing.

What a ridiculous reply! Care to expand more, or are you just angry tonight? How about explaining in detail what the advantages of PS 3.0 vs PS 2.0 are?

From what I have read, there is a gain in efficiency when using PS 3.0 vs PS 2.0. In other words, the game can be programmed to run more efficiently with a given set of effects using PS 3.0 vs PS 2.0. If this were not the case, then it would be completely pointless for developers like Crytek to even bother to code add-on's for PS 3.0.
 
Back
Top