ATI B.S Bingo FUD

Mendel said:
Ratchet said:
Mendel said:
there are no winners in graphics industry when things get like they are now...
What do you mean? When things get competitive? I think the 6800GT and X800XL users out there would disagree.

Well, last time I checked (when I bought my 6800GT) Ati cards were nowhere to be seen, not in this country anyways and Nvidia offerings were hugely expensive so I had to almost sell my aunt and uncle to get this thing...

But thats not what I meant, I meant the mudslinging contest that is bound to follow. I meant the apparent unprofessionalism. I meant the funds put to pr bullshit vs engineering products. If they start to hire as unmature and unprofessional employees to their engineering departments as they apparently do to their pr departments, I don't think we will ever see another working product from either company.

Eurooppalaiset maksaa tuotantokulut. Jenkeissä kilpailu on liian kovaa, eikä kukaan halua kotimarkkinoitaan kiusata vittumaisilla hinnoilla. Piirejä ei voida viedä eteenpäin kun nykyisiäkään ei meinata saada toimivana tarpeeksi ulos, puhumattakaan että uusille ominaisuuksille, vaikka niitä voitaisiinkin toteuttaa raudalla, ei ole rajapintatukea tiedossa ennen longhornia. Joten, ainut vaihtoehto on hyökätä kilpailijan kimppuun jotenkin muuten.


and it is just same in english, och samma på svenska. ;)


EDIT: I'll posted cleaned up english abstract of this text below. :)
 
Nappe1 said:
Eurooppalaiset maksaa tuotantokulut. Jenkeissä kilpailu on liian kovaa, eikä kukaan halua kotimarkkinoitaan kiusata vittumaisilla hinnoilla. Piirejä ei voida viedä eteenpäin kun nykyisiäkään ei meinata saada toimivana tarpeeksi ulos, puhumattakaan että uusille ominaisuuksille, vaikka niitä voitaisiinkin toteuttaa raudalla, ei ole rajapintatukea tiedossa ennen longhornia. Joten, ainut vaihtoehto on hyökätä kilpailijan kimppuun jotenkin muuten.
Moi! (That's about as much finnish as I've learned in 4.5 months living here... :( )
 
t0y said:
Moi! (That's about as much finnish as I've learned in 4.5 months living here... :( )

that's quite lot. :)

as what comes to post I wrote, I partially agreed with Mendel. Still, I have been watching this "Graphics Cards Circus" long enough to notice that, when competition goes neck to neck, it means cheaper prices in america and higher prices in europe. This is because, companies think U.S. / Canada markets to be so important. Again, everything in europe is usually a bit more expensive so, it's quite easy to put us to pay the development costs and say that it's all about tolls and taxes, etc.

Another point is that, while companies are strugling to deliver now "available" chips in enough, it would show that there's right now not much room for improvment. (as well as the implementation of SLI could indicate as well.) Still, if there would be possibility to develop better hardware (as competition weapon as instead of all this FUD) there's no support for new features before Longhorn. Again, no API support means no games / software, which again means that no one isn't going to pay extra of it.

So, Final Conclusion? both companies are driven on full steam already. There's no other way to boost sales than with FUD and very attacking marketing campaigns. Whenever semiconductor side gets yields better all the time, both companies will right a way use all of it's potential. After that, they are living waiting for next bigger step and during that time they have to continue boosting the sales, which means more marketing.

again, this is all my own opinions, based on what I have seen during last 10 years what I have been watching PC market.




Have anyone thought, how dependary both companies are from Taiwanese semiconductor manufacturers, Especially from TSMC? What happens if something stalls TSMC's operations for few months or even half a year?
 
BZB said:
- SM3.0 We've seen nothing significant in games using SM3.0. Nvidia's version is slow (especially on anything other than the very top end product) and can't do branching without a big performance hit. You will need a new card to run any game with significant SM3.0 including the ones from Nvidia. ATI said they would have SM3.0 in their cards when it was worth expending the transistors for it and wouldn't cause the rest of the chip to suffer to produce nothing but a marketing check box at this time.

You can't say something like that in a thread Chris Ray will see.
 
Bouncing Zabaglione Bros. said:
- SM3.0 We've seen nothing significant in games using SM3.0. Nvidia's version is slow (especially on anything other than the very top end product) and can't do branching without a big performance hit. You will need a new card to run any game with significant SM3.0 including the ones from Nvidia. ATI said they would have SM3.0 in their cards when it was worth expending the transistors for it and wouldn't cause the rest of the chip to suffer to produce nothing but a marketing check box at this time.

It remains to be seen what benefit early adoption of SM3 will bring. It may turn out that the NV40 implementation is not good enough for the next gen engines, or it may turn out that NV40 provides decent enough performance.

If my 6800GT runs the UE3 engine and future D3 engine games well, I'll be happy with my purchasing decision, particularly if R4x doesn't. Unlike some, I'm on about a 5 year upgrade cycle, so buying the latest available technology in the hope of some future proofing was a priority.

But if you upgrade every year or so, then I agree that SM3 is of little benefit right now.

Tom
 
ivzk said:
Snip Chris Ray Snip

Really does make you wonder how I got brought up in all this. Your desire to flame must be insatiable. Heck, I havent even commented on this article in any forum. Learning my lesson from the last Nvidia slides, I stay far away from this kind of PR now.

In any event, Continue regularly broadcasted flaming.
 
I was just trying to warn BZB that you according to YOU SM3.0 has been more than proven and that you may strongly dissagree with his opinion that it has not been proven whatsoever. That's all.
 
Eh? Sm 3.0 is just as proven as SM 2.0B or SM 2.0. Its an extension upon the capabilities of prior shader models. Believing either would just fall to its knees by loading a SM 3.0/2.0B profile is pretty obsurd yes. Thats always been my position. You use it or you dont. How you use it is entirely a different matter. Personally the idea that you have to use more than 512 instructions, Dynamic Flow Control, Geometry Instancing, and Dynamic Branching all at once to be considered a SM 3.0 application to be rather obsurd. *Edit* As far as I'm concerned if you use one benefit of SM 3.0 or even SM 2.0B over the prior models then you have gained something from it.


What does my opinion of SM 3.0 have to do with this article anyway? Since you so dilligently brought it up. Though I have already speculated on what your answer will be. Though I will humor you at this point with a responce. But not for long. This topic isnt labeled "My Opinion" of SM 3.0. Nor is it even relevent.
 
Your opinion on SM3.0 has absolutely nothing to do with this article. I just know that it differs greatly from BZB's opinion as well as the opinion of the poster below BZB's post. I was looking more towards a brief discussion where I might be able to pick something up instead of having to read through a 10 page thread discussing the same thing.

He makes a post, you make a countering post, and there. I got the jist of it a very short amount of reading.
 
Sure, I do find it interesting that people seem to label Shader Model 3.0 as some kind of magic switch that you either turn off or on. It perplexes me a bit. Whether thats due to Nvidias marketing or just simple misunderstanding of it is as good as question as any. The whole point of each new shader model has been to extend upon the limitations of the prior ones with more flexibility. That is of course my opinion. I also feel that there is no reason to say its anymore unproven when clearly the technology is working.

Few examples.

3dmark05
Far Cry
Shadermark.

Are these programs making the absolute full potential of shader model 3.0? No. Are they using it? Yes. And they are benefiting from it too in many cases compared to prior shader models due to the increased flexibility. I honestly dont see how this shows the technology to be flawed or unproven. I mean seriously. Many of our current titles and SM 2.0 effects can be done in 1.4. That doesnt make ATIs implementation of Shader model 2.0 unproven either. There are of course IQ enhancements such as Floating Point precision increases but in many cases these are the few immediately noticable effects.

P.S. My above examples also cover SM 2.0B
 
The technology is working. No doubt about that part. I guess the thing I have problems understanding is as follows:

-9700 pro was the first SM2.0 card.

It ran all the 2.0 benchmarks pretty damn good. But if you load it up with a todays game like Far Cry or similar, it's not the most enjoyable experience according to many people.

-6800 series are the first true SM3.0 card.

Runs everything that is out there with SM3.0 without a hitch. The question is, how will it cope with tomorrow's games with more and more SM3.0 oriented programming in the code.

Is there a parallel between the 2 series of cards? Was the jump from SM1.4 (or whatever it was) to SM2.0 a much more of a significant jump than the jump from 2.0 to 3.0? Is the 6800 series powerfull enought to be deemed more futureproof right now?

EDIT: I forgot all about the FP side of things.
 
SM2.0 was embraced by developers when it came with the 9700. SM3.0 is similarly being embraced by developers when it came with the 6800. Software doesn't become developed overnight, it takes time.

Sm2.0 yesterday, is as relevant as SM3.0 today. Unless you feel it's not relevant because ATI did not release any hardware supporting Sm3.0, in which case, there is really nothing I can put forward to you.
 
Smurfie said:
SM2.0 was embraced by developers when it came with the 9700. SM3.0 is similarly being embraced by developers when it came with the 6800. Software doesn't become developed overnight, it takes time.

Sm2.0 yesterday, is as relevant as SM3.0 today. Unless you feel it's not relevant because ATI did not release any hardware supporting Sm3.0, in which case, there is really nothing I can put forward to you.

The relevance of sm3.0 depends on when the next version of dx comes out or wgf or whatever it will be called .

The sooner that comes out with hardware that takes advantage of it the less important sm3.0 will be .

Is it good to have sm3.0 cards , yes .

Will it mean your future games will run well on an sm3.0 card or better than a sm2.0 card ? No

There is no reason in my mind why they would spend more time optimizing for such a small base as sm3.0 , even in the next year it will be much smaller than sm 2.0 capable cards on the market .
 
ivzk said:
It ran all the 2.0 benchmarks pretty damn good. But if you load it up with a todays game like Far Cry or similar, it's not the most enjoyable experience according to many people.
I beg to differ! "Todays game" Far Cry came out well before the 9700 pro was old.

I originally played Far Cry on my 9700 pro and loved it. :p
 
I got Digi's back on this one. It may have not played it with every option maxed on my 9700pro. But the IQ looked pretty dam good and it gave me an enjoyable experince!
 
I think the thing about R300 is that it introduced a lot of new features (like SM2.0) that were actually very usable from the the get-go. Up till then, we'd seen all major new features come in as a checkbox feature for marketing, or an advanced feature for developers to use in developing games designed to run on updated hardware 18 months down the line.

The impressive general (and SM2.0 in particular) performance gave ATI a product in R300 that could run the first generation of SM2.0 titles quite well. R300 didn't just allow developers to write for SM2.0, it created a market for those SM2.0 titles too.

Now we're back to SM3.0 as a marketing tickbox and development tool, but not giving enough performance to be a usable consumer product to run significant SM3.0 games - we'll need the next gen of SM3.0 hardware for that. Those games are in development now thanks to the 6800, but you'll need better a better SM3.0 card than that to actually play those games.

R300 was a startling blip in the way that new features come to market in games, hardware and the way the market gets created for those new features. We shouldn't really be surprised that we've gone back to the more usual way that new functionality is being introduced as "for the developers" in the first iteration, and then "for everyone else" next time around.
 
Bouncing Zabaglione Bros. said:
I think the thing about R300 is that it introduced a lot of new features (like SM2.0) that were actually very usable from the the get-go.
Like what? 3dmark03? the 9700pro came out in aug/sep 3dmark was 4months later.hmmm but i get you point.


Bouncing Zabaglione Bros. said:
Now we're back to SM3.0 as a marketing tickbox and development tool, but not giving enough performance to be a usable consumer product to run significant SM3.0 games - we'll need the next gen of SM3.0 hardware for that. Those games are in development now thanks to the 6800, but you'll need better a better SM3.0 card than that to actually play those games.
Wrong. the current sm3. cards from NVDA, will play the next sm3 games.Just at lower res. But time will tell because we realy dont know what you say to be true.
 
Hey, I think SM 3.0 is important too and if it isn't being used by developers it probably should be used by them.

My only thing with SM 3.0 right now is the necessity of having it right now when it isn't being used much yet. (From my understanding, IMHO, and all that rot. ;) )

More features are almost always a good thing, 'specially when they're new features that are in some form of universal specification so all card makers can include it. :)
 
jvd said:
The relevance of sm3.0 depends on when the next version of dx comes out or wgf or whatever it will be called .

The sooner that comes out with hardware that takes advantage of it the less important sm3.0 will be .
That hardware will still have to support SM1-3 with SM3 being the most flexible of them.

Just a quick fact: there are still many games which tops at PS1.4 in a days of SM3 hardware. Now, if you remember, NVIDIA didn't support PS1.4 at all, they've jumped to SM2 right from SM1. As for SM3 -- R520 will support it and WGF support will come sometimes in 2006 which gives SM3 a nice 2+ year being the best SM and a 1+ year being supported by both major h/w vendors.
 
karlotta said:
Like what? 3dmark03? the 9700pro came out in aug/sep 3dmark was 4months later.hmmm but i get you point.

HL2, Far Cry, Halo off the top of my head are all playing SM2.0 surprisingly well on R3x0 cards, but there are a few others

karlotta said:
Wrong. the current sm3. cards from NVDA, will play the next sm3 games.Just at lower res. But time will tell because we realy dont know what you say to be true.

SM3.0 branching will probably bring current SM3.0 cards down to unplayable levels - even Nvidia advise developers to stay away from one of the main advantages of SM3.0 because there is such a big performance hit on the first gen SM3.0 hardware.

Besides, I doubt that anyone looking to play a UE3 game or Stalker in SM3.0 will consider that lowering the res down to 800x600 to be adequate performance - they'll be looking to get a second gen SM3.0 card pretty much as I suggested.
 
Back
Top