Why ATI ditched the R300 architechture ?

Ailuros said:
Flipper was in a very advanced development stage when the acquisition took place.

I believe SugarCoats point was that ArtX brought ATI to "console world" with the Flipper, it has nothing to do with was it ready or not when they aquired ArtX or not - it just means it opened the door to console world for them, which lead to apparently nicely going work with Nintendo, and Microsoft.
Without ATI aquiring ArtX, I don't think we'd see Revolution with ATI graphics, XB360 perhaps, but not Revolution.
 
They are getting clocked at a lot of price ranges.

They are not getting good returns on a per transistor or per clock basis.

They are just kinda "meh" right now.

They look good because of the X1900, but if recent history is an indication the 7900 is going to scale much better and thouroughly trump it.

And look at how small comparitively the G70 chip is that Nvidia is competing well with. Only 302 million transistors. The X1800 struggles with it at 320+.

Nvidia is just adding math, not all this other stuff, and the texture ability to keep up with it, and they end up faster.
 
Kaotik said:
I believe SugarCoats point was that ArtX brought ATI to "console world" with the Flipper, it has nothing to do with was it ready or not when they aquired ArtX or not - it just means it opened the door to console world for them, which lead to apparently nicely going work with Nintendo, and Microsoft.
Without ATI aquiring ArtX, I don't think we'd see Revolution with ATI graphics, XB360 perhaps, but not Revolution.

Read through the whole chain of my thought in the thread and not just one sentence; Dave caught what I wanted to say and yes I did concenctrate mostly on an engineering perspective.

Also look what Dave noted; under that reasoning with or w/o ArtX with someone as talented as Orton on the company's wheel (always in a relative sense) I doubt that they couldn't have entered the console or any other market they wished for.

Most spotlights fall and will fall on the XBox360; that "perhaps" doesn't show me like you're absolutely sure that they wouldn't had managed to enter the console market after all ;)
 
Daryl said:
ATI has a lot of stuff that sounds neat, but if it's not used right now what good is it? As I say, people who buy these real high end cards buy a new one every six months anyway. So what good is looking to the future?

Most of the new technology is included in all of ATI's new chips X1300/X1600/X1800/X1900 (Ignoring the strange Fetch4 omission from X1800 and X1300's lack of the ring-bus memory controller).

People that buy the low-end and mid-range chips probably won't upgrade every six months so they will benefit in the future. I'd say that X1600 is a prime example of this - this should, in theory at least, be much more future-proof than competing NVidia chips. Buy this as a mainstream chip now and it could well offer decent performance in future games for longer than its competitors.

Remember, although high-end chips most of the attention, low-end and mid-range chips garner the vast majority of the sales.
 
ophirv said:
My simple question is this :

IF the move from FP24 to FP32 doesn't give much improved graphics to the naked eye , what is the point of adding so much transistors per pipeline .

I have seen a lot of SM2.0 vs SM3.0 screenshots from various games and the although there was a differece it wasn't so notible .

I just think that when dealing with realtime graphics we should settle for less precision in order to get more performance .
I want the best graghics picture possiable I could careless if a game plays at 600FPS.
To me GPU are a visual experiance that can bring you into the game ATI has been excellant at this.

I have noticed in reviews were games in actual game play are limited to fps. yet there's a big uya if in benchmarks NV is faster in that bench mark at 120fps vs 118fps.

To me this down right stupidity. I want to be ingrossed in a gaming experiance and IQ is the only way that happens. Detail so real you can reach out and touch it.

Maybe ATi's next move should be a SPU(smell processing unit) that will really bring you into the game lol
 
Daryl said:
They are getting clocked at a lot of price ranges.

They are not getting good returns on a per transistor or per clock basis.

They are just kinda "meh" right now.

They look good because of the X1900, but if recent history is an indication the 7900 is going to scale much better and thouroughly trump it.

And look at how small comparitively the G70 chip is that Nvidia is competing well with. Only 302 million transistors. The X1800 struggles with it at 320+.

Nvidia is just adding math, not all this other stuff, and the texture ability to keep up with it, and they end up faster.

So i see you really like nv thts great. when looking at women do you prefer pretty ones or ugly ones . because right now your basicly talking about fps and thats fine many want that. But there are many like myself that really injoy a quality visual experience.

Do you have a link to the G71 specs. or not if not were do you get your information based on the big bad g71. link please
 
Back
Top