ATi responses challenge of NV30 with R350

Status
Not open for further replies.
Derek Smart [3000AD said:
]
martrox said:
Come on, Derek, please find something else to say besides the driver thing.........


Unluckily for you, I don't give a shit. So, please don't start.

I don't think he cares whether or not you care. He voiced his opinion, which happens to differ from yours. Tough.

Best,
DDM_Reaper20
 
DDM_Reaper20 said:
[Speaking from the perspective of a user, I can say I'm pleased with how my Radeon 9700 Pro performs. RTCW, Quake 3, Independence War 2 -- all play extremely well, with eye candy and all. Those are the three games I have played extensively lately.

Well thats nice; now siddown. And if you can't post ON TOPIC, don't post at all. This thread is NOT about drivers. I didn't make it about drivers. I made a passing comment within the scope of the post I was making.

I'm not going to argue that there are bugs,

Naaaah, really? :rolleyes:

I'm just wondering whether they are really THAT bad.

Well considering your whole three - AGED/AGING games - I'd say go talk to those who are playing other games.

On top of that, it's not as if games were not released without any bugs whatsoever. I could name quite a few games that were crawling with the little suckers when they came out.

Again, if you can't post ON TOPIC then DON'T POST!!!. This is NOT about buggy games and/or drivers for that matter. Only a complete idiot would infer that because games are buggy, so its ok for drivers to be buggy. Gimme a break.

As to NV30 spanking R300, I guess we'll know when the card is released, not before.

No shit, Sherlock.

Snap said:
Derek Smart [3000AD said:
]Unluckily for you, I don't give a shit. So, please don't start.

"You didn't have to read it." :rolleyes:

You're kidding me, right? Why didn't you tell HIM that he didn't have to read my original post.

You fanATIcs are so one-sided it's embarrassing.
 
Derek Smart [3000AD said:
]This thread is NOT about drivers. I didn't make it about drivers. I made a passing comment within the scope of the post I was making.

You made an off topic post ranting about drivers - seems like you did make it about drivers.

Again, if you can't post ON TOPIC then DON'T POST!!!.

Take a leaf...
 
Derek Smart [3000AD said:
]
martrox said:
Come on, Derek, please find something else to say besides the driver thing.........

You didn't have to read it. It was part of the point I was making. We have regurgitated threads and posts here all the time. For example, this very one discussing this whole ATI vs nVidia thing - AGAIN - I don't see you telling everyone that they should find something else to say besides this nVidia vs ATI thing

BTW, all my software is working very well with my 9700, as they all do with my TI4600.....

Unluckily for you, I don't give a shit. So, please don't start.

I was very nice and respectful in asking you just what most in this forum would have asked you, I used no foul langauge, and I didn't make it a personal attack. Again, respectfully, I ask you to start adding to this forum, not taking away from it.
 
I was going to dissect my post line by line, but since I'm smart enough to know better, I'm not going to. All I can say is, if you know the meaning of the term SCOPE then you should know that there were two references to drivers in my post - BOTH - within the scope of the post I was making as it DIRECTLY relates to NEW HARDARE.

In case there was any doubt,

NEW HARDWARE USUALLY EQUALS PRE-MATURE AND/OR BAD DRIVERS

Here is the definition of the word scope, in case there was any doubt about its use therein.

martrox said:
I was very nice and respectful in asking you just what most in this forum would have asked you, I used no foul langauge, and I didn't make it a personal attack. Again, respectfully, I ask you to start adding to this forum, not taking away from it.[/i]

wot?

Since when was shit (a) a foul term within the realm of a forum board (b) inappropriate?

If you were offended by it, then sub it for any term you deem worthy of your poor eyes. I'll try to use other terms as appropriate - if only to prevent you and your buddies from taking the thread (yet another) off-topic with your banter.

Further, I saw no evidence of a personal attack anywhere in my post. You must have your monitor upside down.

Let me draw your attention to the registration terms.
 
@ Derek Smart

I guess it's hopeless to try to have a reasonable discussion with you, as you are awfully quick to look down your nose on people, and you routinely apply double standards ("don't do as I do, do as I say"). Posting off-topic, then yelling at others for doing the same -- how, uh, smart. Not.

BTW, I don't give a fig about your opinion; your being a game developer does not make you some kind of deity.

Now, I'm gonna toss BattleCruiser 3K at this baby. Wonder how many bugs I'll find . . . ?

Best,
DDM_Reaper20
 
All I can say is, if you know the meaning of the term SCOPE then you should know that there were two references to drivers in my post - BOTH - within the scope of the post I was making as it DIRECTLY relates to NEW HARDARE.

The thread was discussing new hardware, not drivers - hardware development is not going to stop for software development is it.

NEW HARDWARE USUALLY EQUALS PRE-MATURE AND/OR BAD DRIVERS

Likewise for NV30. Given they are currently rushing for the christmas season we'll have to see what the quality of their first drivers are like.
 
Derek Smart [3000AD said:
] But remember that the 8500 was supposed to be their answer to the GF3 Ti. heh, they [ATI, in case you were wondering) failed.

In performance terms at launch the 8500 may not have been a Gf3 performance killer - but in price & features terms it was, the 8500LE was marginally cheaper than the Gf3Ti200 and the 8500 was way cheaper than the Ti500 (before christmas 2001 ~ £200 v £300). Certainly now in the majority of situations the 8500 outperforms the Gf3 and Ti500 and the LE outperforms the Ti200.

nVidia probably sold more Gf3Ti200's though on brand loyalty and driver rep.
 
Hmmm

Ok well being in the IT world and Programming world I can say that there is the total possability of any one overclocking a Radeon 9700 Pro past 400 Mhz. I have talked to many people and they have showed me how they have overclocked it to insantity (One guy lately told me he overclocked it to 450Mhz I think). Now for the 400Mhz question with a .15m I would have to say that if they use the new fad (which is replacing all copper on the board with fiber (aka network wise fiberoptics)) they would be able to increase performance around about 3000% if they can get the stupid hardware to be fast enough to handle it. :-?

From what I can calculate if ATI would use fiber they would be able to boast up there memory to about 1500Mhz to 2500Mhz. :D I have seen some network components using fiber instead of copper and those units were handling about 3.5 terabytes a sec (if I remember correctly). Plus there are some companies in which I have talked to some people in those projects that they are creating memory that uses pure fiber.

Till later ;)
Raystream

P.S. this is no crap ther are some articles that have just been released about it so do some searching and you will find them. Right now I can not get any of you links (Stupid HOMEWORK). Sry :cry:
 
Randell said:
Derek Smart [3000AD said:
] But remember that the 8500 was supposed to be their answer to the GF3 Ti. heh, they [ATI, in case you were wondering) failed.

In performance terms at launch the 8500 may not have been a Gf3 performance killer - but in price & features terms it was, the 8500LE was marginally cheaper than the Gf3Ti200 and the 8500 was way cheaper than the Ti500 (before christmas 2001 ~ £200 v £300). Certainly now in the majority of situations the 8500 outperforms the Gf3 and Ti500 and the LE outperforms the Ti200.

nVidia probably sold more Gf3Ti200's though on brand loyalty and driver rep.

I quite agree.

And your last bit says it all. And right there is the rub. I've said it before and I'll keep saying it until doomsday - no matter how fast the ATI cards get, they're not going to get the level of brand loyalty and driver rep that nVidia has, until they clean up their act.

Sure, nVidia finally has stiff competition in the re-incarnation of ATI, but lessons learned from 3Dfx and all other chip makers comes to the same thing, make one false move and you'd probably get away with it. Keep making several repetitive false moves and its curtains.

As I've said before, the ATI hardware engineers are, well simply put, rocket scientists. The leap from previous ATI generation cards to the 8xxx series is nothing short of a miracle, really. Especially given the prior generation. With the advent of the 9xxx, (driver problems notwithstanding), they seem hell bent on keeping it that way. From all accounts, the 9xxx series has pretty much solidified this aspect and there's no way they could possibly goof on future hardware generations - even if they had the driver developers do the hardware (ok, I couldn't resist).

Whats going to kill them? The marketplace. No matter how fast the next gen cards are, they have to really do a lot more than they're doing currently, to win gamer confidence and loyalty. And by loyalty, I don't mean those fair weather freaks who just went out and bough the 9700 because it was the fastest thing in town. I'm talking about loyalty like you find with nVidia and even Matrox. Loyalty no matter if the next card beats your favorite by this much, will you switch. That kind of loyalty.
 
derek smart said:
But remember that the 8500 was supposed to be their answer to the GF3 Ti. heh, they [ATI, in case you were wondering) failed.

You are wrong. 8500 was not made to compete with the gf3ti. It was made to compete with the gf3.

And again you talk about "embarassingly bad drivers"

I gave you my list of games that I've tried with no bugs in the last thread you hi-jacked with your garbage that you spew. Still waiting on you to tell me what's wrong with them.

2 words Derek. GROW UP!
You are like a child screaming incessantly at the top of his lungs. At this point I could not care less if ATI made your game do the hokey pokey and turn itself about. I'm tired of hearing about it. And frankly, just by your behavior I've seen in various internet forums, I would just as soon play "britney's dance beat" till my eyes bled than install your game. You've had your fun here. No go back under the bridge where you belong you Troll.
 
I'm hoping that R350 is to R300 what the GF2 GTS was to the GF1 -
that is, higher clock speeds combined with TWICE the texture units, and DDRII - 256MB worth - doubling the texture units to 16 (arranged 8:2)
will give the highend R9xxx series the punch it needs to rival and perhaps surpass the Nv30 - especially if the Nv30 has a 128bit bus with bandwidth saving features. 400 Mhz clock for R350 sounds reasonable, but I doubt it will be much faster since the complexity will be higher due to twice the TMUs per pipe as well as any other features. 400M vert rate will be achived naturally. but the P.S. & V.S version will be 2.XX not 3.0 -

ATI needs to get onto a faster product release cycle. I'd expect R350 to be on shelves by no later than March 03 but as early as January- With R400 being ATI's fall 03 product to pound the NV35 with (no way will Nvidia have NV40 ready by next year)
 
If people want other people to stop hi-jacking threads about driver talk then I suggest they stop talking about drivers. Just let the topic die of starvation.

As for ATI launching a R350 around NV30s premiere, I just don't see it. I think a speed-bumped R300 with faster memory (not DDR2 since it would most likely be outrageously expensive on an already pricey product) is a lot more likely.

Those doubting toms figuring the R300 can't do much more than 325 on the current process, weren't you the same guys who said it couldn't do even as much as 325 before the thing was launched? ;) Quit pretending to be chip designing experts when it's clear you're not, mmkay? :D

After all, the P4 maxed out at 2GHz on a .18u process and could be taken quite a bit higher than that via overclocking. It's all about how it's designed internally, not the process. (Well, obviously not ALL about, but to a large extent anyway).

You might remember the first GF3 chips wouldn't overclock much at all. A while later the GF3 Tis came along on the same process rules (just tweaked a bit), using the same die. People took the 500:s to 300MHz+.

I would not be surprised to see an 'ultra' version instead of a 'pro' come christmas-time. Say around 400MHz core clock, maybe 750-800MHz memory. R350? Naah, I don't think ATI or their partners are ready for that yet, chip's probably not even finished yet.


*G*
 
yet another DSTC.
Back on topic - why does everyone presume the NV30 will "kill/slaughter/massacre/maim" (insert favorite killing machine descriptor here) the R300?
Given nVidias public stance on 256bit busses, and their focus on "better pixels, not faster pixels" i expect this round to see a flip flop of previous rounds. ATI used to have more features and slower performance. This time around, i figure nVidia will have the feature crown but ATI will maintain the performance crown.

One possible reason for the R350 release would be to bring the 9700 feature count up to NV30 levels (it doesnt lack much to get there) and of course, to rain on nVidias launch parade.
 
Althornin said:
yet another DSTC.
Back on topic - why does everyone presume the NV30 will "kill/slaughter/massacre/maim" (insert favorite killing machine descriptor here) the R300?
Given nVidias public stance on 256bit busses, and their focus on "better pixels, not faster pixels" i expect this round to see a flip flop of previous rounds. ATI used to have more features and slower performance. This time around, i figure nVidia will have the feature crown but ATI will maintain the performance crown.

One possible reason for the R350 release would be to bring the 9700 feature count up to NV30 levels (it doesnt lack much to get there) and of course, to rain on nVidias launch parade.

Well said...
 
Althornin said:
yet another DSTC.

Only thing I have to say - apart that this person spent FAR too much effort building that site than anyone reasonably free of obsessions should :) - is, jaysis, Derek's almost 40 years old! I thought him to be some upper-20:s hothead, LOL!

Anyway, as for R300 lacking features of NV30, haven't we established the other way around is true as well? Do we see Nvidia worry about that?

I don't think ATI will lose much sleep over some shorter instruction limits in their pix/vert shaders compared to Nvidia's design, their respective parts will like NEVER run realtime apps using that many instructions anyway. What other features are lacking in R300, do we know?


*G*
 
Grall said:
Anyway, as for R300 lacking features of NV30, haven't we established the other way around is true as well? Do we see Nvidia worry about that?

No. They're too busy worrying about building their chip. :eek:

--|BRiT|
 
EXACTLY.

So why do people seem to think ATI is so concerned about this? I don't get it.

They got shorter shaders, but they have texturing from floating-point cubemaps and stuff. Which of the two will be more useful? I believe Democoder slagged the cubemap support claiming it would kill fillrate (my apologies if I blamed the wrong guy here! :)), but you wouldn't have to use screen-resolution for each face in your map, right? What if you just did like 256*256 pixels or such (probably with simplified pixel shaders and such), would that be such a horrifyingly bad idea? ;)

*G*
 
Evildeus said:
Ths point is, we have some more "rumors" saying that NV30 will go in mass production in november: that's good for Nvidia.

Actually, it says, "when Nvidia’s 0.13-micron NV30 chips hit the market in volume after November." Note "chips" and "after". That sounds like volume production of *chips* (which still need to become cards) in December to me, tho there is possibly some wiggle room there if the author was being imprecise in his language.

And, no, I'm not interested in igniting another "meaning of 'is' is" type of discussion ala "tape-out" or "broken". Just pointing out that the quote provided is not entirely clear, and certainly points to at least December for whatever it is trying to convey, not November.
 
Status
Not open for further replies.
Back
Top