R400 by the end of this year? Surely a mistake (?)

Isn't competition grand!!!

Hats off to ATi if they have a quantity supply of R350 in February just as GeForce FX appears.

Does anyone know if the NV35's development was delayed by the stuff up with NV30 in fab? I know it would horrible from a marketing / finance perspective to release NV35 too soon after NV30 - but technically could they do this, or was NV35 6 months delayed by 0.13 micron too?
 
DaveBaumann said:
RV350 and R350 taped out last week

CMKRNL reported that NV31, R350 and RV350 had taped out "in the last couple of weeks" on the 22nd of Nov. Given that his process guesses were on the money and his track record...

[edit] Plus, and earlier DG report stated that RV350 is scheduled for ramp up later this month.

And IIRC MuFu backed it also...
 
R350 and RV350 taped out in November so it seems feasible to me that they're now going into production. This is pretty much an indication that ATi is serious about the 6 month "kicker" that they've talked about. I guess people will have to swallow their skepticism now. This doesn't look good for Nvidia though, since they've only managed to recently go into production of the NV30 themselves.

It looks to me like the NV30 is going to hit the market not more than a month before the R350. Maybe that's why they put the huge heatsink on it and pushed the card to 500 MHz. It's not because the card is inefficient compared to R300 as some have claimed, but because they know they'll have to counter the R350 which will be coming out the door right on its heels.
 
Isn't competition grand!!!

Hats off to ATi if they have a quantity supply of R350 in February just as GeForce FX appears.

Does anyone know if the NV35's development was delayed by the stuff up with NV30 in fab? I know it would horrible from a marketing / finance perspective to release NV35 too soon after NV30 - but technically could they do this, or was NV35 6 months delayed by 0.13 micron too?


it'll be neat to watch NV30 being "squeezed" between R300 and R350 :)

horay for ATI.

not that I want them to dominate and stagnate the industry by gaining complete surpremacy over 3D, but right now, they have air-superiority :)


If R350 taped out in Nov and is in production now, with release in Feb-March and then R400 coming down the pike in the summer, Nvidia had better get its act together. maybe they have it together...maybe NV35 and NV40 are not delayed.

Although...


we should have had NV35 in the spring, or even this fall. right now. if NV30 had been on time. either spring 2002 or at the lastest, fall 2002. but that was not to be. Nvidia is almost a year behind their own self-imposed 6 month cycle. even the most hardened Nvidiot would admit at least that Nvidia is 6 months behind. but I see it as nearly a year, and so do others.

so about NV35....

it technically should/could be taping out within several weeks. then NV40 taping out in the summer... but that's all speculation. it hard to say where exactly Nvidia is. they are so out of wack. at least it appears that way. JC said Nvidia was a half step behind, and he's pretty tight with Nv.
:) i generally would not say something to contradict him.

ATI though.. hats off to them indeed! they're firing away on all cylinders :eek:
 
Hm, just a question - in your opinion with ati back in the game, won't it lead to even shorter production cycles then they are now?

Surely it's a great thing for the industry and technology, but it's not that great for customers. Or will they (do they) take this into consideration, so they'll keep it in reasonable range?
 
Rambler said:
Hm, just a question - in your opinion with ati back in the game, won't it lead to even shorter production cycles then they are now?

Surely it's a great thing for the industry and technology, but it's not that great for customers. Or will they (do they) take this into consideration, so they'll keep it in reasonable range?

Why do people keep complaining about short product cycles and being "forced" to upgrade? It's not like games have been getting that much more demanding.

It's really only good news for the customer. It's bad news only for those obsessed with having the latest and greatest thing.
 
Why don't nvidia just release the nv40 6 months after the nv30 and skip the nv35. I presume they would of had a seperate design team working on the nv40 and designing and building that, I wouldn't of thought that the delays with the nv30 and tsmc's problems with .13u would of caused the nv40 to be delayed?


But then again, it would depend on what sort of speed the nv35 is reaching and whether nvidia can get it to compete with the r400 on both a speed and technology stand point.
 
Mintmaster, I really don't see how this is good for customers. Prices even usually don't fall too much over this short period of time.
Plus nothing wasn't forcing me to upgrade until recently. I've had my GF1 SDR since it came out and it did the job well throughout the years.

Secondly why i asked this is because the devs are lagging behind the hw development and it seems the gap is only widening...IMHO
So if they won't catch up with the new hw quicker, what good is to have new product out when there's no software to use it on except few technology demos?

Maybe I'm completely wrong, but that's how I see it.
 
FSAA, AF.

Fast product cycles are far better for the consumer than only one manufacturer.

The consumer gets lower prices (gf3 £350 at release, 9700Pro £300-£260 and thats with a years inflation to take into account) and more choice and it forces cards into the budget and OEM sector quicker allowing developers to program more effects as they know they aren't only going to be seen by the few percent with £300 cards.

The top cards latest tech (AA/AF excepted) rarely gets used within its lifetime whether theres a fast product cycle or not as few people will program for the top card.

EDIT: changed competition for consumer. Not quite sure why I typed a completely different word to what I was thinking :/
 
Shorter product cycles could be bad for the industry. IHV's will have less time to innovate, less time to produce stable products, and will get a smaller return on their investment, meaning less investment so less innovation etc etc. ISV's will simply lag even further behind the HW making new innovative HW even less useful to the "real" consumer...
 
what could be bad for the 'industry' (not the consumer because of the discounting involved) would be an oversupply of chips that went as unsold stock either on shop shelves or stuck with board manufacturers as new products came out.

I remember on some sites seeing Gf2's selling for more than Gf3's weeks/months into the Gf3's release.


How do ATI/NVidia handle this to keep board makers happy?
 
Randell said:
what could be bad for the 'industry' (not the consumer because of the discounting involved) would be an oversupply of chips that went as unsold stock either on shop shelves or stuck with board manufacturers as new products came out.

I remember on some sites seeing Gf2's selling for more than Gf3's weeks/months into the Gf3's release.


How do ATI/NVidia handle this to keep board makers happy?
By keeping them well informed of the roadmap, etc. Every manufacturer these days generally buys enough parts per month to build the inventory they intend to sell that month.

That way you don't end up holding the bag.
 
Bambers said:
The consumer gets lower prices (gf3 £350 at release, 9700Pro £300-£260 and thats with a years inflation to take into account) and more choice and it forces cards into the budget and OEM sector quicker allowing developers to program more effects as they know they aren't only going to be seen by the few percent with £300 cards.

Hm, i doubt the quick transition to OEM sector - weren't TNT2s used in OEM systems until recently? (i hope i'm not talking complete nonsense here).
Plus wasn't also the gf3 exception from other launches? IIRC every top version of given card in last year debuted at ~$400 (gf4,r300) and isn't GFFX expected to be even higher ($400-500?)?

FSAA, AF.

Yes, well I wasn't talking technology-wise on purpose, because as you said, most of the tech goes unused withing its lifetime.
 
Actually, the TNT2 M64 was the card many OEMs chose to bundle with their system up until quite recently (if they don't still do it).

This not only means DX6, but also 1/2*(TNT2) due to the 64bit memory bus width.

It is always nice to see a new generation of accelerators being pushed down into the OEM price range...


With Regards
Kjetil Høiby
 
Kaizer said:
Actually, the TNT2 M64 was the card many OEMs chose to bundle with their system up until quite recently (if they don't still do it).

This not only means DX6, but also 1/2*(TNT2) due to the 64bit memory bus width.

It is always nice to see a new generation of accelerators being pushed down into the OEM price range...


With Regards
Kjetil Høiby

You are a little bit out of date. The card of choice at the moment is going to be either integrated video or the Geforce 4 MX.
 
from what I've seen its also the Rage128 with the Gf4MX or the Radeon9000 as the next upgrade option.
 
Kaizer said:
Actually, the TNT2 M64 was the card many OEMs chose to bundle with their system up until quite recently (if they don't still do it).

With Regards
Kjetil Høiby

I assume your clock stopped at least a year ago... :LOL:

EDIT:
Compaq Presario: integrated savage/integrated intel/gf4 mx420/r9000/aiw r9000/gf4 ti4200
HP Pavilion: integrated savage/integrated intel/gf4 mx420/aiw r7500/gf4 ti4200/r9700 pro
Dell: integrated intel/rage 128/gf4 mx420/gf4 ti4200/r9700 tx/r9700 pro
Gateway: integrated intel/gf4 mx440/gf4 ti4200/r9700 pro

There is no TNT2.
 
Entropy.
Perhaps since its only the start of the second quarter of their "Business Cycle" year, then that is what they may be referring to. If thats the case then they are right in line with what the other officious types from ATI have been saying. The end of this year would be the end of October 2003 if i have it right. So there ya go. I could be full of hot, boiling, hydrochloric acid, but thats my guess.

;)
 
Back
Top