I don't think anybody is happy with it, per se - there is some bitching re: Prescott alright.trinibwoy said:It is truly amazing to see people bitching about power consumption on GPU's while being happy with 90+W CPU's.
You nuts? I got a 35w mobility Athlon just to save a bit-o-juice.trinibwoy said:It is truly amazing to see people bitching about power consumption on GPU's while being happy with 90+W CPU's. Considering the complexity, speed and function of modern GPU's I would expect people to be more forgiving to them. You want speed without power?
trinibwoy said:It is truly amazing to see people bitching about power consumption on GPU's while being happy with 90+W CPU's. Considering the complexity, speed and function of modern GPU's I would expect people to be more forgiving to them. You want speed without power?
Chalnoth said:No, I'm not implying nVidia has something to boast about. I'm implying ATI has something to be ashamed of by not bothering to put forth a forward-looking architecture with the <a href="http://www.serverlogic3.com/lm/rtl3.asp?si=11&k=radeon%20x800" onmouseover="window.status='Radeon X800'; return true;" onmouseout="window.status=''; return true;">Radeon X800</a>.BRiT said:It seems like you're implying that as something for nVidia to boast about, where as I don't think it should be viewed as a good sign. It's because one of those nVidia architectures was still-born when compared to ATI's. ATI had a better architecture from the get-go and didn't need to completely revamp since it just worked.
nVidia did exactly what was expected of them: they fixed the performance problems of their previous architecture while at the same time expanding significantly upon its featureset. This is what they've done with every new architecture, and they also did it within roughly the same timeframe as the previous two (18 months).
ATI, on the other hand, decided that they were to be better-served by attempting to outperform the competition with brute force rather than bothering to give us better features for use in future games. It really heartens me that this cycle seems to be swinging back to nVidia, for the reason that the last thing I want to see is a cycle of nVidia and ATI one-upping one another in performance in current games only, and not bothering to improve technology.
digitalwanderer said:You nuts? I got a 35w mobility Athlon just to save a bit-o-juice.
90+W?!?! That's nuts!
35 watters OC better, that's why I got it instead of the 45 watter.trinibwoy said:I got a 45 watter so I could overclock the shit out of it. Guess we have different priorities
anaqer said:I don't think anybody is happy with it, per se - there is some bitching re: Prescott alright.
Also you should consider that the role of a CPU is somewhat more important than that of a 3D card, even for gamers.
digitalwanderer said:Just got me OCZ DDR Booster-thingy from DH too today, I'm looking to hit some new highs. 8)
MuFu said:Sounds like r580 might be quite revolutionary in its own little way (and r520 to a lesser extent). Certainly more so than r4x0. Stay tuned.
The difference is timing. nVidia is moving on to the NV4x architecture. ATI is still milking the R3xx architecture, despite having released it a bit sooner than nVidia released the NV3x.Mariner said:The NV35 was little more than modified NV30 technology with improved performance. The relationship between RV410 and R420 is the same as that between NV36 and NV35/38.
Did you criticise the NV36 when it was released in the same way as you seem to be criticising the RV410? After all, the performance of this chip was worse than RV350 which was available at this time.
Huh? Well, ATI did pretty much double the per clock results. I don't think I've ever heard improving pipeline efficiency called a brute force approach.Hellbinder said:i would have to disagree with that. Its more like Ati *claimed* they were going with Brute force but actually ended up with just barely enough clock speed room to hang on.
To me Brute force should have had much better Per Clock results.
nVidia is moving on to the NV4x architecture. ATI is still milking the R3xx architecture, despite having released it a bit sooner than nVidia released the NV3x.
Several? Um, it was about 1.5 years. And ATI's more advanced shader technology? It was released as competition for the NV2x. This is a completely different scenario, as nVidia has released two architectures to ATI's one, with a spacing of 18 months. It's looking like ATI will wait at least 30 months before replacing the R3xx architecture.DaveBaumann said:They are both guilty of the same thing: NVIDIA "milked" NV2x for several years whilst ATI released a more advanced shader technology. It suited NVIDIA's business at the time though, so why not.
Completely different? Doesn't look that way. NV3x was not a completely different architecture than NV2x. ATI had a very good platform with R300 and they are milking it for all it's worth. The R300-based products have been very successful for ATI, only geeks would complain about ATI's good business sense.Chalnoth said:Several? Um, it was about 1.5 years. And ATI's more advanced shader technology? It was released as competition for the NV2x. This is a completely different scenario, as nVidia has released two architectures to ATI's one, with a spacing of 18 months. It's looking like ATI will wait at least 30 months before replacing the R3xx architecture.DaveBaumann said:They are both guilty of the same thing: NVIDIA "milked" NV2x for several years whilst ATI released a more advanced shader technology. It suited NVIDIA's business at the time though, so why not.
You can always draw parallels, but to say that the NV3x could not be considered a new architecture is to claim you are blind. It is very easy to see where the design of the NV3x came from, but it is still vastly different from the NV2x.FUDie said:Completely different? Doesn't look that way. NV3x was not a completely different architecture than NV2x. ATI had a very good platform with R300 and they are milking it for all it's worth. The R300-based products have been very successful for ATI, only geeks would complain about ATI's good business sense.
Several? Um, it was about 1.5 years.
And ATI's more advanced shader technology? It was released as competition for the NV2x. This is a completely different scenario, as nVidia has released two architectures to ATI's one, with a spacing of 18 months.
Yes, but the quote I responded to specifically mentioned transistor count, not features. Just want to make it clear what I was talking about.Chalnoth said:It was very conservative in terms of featureset. The most conservative "new architecture" we've yet seen.3dcgi said:I find it funny that people think the 420 was conservative. When I first heard the specs I thought it was very agressive, still do. It's just that Nvidia was even a little more agressive. I'd bet R420 and NV40 are both two of the biggest chips ever sold in the consumer market.
Didn't the nV30 paper launch before the R300?DaveBaumann said:and then the managed to get R300 out well before the launch of NV30 and a long time before any were sold