eight shader units for R520

3dcgi said:
I find it funny that people think the 420 was conservative. When I first heard the specs I thought it was very agressive, still do. It's just that Nvidia was even a little more agressive. I'd bet R420 and NV40 are both two of the biggest chips ever sold in the consumer market.
It was very conservative in terms of featureset. The most conservative "new architecture" we've yet seen.
 
The GeForce4 was a refresh part. But yes, the situation compares well. Except that in that case ATI was on a similar release cycle. This time nVidia has released two new architectures to ATI's one.
 
Chalnoth said:
This time nVidia has released two new architectures to ATI's one.

It seems like you're implying that as something for nVidia to boast about, where as I don't think it should be viewed as a good sign. It's because one of those nVidia architectures was still-born when compared to ATI's. ATI had a better architecture from the get-go and didn't need to completely revamp since it just worked.
 
BRiT said:
It seems like you're implying that as something for nVidia to boast about, where as I don't think it should be viewed as a good sign. It's because one of those nVidia architectures was still-born when compared to ATI's. ATI had a better architecture from the get-go and didn't need to completely revamp since it just worked.
No, I'm not implying nVidia has something to boast about. I'm implying ATI has something to be ashamed of by not bothering to put forth a forward-looking architecture with the Radeon X800.

nVidia did exactly what was expected of them: they fixed the performance problems of their previous architecture while at the same time expanding significantly upon its featureset. This is what they've done with every new architecture, and they also did it within roughly the same timeframe as the previous two (18 months).

ATI, on the other hand, decided that they were to be better-served by attempting to outperform the competition with brute force rather than bothering to give us better features for use in future games. It really heartens me that this cycle seems to be swinging back to nVidia, for the reason that the last thing I want to see is a cycle of nVidia and ATI one-upping one another in performance in current games only, and not bothering to improve technology.
 
DaveBaumann said:
Paju, read earlier for the reasons for why one not the other.

As for the risks - its thought that the unified architecture isn't new for ATI as this is what the, defunct, R400 was going to be, but has since been extended on for the XBox and will likely form the basis for their Longhorn graphics architecture.

Thanks for the info. Could it really be the case that both R420 and R520 are just evolutionary steps in ATI's GPU line and R400 and R500 real revolutionary GPUs?

I'm not saying that R420 is and R520 will be bad - not at all. I'm just wondering how long they can extend the lifespan of R300 core. I'd say that R300 marchitecture was designed very well. Otherwise it would not be possible to "trash" completely new marchitecture and make redesign immediately without putting anything to the market (fractions of new architecture is shown in R420 and most likely will be shown in R520).

One thing, which may end up happening is that when R400 was redesigned to R500 (XBox 2) and will turn eventually to R600 (PC) is that we might have rather bug free hardware without any or major flaws from the beginning of its introduction. GF FX suffered a lot of few serious flaws but 6800 series fixed those (or was it earlier). Well, R600 is anyhow far far away at the moment (late 2006 I assume) so almost nothing about it is certain at the moment.

I still think that it's quite risky business to develop such a extensive leap in architecture and represent it in console even though they have some experience of it from R400. XBox 2 simply cannot fail against PS3 or next GC and is expected to sell a lot.
 
paju said:
I'm not saying that R420 is ... bad - not at all.
Oh, I am. You can always make an older architecture run faster than one that bothers to incorporate new technology. With fewer transistors to deal with, it becomes easier to deal with heat and clocking concerns. And yet, nVidia still managed to get their mid-range GeForce 6600 GT part to clock higher than anything ATI has yet offered.
 
Chalnoth said:
paju said:
I'm not saying that R420 is ... bad - not at all.
Oh, I am. You can always make an older architecture run faster than one that bothers to incorporate new technology. With fewer transistors to deal with, it becomes easier to deal with heat and clocking concerns. And yet, nVidia still managed to get their mid-range GeForce 6600 GT part to clock higher than anything ATI has yet offered.

The NV35 was little more than modified NV30 technology with improved performance. The relationship between RV410 and R420 is the same as that between NV36 and NV35/38.

Did you criticise the NV36 when it was released in the same way as you seem to be criticising the RV410? After all, the performance of this chip was worse than RV350 which was available at this time.
 
Chalnoth said:
And yet, nVidia still managed to get their mid-range GeForce 6600 GT part to clock higher than anything ATI has yet offered.

And cards that run with high VCores can generally be clocked higher. [Edit] And unless I'm mistaken, 9600 XT ran at 500MHz.
 
DaveBaumann said:
Chalnoth said:
And yet, nVidia still managed to get their mid-range GeForce 6600 GT part to clock higher than anything ATI has yet offered.

And cards that run with high VCores can generally be clocked higher. [Edit] And unless I'm mistaken, 9600 XT ran at 500MHz.
yes and get hotter, and draw more power from the PSU, there`s NO WAY im buying a 100w+ card, and i dont care how fast it will be. so im realy hoping it won`t get to that.
BTW, any takes on the 520/nv47 power consumption?
 
It is truly amazing to see people bitching about power consumption on GPU's while being happy with 90+W CPU's. Considering the complexity, speed and function of modern GPU's I would expect people to be more forgiving to them. You want speed without power? :rolleyes:
 
trinibwoy said:
It is truly amazing to see people bitching about power consumption on GPU's while being happy with 90+W CPU's. Considering the complexity, speed and function of modern GPU's I would expect people to be more forgiving to them. You want speed without power? :rolleyes:

of course we want it. it doesnt necessarily mean we expect it tho. im extremely disappointed in modern cpus tho. they have been advancing at a snails pace.
 
Yeah, damn the laws of physics! :devilish:

Sounds like r580 might be quite revolutionary in its own little way (and r520 to a lesser extent). Certainly more so than r4x0. Stay tuned.
 
Back
Top