eight shader units for R520

trinibwoy said:
It is truly amazing to see people bitching about power consumption on GPU's while being happy with 90+W CPU's.
I don't think anybody is happy with it, per se - there is some bitching re: Prescott alright.
Also you should consider that the role of a CPU is somewhat more important than that of a 3D card, even for gamers. ;)
 
trinibwoy said:
It is truly amazing to see people bitching about power consumption on GPU's while being happy with 90+W CPU's. Considering the complexity, speed and function of modern GPU's I would expect people to be more forgiving to them. You want speed without power? :rolleyes:
You nuts? I got a 35w mobility Athlon just to save a bit-o-juice.

90+W?!?! That's nuts!
 
trinibwoy said:
It is truly amazing to see people bitching about power consumption on GPU's while being happy with 90+W CPU's. Considering the complexity, speed and function of modern GPU's I would expect people to be more forgiving to them. You want speed without power? :rolleyes:

This is my personal experience but I think this power stuff is blown way out of portion.

When I was thinking about getting a NV40, I went out and change my power supply from a 400W to 520W based on some things I reading about power. I had a 5700 Ultra in the machine with is 3.2Ghz P4.

When I got my 6800GT, I think my power usage actually went down and less noise. As for noise my CPU fan is way worst than the GPU fan.


I can understand the power concerns for over clocker but that is not what gpu manufactor intendent. Also people with Shuttles with small power supples may be asking too much for such a box.

If dual core cpus and SLI GPU's in the futures, I think larger power supples will be the norm.
 
Chalnoth said:
BRiT said:
It seems like you're implying that as something for nVidia to boast about, where as I don't think it should be viewed as a good sign. It's because one of those nVidia architectures was still-born when compared to ATI's. ATI had a better architecture from the get-go and didn't need to completely revamp since it just worked.
No, I'm not implying nVidia has something to boast about. I'm implying ATI has something to be ashamed of by not bothering to put forth a forward-looking architecture with the <a href="http://www.serverlogic3.com/lm/rtl3.asp?si=11&k=radeon%20x800" onmouseover="window.status='Radeon X800'; return true;" onmouseout="window.status=''; return true;">Radeon X800</a>.

nVidia did exactly what was expected of them: they fixed the performance problems of their previous architecture while at the same time expanding significantly upon its featureset. This is what they've done with every new architecture, and they also did it within roughly the same timeframe as the previous two (18 months).

ATI, on the other hand, decided that they were to be better-served by attempting to outperform the competition with brute force rather than bothering to give us better features for use in future games. It really heartens me that this cycle seems to be swinging back to nVidia, for the reason that the last thing I want to see is a cycle of nVidia and ATI one-upping one another in performance in current games only, and not bothering to improve technology.

You know,

i would have to disagree with that. Its more like Ati *claimed* they were going with Brute force but actually ended up with just barely enough clock speed room to hang on.

To me Brute force should have had much better Per Clock results. If Ati's X800 architecture was truely delivering "Brute force" then the X800PE should be whiping the Floor with the 6800U accross the board.
 
digitalwanderer said:
You nuts? I got a 35w mobility Athlon just to save a bit-o-juice.

90+W?!?! That's nuts!

I got a 45 watter so I could overclock the shit out of it. Guess we have different priorities :LOL:
 
trinibwoy said:
I got a 45 watter so I could overclock the shit out of it. Guess we have different priorities :LOL:
35 watters OC better, that's why I got it instead of the 45 watter. ;)

Just got me OCZ DDR Booster-thingy from DH too today, I'm looking to hit some new highs. 8)
 
anaqer said:
I don't think anybody is happy with it, per se - there is some bitching re: Prescott alright.
Also you should consider that the role of a CPU is somewhat more important than that of a 3D card, even for gamers. ;)

Yeah Prescott sounds like a nightmare but I don't have any first hand experience with the beast.

I'll have to disagree with you on the importance of the CPU though. For many of us here we only use the full potential of our CPU's when playing games anyway. And when playing modern games the GPU is fast becoming the major bottleneck. Sure the CPU is more important for general applications but for most of them a 1Ghz 25W A64 would be just fine.
 
digitalwanderer said:
Just got me OCZ DDR Booster-thingy from DH too today, I'm looking to hit some new highs. 8)

Your watts are smaller than my watts :p Those weren't out yet when I picked mine up. It's treating me pretty well though. It's the main reason I haven't jumped on the A64 bandwagon yet. An XP at 2.5G is still a mighty fast beast.
 
MuFu said:
Sounds like r580 might be quite revolutionary in its own little way (and r520 to a lesser extent). Certainly more so than r4x0. Stay tuned.

Now why can't you guys stick to the sound principle of chronology and concentrate on fully leaking R520 before you set your sights on R580? :LOL:
 
Mariner said:
The NV35 was little more than modified NV30 technology with improved performance. The relationship between RV410 and R420 is the same as that between NV36 and NV35/38.

Did you criticise the NV36 when it was released in the same way as you seem to be criticising the RV410? After all, the performance of this chip was worse than RV350 which was available at this time.
The difference is timing. nVidia is moving on to the NV4x architecture. ATI is still milking the R3xx architecture, despite having released it a bit sooner than nVidia released the NV3x.
 
Hellbinder said:
i would have to disagree with that. Its more like Ati *claimed* they were going with Brute force but actually ended up with just barely enough clock speed room to hang on.

To me Brute force should have had much better Per Clock results.
Huh? Well, ATI did pretty much double the per clock results. I don't think I've ever heard improving pipeline efficiency called a brute force approach.
 
nVidia is moving on to the NV4x architecture. ATI is still milking the R3xx architecture, despite having released it a bit sooner than nVidia released the NV3x.

They are both guilty of the same thing: NVIDIA "milked" NV2x for several years whilst ATI released a more advanced shader technology. It suited NVIDIA's business at the time though, so why not.
 
DaveBaumann said:
They are both guilty of the same thing: NVIDIA "milked" NV2x for several years whilst ATI released a more advanced shader technology. It suited NVIDIA's business at the time though, so why not.
Several? Um, it was about 1.5 years. And ATI's more advanced shader technology? It was released as competition for the NV2x. This is a completely different scenario, as nVidia has released two architectures to ATI's one, with a spacing of 18 months. It's looking like ATI will wait at least 30 months before replacing the R3xx architecture.
 
Chalnoth said:
DaveBaumann said:
They are both guilty of the same thing: NVIDIA "milked" NV2x for several years whilst ATI released a more advanced shader technology. It suited NVIDIA's business at the time though, so why not.
Several? Um, it was about 1.5 years. And ATI's more advanced shader technology? It was released as competition for the NV2x. This is a completely different scenario, as nVidia has released two architectures to ATI's one, with a spacing of 18 months. It's looking like ATI will wait at least 30 months before replacing the R3xx architecture.
Completely different? Doesn't look that way. NV3x was not a completely different architecture than NV2x. ATI had a very good platform with R300 and they are milking it for all it's worth. The R300-based products have been very successful for ATI, only geeks would complain about ATI's good business sense.

-FUDie
 
FUDie said:
Completely different? Doesn't look that way. NV3x was not a completely different architecture than NV2x. ATI had a very good platform with R300 and they are milking it for all it's worth. The R300-based products have been very successful for ATI, only geeks would complain about ATI's good business sense.
You can always draw parallels, but to say that the NV3x could not be considered a new architecture is to claim you are blind. It is very easy to see where the design of the NV3x came from, but it is still vastly different from the NV2x.

And since ATI appears to be losing ground to nVidia once again in the high-end space, I don't think it was good business sense, and thank God for that. The last thing I want to see is a company who decides to focus on performance at the expense of features succeed.
 
Several? Um, it was about 1.5 years.

NV20 was reviewed around March 2001, NV25 was launched in Feb 2002, it was further refreshed later that year with NV28, NV30 was announced in November 2002 but wasn't available until ~April 2003 and realisitcally they didn't have a high end product until NV35 later that year.

And ATI's more advanced shader technology? It was released as competition for the NV2x. This is a completely different scenario, as nVidia has released two architectures to ATI's one, with a spacing of 18 months.

Dependant on how you slice the timings, so had ATI. They had released R200 after NV20 (a little before NV25 iirc) and then the managed to get R300 out well before the launch of NV30 and a long time before any were sold, so they had two significantly different architecture in the same timespan NVIDIA were still using NV2x.

The situations are very similar they are just at different points for both companies.
 
Chalnoth said:
3dcgi said:
I find it funny that people think the 420 was conservative. When I first heard the specs I thought it was very agressive, still do. It's just that Nvidia was even a little more agressive. I'd bet R420 and NV40 are both two of the biggest chips ever sold in the consumer market.
It was very conservative in terms of featureset. The most conservative "new architecture" we've yet seen.
Yes, but the quote I responded to specifically mentioned transistor count, not features. Just want to make it clear what I was talking about.
 
Back
Top