NV40 3DMark 2003 scores revealed -theinquirer

The Baron said:
It's annoying me, because from what I know about it, I actually like the cooler so far (at least compared to NV30 and NV38).

Of course, being better than the NV30 / NV38 cooler is not exactly saying a whole helluva lot. ;)
 
Joe DeFuria said:
The Baron said:
It's annoying me, because from what I know about it, I actually like the cooler so far (at least compared to NV30 and NV38).

Of course, being better than the NV30 / NV38 cooler is not exactly saying a whole helluva lot. ;)

"Badda-boom, TING!"

You earned that rimshot Dr. Defurious. 8)
 
radar1200gs said:
Of course the 9800 won't be any slower than when purchased. The only problem is that when you go to play hal-life2 on it, it will have (relatively speaking) 9600 like performance compared to R420. So you paid top dollar for mid-range performance (when you cash your voucher in).

Thats part of why I stuck with GF4. I was originally hoping (along with plenty of others) that NV30 would prove to be a top-end card actually worth the asking price. It turned out not to be, however NV40 looks much more promising in this regard.
I don't want to tip over the hornets' nest again, but with posts like these, why do some of you even bother to reply? B3D had trouble with StN ratios a few years back, IIRC, and the solution was to ignore the flamebait, not to bite. No need to scald your tongues with something so unsavory.
 
radar1200gs said:
I don't know if the following is true or not, but I have a feeling Brilinear was introduced on NV3x because the broken condition of the chip meant traditional FSAA/Aniso put too much of a strain on performance (FSAA was broken on the NV30's nVidia used at launch). This is probably why various claims made for NV30's FSAA were never realised (free @ 4x, gamma corrected).
This paragraph shows how little you know. It takes years to design a chip. You don't make the chip, get it back, then go and add new features to it because that will add probably another year to the schedule. Face it, brilinear was in the NV3x design from the start.

The whole "gamma corrected" AA claim was a total joke from the get-go. Some NVIDIA marketing guy claimed that you could do the gamma correction in the shader. While it's true that you can gamma correct results in the shader, you can't do it per sample as multisampling would require, hence the guy was confused, misinformed or outright lying.

-FUDie
 
If those rumors about power requirement are right, I guess this is the end of the road. I wonder how they would manage to have an acceptable power consuption in next gen GPUs (R500/NV50) while doubling the performance.

I guess both company -especially Nvidia- are shooting themselves. They set the bar so high in this round that it would be really difficult to improve the performance further. I mean what will be next.. 1.6 Ghz DDR3(effective 3.2 Ghz) Ram? 200+ Watt GPUs? 512 bit bus? 1+ Ghz GPU? I do not think 0.09u will save them either...

It is going to be really interesting to see how they would manage to pull out another x2 improment in R500 and NV50 :?
 
silhouette said:
If those rumors about power requirement are right, I guess this is the end of the road. I wonder how they would manage to have an acceptable power consuption in next gen GPUs (R500/NV50) while doubling the performance.

I guess both company -especially Nvidia- are shooting themselves. They set the bar so high in this round that it would be really difficult to improve the performance further. I mean what will be next.. 1.6 Ghz DDR3(effective 3.2 Ghz) Ram? 200+ Watt GPUs? 512 bit bus? 1+ Ghz GPU? I do not think 0.09u will save them either...

It is going to be really interesting to see how they would manage to pull out another x2 improment in R500 and NV50 :?

Out of curiosity, what makes ya think the R420 is going to suck a ton of go-go juice? Low-k should help ATi avoid that.
 
I don't get it? Why doesn't nVidia just do what 3dFX was going to do with the Voodoo 5 6000 - use an external AC outlet for power. This would probably be the simplest, easiest, and most convenient method for users.
 
I was thinking the same thing. Surely it'd be easier for consumers to find another outlet on the surge protector than to dig around their case for two separate molex lines, and this would probably be standard from now on for the high-end parts. Maybe the IHVs don't have enough room on the backplate to fit an AC connector?
 
surfhurleydude said:
I don't get it? Why doesn't nVidia just do what 3dFX was going to do with the Voodoo 5 6000 - use an external AC outlet for power. This would probably be the simplest, easiest, and most convenient method for users.

1) How well was that idea received on the Voodoo?

2) Cost.
 
digitalwanderer said:
"waterblock on the card with a little pump/radiator you need a drivebay free for"

You heard that one too? :devilish:
 
surfhurleydude said:
I don't get it? Why doesn't nVidia just do what 3dFX was going to do with the Voodoo 5 6000 - use an external AC outlet for power. This would probably be the simplest, easiest, and most convenient method for users.

Probably the stigma attached to it, although I agree it would probably be easier for people.
 
I thought AMD wasn't suffering issues with heat?
I thought they have more like some kind of issues with internal timing holding them back :?:
 
AlphaWolf said:
surfhurleydude said:
I don't get it? Why doesn't nVidia just do what 3dFX was going to do with the Voodoo 5 6000 - use an external AC outlet for power. This would probably be the simplest, easiest, and most convenient method for users.

1) How well was that idea received on the Voodoo?

2) Cost.

At the time, the idea was seen as almost ridiculous, as even molex connectors were thought of as unfashionable... However, I think that at this point in the industry, the idea is probably MUCH MUCH better than having 2 molexes requiring their own sole power lines.
 
surfhurleydude said:
AlphaWolf said:
surfhurleydude said:
I don't get it? Why doesn't nVidia just do what 3dFX was going to do with the Voodoo 5 6000 - use an external AC outlet for power. This would probably be the simplest, easiest, and most convenient method for users.

1) How well was that idea received on the Voodoo?

2) Cost.

At the time, the idea was seen as almost ridiculous, as even molex connectors were thought of as unfashionable... However, I think that at this point in the industry, the idea is probably MUCH MUCH better than having 2 molexes requiring their own sole power lines.

Well I personally find both idea's sorta repulsive, but I think the 2 molex connector solution will be a much easier sell to their AIB partners. Especially when you consider that the peg variant will likely only require 1 molex. I can't see aib partners liking the idea of eating $5-10 or whatever for the cord and brick.
 
arrrse said:
I thought AMD wasn't suffering issues with heat?
I thought they have more like some kind of issues with internal timing holding them back :?:

Athlon 64 3200+ is pushing out 89 watts of heat and the 3400+ should push out something like 95 watts :D
 
I thought 89W is the ceiling for the current 130nm A64 process, of which 3700+ might be the end of the line?
 
"I don't believe you" there, I said it.

could you show me some thermals info that shows 64's near that output. Please.
 
Doomtrooper said:
arrrse said:
I thought AMD wasn't suffering issues with heat?
I thought they have more like some kind of issues with internal timing holding them back :?:

Athlon 64 3200+ is pushing out 89 watts of heat and the 3400+ should push out something like 95 watts :D

This needs to be corrected. The Athlon 64 consumes about 50-60W or so on average. The TDP (read: max power consumption for AMD) for the entire 130nm line of Athlon 64s and Opterons is 89W, and they never even reach that. Conversely, if we look at, say Prescott, the TDP (read: typical power consumption for Intel) is 103W. The max is around 120W or so. AMD doesn't have any thermal issues right now and likely aren't going to have any in the near future thanks to SOI. I may not have the GPU knowledge a lot of you guys have, but if there's one thing I know it's CPUs :)
 
Back
Top