Josh on NV40 & R420

Rookie

Newcomer
Josh's viewpoint on NV40&R420,some useful info as always:

http://www.penstarsys.com

what surprised me is that Josh said there are 2 version NV40 :oops: . One is forged by TSMC and the other is from IBM.

Quote:

"First off it appears that there are actually two chips that can fall into the NV40 spec, one made by TSMC and the other by IBM. If I were to hazard a guess, the 210 million transistor model is made by TSMC and the 175 million transistor product will be from IBM. I have no idea how these two will stack up, or even if they will be released at the same time."

True or false?

Fixed the typo...
 
ack why say r300 and nv40 in the same breath. IF they can't beat a 2 year old ati card then they have a ton of problems
 
jvd said:
ack why say r300 and nv40 in the same breath. IF they can't beat a 2 year old ati card then they have a ton of problems
If you bothered to read the linked article you'd see it was clearly a typo on the part of the threadstarter.
 
Both cards will be fast, even though the ATI product will probably not feature PS 3.0 functionality. This is not a bad thing as PS 3.0 will not be widely adopted until Q1 2005. This gives ATI ample time to get their next gen product out the door that will support PS 3.0 functionality. This does give NVIDIA quite a bit of mindshare though, as people willing to shell out $400+ for a video card may be more interested in being a bit more future proof rather than having the absolute fastest PS 2.0 card on the planet.
Just like they flocked to FX 5800/5900 over R9700/9800? Can we assume that the people shelling out $400+ for a video card will make a more informed purchase than those scrounging around at $80, and thus whichever card is faster in PS2.0 and with AA+AF will sell better, rather than whichever has more features (NV30 had PS&VS2.0+ to R300's PS/VS2.0 and FP16&32 to FP24, yet supposedly ATi has an overwhelming share of the high end)? I was as psyched about DX9 as anyone else here, but here we are 1.5 yrs after R300 debuts and just now are DX9-ish titles hitting the market in relative numbers. I wonder if ppl will wise up to potential vs. performance? Heck, I'm not even sure I've learned my lesson yet. ;)
 
Pete said:
Just like they flocked to FX 5800/5900 over R9700/9800? Can we assume that the people shelling out $400+ for a video card will make a more informed purchase than those scrounging around at $80, and thus whichever card is faster in PS2.0 and with AA+AF will sell better, rather than whichever has more features (NV30 had PS&VS2.0+ to R300's PS/VS2.0 and FP16&32 to FP24, yet supposedly ATi has an overwhelming share of the high end)?

Yes, the NV3X had PS/VS 2.0+ but it also lacked a lot of features. MRT f.e. And they both supported DX9 SM 2.0. Now it's probably SM2.0 vs SM3.0 which makes it more obvious.
 
Pete said:
Can we assume that the people shelling out $400+ for a video card will make a more informed purchase than those scrounging around at $80, and thus whichever card is faster in PS2.0 and with AA+AF will sell better, rather than whichever has more features (NV30 had PS&VS2.0+ to R300's PS/VS2.0 and FP16&32 to FP24, yet supposedly ATi has an overwhelming share of the high end)?

No. There are always the fanboys and the insecure :p
 
It seems kinda strange to offer two separate NV40's. If anything, I would expect that we will either only see one of the designs (if there are indeed two), or one of them is the NV40GL.

From what we've heard in the past, it seems like most of nVidia's volume will be from TSMC, so if one of these is the NV40GL, then it will likely be the one from IBM.
 
StealthHawk said:
Pete said:
Can we assume that the people shelling out $400+ for a video card will make a more informed purchase than those scrounging around at $80, and thus whichever card is faster in PS2.0 and with AA+AF will sell better, rather than whichever has more features (NV30 had PS&VS2.0+ to R300's PS/VS2.0 and FP16&32 to FP24, yet supposedly ATi has an overwhelming share of the high end)?

No. There are always the fanboys and the insecure :p

Not only that, but surely the fact that todays high-end is tomorrows low-end makes checkbox features important at some point.
 
Quitch said:
...Not only that, but surely the fact that todays high-end is tomorrows low-end makes checkbox features important at some point.


in fact, this is not anymore true. R300 never falled to low end, nor did the NV30. They were replaced cheapo ass cores, that have features, but not power to run them. So, only thing where those checkbox features matter is a launch. it is all about the first impression. As an examples, where is software / drivers that supports:
- VideoShader on R300?
- R100 Key Frame Interpolation?
- Adaptive Displacement Mapping on Parhelia?
- Nvidia Shading Rasterizer?

After the company has next generation cores out, they more like seems to hope that older chips would not exist at all or would be forgotten totally.

(I actually noticed that ATI has much more these "hopefully they forget what we demoed on launch and are happy with games" kind-of-things than nVidia.)

Another funny thing is Matrox... they do not have fancy name nor they haven't really hyped their way to do real time effects on videostream, but they do have full support from Adobe. Maybe that's the reason why Parhelia & RT-100 combination is still market leader on real-time video editing.
 
Yeah, Josh's info would be interesting... if it hadn't all been discussed elsewhere countless times, in more detail and accuracy.
 
Rookie said:
Quote:

"First off it appears that there are actually two chips that can fall into the NV40 spec, one made by TSMC and the other by IBM. If I were to hazard a guess, the 210 million transistor model is made by TSMC and the 175 million transistor product will be from IBM. I have no idea how these two will stack up, or even if they will be released at the same time."

True or false?
Its likely true that its being fabricated at both places.

Its likely false that there's any major differences between the two version at a functional level.

Sigmatel manufactures many of its parts at more than one fab. There's an effort to tapeout for each fab, as most of the fabs aren't completely electically compatible and the backend stuff must be run again using different libraries, etc. They're still the same part, though.
 
Nappe1 said:
As an examples, where is software / drivers that supports:
[...]
- Nvidia Shading Rasterizer?
I tend to believe several, if not most OpenGL games do.
 
Bjorn said:
Yes, the NV3X had PS/VS 2.0+ but it also lacked a lot of features. MRT f.e. And they both supported DX9 SM 2.0. Now it's probably SM2.0 vs SM3.0 which makes it more obvious.
But did even the semi-knowledgable consumer (prosumer :)) know about MRTs? I'm thinking consumers believe the box/hype and prosumers believe the benchmark graphs, but neither probably know about MRT, centroid sampling, etc. Heck, they probably don't know the diff. b/w RG and OG AA. I'm not sure ppl will see a box proclaiming DX9 SM3.0 vs. 2.0+ as much different as DX9 SM2.0+ vs. 2.0. It might be the same situation in terms of features, and it'll probably boil down to benchmarks and AA screenshots in the high-end for prosumers (again).

(But to continue on the future-proof point, even when it was discovered that ATi was faster at DX9 than nV, the argument that the 9700P would be more future-proof than the 5800 has turned out to be late in concluding, because there just ain't that many DX9-heavy games, and the difference between DX8 and 9 isn't that great in screenshots. I'd still recommend ATi for its superior AA, but its superior DX9 performance still seems to be a future issue, even 1.5yrs later. The flood should begin now, though; I consider Far Cry the herald of DX9 things to come, and ATi does seem to outperform nV in the Xbit benches. We'll see what the 1.1 patch does for nV shader perf., though. :))

Anyway, I'm hoping for big things from both IHVs. :)
 
Nappe1 said:
Quitch said:
...Not only that, but surely the fact that todays high-end is tomorrows low-end makes checkbox features important at some point.
in fact, this is not anymore true.
Sure it is, just in a slightly different way. The following low-end parts haven't yet had better technology than their high-end counterparts, that's for sure. If anything, they've typically had fewer features.

In other words, if ATI's R420 does lack PS 3.0, then so will ATI's coming value parts.
 
Haha, Taz loves me.

I had just posted a few tidbits that I had gleaned from here and there, and yes, most of it has been widely circulated. The thing that I did find very interesting is that it does appear to be two NV4x chips coming out at very close to the same time, one from IBM and the other from TSMC. Could this be the FX6x00 and FX6x00 Ultra? A $400 pricepoint (IBM) card, and the higher end $500 (TSMC) offering?

I don't know for sure, but that "may" look to be the case. I of course was not told anything that specific by NVIDIA, or even that there were two chips.

Not try to be a Faud here, just passing on some info that I had gathered.
 
It could very well be, or it could also be that the 175 million tran part was supposed to be introduced this April, while the 210 million tran part was going to be a fall "refresh" product. This could still be the case, but perhaps that 210 mil part was ready well before they were expecting and decided to release that also.

From my limited knowledge of fabrication and chip design, each particular design for a fab uses a "standard cell", that cell is either provided by a 3rd party or the fab partner. In this case, the IBM standard cell is quite different from the TSMC standard cell. So, would it really be in NVIDIA's best interest to port the same chip to each major fab? That would involve thousands upon thousands of man hours and computing years to make sure that the design would work with two different standard cell designs. So basically, what I think NV would do is choose which design will be manufactured where (eg. the FX 5700 is only manufactured by IBM, while the FX 5900/5950 is only manufactured by TSMC).

So, if this is truly the case, then there may be two high end NVIDIA chips in the wild here shortly. How they measure up? I have no clue.

EDIT: some poor spelling and editing on my part.
 
Back
Top