NV40 Test samples, pics inside

Is that really a final design or only a reviewers board? Doesn't look so professional to be from NV, and where's the green toned HS? And like others have said. Only 4 caps??
 
Regarding the whole voltage regulation circuit... looking at the red board from weeks before, it seems more and more like the two power connectors are required because of... the memory?! :LOL: I mean, there are two clearly separate parts of the Vreg section, a 3-phase part and a 2-phase part... how would that make sense?

Or maybe one is for the 12V rail and the other is for 5V ( which would be useful seeing the limited current most PSUs provide on the 12V ).
 
Richthofen said:
well that's a problem both IHVs will face. Both have the same access to the latest memory technology. Maybe both IHVs will start with some modest clocked cards and will refresh them a littlebit later when faster memory will be available.

I already addressed that earlier. Both of them might not have the same access. Or if they do, there might only be enough supply for one vendor (whichever one is either willing to pay more, or has the better relationship with the memory supplier.)

If they overclock the memory - well no problem as all as long as they guarantee that it will work flawless. And that is what they have to do in such a scenario. I see no problem there.

I addressed that earlier too:

Joe DeFuria said:
Even if nVidia does ship "overclocked" boards, who said they would be "cheating" the consumer? That would be you, not anyone else. As long as nVidia and it's partners guarantee it to work, where's the cheating?

The point is, if you're going to overclock the memory (and guarantee oprtation), you would expect some rather hefty cooling to serve that purpose...so that would rather nicely explain the large cooling solution for memory (GDDR-3) that as far as we know, should run relatively cool.
 
Few more images - this time with the NV screen printing.
http://www.darkcrow.co.kr/Home/home.asp?idx=#3018

I still don't see the logic in 2 power connectors as it's from one power source. I thought at 1st it might have been because they physically ran out of PCB space to run the extra tracks but looking at the rear of the prototype there seem's plenty of room.

hmm.. unless the supplied cable has a noise suppressor between the 2 connectors.... (just a thought).
 
THe_KELRaTH said:
Few more images - this time with the NV screen printing.
http://www.darkcrow.co.kr/Home/home.asp?idx=#3018

Too bad noone has seen fit to take a profile shot of the NV40. It'd put alot of discussions about the cooler height to rest...

Oh, and does anyone else notice a difference in how the fan is connected on the first two pics in that thread?

It seems like in the first pic the cooler has been "mirrored"* compared to the other pics.


*Sorry, can't come up with a better word for it, but if anyone would be so kind as to tell me the proper english word, I'd appreciate it :)
 
It seems like in the first pic the cooler has been "mirrored"* compared to the other pics.
Yes, that was what I was talking about when I said the 'marketing' pic (ie the one with the green sticker) has a nearly correct fan but with the opposite handedness.
I can see no particular reason why they would swap the handedness like that.

If you look at the red pcb pics with no heatsink, the white line outlining where the heatsink will go indicates that the green sticker version is the correct handedness at least for the shroud part.
The red pcb pic also shows clearly that the ram cooler is physically seperate from the gpu cooler.

I looked up pics of the reference boards for fx5800 -> fx5950 & all seem to have the same bizzare fan setup with the blades all wrong.
I thought it was only for the 5800 that they'd done it so badly :oops:

:idea: Or perhaps I just need to unlearn my knowledge of fluid dynamics & learn The Way its Meant to Flow ;)
 
It's amazing how all those PR pics are designed not to show the height of the cooler, especially as it's something everyone wants to know. If it's good enough for Nvidia to use on their flagship product, surely it's good enough to show us the pictures properly?
 
Oh Nvidia. Dear Nvidia.

shootfoot.jpg
 
All I can say is if they are making a 2 slot solution they better make it vent outside the case. I would actually like that :) As opposed to being annoyed by it.
 
{Sniping}Waste said:
Nvidia can only get there GDDR3 from Samsung but Micron is producing for ATI cards only. The GDDR3 specs came from ATI and partnered up with Micron to produce it. My Micron source (he did most of the IC layout on it) is selling all they can make of GDDR3 up to 833/1666 now and reports high yealds. If this is true and he has nevered lie to me, ATI will have all the GDDR3 they want now and have no shortage. As for Nvidia, they have to get theres from samsung and samsung did very little reserch on GDDR3 and put most of there reserch on GDDR2. From what I here is Samsung is haveing yeald problems with GDDR3 now and can't supply faster then 600 now.

Here is the latest available info on Samsungs memory:
http://www.digitimes.com/NewsShow/MailHome.asp?datePublish=2004/4/12&pages=A5&seq=20
 
Bouncing Zabaglione Bros. said:
It's amazing how all those PR pics are designed not to show the height of the cooler, especially as it's something everyone wants to know. If it's good enough for Nvidia to use on their flagship product, surely it's good enough to show us the pictures properly?

There are five board pics in official GeForce 6800 material and one is a profile shot, but it isnt very useful. All it shows is metal bracket with 2 DVIs and composite video out, no sign of heatsink.

Zvekan
 
Stacked as in 2 or more dies are packed and interconnected one on top of the other in a single package.

Its probably more relevant to the world of system memory when there is a limit to how many seperate chips can live on a DIMM module.
 
My take on the height of the fan is that its the same height as the caps, which means it'll fit in a single slot. (At least going from the pictures with the green stickers)

Unless they've got really really tall caps, which it doesn't look like.

The other picture is a bit more ambiguous. It definately looks taller, though if you notice the heat sink isn't screwed down(note the missing screw near the 'check' by the agp connector), so it might not be flush with the PCB.
 
radar1200gs said:
Stacked as in 2 or more dies are packed and interconnected one on top of the other in a single package.

Its probably more relevant to the world of system memory when there is a limit to how many seperate chips can live on a DIMM module.

Yeah, I figured what "stacked" means, I was just amused how he spins the case to further support the supremacy of GDDR3.
 
Dumb question: what y'all talking about when you say, "but it only has four caps!"?

Ok, I get that "caps" are "capacitators"....but what's the big deal with 4? Should there be more or less, and why? (Please excuse me ignorance and my thanks in advance for any explanation)
 
digitalwanderer said:
Dumb question: what y'all talking about when you say, "but it only has four caps!"?

Ok, I get that "caps" are "capacitators"....but what's the big deal with 4? Should there be more or less, and why? (Please excuse me ignorance and my thanks in advance for any explanation)

I'm alomost as ignorant as you, so someone correct me if I'm wrong....

Think of a capacitor as a "power buffer". It stores energy, and "releases it" in a controlled manner. These cards require clean power, and the "raw" power coming directly off the power supply is not going to be clean enough.

Cards from nVidia and ATI in the past have used many more capacitors. So the question is...given NV40 uses more power, why are there fewer capacitors?

I'm not particularly concerned about this myself....I'm no electrical engineer, but I'd assume either these particular capacitors have a higher rating, or there is some other means to control the power. The reason for things being "different" this time, would likely come down to cost. (Whatever they're doing this time costs more than what they did in the past, but the increased power consumption would require it...)
 
Back
Top