NVIDIA: Beyond G80...

NVIO has, as far as I remember, always been described as supporting the SLI connection (amongst other things) because that functionality isn't within G80 itself.

Jawed
 
NVIO has, as far as I remember, always been described as supporting the SLI connection (amongst other things) because that functionality isn't within G80 itself.

Jawed

How can you explain then, that it exists in G84/G86 since, obviously, they sport an SLI edge connector and do not have a NVIO chip ?
What has been generally accepted is that NVIO includes RAMDAC, TMDS and HDCP functionality because it was assumed that moving those to an external chip was beneficial to yields, due to G80's huge 90nm die, but no mention of SLI also being included was ever made.
At least not officially.
 
Last edited by a moderator:
It was always my theory, and I think Demirug hinted as much as well, that NVIO would make it possible to need only one of those for a GX2 instead of duplicating the logic in both G80s.
 
[...]but no mention of SLI also being included was ever made.
I'm off to bed. There's a big pile of stuff that isn't "official"...

http://www.beyond3d.com/content/reviews/1/3

You can also see dual SLI connectors that feed in to NVIO (pictured below), NVIDIA's brand new I/O processor for handling signal input and output to their next generation of GPUs. NVIO marshalls and is responsible for all data that enters the GPU over an interface that isn't PCI Express, and everything that outputs the chip that isn't going back to the host. In short, it's not only responsible for SLI, but the dual-link DVI outputs (HDCP-protected), all analogue output (component HDTV, VGA, etc) and input from external video sources.

Jawed
 
I was wondering the same thing. is nVidia just playing the number game? I doubt there would be any G90 (NV55) derivatives this early. thang.. the complete 50 product range hasn't even come out yet.

ven implying it's G90 derivative would have so many consequences, DX10.1 support etc...
 
Last edited by a moderator:
Yeah saw that too. G80 is supposedly late which would imply they were shooting for the refresh soon but most likely it's no big deal.
 
yea, why was DX4 not released? presumably it was optimised for some bitboyz chip but there was a giant conspiracy involving NV, ATI, S3, powerVR etc. to make them fail.
 
yea, why was DX4 not released? presumably it was optimised for some bitboyz chip but there was a giant conspiracy involving NV, ATI, S3, powerVR etc. to make them fail.

IIRC, at the time it was mentioned that it was because the number 4 is considered unlucky in Japan. Something to do with the word for 4 also sounding like the word 'death'.
 
yea, why was DX4 not released? presumably it was optimised for some bitboyz chip but there was a giant conspiracy involving NV, ATI, S3, powerVR etc. to make them fail.

After DirectX 3, we had planned a DirectX 4 for December 1996 that would allow access to some special features that Cirrus Logic was going to put into laptop video chips (I think, its been 9? years). When the chips got delayed, we opted not to ship DirectX 4 as it had us in a huge rush (3 months between 3 & 4) for no reason. We had also told the game developer community about Direct X 5 that was targeting summer of 1997, and so we decided to simply skip DirectX 4 rather than confuse people. DirectX 5 shipped on July 16, 1997 - and to this day, people ponder about what happened to DirectX 4. So much for avoiding confusion.
http://craig.theeislers.com/2006/02/directx_then_and_now_part_1.php
 
IIRC, at the time it was mentioned that it was because the number 4 is considered unlucky in Japan. Something to do with the word for 4 also sounding like the word 'death'.

That's Taiwan/China, but even the Pentium IV sold well there..
 
Back
Top