NV30 in a multi-chip / board configuration...

McElvis

Regular
I've asked this question in another thread, but it got quickly buried ;)

From any of the previews or demonstrations from the GeForceFX launch, was it ever mentioned about multi-chip / board configurations?
 
McElvis said:
I've asked this question in another thread, but it got quickly buried ;)

From any of the previews or demonstrations from the GeForceFX launch, was it ever mentioned about multi-chip / board configurations?



Did see something about a multichip version somewhere...not final yet as they are trying to decide between Whit & Pratney or Rolls Royce for the cooling engine.
 
WaltC said:
McElvis said:
I've asked this question in another thread, but it got quickly buried ;)

From any of the previews or demonstrations from the GeForceFX launch, was it ever mentioned about multi-chip / board configurations?



Did see something about a multichip version somewhere...not final yet as they are trying to decide between Whit & Pratney or Rolls Royce for the cooling engine.

Prat and whitney ???

r u drunk
 
No need for Pratt and Whitney. As you can see on this website, even a small jet engine can provide the cooling necessary to keep beer cold on a hot New Zealand day, or in this case cool a videocard that produces several megawatts of thermal energy.

engine1.jpg

beerat2.jpg


As a side benefit, the noise level of such a small jet is a mere 125 dB at 1 meter away, a far cry from the 150 dB you'd get from a F18... I mean NV30 taking off next to you. If nVidia had any brains they would use such a cooler for their NV30. It would cool both the videocard and their beer.
 
Please, people...

I know many of you are old enough to have owned 386 and 486 computers. Do you remember the scoffing when Intel revealed that you had to mount a fan on the heatsink for the new Pentium 60/66 processors?

I remember seeing cartoons showing engineers with their hair blowing in the wind because of the monstrous fans mounted to the case, with the caption being one engineer stating, "See, I told you heat wasn't going to be a problem!"

It should be obvious by now that the CPU and VPU are going to both require a case and cooling design that will allow dissipation of lots of heat. That is, until someone comes up with a semiconductor that doesn't bleed off heat like current technologies do.

For a couple years now I've been wondering why motherboard designers and case designers won't spec out something that leaves plenty of room for at least one VPU and a large cooling solution.

I can't remember what 3GIO has been finally named, but does anyone know if they're finally going to come up with a case design that allows for the VPU to occupy as much space and cooling capacity as the CPU?

I'm all for quiet computers, but I think the problem is that it is hard to be quiet if you're trying to move air in a tiny space with a limited size fan. Logic dictates that more space should be given to the VPU card and the case design and airflow system should cater to large heatsink designs on the video card.

Something akin to how Dell boxes have a fan on the back of the case with a plastic duct to direct the cool air onto the processor, removing the need for the fan to be placed directly on the processor, and allowing for a larger fan that moves more air at slower RPMs and is thus quieter.
 
Intel have been working on trying to specify cooling solutions for PC's for years, but so far they have failed to make it stick.

IMHO the main problem is the pick-and-mix nature of the PC - if your motherboard, fan and case don't come from the same manufacturer it's hard to get anything other than the 'generic' solutions to work.

Dell have an advantage in this respect, of course.
 
At least most P4 motherboards have the area around the socket cordoned off so there is room for the cooling solution. I don't know how many times I've been mounting the heatsink and have had trouble with the clip because a capacitor was too close to the socket or there was actually a SMT component almost directly under the clip notch on the socket.

They may not be able to dictate the rest of the case, but at least they are finally getting a bit of room around and above the processor reserved for cooling purposes.
 
Haha, thats funny. Not.

I wish we could quit these incessant jabs. Just answer the question and leave your snide comments at the door.
 
ATI's R300/R9700 supports the ability to combine the processing power of upto 256 VPUs.

I am sure that NV30/GFFX supports a similar capability. if not more
(imagine 512 of them, hehe. no I don't know anything on that tho.)


Even just a few dozen would produce mind blowing realtime 3D.

Remember one of the high-end board designers for 3DFX put 32 VSA-100s
on a single board (alcamley or quantam3d i think)

holy crap.


I hope that XBox 2 goes in this direction, because I do not see Nvidia coming up with a single GPU that can match the power of PS3's CELL/GS3 chipset(s) - PS3 is slated to have absolutely massive, massive bandwidth and parallelism.

perhaps Nvidia will try to put multipule NV50 cores on a single die, or MS will pressure them into finally using several GPUs in a single system, even if NV50 is one chip. I am not saying Nvidia will make stop makeing single chip GPUs. just use more of them for XB2.
 
RussSchultz said:
Haha, thats funny. Not.

I wish we could quit these incessant jabs. Just answer the question and leave your snide comments at the door.
Damn - did someone crap in your tea this morning?
 
I'm certain that nVidia would like to get these chips in the hands of the truly high-end systems: those used for movie-style 3D graphics. They've probably already got a team working on evangelizing the technology of the GeForce FX to these markets, and attempting to see if they can convince people that they're good enough for the job (That, or they're having a third party do it...).

That, and they might be attempting to sell them to, say, SGI for use in their high-end visualization systems.

These are the only places where I would expect to see a multichip NV30 get any use. It would just cost far too much (In power/heat, not to mention actual money) to do it for a consumer-level solution.
 
megadrive0088 said:
I hope that XBox 2 goes in this direction, because I do not see Nvidia coming up with a single GPU that can match the power of PS3's CELL/GS3 chipset(s) - PS3 is slated to have absolutely massive, massive bandwidth and parallelism.
People said the same about the GS/Emotion Engine combination of the PS2. I doubt if many would say the PS2 is the most capable console now.

There is no reason the same pattern will hold true for the future - any predictions now are wild speculation.
 
Althornin said:
RussSchultz said:
Haha, thats funny. Not.

I wish we could quit these incessant jabs. Just answer the question and leave your snide comments at the door.
Damn - did someone crap in your tea this morning?

No, it was just the Nth thread that somebody was yapping about loud fans. Haha joke is funny once. I don't want to read it in every single thread.
 
Back
Top