NVIDIA GF100 & Friends speculation

Depends on how much time they really had. I doubt it'll be a scaled up 104, since I'd expect that if they use all "104 ideas" for the high end it'll most likely happen later on.

Good point. I have to say I'm not sure how NVIDIA schedules those things.

And what would they do with all the GTX470 inventory? Breed on it? :LOL:

Nah. Those things are full of hard corners and pins sticking out of everywhere, it would be painful and unpleasant — unless of course you're into that sort of thing.

Seriously though, I have no idea how much GTX 470 inventory remains, but I guess there could be quite a bit left.

Time is a major constraint as everyone knows with those things. However making once a mistake is dumb, while making the same twice is dumbest. The smaller GF10x variants appearing on A1 could indicate that things have gotten a lot better then they were at the start.

Sure. It's still not enough, though, and I wonder exactly why, but perhaps there's no simple answer to that question.
 
And what would they do with all the GTX470 inventory? Breed on it? :LOL:
Selling them as Teslas or Quadros would come to mind, don't you think? That obviously only works for the chips with fully working ROPs/MCs or creating alternative SKUs.

Clocks shouldn't be a factor here, since the pro versions are already clocked lower.

That doesn't rid the board partners of their respective inventories of course.
 
Selling them as Teslas or Quadros would come to mind, don't you think? That obviously only works for the chips with fully working ROPs/MCs or creating alternative SKUs.
No need for more SKUs Quadro 4000/5000 (and 5000M) already have ROPs/MCs disabled (2 disabled on 4000/5000M, one disabled on 5000 apparently based on the amount of memory).
 
Ah, wasn't too sure about that! So, all Nvidia would need to do is buy back the chips from their board partners in order to use them on high-priced pro-models. ;)
 
Ah, wasn't too sure about that! So, all Nvidia would need to do is buy back the chips from their board partners in order to use them on high-priced pro-models. ;)

I hope you don't expect me to shelve that under "serious suggestions" on your behalf :LOL:
 
No, but they'd probably end up with a profit anyway if they'd do that. ;)

BTW, weird theories about GF100+ :)
 
Heard a rumour that a refrese of GF100 is coming, they're aiming at a December release.

Now here's the kicker, to counter AMD's Radeon HD 6000 series, they're gonna rebrand it to GTX 5xx :LOL: Wonder what'll happen to the other GF1xx chips, will they be rebranded as well?? :LOL:
 
Heard a rumour that a refrese of GF100 is coming, they're aiming at a December release.

Now here's the kicker, to counter AMD's Radeon HD 6000 series, they're gonna rebrand it to GTX 5xx :LOL: Wonder what'll happen to the other GF1xx chips, will they be rebranded as well?? :LOL:

You're not alone; see also Carsten's comment above.
 
Now here's the kicker, to counter AMD's Radeon HD 6000 series, they're gonna rebrand it to GTX 5xx :LOL: Wonder what'll happen to the other GF1xx chips, will they be rebranded as well?? :LOL:

If GF10x make it into the 5xx series AND are differentiated somehow by higher clocks, more units enabled etc then that's fair game as they are essentially different products. Same goes for any new chip based on GF100. For the lower end parts I wouldn't expect 5xx versions of those to be released until at least mid next year even if based on current silicon - i.e. another potential 6 month thrashing.
 
second sighting, as they did show a GT420 OEM right on their website. ;)
http://www.dailydigitals.com/nvidia-geforce-gt420-oem-graphic-card.html

I've googled for the GT430 and found some information, a german online retailer is said to have listed a Gigabyte OC model with 1GB ddr3, priced at 72.62€
http://www.presence-pc.com/actualite/GeForce-GT430-40877/
(there's a picture whose source I don't know)

this leave me wondering what exactly are the differences between the 430 and the 420. only clock do vary?
I wait for a cheap dual slot gt430, that would do the job as I would run Crysis on a CRT monitor at most.
 
420, 430 and 440 have more differences than just clocks.
Well, apparently one of them has only 48 shaders. Since you said "differences" there needs to be something else different too. AFAICT that can only be ROPs/memory. I guess from 64bit ddr3 (8 rops) up to 128bit gddr5 (16 rops???) everything is possible...

So GT430 seems to be 2 SMs / 128bit ddr3.
Looks like GT420 is 1 SM / 128bit (or 64bit? or both?) ddr3 (at least the card nvidia had at the website was 128bit ddr3)
Maybe GT440 then is 2 SMs / 128bit gddr5, though I'm not sure gddr5 would help much.
 
Last edited by a moderator:
Well, apparently one of them has only 48 shaders. Since you said "differences" there needs to be something else different too. AFAICT that can only be ROPs/memory. I guess from 64bit ddr3 (8 rops) up to 128bit gddr5 (16 rops???) everything is possible...

So GT430 seems to be 2 SMs / 128bit ddr3.
Looks like GT420 is 1 SM / 128bit (or 64bit? or both?) ddr3 (at least the card nvidia had at the website was 128bit ddr3)
Maybe GT440 then is 2 SMs / 128bit gddr5, though I'm not sure gddr5 would help much.

Anandtech-s 400M article has a list of gf108 and only 128bit DDR3 is there for both 96 and 48 cuda cores. http://www.anandtech.com/show/3887/nvidia-400m-dx11-top-to-bottom/2

The quadro article http://www.anandtech.com/show/3961/nvidia-launches-quadro-2000-600 than shows the 96 cuda core part with 128 bit memory and 8 ROP-s.

So all GF108 seems to be 128bit ddr3 and 8 ROP-s. The diference is than 1 or 2 SM-s and clocks.
 
NVIDIA's GeForce GT 430: The Next HTPC King?

http://www.anandtech.com/show/3973/nvidias-geforce-gt-430

[...]With GT 430, NVIDIA has basically surrendered to AMD on performance. In a very unusual manner, you won’t find NVIDIA extoling the virtues of the card’s performance over AMD’s lineup. Even in our press briefing there was little said about gaming performance beyond the fact that it’s faster than the GT 220 and that NVIDIA believes it’s a meaningful upgrade over Intel’s IGP products due to their greater compatibility with games. Instead NVIDIA is largely selling this card upon its virtues as an HTPC card, or as NVIDIA likes to call the broader market segment: Digital Media PCs.

NVIDIA’s ace in the hole is that they have 1 thing right now that AMD doesn’t: a complete 3D stereoscopy strategy. On the hardware side this is due to the fact that GF104/106/108 all have support for HDMI 1.4a, which is necessary for full resolution 3D television/Blu-Ray and is an advantage afforded to them by the fact that AMD’s products are too old to incorporate support for HDMI 1.4a. On the other side NVIDIA has a coherent 3D strategy, with 3D Vision hardware for PC monitors, and thanks to the HDMI support for sending Blu-Ray 3D to TVs (and later this year, 3D gaming through 3D TV Play). And of course NVIDIA has bitstreaming audio capabilities for compressed lossless audio formats, bringing them up to par with AMD's audio offerings and a step up over the GT 200 series which could only support LPCM.

The long and the short of matters is that for gaming performance NVIDIA is already beat; as we’ll see even a Radeon HD 5570 DDR3 can confidently beat the GT 430, never mind the Radeon HD 5670 which can currently be found for even cheaper than the GT 430 once you factor in rebates.

Funny enough NVIDIA won’t give us the actual size of GF108 (they haven’t done this for any Fermi parts), but using a ruler we’d estimate the size to be 11mm x 10.5mm, or roughly 116mm2.
So chopping half a mm off each dimension: 105mm².
 
Any idea what they did to only have 4 rops? It appears it's 2 memory controllers, but only one L2 partition and one quad-rop block?
(I've argued a long time GF104/106 have too many rops anyway, but still it's surprising somewhat imho they only have 4 rops on a 128bit bus now with the GF108, I would have expected 8... Of course, for raw color fillrate it makes no difference anyway since pixel output is limited to 2 per SM, and given the emphasis isn't on fast AA this might even make sense, especially since these cards only use ddr3 so bandwidth is actually lower than it would be on 64bit gddr5.)
 
Back
Top