Right now, GF104 either does reference the X2 card or there is a mid-range chip that tape out, without much hassle or news about it.
Or GF104 references a binned GF100 that will be used in an X2 product?
Right now, GF104 either does reference the X2 card or there is a mid-range chip that tape out, without much hassle or news about it.
Or GF104 references a binned GF100 that will be used in an X2 product?
Silus said:What I meant was that the GF104 codename was used to reference the Fermi based GeForce X2 card.
Silus said:Right now, GF104 either does reference the X2 card
I was expecting this from NV, but the question remains will they beable to support 3 monitors with only one card or will you have to sli them?
HardOCP just did extensive testing of eyefinity at different resolutions across an array of cards and games based on playable (30 to 50 fps) framerates. Kyle's conclusion was 5850 at a bare minimum, 5870 most balanced, 5890 best with caveats. For real gaming.
http://www.hardocp.com/article/2010/01/05/amds_ati_radeon_eyefinity_performance_review
Fermi needs a minimum of 100 to 120 fps for a workable 3 monitor 3D set-up. Based on HardOCP's menchmarks, it's going to take substantially more than 5890 performance to still retain anything near enthusiast/gamer settings.
But Nvidia are masters at the PR game, it will be fascinating to see what they put together.
What I meant was that the GF104 codename was used to reference the Fermi based GeForce X2 card.
I speculated in the past, that if GF100 (single chip high-end) was quite a bit faster than the HD 5870 (let's assume LegitReviews number - 36%), NVIDIA doesn't really need GF100 X2 to beat the HD 5970. A mid-range chip X2 should be more than enough.
Most considered it to be a flawed speculation, because there were no news of a tape out for such a chip.
Right now, GF104 either does reference the X2 card or there is a mid-range chip that tape out, without much hassle or news about it.
Isn't that what I said ?
Kaotik said:This would contradict every single generation of nVidia codenames since they started to have more than 1 chip per generation
What I meant was that the GF104 codename was used to reference the Fermi based GeForce X2 card.
I speculated in the past, that if GF100 (single chip high-end) was quite a bit faster than the HD 5870 (let's assume LegitReviews number - 36%), NVIDIA doesn't really need GF100 X2 to beat the HD 5970. A mid-range chip X2 should be more than enough.
Most considered it to be a flawed speculation, because there were no news of a tape out for such a chip.
Right now, GF104 either does reference the X2 card or there is a mid-range chip that tape out, without much hassle or news about it.
Yes, but their codename hierarchy has still always remained the sameThis is nVidia you're talking about? Masters of unambiguous naming conventions....
Well it looks like you're in luck now that we have pix of a real card.To get an idea of how big the card will be, ie whether it will fit into my case.
Nvidia demonstrates live Unigine DX11 benchmark with tessellation :
http://pc.watch.impress.co.jp/docs/news/event/20100107_340953.html
This would contradict every single generation of nVidia codenames since they started to have more than 1 chip per generation
I didn't understand you to mean that GF104 was a binned GF100 part, no. I can see that that might be what you meant. :shrug:
GF104 as a dual-board chip is still based on that unverified 'source' of a laid-off engineer. I wouldn't put too much effort into arguing over it
And the truth is, a dual chip isn't necessary unless the performance of GF100 / the single chip is too close to the 5xxx equivalent for comfort, and they want a dual chip to maintain the performance crown. If GF100 is all it's cracked up to be, I don't know how necessary that dual chip is (granted it could be a 2 x cut down cores)
GF104 as a dual-board chip is still based on that unverified 'source' of a laid-off engineer. I wouldn't put too much effort into arguing over it
And the truth is, a dual chip isn't necessary unless the performance of GF100 / the single chip is too close to the 5xxx equivalent for comfort, and they want a dual chip to maintain the performance crown. If GF100 is all it's cracked up to be, I don't know how necessary that dual chip is (granted it could be a 2 x cut down cores)
GF100 is not cracked up to beat Hemlock, so a dual chip card will always be necessary to retake the performance crown.