They can't support it in say the GT200 based Tesla and not support it in the GTX280, that would be bad PR. I think they will have support one way or another (ie. adding it on shrink, or if it's already there adding code to the driver) once they bring out GT200 based mid end devices. Or they will just ignore it completely and try to pressure Microsoft into another DX refresh with a couple new features on top of 10.1 where they have the initiative ... DX refreshes are pretty much a game of microsoft blessed semi-proprietary APIs (NVIDIA will have a harder time getting this than ATI did with 10.1 of course).it still does not explain Nvidia decision to not support Dx10.1 - at all - in Tesla
- or do you think they will be able to add support in their refresh and shrink?
Another CJ-provided Nvidia's GTX 200 series slide turns up at vr-zone.
This time it's a performance chart:
http://www.vr-zone.com/articles/GeForce_GTX_280_&_GTX_260_Gaming_Performance/5817.html
If you happen to get your hands on anything better in the next two weeks be sure to share it with us.
You mean disseminate marketing material? I'm not in that business
Does this apply to us too? (*cough* NV *cough*)If you happen to get your hands on anything better in the next two weeks be sure to share it with us.
Does this apply to us too? (*cough* NV *cough*)
Neither am i.
Compare that graph to this Nvidia-supplied one. I'll take your word for it, but I'm not aware of too many independent sites that use that colour scheme on their graphs - vr-zone certainly don't *shrug*
Did that make the graph so wrong ?
"Visually" deceiving customers through colorful charts with scale issues is not something that ATI is immune of either.
I offered no opinion on whether ATI or Nvidia are less-sinful, just that it's obviously marketing material.
That they feel it necessary to resort to one of the oldest tricks in the book does tell me they're trying to hide something. While I offer no opinion on what that might be, there are certainly plenty of people here who might - if they choose to
Of course they're trying to hide something, these slides are part of a NDA'd PDF file ! NDA means "Non Disclosure Agreement".
Somehow i doubt they would try to hide something with Powerpoint or PDF presentations..., and then hand over the actual hardware so that websites around the world could test against (or in support of) those "official" findings by Nvidia.
It seems to my conspiracy-addled brain that the "leak" of that slide is meant to distract from the rumours about the thermal envelopes of certain upcoming products
What makes you think 6+8-pin is required for HD4870?But [...] the need for 6pin+8pin power plugs for HD4870 (non X2, mind you) does.
What makes you think 6+8-pin is required for HD4870?
Jawed
Yes, 160W (1GB card?), seems to be about the same as 9800GTX. G92b should be cooler/lower-power.Still, 6pin+6pin for a single GPU product that is supposedly cooler than 9800 GTX (at least the 65nm version, i don't know about G92b yet) ? 160W TDP for a GDDR5 card ?
Judging by theseIt will be interesting to compare HD4870 X2 with both GTX 2xx in that regard, when the time comes.