NVIDIA GT200 Rumours & Speculation Thread

Status
Not open for further replies.
it still does not explain Nvidia decision to not support Dx10.1 - at all - in Tesla
- or do you think they will be able to add support in their refresh and shrink?
They can't support it in say the GT200 based Tesla and not support it in the GTX280, that would be bad PR. I think they will have support one way or another (ie. adding it on shrink, or if it's already there adding code to the driver) once they bring out GT200 based mid end devices. Or they will just ignore it completely and try to pressure Microsoft into another DX refresh with a couple new features on top of 10.1 where they have the initiative ... DX refreshes are pretty much a game of microsoft blessed semi-proprietary APIs (NVIDIA will have a harder time getting this than ATI did with 10.1 of course).
 
Last edited by a moderator:
Compare that graph to this Nvidia-supplied one. I'll take your word for it, but I'm not aware of too many independent sites that use that colour scheme on their graphs - vr-zone certainly don't *shrug*

Did that make the graph so wrong ?
"Visually" deceiving customers through colorful charts with scale issues is not something that ATI is immune of either.

Also, remember that the previous slide was made when RV670 hadn't yet been launched (8800 GT launched first), while this time NV is comparing GTX 280/260 with a known quantity (HD3870 X2), not RV770.
 
Did that make the graph so wrong ?
"Visually" deceiving customers through colorful charts with scale issues is not something that ATI is immune of either.

I offered no opinion on whether ATI or Nvidia are less-sinful, just that it's obviously marketing material.

That they feel it necessary to resort to one of the oldest tricks in the book does tell me they're trying to hide something. While I offer no opinion on what that might be, there are certainly plenty of people here who might - if they choose to ;)
 
I offered no opinion on whether ATI or Nvidia are less-sinful, just that it's obviously marketing material.

That they feel it necessary to resort to one of the oldest tricks in the book does tell me they're trying to hide something. While I offer no opinion on what that might be, there are certainly plenty of people here who might - if they choose to ;)

Of course they're trying to hide something, these slides are part of a NDA'd PDF file ! NDA means "Non Disclosure Agreement", right ? :p ;)

Somehow i doubt they would try to hide something with Powerpoint or PDF presentations..., and then hand over the actual hardware so that websites around the world could test against (or in support of) those "official" findings by Nvidia, all this roughly two weeks before the NDA expiration date.
 
Of course they're trying to hide something, these slides are part of a NDA'd PDF file ! NDA means "Non Disclosure Agreement". :p ;)

Somehow i doubt they would try to hide something with Powerpoint or PDF presentations..., and then hand over the actual hardware so that websites around the world could test against (or in support of) those "official" findings by Nvidia.

It seems to my conspiracy-addled brain that the "leak" of that slide is meant to distract from the rumours about the thermal envelopes of certain upcoming products ;)
 
It seems to my conspiracy-addled brain that the "leak" of that slide is meant to distract from the rumours about the thermal envelopes of certain upcoming products ;)

Well, ~240W TDP for GTX 280 and ~180W TDP for GTX 260 don't surprise me.
But a 160W TDP and the need for 6pin+6pin power plugs for HD4870 (non X2, mind you) does. It makes me wonder where a HD4870 X2 would stand, and if it would be that far from 240W too. ;)
 
Last edited by a moderator:
What makes you think 6+8-pin is required for HD4870?

Jawed

Obviously, a typo.
Still, 6pin+6pin for a single GPU product that is supposedly cooler than 9800 GTX (at least the 65nm version, i don't know about G92b yet) ?
160W TDP for a GDDR5 card (wasn't lower power -or similar power envelope at high speeds- one of the main points of said technology) ?

It will be interesting to compare HD4870 X2 with both GTX 2xx in that regard, when the time comes.
 
Still, 6pin+6pin for a single GPU product that is supposedly cooler than 9800 GTX (at least the 65nm version, i don't know about G92b yet) ? 160W TDP for a GDDR5 card ?
Yes, 160W (1GB card?), seems to be about the same as 9800GTX. G92b should be cooler/lower-power.

Jawed
 
Status
Not open for further replies.
Back
Top