NVIDIA GF100 & Friends speculation

The difference is that nVidia is actively reaching out to developers, in effect saying, "Let us help you with that. We can make your game run faster!" And also bear in mind that nVidia's developer relations program is highly unlikely to exclude optimizations that would also help ATI hardware, thus the TWIMTBP program likely improves game performance for both IVH's, but obviously the program's main focus is going to be nVidia hardware, not least because nVidia's support surely knows much more about nVidia hardware than ATI hardware.
I would like to think that there are no free lunches in the world. Nvidia isnt sponsoring anything to make it run efficient on their competitor's hardware. If they find an performance increase in codepath, they arent going to share it. It makes perfect sense not to.

Besides the point wasnt about how good Nvidia's devrel was, it was about 4A having nothing to lose in this case, well except TWIMTBP, if ever that was on the line. Like I said, since we do not know what was in play, we cant really conclude one way or another.
 
I would like to think that there are no free lunches in the world. Nvidia isnt sponsoring anything to make it run efficient on their competitor's hardware. If they find an performance increase in codepath, they arent going to share it. It makes perfect sense not to.
The way I understand it, the biggest win from nVidia's perspective isn't the relative performance vs. ATI hardware, but instead the sticker on the box, which means that every gamer that buys that game is getting an ad for nVidia as well. If increasing the performance also on ATI hardware as well as their own is the price they have to pay to keep developers coming back to them (as developers surely don't want to alienate ATI users), then so be it.
 
As far as I know, ATI doesn't have a comparable developer relations program, so they would hardly have reason to show ATI their code. ATI might just say, "That's nice. Why are you showing this to us?"

The difference is that nVidia is actively reaching out to developers, in effect saying, "Let us help you with that. We can make your game run faster!" And also bear in mind that nVidia's developer relations program is highly unlikely to exclude optimizations that would also help ATI hardware, thus the TWIMTBP program likely improves game performance for both IVH's, but obviously the program's main focus is going to be nVidia hardware, not least because nVidia's support surely knows much more about nVidia hardware than ATI hardware.

Clearly you not read such articles as this:

http://www.techreport.com/articles.x/18620

You're ascribing to complete fallacies if you think our dev rel just ignores developers. :rolleyes:
 
As another data point, some years ago there was talk about Intel's Montecito chip being packaged with a TEC. My exposure to that market is nil, so I don't know if this caught on.
Some of the discussions about the leakages of the very large Itanium die at the upper region of the power envelope hinted that it might have actually saved power to put a peltier on the chip because reducing the chip's temp prevented more watts in leakage than the cooler would consume.

The scheme also had possible benefits in increasing reliability of the silicon and helped in cramped server environments.
There was some amount of "OMG ITANIC IS ON FIRE" mockery, but I thought the idea of hitting a spot on the leakage/temp curve where a peltier saved power was kind of neat.
 
This is the second or third time they have done such a thing, outlining is quite different then actually getting it done.

And some more times in the last week.

http://blogs.amd.com/play/2010/03/26/behind-the-scenes-with-the-amd-gaming-evolved-team/
http://blogs.amd.com/play/2010/03/29/amd-supported-games-top-the-charts/

Interesting that they keep mentioning Bad Company 2 as a title they worked with the devs to get the most out of considering its has flickering texture issues and much longer loading times on ATI cards, if this is an example of a game that ATI have worked on then it doesn't show a very good impression.

Other games like Just Cause 2 have issues with Eyefinity and being unable to see the quick time events in order to hijack helicopters
 
This is the second or third time they have done such a thing, outlining is quite different then actually getting it done.

Well talking is easy. And cheap to boot. It would be nice to get candid feedback from a code monkey exposed to TWIMTBP though. All we get are fluff statements in interviews with executive developers and from PR.
 
And some more times in the last week.

http://blogs.amd.com/play/2010/03/26/behind-the-scenes-with-the-amd-gaming-evolved-team/
http://blogs.amd.com/play/2010/03/29/amd-supported-games-top-the-charts/

Interesting that they keep mentioning Bad Company 2 as a title they worked with the devs to get the most out of considering its has flickering texture issues and much longer loading times on ATI cards, if this is an example of a game that ATI have worked on then it doesn't show a very good impression.

Other games like Just Cause 2 have issues with Eyefinity and being unable to see the quick time events in order to hijack helicopters

Heh that AMD blog used the poll @ [H]ardOCP to, and I quote "make it clear that gamers want the richer, more realistic, and more immersive graphical innovations like those that AMD products offer"

PR truly does wonders :)
 
Heh that AMD blog used the poll @ [H]ardOCP to, and I quote "make it clear that gamers want the richer, more realistic, and more immersive graphical innovations like those that AMD products offer"

I would love to know how many of those gamers actually plonked down the cash :)
 
And some more times in the last week.

http://blogs.amd.com/play/2010/03/26/behind-the-scenes-with-the-amd-gaming-evolved-team/
http://blogs.amd.com/play/2010/03/29/amd-supported-games-top-the-charts/

Interesting that they keep mentioning Bad Company 2 as a title they worked with the devs to get the most out of considering its has flickering texture issues and much longer loading times on ATI cards, if this is an example of a game that ATI have worked on then it doesn't show a very good impression.

Other games like Just Cause 2 have issues with Eyefinity and being unable to see the quick time events in order to hijack helicopters

Its a d11 thing , the gtx 480 has the problem
 
Because both IHVs try to get their consumers the best gaming experience possible?

Sure, but if it's a DX11 bug wouldn't we need a Microsoft patch to the runtime or a game patch to work around the issue? Though I've still not seen it reported anywhere that the GTX 470/480 are affected, and also ATI's DX10 cards have the same issue.

I've seen reports of games loading slower on ATI hardware for a while now, even Humus mentioned that Just Cause 2 loaded slower on ATI hardware, something to do with the shader compiler.

I only hope that if ATI do fix the issue, it's a fix that will affect more than just BC2.
 
Sure, but if it's a DX11 bug wouldn't we need a Microsoft patch to the runtime or a game patch to work around the issue?

I think he meant it's a BC2/DX11 bug, not a problem with the DX11 api itself.

Though I've still not seen it reported anywhere that the GTX 470/480 are affected, and also ATI's DX10 cards have the same issue.

Cause reviewers were probably so busy writing down fps numbers they wouldn't have noticed anyway.
 
Also apparently ATI didn't get access to the final game code either.

http://hardforum.com/showpost.php?p=1035518165&postcount=27

I just can't fathom why they would purposely not give ATI access to the final code to tweak their drivers, if TWIMTBP is just supposed to be Nvidia offering support like they claim.


That kind of crap has been going on for years, what software company would not want their game to run 'well' on all hardware vs. targetting one brand, this type of devious marketing has been demonstrated by Intel also. I remember talking to the lead STALKER programmer via email telliing me to dump my Radeon 9700 for a 5800 Ultra as it was a 'better' shader throughput card ..ughhhh
As consumers we have the ultimate power, after all we pay for the product, if only the majority of consumers knew they were being duped or mislead on their purchase it would be a different story, why not optimize for the two big players and let the winner be the hardware.
Since game developers are making more profit than the movies industry globally (we are talking Billions of dollars), I don't buy into the poor game developers needing 'support' from ATI/Nvidia.
So if a game title doesn't run well on my hardware, it won't be bought, simple as that...man I get pissed off that this stuff goes on today with all the excellent hardware to develop on.
 
Back
Top