AMD: R9xx Speculation

Get some sweet looking demos out using adaptive tessellation that run well on their hardware? What better way to prove their point than to produce something that runs faster and looks better than the uber-tessellated stuff nVidia is pushing? Then everybody will see for themselves that nVidia's approach is wasteful and pointless.
.

not demos but we can see for ourselves soon enough: http://gamevideos.1up.com/video/id/31916/bigger
 
That's gotta be a mistake, AT...AMD isn't pushing tessalation! :oops:
Of course they dont! ATI invented tessellation for themselves. Games like Deus Ex: Human Revolution, Dirt 2, Battlefield Bad Company2, Alien vs Predator, Stalker Pripyat, etc. are meant for AMD internal use only! Therefore Sontin and Trini are correct, AMD isnt pushing tessellation whatsoever :p
 
Sontin has been sent on his way. I'd suggest learning from his story, rather than dancing wildly around the celebration fire. If you feel the need to do some heavy IHV evangelism, just don't. Contribute to debates, if Trini is wrong, at least attempt to outline why - ceilingcat knows that both sides of that debate have some merit.
 
Sontin has been sent on his way. I'd suggest learning from his story, rather than dancing wildly around the celebration fire. If you feel the need to do some heavy IHV evangelism, just don't. Contribute to debates, if Trini is wrong, at least attempt to outline why - ceilingcat knows that both sides of that debate have some merit.
/me shuts up and flies right

Sorry Alex, couldn't resist. I shall endeavor to behave. :oops:
 
This tessellation mania is really interesting. nVidia, who completely ignored tessellation for years with G80, G92 and GT200 and caused its removal from DX10 specs, now presents itself as the company, that gave tessellation to gamers. If I sum it up, tessellation wasn't important in 2006, 2007 and 2008. It was completely irrelevant in 2009, especially by the year's end and during the first half of 2010. At that time, nVidia said, that AMD is punishing gamers by implementing DX11 tessellation... But nVidia changed it's mind and tessellation became essencial and extremely important in H2/2010...

//late :(
 
Sontin has been sent on his way. I'd suggest learning from his story, rather than dancing wildly around the celebration fire. If you feel the need to do some heavy IHV evangelism, just don't. Contribute to debates, if Trini is wrong, at least attempt to outline why - ceilingcat knows that both sides of that debate have some merit.
I'm sorry, I'm cute... I mean - I'll behave :p
 
Last edited by a moderator:
personally it seems AMD is out to promote tessellation every one can use, while Nvidia is pushing tessellation only the 470 and 480 can use.

I know what team I choose to support.
 
Why would a video card support Ethernet over HDMI?

capturing video/audio from an integrated AV device; passthough of integrated features of the TV, such as GoogleTV; passthrough of encoded streams for the TV to process (for example, PiP or commentary); file transfer from integrated usb/sd card slot, without additional cable.

Reasons off the top of my head.
 
This tessellation mania is really interesting. nVidia, who completely ignored tessellation for years with G80, G92 and GT200 and caused its removal from DX10 specs, now presents itself as the company, that gave tessellation to gamers. If I sum it up, tessellation wasn't important in 2006, 2007 and 2008. It was completely irrelevant in 2009, especially by the year's end and during the first half of 2010. At that time, nVidia said, that AMD is punishing gamers by implementing DX11 tessellation... But nVidia changed it's mind and tessellation became essencial and extremely important in H2/2010...

//late :(
(my bold)
I heard this time and again repeated on forums. Is there some hard evidence on that? I'd love to see the whole story.
 
It's not about that but about monitor temperature calibration.

How this could've gone unnoticed between the beta and release is beyond me though.
It's been said that if you move the sliders for brightness, contrast, color and hue and things like boarders, panes, buttons and in particular gray area turning pink. I don't think it's limited to just the Kelvin slider. Also there is an issue that once you do go back to previous drivers hue and brightness doesn't save the number you choose.
 
Last edited by a moderator:
(my bold)
I heard this time and again repeated on forums. Is there some hard evidence on that? I'd love to see the whole story.
Hard evidence is impossible in such cases (do you really think Microsoft will come out and publicly state it?), same as in Assassin Creed DX10.1, just logical reasoning and inside sources. As in - Nvidia fans wont ever believe it, the rest of the World wont have a hard time assuming its probably true.
 
You mean like Froblins?

I think you and others are missing the point, intentionally or otherwise. It's not that AMD hasn't been talking about tessellation. I must be the only person here who sees the irony in a company pitching a 7th generation feature vs the competition's 1st generation while simultaneously complaining that the competition has "too much" performance. Also, after 7 generations if the best you can come up with is stuff like this you're asking for trouble.

Let me put it this way. If Huddy approached you right now and said "Hey Jawed, nVidia is pushing this over tessellated crap. What do you recommend we do to show people the right way to do things?"

Will you tell him to dust-off "Froblins" - a proprietary DX10.1 implementation that is not only visually uninspired but irrelevant in the DX11 world? Will that be your counter to nVidia's marketing money and aggressive tactics? There is such a thing as being too techie, a little pragmatic thinking goes a long way.

not demos but we can see for ourselves soon enough: http://gamevideos.1up.com/video/id/31916/bigger

Yes, that's exactly what they need to do. Put the focus on them and what they're doing, not what nVidia is doing TO them. :)
 
I think you and others are missing the point, intentionally or otherwise. It's not that AMD hasn't been talking about tessellation. I must be the only person here who sees the irony in a company pitching a 7th generation feature vs the competition's 1st generation while simultaneously complaining that the competition has "too much" performance. Also, after 7 generations if the best you can come up with is stuff like this you're asking for trouble.

Let me put it this way. If Huddy approached you right now and said "Hey Jawed, nVidia is pushing this over tessellated crap. What do you recommend we do to show people the right way to do things?"

Will you tell him to dust-off "Froblins" - a proprietary DX10.1 implementation that is not only visually uninspired but irrelevant in the DX11 world? Will that be your counter to nVidia's marketing money and aggressive tactics? There is such a thing as being too techie, a little pragmatic thinking goes a long way.
How can you miss the point after 10x answers? I'll try it as simple and short as possible:

1. Nobody is complaining about Nvidia pushing tessellation as well, AMD and the rest point out its absurd to push tes. which is discarded 75% like in HAWX.

2. AMD is helping implement tes. in many games (the list I mentioned above isnt even complete, yet you speak as if AMD isnt doing anything), but they dont pay out devs to either harm competition, nor its sufficient to overcome cheque Nvidia is writing to them. Thats pragmatic thinking you should be talking about. Cheque > proper implementation of anything, DX10.1, tessellation, AA, or whatnot.
 
How can you miss the point after 10x answers? I'll try it as simple and short as possible:

1. Nobody is complaining about Nvidia pushing tessellation as well, AMD and the rest point out its absurd to push tes. which is discarded 75% like in HAWX.

2. AMD is helping implement tes. in many games (the list I mentioned above isnt even complete, yet you speak as if AMD isnt doing anything), but they dont pay out devs to either harm competition, nor its sufficient to overcome cheque Nvidia is writing to them. Thats pragmatic thinking you should be talking about. Cheque > proper implementation of anything, DX10.1, tessellation, AA, or whatnot.

I notice you selectively didn't quote his fourth paragraph(Well, it was more of a sentence), which praised AMD.
 
I think you and others are missing the point, intentionally or otherwise. It's not that AMD hasn't been talking about tessellation. I must be the only person here who sees the irony in a company pitching a 7th generation feature vs the competition's 1st generation while simultaneously complaining that the competition has "too much" performance. Also, after 7 generations if the best you can come up with is stuff like this you're asking for trouble.

Let me put it this way. If Huddy approached you right now and said "Hey Jawed, nVidia is pushing this over tessellated crap. What do you recommend we do to show people the right way to do things?"

Will you tell him to dust-off "Froblins" - a proprietary DX10.1 implementation that is not only visually uninspired but irrelevant in the DX11 world? Will that be your counter to nVidia's marketing money and aggressive tactics? There is such a thing as being too techie, a little pragmatic thinking goes a long way.

Yes, that's exactly what they need to do. Put the focus on them and what they're doing, not what nVidia is doing TO them. :)

AMD is doing something very concrete: helping developers and pushing adaptive tesselation, which is the only sane way to tesselate. Sadly, some developers *cough* Ubisoft *cough* refuse to listen because of marketing contracts with NVIDIA.

And the fact is that it works very well: as far as I'm aware, all currently available DX11 games feature tesselation, and they all run very well on Radeons. Now I guess people who spend all day playing Heaven and Stone Giant will be better off with Fermi, but I'm sure NVIDIA has more than enough GTX 470s/480s in stock for them.
 
I notice you selectively didn't quote his fourth paragraph(Well, it was more of a sentence), which praised AMD.
Thats what AMD is doing all along, and I mentioned loads of examples which Triny choose to ignore. The praise afterward looks... interesting and contradictory to many of his posts.
 
Nvidia also recomended for developers to have minimum 8 pixel/triangle if possible. They are just at the point where they can push it further with a smaler penalty than AMD and use it for marketing purpose. :rolleyes:

On the other side the HAWX 2 benchmark results are still high for AMD cards. So its not a situation where it would be unplayable on AMD cards because of nvidia . ;)
From hardware.fr
http://www.hardware.fr/articles/804-17/dossier-amd-radeon-hd-6870-6850.html
 
Nvidia also recomended for developers to have minimum 8 pixel/triangle if possible. They are just at the point where they can push it further with a smaler penalty than AMD and use it for marketing purpose. :rolleyes:

On the other side the HAWX 2 benchmark results are still high for AMD cards. So its not a situation where it would be unplayable on AMD cards because of nvidia . ;)
From hardware.fr
http://www.hardware.fr/articles/804-17/dossier-amd-radeon-hd-6870-6850.html

5670 and below would certainly be close to unplayable at max settings, even at 1680x1050.
 
Back
Top