ATI's DX11 tessellation to be aided by the shader core?

Hmm.. Some sentences seem a bit naive to me.

Nvidia uses 16 tessellation units that are emulated within the CUDA cores.

This was what Charlie said. It was never proved that there isnt any dedicated hardware for Tesselation in GF100.

Currently, ATI is working on changing this within the drivers so that the Stream Processors will aid the Tessellation Unit with processing tessellation.

Hello? They are already aiding the Tesselation Unit by doing Domain and Hull Shaders.

They're also working to further improve tessellation by making the cache work better between the tessellation unit and the SP's, which is going to be in the 11.0 Catalyst drivers.

Isnt this correlated with hardware? If the hardware was intended to be like that, drivers would have this long ago. They would not be waiting for Fermi to come out. Maybe this will be addressed in SI, but i doubt something like this can be done in Evergreen just by software tweaks.
 
I heard this is coming out right after dx9.1
biglol.gif


I get it! :LOL:
 
I loved that DX 9.1 rumor... It was so deliciously desperate... The guys who released and fixed the original Longhorn OS to show "what could've been" should make sure they include a DirectX version that is "optimized for the FX architecture" like 9.1 was supposed to be...
 
Just for the record, I gave it a shot with 10.5 beta on my 5850 crossfire and I saw no great difference. Just a couple of frames, perfectly within the error margin.
 
Just for the record, I gave it a shot with 10.5 beta on my 5850 crossfire and I saw no great difference. Just a couple of frames, perfectly within the error margin.

What 10.5 beta? :smile:
But not these betas: http://www.rage3d.com/board/showthread.php?t=33961625?

Seriously: Are there no ways to improve Evergreens' tessellation performance? Shader replcements, better LDS usage, etc?
 
Last edited by a moderator:
Generally, 3D hardware vendors don't retrofit new features onto existing products... They want us to buy new hardware entirely instead, so that they'll make more money. :) Also, it's probably easier to just improve tesselation in the next generation of GPUs than to try and fix the current product line.

My wager is that as usual, wild internet rumors are just that. ;)
 
Generally, 3D hardware vendors don't retrofit new features onto existing products...

I would contend that, actually. Generally speaking, where we can, we try and backfill features to the prior gen where they are capable of doing it. Occasionally there have been some cases where the feature has been qualified first on a newer set of hardware an there has been a lag in getting those new features on the old stuff. An example of this is CCC recently added some new video capabilities and they are not Evergreen specific.
 
On the other hand, ATI announced a few months back you guys would not spend much effort on improving OpenCL performance for the 4xxx series. So the door definitely swings both ways. :p
 
The real ugly thing, that you did, was the removal of SGSSAA under OpenGL for HD4000 owners.
Don't care about that, or anything SS-related at all really. What I REALLY miss is them removing alpha to texture type "antialiasing" for billboard/foliage type polygons and replaced it with supersampling that even on a crossfire 4890 setup is so damn slow it's totally unusable even in titles like WoW that's half a decade or more behind the curve on a technical level.

Alpha to coverage looked great, ran fast. I don't understand at all why the fuck it was removed and replaced with something that's just total crap that simply can't be used. Forget Crysis... The GPU would have a coronary infarction just thinking about it. :cry:

Other than the extreme idle power useage of the 4890, this is the only thing that really bugs me with those cards. Well, the crappy texture AF too and the less effective MSAA compared to G80 (particulary on lines, which don't seem to be AA'd at all), but that I can ignore more easily.

However, WoW on my now almost ancient G80s in SLI, 16x AA, alpha to coverage turned on, max AF...looks awesome, runs slick as snot. ATI 8x AA + max AF is noticeably more jagged, + textures shimmering where they do not on the G80s, and of course no alpha to coverage, so crawling ants around the edges of every damn tree and plant. That sucks.
 
Back
Top