caveman-jim
Regular
Also has anyone seen any discussion/data on Kepler DX11.1 support?
This again ! , any clock increase AMD would be able to raise , NVIDIA could just do the same .
HD 7970 clocks at 5500 MHz , this what I am referring to .
DX11.1 is supported.Also has anyone seen any discussion/data on Kepler DX11.1 support?
It's in the control panel, two settings actually.Do you guys know if the 680's adaptive vsync engages automatically on all games, or is it something that must be forced via control panel or only available on new games that support it? Didn't see that explained on the reviews I saw.
a CUDA toolkit that supports GK104 will be out in the near future (4.2 RC--it's 4.1 + some bug fixes + Kepler support).
Yes, GK104 has CC 3.0.
Apparently, there are at least three review articles with different explanation for the Kepler architecture.Who is right?
It's not like the 680 has abandoned compute. It still does very well in a number of cases. It doesn't look like Nvidia has placed much priority in the areas where it doesn't however.
I'm not sure how much AMD plans to follow. Its path has already been outlined, and compute/integration enhancements are in store.
The question is whether AMD intends to push ahead on facets of its graphics domain that it has modestly improved or tweaked: areas the 680 has exploited.
The reason Nvidia have dropped the compute from GK104 is because the GK110 is still on the way. I expect all of the compute will be back in and it will depress overall performance in the chip but it is clear that the 104 is almost purely game focussed. Nvidia have completely outmanoeuvred AMD this gen by removing the compute from GK104 and pitching it to gamers while leaving their big die for Tesla applications. AMD will have to go for a similar move or be left behind next generation.
With the 8 special full-rate 64-bit units, I now fully expect that the GPGPU card from the Kepler generation will not actually be a graphics card.
Something 500mm²ish with no 32-bit units, triangle setup, interpolation, etc at all and just fully decked with 64-bit shader groups. It would be a monster.
Interesting, was the GTX 680 supposed to be the 670Ti?
http://www.geforce.com/Active/en_US...ce-gtx-680/NVCPL-WindowsDesktopManagement.png