NVIDIA Kepler speculation thread

http://www.abload.de/img/diablo-3-ao-1080p8tug9.png

what should we understand from this graph.. AFAIK Ambient Occlusion is pure shader intensive?, looks like GK104 is shader limited here otherwise it wouldnt take the heaviest hit when AO enabled.. OTOH following that logic GTX260 is taken lowest hit but you would think opposite because it's texture/pixel power relatively stronger and lower at shaders than the others..
 
http://www.abload.de/img/diablo-3-ao-1080p8tug9.png

what should we understand from this graph.. AFAIK Ambient Occlusion is pure shader intensive?, looks like GK104 is shader limited here otherwise it wouldnt take the heaviest hit when AO enabled.. OTOH following that logic GTX260 is taken lowest hit but you would think opposite because it's texture/pixel power relatively stronger and lower at shaders than the others..
I think AO is also depth-buffer access bound, depending on the implementation.

p.s.: Probably the limitation is in the memory bandwidth -- 115 vs. 111 GB/s in favour of the GXT460.
 
Last edited by a moderator:
I think AO is also depth-buffer access bound, depending on the implementation.

p.s.: Probably the limitation is in the memory bandwidth -- 115 vs. 111 GB/s in favour of the GXT460.

Thanks fellix.. It's nvidia driver AO, game doesnt support AO natively.. I guess NV use same AO method for all profiles.. You seem to be right, I found Tridam's article about NV's AO i hope it helps.. Lots of Z-buffer talk -not that i understand all :)
 
Thanks fellix.. It's nvidia driver AO, game doesnt support AO natively.. I guess NV use same AO method for all profiles.. You seem to be right, I found Tridam's article about NV's AO i hope it helps.. Lots of Z-buffer talk -not that i understand all :)
Diablo 3 has baked AO solution, using driver based SSAO (actually HBAO) does mess image stability and quality.
 
Thanks fellix.. It's nvidia driver AO, game doesnt support AO natively.. I guess NV use same AO method for all profiles.. You seem to be right, I found Tridam's article about NV's AO i hope it helps.. Lots of Z-buffer talk -not that i understand all :)

If they have use AO from Nvidia driver this can explain it ( i dont know how it react with old cards ) ... the problem with this technique is it replace the "standard AO " ( HBAO etc ) use the optimised Nvidia technique.. hard to compare if this is not optimised with older cards ( or unused on old cards if the game dont use AO, or HBAO, or SSAO ).. This can change performance, but quality wise.. it is really hard to see a difference untill you search for it .. ( a little bit like the FP demotion ) ... But how the driver is interacting with old cards by use this new feature... i cant tell ..

Anyway, Diablo is still on beta phase, I dont even know why they do this type of benchmark right now .
 
Last edited by a moderator:
I found a pic in another forum suggesting that the nvidia drivers are supporting or using DX11.1 using win7. Wouldn't we need to download and install a new DXsetup?

Edit:
Found another thread discussing it. It looks like something new with 302.59 Beta
 
Last edited by a moderator:
I found a pic in another forum suggesting that the nvidia drivers are supporting or using DX11.1 using win7. Wouldn't we need to download and install a new DXsetup?

Edit:
Found another thread discussing it. It looks like something new with 302.59 Beta

Could it be possible to implement DX11.1 w/o hardware implementation? i mean there isn't any mention of Kepler using DX11.1...at least until now.
 
Last edited by a moderator:
The latest driver releases from NV are stating DX11.1 support for all DX11-capable hardware. Could be a bug in the driver's control panel. :???:
 
The latest driver releases from NV are stating DX11.1 support for all DX11-capable hardware. Could be a bug in the driver's control panel. :???:

Can you or anyone check the Dxdiag's System tab to show what Directx version it reads? What about the DDI in the Display tab?
 
Could it be possible to implement DX11.1 w/o hardware implementation? i mean there isn't any mention of Kepler using DX11.1...at least until now.

Actually there has been mention of it, Muropaketti specificly asked if it supported 11.1, and the answer was "yes, but who cares"
Sen lisäksi, että NVIDIA pihtaa vielä Keplerin laskentaominaisuuksia, jouduimme myös kysymään erikseen, onko GK104-grafiikkapiiri varustettu DirectX 11.1 -tuella. Vastaus on kyllä, mutta NVIDIAn Drew Henryn (General Manager for NVIDIA’s PC GPU Business Unit) mukaan ”Who cares?”.

http://muropaketti.com/artikkelit/naytonohjaimet/nvidia-geforce-gtx-680-gk104
 
Can you or anyone check the Dxdiag's System tab to show what Directx version it reads? What about the DDI in the Display tab?
DXDiag is limited by the run-time support of the OS. Since DX11.1 is not yet available for Win7, it wouldn't matter if the hardware (or the drivers) expose something newer.
 
DXDiag is limited by the run-time support of the OS. Since DX11.1 is not yet available for Win7, it wouldn't matter if the hardware (or the drivers) expose something newer.

Ok thanks. I was thinking that perhaps the drivers were updating directx files or something.
 
Are these things on par performance wise with Ivy Bridge?
It's at the same time funny and sad how NV digs its own grave with these rebrands, and rebrands, and again rebrands... :cry:
The GT610 (aka GT520) can't even quite keep up with SNB HD3000, but GT630 will be faster than HD4000 (not sure about GT620 probably very dependent on the benchmark, the pathetic bandwidth will hurt more in some benchmarks than in others). There's actually not that much of a difference between the GT630 ddr3 and gddr5 versions (below 10% overall), so I suspect the gddr5 version will mostly exist on paper only. Either way the GT630 doesn't look like a worthwile upgrade if you got a IVB HD4000 to me, but they probably aren't intended for that anyway.
 
Back
Top