NVIDIA Kepler speculation thread

If you think about it there is no point in getting a GTX680 at 550€ when you can get two 560TI at 450€...

Except for saving about 150W under load, getting new features, not having to deal with micro-stuttering and all the headaches of SLI, etc.
 
The box label says "DirectX 11" -- no 11.1 support?

They dont forcibly will write DX11.1 in the box. If im right, my Sapphire dont show 11.1 (but it is indicate second generation of DX11 Microsoft ), the HIS is showing DX11.1 support in the spec ( need to check, as i i have throw the box ).

We will need GPU-Z for that but i dont see any reason for the GTX6xx to dont get the support of the version 11.1.
 
Last edited by a moderator:
They dont forcibly will write DX11.1 in the box. If im right, my Sapphire dont show 11.1 (but it is indicate second generation of DX11 Microsoft ), the HIS is showing DX11.1 support in the spec ( need to check, as i i have throw the box ).

We will need GPU-Z for that but i dont see any reason for the GTX6xx to dont get the support of the version 11.1.

nVidia didn't implement the required features perhaps?
They never supported DX10.1 before their DX11 cards, either
 
he sounds like they dont give a damn about dx11.1

http://forum.beyond3d.com/showpost.php?p=1612768&postcount=60

Yes, sounds like DX11.1 is a special AMD marketing feature. They hope they will sell more products because of it... Different priorities I dare say.

GT215 and GT216 did support DX10.1.

Because Windows 7 requires it to be "W7 ready" certified? :???:

Except for saving about 150W under load, getting new features, not having to deal with micro-stuttering and all the headaches of SLI, etc.

It's not worth it for almost 200 € more, you'll pay a lot of electricity bills with that amount of money. And then, the thought about 2 cards is much more seductive- like 2 is more than 1. :???:
 
Launching 4 months later when they know the clocks of the competition... I'm guessing the rumored 950mhz was the real spec and it was bumped up to 1006mhz for just the reason of that percentage. If thats even a stock clocked model, may be a stock OC version. The power/size differences are nice though I suppose, but I'd like to see it in practice a little more, and see them memory and bandwidth limited as well.

4 months, 10 weeks... what's the difference? :D
 
The box label says "DirectX 11" -- no 11.1 support?

It will probably be there, but either way it's not going to be a big deal. Developers are not going to alienate mass amounts of vanilla dx11 users just to implement what is likely to be unnoticeable features/gains with dx11.1. There may be a handful of titles that have optional dx11.1 support within the next year or two, but again, whatever dx11.1 support is there will be entirely transparent and unnoticeable.
 
so?
For God's sake, GPU-Z uses DB , it does not read configuration from the chip.
If it doesn't recognize GK104 correctly (DB not updated) it will show borked info.
Use EVGA Precision. After all its being written by Unwinder atm

GPU-Z uses DB for some things, not everything. Clocks for example aren't read from DB, that's for sure, since factory OC'd cards also show different default clocks from "stock models" and there's no way every single factory OC'd card can be added to DB in time
 
GPU-Z also detects the number of active processing clusters - remember HD4830 and HD 6850 launches. ;)

But with Kepler clock detection seem a bit hard and also driver depended, with 300.99 GPU-Z 0.5.9 showed a current clock of 1006MHz.
 
GPU-Z also detects the number of active processing clusters - remember HD4830 and HD 6850 launches. ;)

But with Kepler clock detection seem a bit hard and also driver depended, with 300.99 GPU-Z 0.5.9 showed a current clock of 1006MHz.

he's using bundled drivers coming with card, it should be ok but nvidia might have thought party poopers and only sent reviewers the drivers that read clocks properly :?:
 
Back
Top