(I wish future GPU architectures also stayed on 28nm: this is so much more interesting!)
Last edited by a moderator:
Mmmmm... Seems that those bench aren't correct, perhaps fake or temporary pending the final publication
How so?
cut
The improvement should be mostly because of the caching, considering the compute deficit of GTX980 compared to 780Ti. Now if only NV could make their mind and polish their OCL run-time.Very, very nice. If the Luxmark scores are legit then that's a major comeback.
The data relating to the other video cards (both nvidia and amd) do not match with reality. videocardz has also deleted the news
That's sad if true. Expreview usually posts legit numbers.
The improvement should be mostly because of the caching, considering the compute deficit of GTX980 compared to 780Ti. Now if only NV could make their mind and polish their OCL run-time.
How is that different in Maxwell, since both architectures are dual-issue?That deficit is very much application specific. The 780 Ti requires much more ILP to hit peak utilization. In workloads with few opportunities to dual-issue math instructions the 980 has a ~40% advantage.
How is that different in Maxwell, since both architectures are dual-issue?
I'm interested though to know about Freesync support, or in other terms full Displayport 1.2a support.
For those asking what happened to Expreview review, I was asked by them to take it down, but fear no more, another review just got leaked, with even more charts, slides andeven pictures of unreleased cards. Enjoy.
http://videocardz.com/52552/nvidia-geforce-gtx-980-and-gtx-970-press-slides-pictures-charts
How is G-Sync technically different anyway? Wouldn't surprise me if nvidia could enable FreeSync with a driver update. (But of course they won't for now, no reason to alienate their monitor manufacturing partners.)Tom Peterson's talk on G-Sync back on August 22 (youtube/pcper) was pretty explicit about having "no plans to support that [1.2a optional extensions / adaptive sync]" (51:37).
"shader: raster ordered view"
Can you elaborate more (or quote exactly what they said)?
How is G-Sync technically different anyway? Wouldn't surprise me if nvidia could enable FreeSync with a driver update. (But of course they won't for now, no reason to alienate their monitor manufacturing partners.)