Is Nvidia deliberately downgrading Kepler performance in favour of Maxwell?

Fermi's WDDM 2.0 drivers are still pending, right?

Will be released around christmas ... ( or maybe a bit early, i think this is just their deadline for it as games DX12 should be there at this time ) ...

I really dont see why they couldnt release it allready..outside maybe got some peoples who upgrade anyway.

And lets be honest, im not sure what could be the performance of an old Fermi gpu like a 460 -480, DX12 or not at this time, in AAA games..
 
Fermi cards pretty much all have <2GB on them (most have 1-1.5GB) so their relevance in the DX12 era is questionable IMO. 2GB is somewhat of a practical minimum requirement even now. Still it's nice that they'll be supported.
 
Here is something unfortunately only with 3dmark, but there really doesn't seem to be any purposeful degradation of performance.

http://www.bytemedev.com/the-gtx-780-ti-sli-end-of-life-driver-performance-analysis/


Im not sure that the problem was related to 3Dmark performances ... And the conclusions speak for themselves as they are just explain, why it could have a degradation on performances in games . ( or at least an not optimal optimization of them.. )

Funny enough, im pretty sure if we was compare thoses 3Dmark numbers with games benchmark, it could even be used for show that it could well be a deliberated will for games used by reviewers... As it seems that the return on performance driver seems inverse on games that what you can see there.

Personally i think it is pretty normal that Nvidia try to optimize first his driver and tunes performance for his last architecture, what company will not do it ? .. Now, if they are doing it only for a marketing point of view, thats another story but well... impossible to verifiy for be honest...

But what really tease me, is when i see him using the fact that consoles are using AMD GCN gpu's and use it as an explanation with this: And really, thats the best part of the article..

.) All The Console Ports Have Been Virtually Terrible
[*] Batman: Arkham Knight – Sales have been stopped it was so bad.
[*] Assassin’s Creed Unity – Appalling and terrifying release.
[*] Watch_Dogs – Hatefully now known as Patch_Dogs.
[*] VRAM requirements have tripled over night, with no perceivable increase in image quality.
[*] The architecture in the consoles GPUs is GCN. Games are being designed from the ground up to work well on AMD hardware.

This is the most hilarious thing i have read thoses last 6 months: Does it really suggest that thoses games run bad on Kepler because they are optimzed for AMD GPUs GCN ? or because it was console portts..

Every game listed by him are "GAMEWORK" titles... all developped in tight collaborations with NVIDIA, with a lot of money and code and time injected by Nvidia on them .. and no one of thoses games have run well on AMD gpu's PC system...

"Vram requirements have tripled over night.. " huuum, i dont think AMD GPU's have tripled their Vram hardware, but strangely, i know an other company who have release a gpu with 6 and 12GB of Vram...
 
Last edited:
he will be testing more games shortly.....

Now to speak to driver optimizations for latest hardware first, I agree that would be most pertinent.

Talking about console programming, low level programming doesn't translate at all to high level API's in the PC environment, so when changing over from console to PC, it has be handled with care and QA time is pretty much doubled if its an actual port. Its always easier to make a pc version and console version at the same time.

Gameworks, as much as its integrated, usually don't need to get into base engine programming or game programming to integrate them, so no, Its not Gameworks that broke those games. And if those Gameworks features are turned off, the performance should go back up on more on AMD hardware and of course performance should increase on all hardware.

How many games have we seen in the past have poor performance when they were ported from a console to PC? This seems to be a fairly common occurrence through out every console.

Publishers don't really care about PC versions of games that much as consoles so if the developer is working on a port, rest assured the publisher will push them to get it out as soon as possible while cutting corners and use as less money as possible.
 
skymtl is a well known nvidia fanboy, his Titan Z review had it faster than the 295x2 while it was the opposite elsewhere. He has been frequently criticised on other forums for such reviews.

Anyway, the results, if true, actually show that there was crippling going on. Because he has 780Ti running faster than the 290X in games like shadow of mordor and hitman absolution whereas 290X is quite ahead in them at other review sites with perhaps older drivers. And he himself mentions the 700 series being unoptimized for games like GTA V when it first came out and includes gem like this,

Unlike Dragon Age which is an AMD-centric title, Dying Light was a showcase for a few of NVIDIA’s GameWorks technologies. Here, the GTX 980 surges ahead of the R9 290X and GTX 780 Ti. While the GTX 780 Ti’s performance was more than adequate, its distance from a Maxwell-based card that it should be competing against has grown. However there is no indication of performance crippling or anything else on NVIDIA’s part.

Truth be told, I can’t put a reason to this one given the modest 2.8GB worth of framebuffer our settings require. It could be that the GTX 980’s core architecture is a bit more efficient at processing the draw calls in this game.

Hmmm, what could it be?? :???::???::???:

His SoM bench,

http://images.hardwarecanucks.com/image//skymtl/GPU/GTX-780-TI-R9-290/GTX-780-TI-R9-290-37.jpg

vs.

Anandtech's

http://images.anandtech.com/graphs/graph9390/75458.png

and Techpowerup's,

https://tpucdn.com/reviews/AMD/R9_Fury_X/images/som_2560_1440.gif

From a >10% advantage in favor of 290X over 780Ti, he's managed to finagle a ~5% improvement in favor of 780Ti. Truly outstanding! Maybe it's the 1600p resolution which AMD aren't optimizing for, who knows?? :p
 
Yeah, years ago, HC was a real good reviews sites .. but i dont know since some years, something have happened.. his result are completely strange.. I know him from Xtremsystem... and it was a good credible source, but now ... .. i mean just look at the review and the forum date. look like it is slowly dying.
 
Link is broken

Working here, but. seriously their conclusion about the 290x is absolutely invert of all we have seen in other sites: in HC the 290x is still 6% slower of the 780TI ( was 9% before ), the 980 is now 7% faster .. ( something like that ), i dont know how interpret all the review where the 290x is now largely faster ( by a ratio of +10% and more ) ..
 
Working here, but. seriously their conclusion about the 290x is absolutely invert of all we have seen in other sites: in HC the 290x is still 6% slower of the 780TI ( was 9% before ), the 980 is now 7% faster .. ( something like that ), i dont know how interpret all the review where the 290x is now largely faster ( by a ratio of +10% and more ) ..

Yep, now it works, thanks
 
Back
Top