NVIDIA Kepler speculation thread

The way shared memory and global memory are differentiated is different from CPUs and from GCN. I'm not certain from an ISA standpoint if that concept will persist. Bonaire's level adds a flat addressing mode that at tries to handle regular and LDS data as part of a single space, although there are hazards the software must watch out for.

NVIDIA GPUs have been exposing a flat address space for generic pointers since Fermi. The same pointer can be used to point to global memory, shared memory, or thread local memory.

Optionally, you can make the pointers explicit and gain a performance advantage in certain situations.
 
Which titles are "heavy compute driven"?
Dirt Showdown?

Also there's a lot of talk about GK110 being "gaming focussed" but I thought GK110 included all the extra compute capability that was lacking in GK104. So it's pretty similar to Tahiti in that respect isn't it?
GK110 wasn't gaming focused since it supports ECC and faster DP rate than GK104. I don't consider new features like dynamic parallelism as I'm pretty sure those new features were desired for GK104, but weren't going to be ready in time.
 
It is very simple- Tahiti is bigger, more power hungry and offers much lower margins overall compared to GK104. Tahiti does the same as what the R600 did compared to 8800 GT (GTX 680). While Titan is the new damage aka 8800 Ultra
History repeats itself.

And yeah, architectures obviously have their weaknesses and strengths but AMD arranged the things in such a way that Nvidia with the smaller chip can achieve the same performance and gain noticeably higher margins...

Then go and feed some GPGPU loads on the GK104. Oh wait, it's suddenly slower than Fermis or Northern Islands :rolleyes:

So then why haven't they?

It's been over a year since the GK104 was released and with 28nm a mature product yet no magical Pitcairn GK104 has been seen.

Seems more like AMD knows reasons why it can not or should not be done.

Because yields and performance of Tahiti are good enough not to make completely new chip to compete with your own product in gaming world?
 
NVIDIA GPUs have been exposing a flat address space for generic pointers since Fermi. The same pointer can be used to point to global memory, shared memory, or thread local memory.

Optionally, you can make the pointers explicit and gain a performance advantage in certain situations.
I forgot about how that is exposed to software.
I was thinking more how about the configurable L1/Shared memory pool being different from CPUs that put it all into the general hierarchy, and GCN's physically different pool.

I muddled things up by mentioning Bonaire's flat addressing scheme, aside from my impression that it has to be an intermediate step to something less awkward.
 
A title where AMD was heavily involved in making the engine for is hardly conclusive proof. Next ;)
There are tons of other games that use DirectCompute, like Sleeping Dogs, Tomb Raider, Metro 2033 where the 7870 LE and 680 are tied.

AMD was no more heavily involved in Dirt Showdown than Nvidia was in Metro 2033. Just because one IHV doesn't do as well in a game title (see AMD hardware in Civ 5) doesn't mean that it isn't relevant.

Regards,
SB
 
(Expecting a "I'm right no matter what you say. Point." answer)

If I say that I am not satisfied with AMD and in particular with the underwhelming Tahiti performance, then there is an obvious reason.

And I think it is time for me to switch the camp and punish AMD- my next card should definitely be an NVIDIA GeForce.
 
AMD was no more heavily involved in Dirt Showdown than Nvidia was in Metro 2033. Just because one IHV doesn't do as well in a game title (see AMD hardware in Civ 5) doesn't mean that it isn't relevant.

Regards,
SB

Proof? GPU PhysX aside, how was Nvidia involved in Metro 2033? Did they develop the renderer?
In the grand scheme of things it is irrelevant. If other engines were to show the same characteristics, okay. But if it's just the one...
 
If I say that I am not satisfied with AMD and in particular with the underwhelming Tahiti performance, then there is an obvious reason.

And I think it is time for me to switch the camp and punish AMD- my next card should definitely be an NVIDIA GeForce.

I have a 7950, before i was a gtx580 owner, the driver support of Amd is the real important issue compared to nvidia. I ever read this on forums all around web, and except for fanboys, almost every amd vga's owners declares problems with amd support (stuttering, black screen, flickering, issues also in 2D like internet navigation etc.). This kind of problems with nvidia are much more less, in my direct experience, (apart i love features like adaptive vsync), so my next videocard will be definetely an nvidia, i don't care if it will costs 80/100€ more of similar amd one, (or the amd one has 10 triple A games in bundle) because they will be good spent money (another time, excuse if my english is not perfect, i hope you guys understand what i mean in this post).
 
This usually scales with CU's.

So, simply calculating the die area for one CU and dividing any imaginary available space by this number won't lead any where since there are other units that need to be replicated as well. Is that what we're both saying?
 
Proof? GPU PhysX aside, how was Nvidia involved in Metro 2033? Did they develop the renderer?
In the grand scheme of things it is irrelevant. If other engines were to show the same characteristics, okay. But if it's just the one...

Proof? Sorry mate, the burden of proof is on you for making the claim that AMD went to extraordinary lengths to make Dirt Showdown perform better on AMD hardware than Nvidia hardware. You are the one making the claim, not I.

Hell, it isn't like it's me going around saying Nvidia did something underhanded in Civ5 to make it perform vastly better on that game (which has significant compute), way better than comparisons in other games at the time would have warranted.

Just like Nvidia doesn't "develop" the engines for their partners, neither does AMD. Just like some AMD partnered titles where they were involved perform better on Nvidia hardware, there are also Nvidia partnered titles where they were heavily involved that perform better on AMD hardware. Well, except for a brief period of time when Nvidia possibly had partners remove some performance enhancing features (DX10.1) because their cards didn't support them.

Regards,
SB
 
Proof? Sorry mate, the burden of proof is on you for making the claim that AMD went to extraordinary lengths to make Dirt Showdown perform better on AMD hardware than Nvidia hardware. You are the one making the claim, not I.

Hell, it isn't like it's me going around saying Nvidia did something underhanded in Civ5 to make it perform vastly better on that game (which has significant compute), way better than comparisons in other games at the time would have warranted.

Just like Nvidia doesn't "develop" the engines for their partners, neither does AMD. Just like some AMD partnered titles where they were involved perform better on Nvidia hardware, there are also Nvidia partnered titles where they were heavily involved that perform better on AMD hardware. Well, except for a brief period of time when Nvidia possibly had partners remove some performance enhancing features (DX10.1) because their cards didn't support them.

Regards,
SB

I didn't claim that. I just said it is a bad example since one of the only two IHVs was directly involved - to a larger extent than I've ever read about before.
You can read up on AMDs involvement here:
http://blogs.amd.com/play/2012/06/20/dirt-showdown-on-gcn/

Can you show something similar for Metro 2033 as you claim? A healthy dose of skepticism is better than being naive.
 
Last edited by a moderator:
Please to see the impact of Gaming Evolved and Never Settle highlights AMD are investing more in developer relations and bringing games forward than NVIDIA are. :)
 
This is the Nvidia Kepler thread, not AMD thread. Seriously, the mods here don't do their jobs and clean the threads when it's filled with AMD fanboys.
 
One needs to own both a GeForce and a Radeon to always be able to have the best game experience.
 
Proof? Sorry mate, the burden of proof is on you for making the claim that AMD went to extraordinary lengths to make Dirt Showdown perform better on AMD hardware than Nvidia hardware. You are the one making the claim, not I.
FWIW (and not that I want to join the discussion who was involved more with whom...) but AMD has been very openly advertising their involvement in the Showdown Engine.
http://blogs.amd.com/play/2012/06/20/dirt-showdown-on-gcn/

"Optimizing for Graphics Core Next

We like to say that Graphics Core Next was built for ultra settings, and these results bear that out. But it’s not just the hardware, as we’ve been working very closely with Codemasters to optimize the DiRT Showdown engine for GCN."

And a little less expicit here:
http://blogs.amd.com/play/2012/05/31/race-hard-party-hard-with-dirt-showdown/

There, it's only mentioned how similar Dirt's lighting engine is to the one in AMDs own Leo-Demo.
 
GameSpy wrote: "Dirt: Showdown delivers bargain-basement entertainment value for the high, high price of $50. With its neutered physics, limited driving venues, clunky multiplayer, and diminished off-road racing options, discerning arcade racing fans should just write this one off as an unanticipated pothole in Codemaster's trailblazing Dirt series."

PC Gamer wrote: "Dirt: Showdown provides thrills while it lasts, but afterwards you're left wanting the deeper experience of its parents".


http://en.wikipedia.org/wiki/Dirt:_Showdown
 
Back
Top