NVIDIA Kepler speculation thread

I myself just cannot fathom buying a card with under 4GB GDDR5 at this point, if these rumours turn out to be true then Nvidia have once again gone the cheap route on memory for their high end cards(excluding titan).
I didn't do an exhaustive investigation, but only went to Newegg to search for '7970 6GB', 7870 4GB', and 'GTX 680 4GB' and 'GTX 670 4GB'. I got 1 Sapphire 7970 out of a total of 85 7970s ($600), 9 GTX 680s (starting at $520) out of a total of 41, and 6 GTX 670 (starting $420).

How do you conclude a 'once again' from this data?

Are you saying that this Sapphire is the only AMD GPU you can fathom buying?

How many other potential buyers do you expect will impose the same constraints? How many game developers will use this, for the foreseeable future, as a consideration?

It's kinda funny: I remember GPU vendors being ridiculed for putting too much RAM to trick customers into thinking it would help with performance.
 
Last edited by a moderator:
How do you conclude a 'once again' from this data?
I'm not even sure what this has to do with anything as i was talking about reference cards.

Are you saying that this Sapphire is the only AMD GPU you can fathom buying?
Did i say that?

How many other potential buyers do you expect will impose the same constraints? How many game developers will use this, for the foreseeable future, as a consideration?
Why don't you ask them.


The reason i say Nvidia went the cheap route is because they use the same 2GB of ram on the 650Ti boost and GTX660 as they use on the GTX680, AMD at least distinguish the 7900 series from the others by having an additional 1GB of ram.

In addition to that 4Gb GDDR5 chips have become available, not using them for the high end cards is inexcusable(for the 700 series).

Combine that with the new consoles being right around the corner and it makes having a 2 or 3 GB high end card look short sighted.
 
Last edited by a moderator:
It's kinda funny: I remember GPU vendors being ridiculed for putting too much RAM to trick customers into thinking it would help with performance.
Well there's of course a difference putting 4GB on a GT630 (there's a few of these) versus putting the same amount on a high end card. Still, if you think you need 6GB you deserve paying $1k for the titan imho. And unless nvidia restricts the OEMs to put more memory on it you should still be able to get 6GB 780, it will just be more expensive as it should be.
Granted you could argue that maybe games might benefit more in the future from more graphics memory mostly because next gen consoles have 8GB, but I wouldn't bet on 3GB being insufficient anytime soon. Well maybe if you use SLI with some ludicrous HD resolution.
For those who want more memory for "future proofing" just get the more expensive versions. Probably most people buying these will upgrade to a faster card in a year anyway so there's no reason why they should pay more for no gain.
 
I'm not even sure what this has to do with anything as i was talking about reference cards.
Can I buy a reference card on their website?

Did I say that?
You're complaining about 3GB and not being able to 'fathom' going less than 4GB when you could very well argue that 4GB is as pointless as 2GB on a GTX650Ti Boost... If it's almost impossible to buy a 4GB+ card from one very popular GPU vendor you could also argue that, dare I say it, you're trying to find arguments to support a foregone conclusion.

The reason i say Nvidia went the cheap route is because they use the same 2GB of ram on the 650Ti boost as they use on the GTX680, AMD at least distinguish the 7900 series from the others by having an additional 1GB of ram.
Maybe Nvidia thinks it's unfair to deprive 650 owners of their 2GB? And so they throw it at them even if it will never serve a purpose? ;)

Look: if you really think 4GB is a sine qua non if you want to buy a GPU today, fine. I may not agree, and neither do the AMD vendors on Newegg, but who knows which game will be released in the coming 2 years that absolutely will require it.

Just cut the 'being cheap' indignant BS when the facts in the stores point out just the opposite. If the demand is there, plenty of board makers will up the total to 6GB, don't worry.
 
Perhaps i was exaggerating a bit concerning the 600 series but if Nvidia offer the 770 as standard with 2GB then that is them being cheap.

This is a new generation of cards, right before new consoles and just after 4Gb ram chips become available that allow them to double memory sizes; having the same amount of ram as the previous generation(hell AMD had 2GB on the 6900 series) just isn't good enough for their premium cards.

This upcoming generation of cards is the time for them to upgrade ram densities, if not now then it's going to be in another year or more.

Going by the rumours; doubling the 770 ram seems like something they could do but because of the 780 having 3gb they wouldn't do it without doubling the 780s ram, but 6GB of ram seems very unlikely.

I can only hope the rumours are false.
 
Last edited by a moderator:
I specifically got a 3GB 660Ti for that reason , but still , I am afraid it wouldn't be enough !! :(

Killzone's PS4 demo consumed 3.0GB of PS4 memory!! just for graphics , it consumed another 1.5 GB for the rest of the data for a crazy aggregate of 4.5GB !

http://www.eurogamer.net/articles/digitalfoundry-inside-killzone-shadow-fall

That was before they had the 8GB available to them.

Also i think people are not understanding me correctly; i'm not saying that 2GB/3GB cards won't be able to play the games, i'm saying that why would you want to play $400+ for a card that can probably only play the games as the same res/aa as the console version? am i not understanding why people buy the high end cards? i could have sworn it was to play the games with superior settings(not just framerate).
 
Last edited by a moderator:
i'm saying that why would you want to play $400+ for a card that can probably only play the games as the same res/aa as the console version?

The amount of memory in a console and the amount of memory on a video card are not directly comparable like that. There is no reason to assume that a computer system with 2 GB dedicated DDR5 video memory and 16 GB dedicated DDR3 system memory will perform worse or have the same limitations that a console with 8 GB total DDR5 combined memory will have.

Don't get me wrong - I like the extra memory for compute purposes. But then again, that is probably why the card NVidia aimed at people like me has 6 GB (Titan). From the game perspective, I think the major reason you will be playing games that look a lot like they do on consoles is that it doesn't make a lot of sense to generate costly assets for each platform.
 
I don't know how exactly engines or drivers deal with this, but needing 3.5GB for graphics doesn't mean all of it needs to be stored in the fastest kind of memory at all times. AFAIK drivers swap textures between GPU and main CPU DRAM all the time, except for those that are explicitly tagged by the engine as needing to be in GPU memory. (How common is this?)

While this may result in an occasional slowdown due to swapping here or there, I don't think this will have a major impact on the way engines are written since all of that should be handled transparently by the driver.
 
I don't know how exactly engines or drivers deal with this, but needing 3.5GB for graphics doesn't mean all of it needs to be stored in the fastest kind of memory at all times. AFAIK drivers swap textures between GPU and main CPU DRAM all the time, except for those that are explicitly tagged by the engine as needing to be in GPU memory. (How common is this?)

While this may result in an occasional slowdown due to swapping here or there, I don't think this will have a major impact on the way engines are written since all of that should be handled transparently by the driver.

This should be explained by the company in user friendly style and should reach all customers. In no case it should be discussed only here
 
This should be explained by the company in user friendly style and should reach all customers. In no case it should be discussed only here
I have no idea what you're talking about? GPU companies should explain how driver memory management works to its non-tech customers? What possible upside would this have?
 
I have no idea what you're talking about? GPU companies should explain how driver memory management works to its non-tech customers? What possible upside would this have?

I think he has a point. They obviously don't want to go into the detail of memory management but something to explain why the "3GB" they are so proudly plastering on the box of their highest end $500 GPU is not worse than the "8GB" Sony and MS will be advertising for their new consoles which will sell for a similar price could be a good idea.

Joe average could easily look at those numbers and conclude the consoles are "more than twice as fast!"

Personally, I'm still not concvinced 3GB will be enough to always retain parity let alone superiority. Or 4GB for that matter. Until someone like ERP, 3dilittante, Gubbi, sebbbi etc... explains why it would be enough and specifically how they think system RAM will make up for the deficit in the real world (not just what is possible in theory with dedicated developers) then I'm going to remain dubious.
 
I don't know how exactly engines or drivers deal with this, but needing 3.5GB for graphics doesn't mean all of it needs to be stored in the fastest kind of memory at all times. AFAIK drivers swap textures between GPU and main CPU DRAM all the time, except for those that are explicitly tagged by the engine as needing to be in GPU memory. (How common is this?)

While this may result in an occasional slowdown due to swapping here or there, I don't think this will have a major impact on the way engines are written since all of that should be handled transparently by the driver.

You want to take a look at the streaming vs non-streaming pools there. Additionally the render target size and breakdown is interesting given they are targeting a 1080p output and this has no MSAA applied yet.
 
Joe average could easily look at those numbers and conclude the consoles are "more than twice as fast!"
They could also look at their 8 GB of system memory and say the additional 2GB of video memory gives them 10. That may have a little grain of truth to it.

The demo hasn't been fully optimized for one thing, but there are a few questions I do have.
One, does the demarcation of system, shared, and GPU memory mean this isn't a fully HSA solution, or is it just a choice of graph labelling?
This could inflate the graphics side if the needs to be a pinning scheme for the graphics memory allocation. Similarly, the developers may have opted to play it safe and not try to do too much juggling of immediately needed and not needed data, which could inflate the numbers.

Second, how much of the video memory part of that graph is just them saying "this data is used by the GPU" versus "this data is being heavily used by the GPU"?
For a PC with decent system memory, things like the streaming pool would be significantly smaller and possibly subsumed by a larger system memory buffer. The sizing would be in part based on the fact that the PS4's GDDR5 is the one-stop shop for everything before streaming from slower physical storage.

It could be a matter of convenience and an artifact of the common memory pool. A PC could load tons of graphics data to system memory that isn't being used immediately, but that would still show in a PS4 graph as GPU memory. The non-streaming pool may have subcomponents of heavily used data and data that is graphics-related that can be managed by the system.
 
They could also look at their 8 GB of system memory and say the additional 2GB of video memory gives them 10. That may have a little grain of truth to it.

This is where my thinking was going with the GPU vendors. I.e. put a simple bar chart on the back of the box which adds their "3GB" to the "typical" 8GB system memory of your average gaming PC to give an "awsome" total of "11GB" thus making the GPU appear better to Joe average.

As you say, the truth of that is quite limited but it's one option they have to not look sub par without going into the intricacies of memory management.
 
What the hell is all this conversation about "ZOMG 8GB CONSOLE WILL LOOK BETTER TO JOE?"

Is the system ram going to be on the printed exterior of the box? At what other point in the history of consoles have the manufacturers called out the total quantity of ram, the speed and capability of the varying processors, or even the spindle speed of the hard drive inside? Here's a hint: NEVER.

Consoles are appliances, the only difference is the amount of storage space for saving your games and downloaded content. All other computing factors (RAMs, jiggahurtzes, turdbo metabits and doohickey whack-jobbery) are meaningless as none of that will change during the lifespan of the unit without negatively affecting compatibility of future titles.

The only people who will know about this are the people who care to dig. And if they're so dumb as to attribute ZOMG 8GB OF RAM to somehow being better than a 3GB 7970GE or 6GB TITAN, then they're too damned dumb to care about the actualy technical differences anyway.
 
What the hell is all this conversation about "ZOMG 8GB CONSOLE WILL LOOK BETTER TO JOE?"

Is the system ram going to be on the printed exterior of the box? At what other point in the history of consoles have the manufacturers called out the total quantity of ram, the speed and capability of the varying processors, or even the spindle speed of the hard drive inside? Here's a hint: NEVER.

Consoles are appliances, the only difference is the amount of storage space for saving your games and downloaded content. All other computing factors (RAMs, jiggahurtzes, turdbo metabits and doohickey whack-jobbery) are meaningless as none of that will change during the lifespan of the unit without negatively affecting compatibility of future titles.

The only people who will know about this are the people who care to dig. And if they're so dumb as to attribute ZOMG 8GB OF RAM to somehow being better than a 3GB 7970GE or 6GB TITAN, then they're too damned dumb to care about the actualy technical differences anyway.

There'll be plenty of forum goers that (rightly or wrongly) see any 3GB GPU as being unable to remain competitive with the 8GB consoles over the long term. I'm sure they'll be more than happy to spread that message around their non forum going friends. If you doubt that try starting a thread on neogaf with a title along the lines of "3GB is more than enough to compete with the next generation consoles" and see what kind of response you get.

I'm not saying they're right but if 3GB really is enough then that's something the GPU vendors may want to at least make some basic attempt to educate the masses over.
 
Back
Top