NVIDIA Kepler speculation thread

From Fudzilla: "GTX 760 Ti, 770, 780 detailed."

According to them,

760 Ti = 670
770 ≈ 680, the cooler will help the 770 to have over 5% higher performance than the 680
780 is a GK110, 256-bit bus.

I don't believe the bus width… :rolleyes: especially when memory clocks are already at 6 Gbps for the 670 and 680. The other stuff I can believe though.

Yeah it'll definitely be 320-bit. If it was 256 bit we'd have a case of the gtx770 being faster or equal to gtx780, very much similar to the gtx460 vs. gtx465 scenario
 
Yeah it'll definitely be 320-bit. If it was 256 bit we'd have a case of the gtx770 being faster or equal to gtx780, very much similar to the gtx460 vs. gtx465 scenario
The 680 was already showing very obvious signs of being bottlenecked with bandwidth. With 10% higher core clocks and 14% more shaders than the 670, it was only 7% faster at launch.
So "over" should be "at least."
Ah, I guess I can't read numbers when they're in word form. GPU Boost 2.0 should be a good boost in performance and perf/watt as well. AnandTech revealed that their initial implementation of GPU Boost had very conservative voltages to ensure stability in the worst of the worst case scenarios.
 
http://videocardz.com/41297/nvidia-geforce-gtx-770-pictured

bnBtAr6.jpg


GTX 770 with Titan cooler?
 
The "770" looks like a paint job based on the 780 card, copying the first '7'.

And $400 seems quite unrealistic, especially with that cooler.
 
Increasing ppi on current 22-23 inch monitors is essential and critical for overall improvement of image quality. Once ppi gets to the retina range around and over 300 ppi on 22 inch display, then I think I will feel better

4k on 22 inch would be just 216 ppi! but 8k would be 401ppi, we are far from that at least now.....
 
I am not sure how relevant the ppi stuff is to the Kepler architecture, but wanted to weigh in on something I think has been cited a bit incorrectly.

The "retina range" of ppi depends on distance from the device. The "retinal display" term Apple came up with states approximately 1 arc minute viewing angle at 12 inches (0.3048 meters). To achieve 1 arc minutes viewing at 12 inches, you need approximately 300 ppi.

For a 22" monitor, that puts "retina range" for viewing at 2.5 feet (0.35 meters) at approximately 1080p resolutions (assuming 20/20 vision, and avoiding the issue of pixel layout). So unless you sit closer than 2.5 feetyou are pushing the limits of human vision at 1080p for most average computer monitors. A 22" 8k display at 2.5 feet is a collosal waste of money (and hence you will probably never see one).

I think the real driver of higher resolutions is multi-monitor setups. Combining 3 22" 1080p monitors (5760x1080) is about 6.2 megapixels. Combining 3 27" 2560x1440 is over 11 megapixels. Those are the type of setups I think top of the end video cards are trying to drive. For most of those, you currently expect 2 or more cards. It would be interesting if they could get it down to 1 card.

I think the whole 300ppi thing is a case of a marketing strategy for one application (i.e. Apple's cell phones) being pushed into a realm where it doesn't really apply (computer monitors). I think AlphaWolf was politely hinting at this with his comment. Basically, a cell phone that will be held close to your face will require a much higher pixel density to get the same effect as a monitor several feet away.
 
Videocardz found a now-removed Chinese forum thread with a purported roadmap of some upcoming NVIDIA cards.

NVIDIA-GeForce-700-Series-Roadmap.png


(Titan is GK110-400.)

WhyCry (Videocardz) said:
As for the questions you might ask. No, I didn’t made this chart, and no, I have not added anything besides the watermarks and some blurriness. I think this picture looks quite legit (considering how many fakes I’ve seen). The color-scheme and the fonts are correct. However, as always, treat it with a grain of salt. Maybe this is indeed a real chart from NVIDIA, but it may be old, so for instance, the launch dates could be different.
I wonder if the 780 will have a 384-bit bus or a 320-bit bus with a weird memory configuration.
 
I have not added anything besides the watermarks and some blurriness.

Why?
He is not the source, why does he feel the right to watermark, also why add blurriness to it? seems like an odd thing to do.

I call BS, but i don't believe any of the other 700 series rumours anyway.
 
Why?
He is not the source, why does he feel the right to watermark, also why add blurriness to it? seems like an odd thing to do.
I've been under the impression that the watermark thing is a fairly common practice (although that doesn't make it right). Also the blurriness might be to hide a watermark from the forum thread (it's in a location one would expect a watermark to be).
 
I wonder, if this and the leaked pictures of 770/780 using the Titan-lookalike-Cooler are true, what this will do to a boost 2.0, which GK104 is capable of as well (with an older driver, I've seen the Temp-Target at 94°C). Probably pretty high clocks on an unchanged ASIC?
 
The "retina range" of ppi depends on distance from the device. The "retinal display" term Apple came up with states approximately 1 arc minute viewing angle at 12 inches (0.3048 meters). To achieve 1 arc minutes viewing at 12 inches, you need approximately 300 ppi.

For a 22" monitor, that puts "retina range" for viewing at 2.5 feet (0.35 meters) at approximately 1080p resolutions (assuming 20/20 vision, and avoiding the issue of pixel layout). So unless you sit closer than 2.5 feetyou are pushing the limits of human vision at 1080p for most average computer monitors. A 22" 8k display at 2.5 feet is a collosal waste of money (and hence you will probably never see one).

I think the real driver of higher resolutions is multi-monitor setups. Combining 3 22" 1080p monitors (5760x1080) is about 6.2 megapixels. Combining 3 27" 2560x1440 is over 11 megapixels. Those are the type of setups I think top of the end video cards are trying to drive. For most of those, you currently expect 2 or more cards. It would be interesting if they could get it down to 1 card.

I think the whole 300ppi thing is a case of a marketing strategy for one application (i.e. Apple's cell phones) being pushed into a realm where it doesn't really apply (computer monitors). I think AlphaWolf was politely hinting at this with his comment. Basically, a cell phone that will be held close to your face will require a much higher pixel density to get the same effect as a monitor several feet away.

Good points but I have some remarks.
I sit at comfortable distance from my 1080p 22 inch monitor and sometimes my vision gets so used to it, that the image looks ugly and grainy. So, I see the pixels themselves perhaps from more than one full arm length distance. So yeah, 8K on 22 inch will be a good solution in the future.

I don't have space neither on my desk nor in my room in general for 3 22 inch monitors- it is stupid and ridiculous waste of precious space...
 
Good points but I have some remarks.
I sit at comfortable distance from my 1080p 22 inch monitor and sometimes my vision gets so used to it, that the image looks ugly and grainy. So, I see the pixels themselves perhaps from more than one full arm length distance. So yeah, 8K on 22 inch will be a good solution in the future.

You can check "The Image Processing Handbook, 5th Edition" by Russ, or any one of numerous neurological studies - but it is a measured scientific fact that human eyes tend to fall in the 1 arc minute of viewing angle. There are physical constraints caused by the way our eyes are designed that make this true. Note that this is not my opinion. You can go look up the studies yourself if you want - this is a pretty well researched fact.

At 2 feet - which is far closer than most people sit to their monitors - that is a pixel density of 184. An 8k monitor would have a pixel density of over 400. So most (if not all) human beings with 20/20 vision would see no improvement from an 8k monitor. I won't claim nobody would - but out of ~8 billion people in the world there might be 1 or 2 given the tolerances on how these were measured.

That brings us to a question of economics. While stranger things have happened, I find it highly unlikely that monitor manufacturers will ever build a monitor so far out of spec with what the human eye can actually see. It costs them a lot more for what ends up being a marketing point on a slide. People won't see the difference between two monitors sitting side by side (because they physically can't), so they will by the cheaper monitor.

The only way I can see 8k monitors being built is the "Monster Cable" effect. Something where people buy it not because it is actually better, but because it is expensive and has fancy marketing attached. I just plain do not see the margins in that for monitor manufacturers. Basically, I wouldn't hold your breath.
 
You can check "The Image Processing Handbook, 5th Edition" by Russ, or any one of numerous neurological studies - but it is a measured scientific fact that human eyes tend to fall in the 1 arc minute of viewing angle. There are physical constraints caused by the way our eyes are designed that make this true. Note that this is not my opinion. You can go look up the studies yourself if you want - this is a pretty well researched fact.

At 2 feet - which is far closer than most people sit to their monitors - that is a pixel density of 184. An 8k monitor would have a pixel density of over 400. So most (if not all) human beings with 20/20 vision would see no improvement from an 8k monitor. I won't claim nobody would - but out of ~8 billion people in the world there might be 1 or 2 given the tolerances on how these were measured.

That brings us to a question of economics. While stranger things have happened, I find it highly unlikely that monitor manufacturers will ever build a monitor so far out of spec with what the human eye can actually see. It costs them a lot more for what ends up being a marketing point on a slide. People won't see the difference between two monitors sitting side by side (because they physically can't), so they will by the cheaper monitor.

The only way I can see 8k monitors being built is the "Monster Cable" effect. Something where people buy it not because it is actually better, but because it is expensive and has fancy marketing attached. I just plain do not see the margins in that for monitor manufacturers. Basically, I wouldn't hold your breath.

""Check your algorithms in motion" (Who said that? I forgot, it was in some slides about AA algorithms)

One arc minute is, per your references, adequate for a photo frame showing your grand kids and doing an alpha transition between (already AA'ed by the camera) photos every minute. It might not be so adequate with the typical street light in a racing game coming closer, going from 1 pixel width to 2 pixels width then back to 1 pixel width in the distance, then alternating between 2-3 pixels width, and so on, as it drifts to one side and closer. You will notice the change in widths, in motion, because you are, as you just quoted, sensitive to 1 arc minute. The relative differences are huge at the 1-2 pixel width range, and not so noticeable at the 40-41 pixel range.

AA helps with that, of course. So you can have both GPU AA, but also "monitor AA". I will speculate and say that with 4xMSAA, as an approximation, you will see a 1-1.25 pixel "width strobe" at what was before the 1-2 pixel range. Or at worst, a 0.25-0.50 pixel, gentle, alpha blended vibration. "Monitor AA" will apply a reduction factor to those values, and if your GPU can handle the multiplied resolution, why not use both?

IMHO 8k sounds a bit excessive today, but I'd love to have 3x 4k monitors with added 4x MSAA.
 
""Check your algorithms in motion" (Who said that? I forgot, it was in some slides about AA algorithms)

Since this is a topic that interests me, if you want an in depth discussion we can move to another thread. It is probably best to let this get back on topic. I will say that I agree with most of what you said.

I still stand by my primary assertion though - the major problem driving the push for more pixels right now is multi-monitor setups, not higher pixel density. I think there are other low hanging fruit that could lead to larger improvements in the games we play. For example, I really like the concept of allowing skin on models to scatter slightly behind the surface to produce more realistic models.

I also think that we're going to see more of a push for APU type chips for the compute users. Kepler is proving that for me. Fermi was fairly easy to balance compute loads. I haven't had the same success with Kepler. Kepler seems more powerful, but has been more difficult for me to write good code for. There are also many functions that I have to run that having a chip with fast atomic memory operations onboard would be great for (like histograming results from Monte Carlo simulations).
 
Back
Top