Nvidia BigK GK110 Kepler Speculation Thread

The primitive setup units are dedicated blocks. And most likely NV doubled the scan-out rate, from 8 to 16 fragments per setup unit, to go with the vastly increased processing rate.

12 would be a better fit, despite not being a power of two, but still overbuilt for ROP throughput.
 
12 would be a better fit, despite not being a power of two, but still overbuilt for ROP throughput.

Is there any real utility for more fragments per setup unit, given that fillrate is rather unimportant now a days, and that we're moving toward highly tessellated, tiny triangles?
 
Won't it be bandwidth bound with almost double the shaders but only 50% wider bus or will the larger L2 make up for it? I have read people here say the 680 is already having memory bandwidth problems despite being balanced than this monster.
 
Last edited by a moderator:
keldor314:
I do not think that fillrate per se is unimportant nowadays - contrarily, I think that with the trend to 4K display technology or multi-monitor setups, more ppc (or zpc, which also depends hereon) fillrate is going to be as important as ever.

shiznit:
We're not yet seeing the clock speeds for GK110 in Geforce. Compared to the Tesla products, GTX 680 has about 50% clock speed advantage, offsetting the higher ALU count at least partially.
 
keldor314:
I do not think that fillrate per se is unimportant nowadays - contrarily, I think that with the trend to 4K display technology or multi-monitor setups, more ppc (or zpc, which also depends hereon) fillrate is going to be as important as ever.

a 3840x2400 display in a 24" size or slightly higher would certainly be an interesting display.
And stop compromising the refresh rate, make it 120Hz too :). This would require a new display port bandwith increase or using two DP together.

Then there is the linux Steam box (and the same software on beige boxes) which allows multi-user without sending over a thousand dollars to Microsoft, Citrix etc. for the software licensing.
Maybe it supports up to 8 users. Can eventually do a lovely turn key LAN, with the gaming "server" also hosting the client software, even serving a basic linux distro with thin client software over network booting. You can come with your overheating 5 year old laptop, your netbook or use some leftover toxic garbage from a decade ago, but you need a GPU with a lot of fillrate and bandwith in the server.
 
a 3840x2400 display in a 24" size or slightly higher would certainly be an interesting display.
And stop compromising the refresh rate, make it 120Hz too :). This would require a new display port bandwith increase or using two DP together.

What I saw several days ago in front of me was quite disappointing. Full HD on a 15-inch laptop running Windows 7. Because of the relatively high resolution and the small display, you had terribly small letters... The picture itself was awful. Scaling was completely messed up. Does it mean that Windows 7 is not ready to adjust this in some user-friendly manner?
 
What I saw several days ago in front of me was quite disappointing. Full HD on a 15-inch laptop running Windows 7. Because of the relatively high resolution and the small display, you had terribly small letters... The picture itself was awful. Scaling was completely messed up. Does it mean that Windows 7 is not ready to adjust this in some user-friendly manner?

Not the type of problem i have seen on such laptop.. maybe something was wrong.

But at contrario i know with 13" the DPI scaling of windows 7 bring some problem on some occasion.

test of the Asus ux31 display 13.3" 1080p.. ( effectively the comment say all. )
http://www.anandtech.com/show/6194/asus-ux31a-putting-the-ultra-in-ultrabooks/4

With all the good we have to say about the LCD, we do need to offer one minor word of caution. Windows 7 still doesn’t handle DPI scaling perfectly, and 1080p in 13.3” makes this one of the highest density LCDs around. Windows 8 may improve on the situation, but for those who stick with Windows 7 you’ll still encounter the occasional quirk. ASUS ships with the DPI scaling set to 125% as mentioned earlier, and it’s really necessary if you want most text to be legible. Even with the minor issues with some applications, though, I’d take this sort of display ten times out of ten if given the option.
 
Last edited by a moderator:
What I saw several days ago in front of me was quite disappointing. Full HD on a 15-inch laptop running Windows 7. Because of the relatively high resolution and the small display, you had terribly small letters... The picture itself was awful. Scaling was completely messed up. Does it mean that Windows 7 is not ready to adjust this in some user-friendly manner?

Was it set to 125% or 150% scaling? If you have 1080P on a 15" at 100% yeah it's going to be quite small.
 
Well there are only two easy solutions :
- Use only interfaces made of flat boxes, fonts, vector icons and throw away all your old GUI software. Wouldn't Windows 8 Metro-only work like that? (using only text terminals also works, thus that 1080p laptop would be appealing to people who only deal with command lines and computer code)
- Apple's way, works with specifize display sizes and resolutions : render everything you can in high DPI, and for the rest (or even in doubt) use pixel doubling, the most trivial scaling method. No ugly bitmap scaling. Caveat : if an application isn't updated to say it supports high dpi, it will be displayed entirely in low dpi even though it could have used the sharper fonts (I believe). That's maybe an implementation detail, intended to not suffer any bug.

the Apple way of course doesn't work on a 1080p laptop : if you could do that, you would be simulating a 960x540 display. Well you would be free to do it but you've lost a ton of "real estate" space.
 
Last edited by a moderator:
- Apple's way, works with specifize display sizes and resolutions : render everything you can in high DPI, and for the rest (or even in doubt) use pixel doubling, the most trivial scaling method. No ugly bitmap scaling. Caveat : if an application isn't updated to say it supports high dpi, it will be displayed entirely in low dpi even though it could have used the sharper fonts (I believe). That's maybe an implementation detail, intended to not suffer any bug.

the Apple way of course doesn't work on a 1080p laptop : if you could do that, you would be simulating a 960x540 display. Well you would be free to do it but you've lost a ton of "real estate" space.
In Apple's method, the 2x resolution that the desktop is rendered at doesn't have to be the display's native resolution. For instance, the 15" Retina MacBook Pro has the native 2880x1800 resolution which looks like (a twice as sharp) 1440x900, but also has other settings including a 3840x2400 resolution which looks like 1920x1200.

The same approach on that 1080p laptop would give resolutions like 2560x1440 which looks like 1280x720.
 
In Apple's method, the 2x resolution that the desktop is rendered at doesn't have to be the display's native resolution. For instance, the 15" Retina MacBook Pro has the native 2880x1800 resolution which looks like (a twice as sharp) 1440x900, but also has other settings including a 3840x2400 resolution which looks like 1920x1200.

The same approach on that 1080p laptop would give resolutions like 2560x1440 which looks like 1280x720.

The problem with the Macbook retina is your pass your time to adjust the resolution... movies.. you switch... internet, you switch down the resolution... a software who have not the good scaling, you switch. ( hopefully the switch have a easy acess )

The next step is maybe an auto scaling... the OS analyse what is displayed and switch, scale and adjust only the part needed when it is needed.
 
Last edited by a moderator:
Lol, some computers of the 1980s would switch resolution on the fly i.e. draw the top 20% of the screen at high res, then 60% at low res and the bottom 20% at high res (or the reverse, or another arbitrary mix).

So, you could have some high res UI elements and colorful blocky graphics, or colorful UI and e.g. high res 1bit monochrome "3D rendering".
Of course, "high res" was something like 320x200 or 256x192, and "low res" was something worse.
 
Lol, some computers of the 1980s would switch resolution on the fly i.e. draw the top 20% of the screen at high res, then 60% at low res and the bottom 20% at high res (or the reverse, or another arbitrary mix).

So, you could have some high res UI elements and colorful blocky graphics, or colorful UI and e.g. high res 1bit monochrome "3D rendering".
Of course, "high res" was something like 320x200 or 256x192, and "low res" was something worse.

Amiga had the OS GUI written around this concept: you had multiple screens and you were able to pan up/down or to swap them.

Copper was an mazing chip. Copper/Blitter were GPGPUs ante litteram.
 
Lol, some computers of the 1980s would switch resolution on the fly
Those were the days eh. :) No, I'm not really wishing to go back to them... Computers were horribly limited, and sometimes plain horrible to use back then, compared to what we have now.

Resolution switching could be done on the fly as long as the scan rate was the same, I think this actually goes as far back as the C64 or Atari 800 series, and maybe even earlier. Atari VCS, maybe...? Not sure you can actually talk about video modes with that machine though, considering the main CPU literally programmed the video scanout in realtime.
 
Computers were horribly limited, and sometimes plain horrible to use back then, compared to what we have now.

That's not true, you are comparing from time perspective but back in the 90s (because for the 80s I cannot tell anything :mrgreen:) I didn't feel any pain. With time, when you get really more pleasant experience, you get used to it, and in this case it is painful to go back.
 
Maybe you had a natural talent for it, but me personally, I found windows PCs in the 90s fairly painful to use, both hardware- and software-wise. Software, win 3.x and 9x versions and their reliance on the ancient and outdated MSDOS, no memory protection or garbage handling saw to that, as well as rudimentary at best multitasking support and so on. Hardware... ISA bus with DIP switches to manually configure resources was just the absolute pits. AT mobos, with serial and parallel ports that had to be manually screwed into the chassis, no hotplugging anywhere, pull the cord of a PS2 connector and the computer friggin hung for chrissakes. Not to mention what happened if you pushed the eject button while accessing an optical drive. :LOL:

No, I have no rose colored glasses about that era. Well, a little, perhaps, but not much.
 
You want we speak about the time where we had a round button on the monitor for change the colors of the screen ? white text on black screen, blue text on yellow screen, and red text on white screen etc.. ? lol. I think remember something like the Atari was the first to display 4 colors at the same time on the screen ( or 8 dont remember )
 
Back
Top