NVIDIA Maxwell Speculation Thread

Does GK110 (Big Kepler) supports async compute scheduling in D3D12 and is it related to the dynamic parallelism feature in CUDA CC v3.5?

Dynamic parallelism is a bit different. It allows the GPU to create work without direction from the CPU. The closest thing nVidia has to async compute is HyperQ but I don't think it supports mixed compute and graphics workloads.

Can't know for sure though as it's always possible their drivers and hardware can do things that aren't exposed via CUDA APIs.
 
From Videocardz: "NVIDIA readying Quadro M5000 and Quadro M4000."

driver said:
NVIDIA_DEV.13F0.1152.103C = "NVIDIA Quadro M5000"
NVIDIA_DEV.13F1.1153.103C = "NVIDIA Quadro M4000"
NVIDIA_DEV.13F2= "NVIDIA Tesla M60"
All three parts are said to use GM204.

Last year there were signs of a Tesla M40 but it hasn't (yet) been officially introduced as far as I know. Is there any way to tell if the M60 consists of one or two GPUs?

Also, while I was searching for info on potential future Teslas, I found the Tesla K8 (1x GK104, apparently introduced September 2014) whose board specification is here. I didn't see any news release or similar official announcement for the K8, which seems strange to me.
 
Nvidia updates Digits and cuDNN GPU-accelerated deep learning software
NVIDIA HAS ANNOUNCED a series of updates to its GPU-accelerated deep learning software that it claims will double deep learning training performance.

"The new automatic multi-GPU scaling capability in Digits 2 maximises the available GPU resources by automatically distributing the deep learning training workload across all of the GPUs in the system," Nvidia said.
"Using Digits 2, [our] engineers trained the well-known AlexNet neural network model more than two times faster on four Nvidia Maxwell architecture-based GPUs compared to a single GPU."

Nvidia said that the cuDNN 3 update also provides higher performance than cuDNN 2, enabling researchers to train neural networks up to two times faster on a single GPU.

"The new cuDNN 3 library is expected to be integrated into forthcoming versions of the deep learning frameworks Caffe, Minerva, Theano and Torch, which are widely used to train deep neural networks," explained the firm.

"It adds support for 16-bit floating point data storage in GPU memory, doubling the amount of data that can be stored and optimising memory bandwidth. With this capability, cuDNN 3 enables researchers to train larger and more sophisticated neural networks."

The Digits 2 Preview release is available now as a free download for registered developers, and the cuDNN 3 library is expected to be available in major deep learning frameworks "in the coming months".
http://www.theinquirer.net/inquirer...-cudnn-gpu-accelerated-deep-learning-software
 
Will be upgrading to 980Ti....but found out Nvidia do not enable 10bit color in their consumer cards, fml.

Where can we bring this up to Nvidia? 10bit consumer monitors are dime a dozen now, AMD has it in their CCC. (thanks Dave!)
 
Will be upgrading to 980Ti....but found out Nvidia do not enable 10bit color in their consumer cards, fml.
Not huge demand I guess ... but if you definitely need 10bit color w/Geforce I believe they are enabled with Linux/FreeBSD drivers, though not sure that's still the case. Link below is when feature was added. To see if still supported you have to read current Linux driver manual.
http://www.nvidia.com/object/linux-display-ia32-295.20-driver.html

Or you can support AMD products ....
 
Last edited:
Will be upgrading to 980Ti....but found out Nvidia do not enable 10bit color in their consumer cards, fml.

Where can we bring this up to Nvidia? 10bit consumer monitors are dime a dozen now, AMD has it in their CCC. (thanks Dave!)

10-bit monitors are a dime a dozen? Where!?
 
Who really needs 10-bit color anyway? It's far beyond human ability to percieve. What we'd REALLY need is more color-accurate backlighting for monitors.
 
Who really needs 10-bit color anyway? It's far beyond human ability to percieve. What we'd REALLY need is more color-accurate backlighting for monitors.
Here's a very simply exercise: display a 256 grayscale (or R-only, G-only, B-only) color ramp on your screen. Preferably on a larger monitor (e.g. 27"). Observe the transitions from one color to the next. It's really that simple.
 
Those aren't consumer applications however, but rather professional.
Smooth gradients come up regularly. Try to make a simple windows background image with a smooth color gradient in the back. On a 1920x1080 screen, 256 shades (of any color) produces clearly visible 8 pixel wide bands instead of smooth image. 10 bit displays produce 1024 colors and that is pretty good (bands are only 2 pixel wide).

But instead of 10 bit colors, I would prefer 16 bit float HDR displays. Current displays produce images that try to look like printed media (normalized brightness), instead of trying to look like the real world (no limit to brightness). HDR displays would make games (and movies and photos shot at HDR) look so much better.
 
Current displays produce images that try to look like printed media (normalized brightness), instead of trying to look like the real world (no limit to brightness).
I don't think that would be a good idea. Looking at a bright display quickly gets very tiring for the eyes. Then there's been research that says light from bright blue LEDs (as used in computer backlights for example) has negative effects on the eye, even on a long-term/permanent basis, and so on.
 
I don't think that would be a good idea. Looking at a bright display quickly gets very tiring for the eyes. Then there's been research that says light from bright blue LEDs (as used in computer backlights for example) has negative effects on the eye, even on a long-term/permanent basis, and so on.

Yup, my eyes start to bleed (simply become all red) if I look at a display with brightness set to some high levels.
The solution is to level the display brightness to the brightness of the surroundings.
Then the problem disappears magically. :D
 
I bought a BenQ BL3200PT LCD with the "low blue light" modes. Very comforting for the eyeballs. It also has a little control puck for menu and presets so I can switch to other modes with a click. It does have 10-bit capability, which my Radeon 6950 supports, but I couldn't see a difference with anything I do.
 
Last edited:
I bought a BenQ BL3200PT LCD with the "low blue light" modes. Very comforting for the eyeballs. It also has a little control puck for menu and presets so I can switch to other modes with a click. It does have 10-bit capability, which my Radeon 6950 supports, but I couldn't see a difference with anything I do.

I am not sure if i am seeing things...but there is a stock Win8.1 metro wallpaper, the one with the 'S' swirl in the middle.

Click personalize and change the accent of the background to light grey (fourth box starting from the bottom right), and the 'S' swirl to light red (same fourth box from bottom left)

I changed between 8bit and 10bit..and noticed clear banding from the bottom half of the swirl onwards...
 
I don't think that would be a good idea. Looking at a bright display quickly gets very tiring for the eyes. Then there's been research that says light from bright blue LEDs (as used in computer backlights for example) has negative effects on the eye, even on a long-term/permanent basis, and so on.
There is roughly a 10,000x brightness difference in a white indoor room wall (no window, basic home lighting) vs a white building wall directly hit by the sun in a bright summer day. Your eyes adapt to bright environments. A bright summer day is not tiring for the eyes in the long run and being outside in a sunny day doesn't cause permanent damage to your eyes. Current displays are not even close to this brightness.

If the display is too small and dark surroundings cover most of the view, then there is a problem. But this problem is easily solved by bigger displays that cover most of your view. Also head mounted displays (such as Oculus Rift) solve this problem.

Currently one of the biggest problem (in addition to resolution) in believability in Oculus (and other HMDs) for me is the lack of HDR. Sky and highlights just look really flat compared to real world.
 
Who really needs 10-bit color anyway? It's far beyond human ability to percieve. What we'd REALLY need is more color-accurate backlighting for monitors.
10-bit support (Radeons go up to 12 though, I think?) improves 8-bit colours too
 
Back
Top