NVIDIA Maxwell Speculation Thread

In theory, gaming at 720p on a 1440p display should be the same as 720p on a 720p display. In practice, some displays have crappy scaling algorithms that add unnecessary blur anyway.
I actually tried this on a 1600x1200 CRT display, disabled display scaling and used GPU scaling, then applied 800x600 resolution on Assassin's Creed 2 with max settings, the resulting image had too much aliasing and blur, and was noticeably worse by several times.
 
I actually tried this on a 1600x1200 CRT display, disabled display scaling and used GPU scaling, then applied 800x600 resolution on Assassin's Creed 2 with max settings, the resulting image had too much aliasing and blur, and was noticeably worse by several times.

Scaling is not a problem on CRT monitors because, if memory serves, they actually display the pixels they're supposed to when you input a given definition that is lower than the maximum they can support.

But of course 800×600 will always look worse than 1600×1200.

However, 800×600 on a 1600×1200 LCD should not look worse than 800×600 on a 800×600 LCD; which just means you don't have to suffer a penalty for having a high-definition display.
 
On NVIDIA cards there's an option in the NVCP to have the GPU do scaling. Back in the day it work work like every other driver release :) but I hope they've fixed that now.
 
Scaling is not a problem on CRT monitors because, if memory serves, they actually display the pixels they're supposed to when you input a given definition that is lower than the maximum they can support.
It's because they have a good natural interpolation, it's an inherent characteristic of the electron beam. I actually tried both GPU and CRT scaling, the latter was worse.

But of course 800×600 will always look worse than 1600×1200.
I get the aliasing point, in theory and in practice it is understandable, however there was a noticeable blur too.

However, 800×600 on a 1600×1200 LCD should not look worse than 800×600 on a 800×600 LCD; which just means you don't have to suffer a penalty for having a high-definition display.
Why do you think the LCD would be different than a CRT?

On NVIDIA cards there's an option in the NVCP to have the GPU do scaling. Back in the day it work work like every other driver release :smile: but I hope they've fixed that now.
You mean it doesn't work?
 
In the past it would sometimes work and sometimes not, pretty much every other driver release. I haven't tried it in a while but when it did work it worked well.
 
Next week, March 25th is Jen-Hsun's GTC 2014 keynote address. Will second generation Maxwell be shown during the keynote?
 
Last edited by a moderator:
It's because they have a good natural interpolation, it's an inherent characteristic of the electron beam. I actually tried both GPU and CRT scaling, the latter was worse.


I get the aliasing point, in theory and in practice it is understandable, however there was a noticeable blur too.


Why do you think the LCD would be different than a CRT?

Because ultimately, LCDs have to output in their native definition, whatever the input may be. So if the two differ, they need to interpolate. CRTs don't have that problem.
 
To be pedantic (which I like doing) the bigger CRTs have double scan to display resolutions that are low for them, like the standard VGA text mode (720x400 at 70Hz). Meaning the scanlines are displayed twice, or from what I have seemed to witness they're spaced out as if every other line was black.
That may give some some neat arcade-monitor look, and whatever the technical reason is it's pretty sharp.
 
Next week, March 25th is Jen-Hsun's GTC 2014 keynote address. Will second generation Maxwell be shown during the keynote?
I'm expecting some sort of introduction for Maxwell 2nd gen similar to how GK110 was announced at GTC 2012. That being said I think whether or not a "big" Maxwell will be among those announced will depend on when that chip is scheduled to be released. If it's planned for this year then I'm sure we'll hear about it and get some details. If it's planned for later then maybe not.

[I'm making the assumption that the GM1xx codenames are 1st gen Maxwell and the leaked GM2xx codenames are 2nd gen Maxwell.]

Could there be an announcement of a K10 successor—a 2x GM204 Tesla part?
 
To be pedantic (which I like doing) the bigger CRTs have double scan to display resolutions that are low for them, like the standard VGA text mode (720x400 at 70Hz). Meaning the scanlines are displayed twice, or from what I have seemed to witness they're spaced out as if every other line was black.
To be even more pedantic, I don't think that's quite correct. I've never heard of monitors doing double scan themselves, but double scan modes do exist. They were typically used to display things like 320x240 (or 320x200) on more-than-VGA multisync monitors (which usually have a lowest vertical sync range about 30kHz, corresponding to 640x480 at 60Hz). This is entirely a graphic card thing though, the monitor doesn't know about this (the display controller simply scans out each line twice).
 
9xLRkHe.png


P66MuqA.png
 
I assume that means DX12 has specific requirements that Nvidia has managed to get in there that not all AMD 5/6/7 series cards meet.
 
Pre-GCN, we're talking about far less flexible scheduling, control, and no read/write memory pipeline.

There are a number of elements, like programmable blending and the pixel-sync buffer type that at the very least can be software-emulated with general reads, writes, and synchronization.

This could be why hardware going as far back as Fermi might be able to hit the checkboxes, while AMD's persistence with the antiquated VLIW architectures is why GCN is the bare minimum.
 
No biggie. GCN has been around for a while now, if you're on anything less you should upgrade.. especially by the time DX12 hits.
 
AMD alone put themselves in the wrong position and then the market share speaks for them...

So, all DX11-compliant NV cards will support DX12 out of the box, but AMD with VLIW not. Why not, doesn't the support of DX11 somehow cover the basic needs and requirements, f&ck, sh&t, absolutely not explainable.
 
Why expect a years-old version of an API to have the exact same requirements and minimum features as a future one?
Isn't a Microsoft API with the exact same requirements and features as DX12 called DX12?
 
I assume that means DX12 has specific requirements that Nvidia has managed to get in there that not all AMD 5/6/7 series cards meet.

No it doesn't.
DX12 API will work on all Fermi/Kepler/Maxwell, just like DX11.2 works on all DX9+ hardware which has drivers that support the API.
Fermi/Kepler/Maxwell will still be limited to D3D Feature Level 11_0 even with DX12, which may or may not bring Feature Level 12_0.

AMD hasn't said anything about which of their cards will support DX12, but most likely it'll be all of their DX11 hardware, just like NVIDIA, and they'll be most likely still limited to Feature Levels 11_0 (VLIW) and 11_1 (GCN) even if 12 brings something new (though there's the remote chance of GCN qualifying for Feature Level 12_0 if there will be such Feature Level, too, due XB1 using GCN hardware)
 
Back
Top