NVIDIA Maxwell Speculation Thread

I always thought that concept is quite fascinating, So let's say I have a 1440p panel, would gaming at 720 be the same thing as playing at native res? no loss of IQ whatsoever? not even a little bit?

In theory, gaming at 720p on a 1440p display should be the same as 720p on a 720p display. In practice, some displays have crappy scaling algorithms that add unnecessary blur anyway.
 
In theory, gaming at 720p on a 1440p display should be the same as 720p on a 720p display. In practice, some displays have crappy scaling algorithms that add unnecessary blur anyway.
What if we disbaled panel scaling all together? let the GPU handle that?

I know the concept is sound in theory, but I just think if it is really any feasible it would have been exploited by now, for example by Console's makers, maybe even in Tablets and Smart Phones too. let the system play at half res achieving high performance with no loss of IQ, sounds like a pipe dream to be honest.
 
What if we disbaled panel scaling all together? let the GPU handle that?

I know the concept is sound in theory, but I just think if it is really any feasible it would have been exploited by now, for example by Console's makers, maybe even in Tablets and Smart Phones too. let the system play at half res achieving high performance with no loss of IQ, sounds like a pipe dream to be honest.

Apple did exactly this with their retina line of iPads -- the retina resolution was exactly quadruple that of the "standard", so that the X and Y axis scaled exactly by 2x for backwards compatibility. Now, for console makers, you have to consider all the idiot fanboys who would've gone ape about half resolution. Nevertheless, several apps did do half resolution in one axis but full rez in another IIRC.
 
Apple did exactly this with their retina line of iPads -- the retina resolution was exactly quadruple that of the "standard", so that the X and Y axis scaled exactly by 2x for backwards compatibility. Now, for console makers, you have to consider all the idiot fanboys who would've gone ape about half resolution. Nevertheless, several apps did do half resolution in one axis but full rez in another IIRC.

Interesting. I wonder if there's any evidence to suggest than you should pick one axis over the other for such a mixed scheme.
 
Interesting. I wonder if there's any evidence to suggest than you should pick one axis over the other for such a mixed scheme.

Traditionnally you focus on full or high vertical resolution. It was nice for scanline oriented displays (i.e. CRT) ; and CRT or analog TV has a clearly identifiable vertical resolution, whereas the horizontal res is some analog, fuzzy thing.

Perhaps it simply looks good, I remember Duke Nukem 3D at 320x400 was kick-ass! (but a pain to get running)

Designing a scaler is easier, you can simply do it on individual scanlines. Consoles may use e.g. 1440x1080 scaled to 1920x1080. Even Cinemascope is horizontal scaling, done with optics.
 
8GB on a laptop GPU? WTH, that's more than any current desktop GPU including the Titan Black!!

I wonder if that means we'll start seeing 6 and 8GB becoming the standard in the higher end Maxwell desktop chips? Would be nice if so.
I guess that's quite possible. In the meantime, there were apparently 8GB r9 290x on display at cebit, as well as "6GB 384bit r9 270x" (I assume the manufacturer had no idea what the card displayed actually was, though I guess it might be possible a tahiti based card with that name really exists). That might indicate future new cards will have more memory indeed.

forgot the cebit link (in german): http://www.heise.de/newsticker/meld...-fruehestens-im-zweiten-Halbjahr-2139532.html
 
Last edited by a moderator:
8GB on a laptop GPU? WTH, that's more than any current desktop GPU including the Titan Black!!

I wonder if that means we'll start seeing 6 and 8GB becoming the standard in the higher end Maxwell desktop chips? Would be nice if so.

Actually Sapphire has released R9 290X with 8GB mem
 
Scrypt-based mining could have some use of large local memory, since the parallel workload scales with the data size. But one would expect such configuration on a beefier GPU, of course.
 
Apple did exactly this with their retina line of iPads -- the retina resolution was exactly quadruple that of the "standard", so that the X and Y axis scaled exactly by 2x for backwards compatibility. Now, for console makers, you have to consider all the idiot fanboys who would've gone ape about half resolution. Nevertheless, several apps did do half resolution in one axis but full rez in another IIRC.
You have to also consider that this method will exacerbate the effect of Aliasing. Pixels will now be 4 times their size, aliasing too.
 
In jd and taobao(China's amazon/ebay), there are several laptop models with gtx850/860M for pre-order now (according to some e-retailers, they will ship the products in april).
 
Back
Top