NVIDIA: Beyond G80...

According to some rumor Crysis delayed and coming out with G92 in november.
Looks like NV can make impact to game dev's when they need to release the games to match with they new HW with TWIMTBP game optimized driver ;)
 
I've never heard that before.
Isn't that only of interest to those who have a monitor that requires a dual-link DVI to work (i.e., a 30 inch LCD, usually with 2560 x 1600 native resolution) ?
To my knowledge, a single-link DVI output with HDCP can drive a 1920 x 1200 @ 60Hz display, more than enough for a 1080p Blu-ray or HD-DVD, but i might be mistaken.
Yeah I should have qualified that, 2560x1600 displays that need dual-link won't accept 1080p HDCP protected content from G80 at full resolution. Each of the links needs an HDCP key. X1K has the same problem.

Presumably NVidia's newer GPUs solve this problem. A mate has been talking about buying a 30" panel, so this issue was something I raised :LOL:

Jawed
 
Yeah I should have qualified that, 2560x1600 displays that need dual-link won't accept 1080p HDCP protected content from G80 at full resolution. Each of the links needs an HDCP key. X1K has the same problem.

Presumably NVidia's newer GPUs solve this problem. A mate has been talking about buying a 30" panel, so this issue was something I raised :LOL:

Jawed
G84/86, although only 8600 GTS has HDCP by default.

All HD 2000 cards support it.
 
G84/86, although only 8600 GTS has HDCP by default.

All HD 2000 cards support it.

All Geforce 8800's (GTS 320/640, GTX, Ultra) support it too, and it's mandatory (in the lower segments only the 8600 GTS qualifies as a mandatory HDCP product, it's optional on every other 8600 GT, 8500 GT and 8400 GS).
 
The issue is not that G80 doesn't support HDCP, it's that it doesn't support HDCP over dual-link connections, only over single-link. That means that, for example, if you're using a 30" monitor, you can't play back encrypted video full-screen; you have to use a lower res and let the monitor upscale for you, which makes for poorer playback quality.

It's made worse by the fact that some monitors (for example, some 30" Dell screens) automatically switch from single-link to dual-link at a much lower resolution than is necessary. It is technically possible to operate single-link up to 1920x1200 @ 60Hz. But screens like the ones I'm talking about switch to dual-link at any resolution higher than 1366x768. That means that if you want to use your 8800GTX system to play back an encrypted 1920x1080 (i.e. 1080i or 1080p) video stream on your Dell 30" monitor, the video has to be downscaled to 1366x768, otherwise HDCP doesn't work.
 
nVidia
dual-link HDCP (mandatory): 8600 GTS
dual-link HDCP (optional): 8600 GT and 8500 GT

All other HDCP cards from nVidia are single-link (not to be confused with the regular dual-link DVI)

No.
You are confusing "HDCP over single-link DVI" mode of a Dual-Link DVI port with a "single-link DVI" port.
All that matters is that, when playing back a Blu-ray or HD-DVD movie and outputting the video feed through a "Dual-Link DVI" port on every 8800, it will revert back to single-link DVI in order to become HDCP-compatible. It does not apply in any other content (desktop, games, DVD, etc).

And, think about it:
Why would you need to upscale the video in the graphics card itself ?

All HD movies are encoded in 1920 x 1080 progressive scan, a 30" LCD computer monitor was probably very far behind on AACS LA's list of priorities and near-to-medium term mass consumption displays. Every time you upscale, you lose quality, whether it's the GPU or the display doing it. Are you willing to do that on a native 1080p HD source ?
They never intended to promote a 2560 x 1600 resolution since, first it's a PC ratio (16:10), second the cost of authoring and local processing is likely much higher, and third there's no consumer LCD/Plasma/DLP/RPTV television capable of displaying in such huge detail anyway...
 
No.
You are confusing "HDCP over single-link DVI" mode of a Dual-Link DVI port with a "single-link DVI" port.
All that matters is that, when playing back a Blu-ray or HD-DVD movie and outputting the video feed through a "Dual-Link DVI" port on every 8800, it will revert back to single-link DVI in order to become HDCP-compatible. It does not apply in any other content (desktop, games, DVD, etc).

And, think about it:
Why would you need to upscale the video in the graphics card itself ?

All HD movies are encoded in 1920 x 1080 progressive scan, a 30" LCD computer monitor was probably very far behind on AACS LA's list of priorities and near-to-medium term mass consumption displays. Every time you upscale, you lose quality, whether it's the GPU or the display doing it. Are you willing to do that on a native 1080p HD source ?
They never intended to promote a 2560 x 1600 resolution since, first it's a PC ratio (16:10), second the cost of authoring and local processing is likely much higher, and third there's no consumer LCD/Plasma/DLP/RPTV television capable of displaying in such huge detail anyway...
I'm talking about HDCP being only single-link despite using a dual-link DVI. The reason I mentioned dual-link DVI was because it doesn't automatically make HDCP dual-link (e.g. 8800 GTS having dual-link DVI but only single-link HDCP).

With a single-link HDCP you only get a 1280x800 or less resolution with current 30"s when viewing HDCP-protected movies which is caused by scalar issues (the same happens when browsing on it with a regular single-link DVI) and not DRM ones (if not you would get 1080p with a single-link HDCP).

From my guide:
8600 GTS 256MB (675/1450/2000) (5.4/32.0) (PCIe) 128-bit DL-HDCP D-DLDVI

8800 GTS 320MB (500/1200/1600) (10.0/64.0) (PCIe) 320-bit HDCP D-DLDVI
This should settle that I'm not confused. :smile:
 
Last edited by a moderator:
I'm confused. I was thinking of getting a 30" Dell in the not too distant future. Would I be better off getting a 27" 1920x1200? I kind of don't understand the explanations. If it can play at 1920x1080, what is the matter with it displaying that as 1920x1080 on the 2560x1600 display, and setting it scaled aspect ratio? You have to stretch the image on a few games, too, to 1920x1200, since that is the highest some will go, but I'm sure it still looks great.

That is a very good feature of NVidia cards - you can fix the aspect ratio so that it will fill in perfect 16:9 or 4:3 and leave verticle bars on either side of the screen while filling it to it's height. ATI cards don't do it - the monitor has to be able to do it - but NVidia cards can do it on all monitors.

So would that work?
 
Well if I'm understanding the conversation correctly (and I'm probably not), but what you suggest would only work for non hdcp content on the G80 based cards. But most monitors will scale properly anyway so it isn't that big a deal IMHO.
 
If it works for non-HDCP content, then that's okay. That isn't a compelling feature for me at present.
 
I'm confused. I was thinking of getting a 30" Dell in the not too distant future. Would I be better off getting a 27" 1920x1200?
You'll be fine so long as either you're not using a G80-powered video card, or you're not playing back 1080i or 1080p resolution video, or you are playing back 1080i/p video, but the video is not HDCP-protected.
 
http://www.fudzilla.com/index.php?option=com_content&task=view&id=2307&Itemid=1

Nvidia plans to introduce at least one 65 nanometre chip in late Q3 or early Q4. The chip is codenamed D8M where D stands for Discrete and M stands for mainstream.

We don’t know much about this chip but it looks like this would be a variant of G84 chip and will finally replace 8600 GTS and GT cards in the mainstream market.

Nvidia is really tight on information about this project but we know that this one is on the way. We believe that there will be a 65 nanometre of the mobile chip as Nvidia desperately needs one.

Midrange and not the new gen G92?

Well i think we will be seeing at least something from nVIDIA some time soon.
 
http://www.fudzilla.com/index.php?option=com_content&task=view&id=2307&Itemid=1

Midrange and not the new gen G92?

Well i think we will be seeing at least something from nVIDIA some time soon.

This would seem to go along well with the way NVIDIA has acted since NV30. With NV30, they tried a brand new process (130u) with a brand new complex enthusiast architecture. The less than stellar yields taught them a very painful lesson...

Since then, NVIDIA has been very careful to "debut" new process technology with performance or mainstream parts to ensure that any "growing pains" with the new process won't crush yields.

Then again, I could totally be wrong....lol
 
I'd just like to say that the NV30 had a lot more wrong with it than simply yields. But yes, yields may have been one significant factor in making the part late, and they have restrained from going with brand-new processes for their new high-end chips.
 
I'd just like to say that the NV30 had a lot more wrong with it than simply yields. But yes, yields may have been one significant factor in making the part late, and they have restrained from going with brand-new processes for their new high-end chips.

LOL...I don't think you'll find anyone on the planet who would disagree that... ;)
 
You'll be fine so long as either you're not using a G80-powered video card, or you're not playing back 1080i or 1080p resolution video, or you are playing back 1080i/p video, but the video is not HDCP-protected.

Geforce 8800 GTS 640 MB, but I don't know when, if ever, I'll watch HDCP.
 
I'm confused. I was thinking of getting a 30" Dell in the not too distant future. Would I be better off getting a 27" 1920x1200? I kind of don't understand the explanations. If it can play at 1920x1080, what is the matter with it displaying that as 1920x1080 on the 2560x1600 display, and setting it scaled aspect ratio? You have to stretch the image on a few games, too, to 1920x1200, since that is the highest some will go, but I'm sure it still looks great.

That is a very good feature of NVidia cards - you can fix the aspect ratio so that it will fill in perfect 16:9 or 4:3 and leave verticle bars on either side of the screen while filling it to it's height. ATI cards don't do it - the monitor has to be able to do it - but NVidia cards can do it on all monitors.

So would that work?

1920x1080 resolution will have horizontal bars on the top and bottom of the screen on both 27" and 30" monitors. (with fixed aspect ratio enabled)

No scaling will centre the image on the 30", wasting those extra inches.

27" monitors have quite a bad dpi ratio (worst of all monitors infact), so the screen will look quite blocky if your sitting relatively close to it, and playing at a lower resolution on the 30" monitor will have the similar effect since the image will be stretched.

I would consider a 24" monitor, it has a good resolution for its size and an 8800 can run pretty much all games at that resolution. Though if you really want a bigger monitor then it has to be the 30".
 
Back
Top