nvidia "D8E" High End solution, what can we expect in 2008?

I am still using a 21" CRT capable of up to 2048*1536. Above 1600*1200 it's stretching content which practically means that I get a form of 2x oversampling (on just one axis) for free. While it's a nice added plus with no performance drop it doesn't eliminate aliasing and definitely doesn't come close to the level of 4xMSAA+TransparencyAA+AF in that resolution instead.

What particular monitor is that on? Is such behaviour (stretching of content) standard across most / all CRTs? It does almost sound like a good thing.
 
Nvidia Corp., the world’s largest supplier of graphics processors, said that it has no immediate plans to release graphics cards that offer higher speed than current top-of-the-range GeForce 8800 Ultra, but said that customers seeking for extreme performance will soon be able to install three graphics cards into one system to get incredible graphics rendering horsepower.
http://www.xbitlabs.com/news/video/...dia_s_Focus_for_Now_Says_Chief_Executive.html

IMO that is taking the statement out of context. :???:
 
That is really bad news if true. When are we going to see something new on the high-end? Is 8800GTX going to rule for another year? I hope not.

Personally I think that triple SLI, quadruple Crossfire and other -tuples are uninspiring and substandard :(. If the things will continue to go this way, in two years we will see computers sporting ten graphics cards :(.
 
Jen-Hsun's reply at the CC was specifically aimed at the holiday season, IMO. Of course, that would only make Tri-SLI's delay more ironic...
 
Huh? So what does a 3840x2400 22" monitor offer over a 1920x1200 22" monitor other than increased DPI?

Well assuming you increase resolution and DPI while keeping the physical dimensions the same...

That means that if you display the exact same image while scaling fonts upwards to match the increased DPI (IE - keeping the image exactly the same) then...

You'll have more pixels rendering the same scene. Aliasing will be reduced but not removed. Text will be significantly easier to read. Especially smaller fonts. I had a chance to use the old IBM 22" 3840x2400 monitor a few years back. And the display was just gorgeous. And even if you didn't scale text upwards to match the increased DPI, it was STILL extremely readable.

Problems at the time (and I think current 22" high DPI panels still share this) is that color fidelity was average at best, pixel response was horrible (in the 50+ ms range), and the price was very high.

All of which were irrelevant for the market it was intended for however, so those weren't really drawbacks compared to the benefits.

Regards,
SB
 
That is really bad news if true. When are we going to see something new on the high-end? Is 8800GTX going to rule for another year? I hope not.

Personally I think that triple SLI, quadruple Crossfire and other -tuples are uninspiring and substandard :(. If the things will continue to go this way, in two years we will see computers sporting ten graphics cards :(.

Or maybe it just means that there won't be "D8E", just "D9E" around Q2/08 or so?
 
Do you think the second generation will be able to run Crysis @ 60fps 1680X1050 @ very high or at least high?
 
There probably won't be any D8E, it's too late for that.

1H08 there will be D9E and R700, D8E should at least be 1 billion transistors which would fall in the same timeline.

nVidia's D8 family is a slight tune of the G80 and it's not ready for full DX10 use, so nVidia is probably rushing D9 family.
 
There probably won't be any D8E, it's too late for that.

1H08 there will be D9E and R700, D8E should at least be 1 billion transistors which would fall in the same timeline.

nVidia's D8 family is a slight tune of the G80 and it's not ready for full DX10 use, so nVidia is probably rushing D9 family.

It's not ready for DX10 use ?
How so ? Speed ?
Is the HD2xxx/HD3xxx any better in DX10 in that regard ?
Also, be on the lookout for something new by, say... 3rd of December.
 
Twinkie;1092991[B said:
"NVIDIA G98, 8800GTS 512, 3-way SLI on Dec 3rd"[/B]
Looks like the higher margin 8C/256bit G92 ASIC that Arun reckons Nvidia's been guilding the wafer margin lilly with.
 
It's not ready for DX10 use ?
How so ? Speed ?
Is the HD2xxx/HD3xxx any better in DX10 in that regard ?
Also, be on the lookout for something new by, say... 3rd of December.
When DX10 will be used, we'll know... for the moment it's impossible to say (volumetric fog as the only "dx10 only" effect is a joke)

Only some facts are here to point out G80 and G92 "flaws", revealed when R600 came out, but it could have issues too (in fact, it has already issues with DX9).

I seriously don't think any of the 2 manufacturers will push another 1st gen DX10 product and the "8800GT" name speaks for himself: nVidia don't want to step up or it would have been called 8850 or 8900 given its performance. Last D8 will be the v2 8800GTS.

R600's virtualization is what I expect to be the first addition to next nVidia's design as it could boost GPGPU even more than games.
 
When DX10 will be used, we'll know... for the moment it's impossible to say (volumetric fog as the only "dx10 only" effect is a joke)

Only some facts are here to point out G80 and G92 "flaws", revealed when R600 came out, but it could have issues too (in fact, it has already issues with DX9).

I seriously don't think any of the 2 manufacturers will push another 1st gen DX10 product and the "8800GT" name speaks for himself: nVidia don't want to step up or it would have been called 8850 or 8900 given its performance. Last D8 will be the v2 8800GTS.

R600's virtualization is what I expect to be the first addition to next nVidia's design as it could boost GPGPU even more than games.

Volumetric fog ?
Then what do you call Crysis' dinamic global ilumination in the multiplayer mode, which is exclusive to the DX10 path ?
There are plenty of titles using DX10 features right now (Crysis, Call of Duty 4, Bioshock, Call of Juarez, Company of Heroes, World in Conflict, etc). Certainly a lot more than DX9 games in the same period of time between the API introduction and DX9-ready/DX9-feature-using games.

Also, even if not being able to run those games at ideal speeds (we need to take in consideration the added lag introduced by having to run Vista for DX10), at the least the current hardware allows the developers to write future DX10 titles much faster than if they could only use software-only emulators contained in the DX SDK.
 
Last edited by a moderator:
There probably won't be any D8E, it's too late for that.
Too late for what? :) D8E is most likely a GX2 board using two G92s in SLI.

1H08 there will be D9E and R700, D8E should at least be 1 billion transistors which would fall in the same timeline.
Dunno about R700 but, as i've already said, i wouldn't expect a new hi-end GPU from NV till the end of 2008.

nVidia's D8 family is a slight tune of the G80 and it's not ready for full DX10 use, so nVidia is probably rushing D9 family.
G80 is a part of 'D8 family' ;)
As for it being ready for DX10 -- it's more ready than anything else out there.
 
Back
Top