nvidia "D8E" High End solution, what can we expect in 2008?

2x 300mm² dies, 2x1GiB GDDR4 (AFR!) + >=250W power-source + single-PCB with a massiv amount of layers = new Record-BOM for a consumer graphics card...

well every generation uses more power than the last, i think the gtx at peak used 30% more than the 7900 GTX. so i dont think power requirements would be prohibiting. why not have a new record? that would be neat.
 
well every generation uses more power than the last, i think the gtx at peak used 30% more than the 7900 GTX. so i dont think power requirements would be prohibiting. why not have a new record? that would be neat.

Not true for this generation.
8800 GTS/GTX -> 8800 GT = 145/185W -> 110W
2900 XT -> HD 3870 = 200W -> 105W

same/better performance for each with far lower TDP (especially for ATi)
 
im talking about 7900 GTX to 8800 GTX (last gen high end to current high end). what you posted is out of scope. also, the GTX murders the GT, you need to read more or something.

anyways, if ATI can do it, why not nvidia???

r680.jpg


GTX murdering GT. "same/better" = HAAHAHAHAHAHAH

red GTX, blue GT

5876-TF3.gif
 
Last edited by a moderator:
anyways, if ATI can do it, why not nvidia???

AMD has a 190mm² die and plans on reference 3870 X2 only 2x512MiB. ;)

Lets see what will happen in January, I doubt that NV will give away the crown (for a long time).

So G92s true power will unveiled: near 3 TFLOPs as highest consumer-setup, like I heard.
 
Not true for this generation.
8800 GTS/GTX -> 8800 GT = 145/185W -> 110W
2900 XT -> HD 3870 = 200W -> 105W

same/better performance for each with far lower TDP (especially for ATi)
You forgot about the GX2/X2 cards which should replace the top-ends of previous generation.
For these you should double those numbers.
 
im talking about 7900 GTX to 8800 GTX (last gen high end to current high end). what you posted is out of scope. also, the GTX murders the GT, you need to read more or something.

anyways, if ATI can do it, why not nvidia???

GTX murdering GT. "same/better" = HAAHAHAHAHAHAH

red GTX, blue GT

A couple things here:

First, the GTX does not "murder" the GT until you go to extreme resolutions with extreme levels of AA. For the vast majority of PC gamers, the GT is just as good as the GTX. Case-in-point: I have a friend that recently built an SLI 8800 Ultra rig on a Q6600. He's running a 22" 1680x1050 LCD. There's not a single game out there where SLI makes a bit of difference to him, because his monitor is too small to take advantage of it. Most gamers are on 1600x1200/1680x1050 (or lower) monitors, with the vast majority on 1280x1024 still. You don't need a GTX for 1.3-1.7 megapixel resolutions.

Second, were I in your shoes, I would do more reading and less talking, excepting to ask questions of course.
 
I'm glad I have a GTX with my Dell 2407WFP . When the game allows AA, I run it even though many claim AA isn't needed at 1920x1200. :oops:
 
Those that claim AA isn't needed at 1920x1200 haven't played at that resolution.
 
AA is always needed, lower pixel pitch = higher AA levels necessary.

It would be interesting to see how little AA would be needed on a monitor like one of those 22.2" 3840x2400 monsters :D
 
It would be interesting to see how little AA would be needed on a monitor like one of those 22.2" 3840x2400 monsters :D

Along with Quad-SLI ... :love: {where's that damn :drool: smilie at anyways}
 
As stated multiple times resolution has diddly to do with regards to aliasing unless you can raise resolution without increasing the size of the monitor.

The aliasing is significantly less noticeable on a 22" 3840x2400 resolution LCD for example, but it's still there.

Pixel density (DPI) is far more of an influence on reducing or worsening aliasing. But I don't suppose the urban myth of high resolution = less aliasing will ever be completely dispelled. /sigh.

I hope to god either Nvidia or ATI actually release a single chip high end solution this spring. Otherwise I'll give both of them my middle finger. :p

Regards,
SB
 
As stated multiple times resolution has diddly to do with regards to aliasing unless you can raise resolution without increasing the size of the monitor.

The aliasing is significantly less noticeable on a 22" 3840x2400 resolution LCD for example, but it's still there.

Pixel density (DPI) is far more of an influence on reducing or worsening aliasing. But I don't suppose the urban myth of high resolution = less aliasing will ever be completely dispelled. /sigh.

Huh? So what does a 3840x2400 22" monitor offer over a 1920x1200 22" monitor other than increased DPI?
 
Well, it wasn't an urban myth when people were using CRTs...

I am still using a 21" CRT capable of up to 2048*1536. Above 1600*1200 it's stretching content which practically means that I get a form of 2x oversampling (on just one axis) for free. While it's a nice added plus with no performance drop it doesn't eliminate aliasing and definitely doesn't come close to the level of 4xMSAA+TransparencyAA+AF in that resolution instead.

Now if in theory such a monitor had a higher video bandwidth and could stretch content on both the x and y axis for way higher resolutions, it might balance things slightly towards the opposite direction, but I honestly wonder how on God's green earth you're going to read any text in any game in something like that.

It's an urban myth for CRTs too, since the content stretching when overriding the monitor's mask is giving you a form of antialiasing for free anyway. In reality there is antialiasing being used, thus it's hardly a high resolution making any form of AA to any degree redundant.
 
but I honestly wonder how on God's green earth you're going to read any text in any game in something like that.
My old Sony 19inch CRT could run 1600x1200 and though it looked great at that resolution, text was unreadable. At least for these old eyes. :cry:
 
Back
Top