nvidia "D8E" High End solution, what can we expect in 2008?

Slower than an 8800 Ultra in Crysis @ 2560x1600, only about half as fast, actually. It this likely to be a framebuffer size constraint, or a memory bandwidth limitation, or an SLI issue?
 
Looks like the same in COD4 and World in Conflict. 1920x1200 w/AA&AF is fine in just about every game, but move that to 2560x1600 and keep the filters and you're lucky to be half as fast as it was @ 1920x1200.

Hopefully it's just a frame buffer limitation and someone comes out with a 2x1GB GX2.
 
16675.png


1920x1200 seems to be the sweet spot for the card. I think the lower memory bus ruins 2560x1600, but thats not a problem.

crysis.png
 
Performance is pretty much as expected but what I find curious is the higher power consumption of the 3870 X2 compared to the 9800GX2 considering the lower transistor count and smaller fabrication process. Did AMD have to bump up the voltage a bit to get those things running at 825Mhz?
 
1) If 30fps is insufficient does that mean that pretty much every console FPS game is "broken"? I won't deny 60fps is nicer but 30fps is still plenty playable, especially in Crysis which many have confirmed is smoother than the average game at low framerates.

This issue crops up every so often, and I'd like to add a few comments/insights even if it is a bit off topic.
The referenced 30fps was 30fps on average. Which means that the frame rate will drop lower. One typical situation is when the action is intense. Multiple opponents, explosions, smoke and other effects are all ingredients in these situations, so basically you get the lowest frame rates in the situations where you would need high frame rates the most. (Personally, I would say that an average of 30 fps is way too low for a shooter, even the modern slow roleplaying variants. I try to maintain framerates above 75fps at all times in all games, and back in the day of competitive gaming, I never went below 125fps rock-solid. Puttering around shooting virtual aliens in a single player game is something else obviously, particularly in many modern games where you move like a slug on valium, but even there I hate missing shots because of frame rate related control issues.)

For many of us control is far more vital to game immersion than graphics minutiae and we will drop graphics fidelity until we get good perceived control. The difference between playing a game, and looking at it. Different people have different priorities.

Console games aim for a certain minimum frame rate, so when you see 30 or 60 fps quoted for those platforms, it is the minimum that is meant, which typically means that the average has to lie at least twice as high. That is not to say that they always reach this design target, but it is obvious when they fail at these low frame rates combined with slow TV refresh rates.

The impact of low frame rates is lessened by slow movement, both of the character and the opponents. Angular (rotation) movement is particularly critical, so games that do not allow you to whip around don't advertise their control limitations as clearly as games that do. On the other hand people also get frustrated by sluggish motion.

Basically, in terms of control, there is no substitute for high frame rates, coupled with lowest possible latency input devices.
Neither all games nor all players have the same needs however, so trying to come up with a single value as a suitable target for gaming in general is both futile and misguided.
(The 60 fps that is an often referenced number, is simply the AC frequency in the USA, which therefore became the refresh rate of their TV sets. As a number, it has no particular bearing per se on the issues involved.)

Sorry for the off-topic expose.
 
HAL, I'd appreciate it if you did more than link and run. Please provide some context when linking.

I'm sorry Dave, but I'm afraid I can't do that.

(In light of recent events, I thought it was apt ;) )
 
Last edited by a moderator:
This issue crops up every so often, and I'd like to add a few comments/insights even if it is a bit off topic.
The referenced 30fps was 30fps on average. Which means that the frame rate will drop lower.

True, but the framerate quoted was 34.4 fps, not 30fps. Crysis is pretty steady in terms of its framerate in a single level so an average of 34.4 fps would likely result in the framrate staying about 30fps for the vast majority of the time, and even when it does go below, it shouldn't drop below say 27fps which is still playable as long as its rare and momentary.

Console games aim for a certain minimum frame rate, so when you see 30 or 60 fps quoted for those platforms, it is the minimum that is meant, which typically means that the average has to lie at least twice as high. That is not to say that they always reach this design target, but it is obvious when they fail at these low frame rates combined with slow TV refresh rates.

If thats the intention then it does seem to be missed a hell of a lot. Most action heavy "30fps" console games show slow down of varying degrees and even some "60fps" games do. Based on that I expect the average framerate of an action heavy "30 fps" console game to be more like 35-40fps. Thats sufficient to stay above 30 fps most of the time with only occasional noticable slowdown - pretty much the norm on consoles for that type of game.

Of course some clever console games can actually dynamically change the level of detail if frames start to get too low which can help keep things above the 30fps mark, e.g. Lost Planet which dynamically adjusts the level of AA used. Pretty clever IMO!

Neither all games nor all players have the same needs however, so trying to come up with a single value as a suitable target for gaming in general is both futile and misguided

Indeed. Which is why I take issue at the site not even bothering to benchmark the game at higher settings because they consider it unplayable. I would wager the majority of gamers would rather play around 30fps at the highest details than 60fps at Medium.
 
Slower than an 8800 Ultra in Crysis @ 2560x1600, only about half as fast, actually. It this likely to be a framebuffer size constraint, or a memory bandwidth limitation, or an SLI issue?

They should have die shrunk the G80 instead of doing some a G92 GX2 BS card. The G80 was awesome, why screw with a good thing and make a lesser card? Talk about milking the money train. If they continue to do this in the future, People will start to skip refreshes from them and just go after the generational change chips.
 
I'm claiming the price for best guess of transistor count. :smile:

The 1.4 B could refer to all transistors including the unused redundant ones.



My guess is we will see another monster GPU for the 9800GTX:

55 nm
1.2 B transistors

256 SP
64 trilinear texture units (64 TA 128 TF)
2 GHz shader
750 MHZ core
512 bit bus, 150 GB/s
1 GB

So basically twice a G80, 3x shader speed at 1.5 TFlop
 
I wish I knew enough to give a technical opening, but because im still a noobie in these parts ill just stick to the big numbers noobies usually look for and leave the rest to other members here.

"9800 GTX"
dual G92 (dual pcb or dual GPU one pcb?????)
384 bit bus
1.5gb memory
full 128 SP per gpu
64 "TMU" per gpu
24 ROP per gpu
750mhz core
2000mhz shader
2.4Ghz GDDR4

WOW, GLAD I WAS SO WRONG!!!!!
 
Back
Top