will Doom III run at 60FPS on any current or upcoming card?

I havent been looking at the lastest benchies for NV35 and 9800P 256MB
- all I want to know is, will I be able to run Doom III in high quality mode at say 800x600 at 60FPS? on present generation PCs equiped with say NV35 or ATI's next card

60FPS is needed to maintain a CG-look that 30fps really cannot offer.
 
I think running at 800x600 would ruin the CG look more than a 30 fps framerate would. The gameplay would be more fluid/enjoyable at 60 fps, but I don't think that makes it look better.
 
Since I can watch nice realistic CG and real-life movies on a crappy NTSC set without "ruining" the realistic look, I'd say the resolution isn't the issue to achieving realistic look. It's primarily the lighting. Second in line is AA and framerate.

You can run any games you want in 10000x10000 resolution and it ain't gonna make them look like a FMV if they have only vertex lighting.
 
DemoCoder said:
Since I can watch nice realistic CG and real-life movies on a crappy NTSC set without "ruining" the realistic look
This is a BS argument.
The fact is, you sit MUCH farther away from your TV set when you watch stuff on it. Thus, the lack of resolution is not nearly as apparent.
 
Althornin said:
DemoCoder said:
Since I can watch nice realistic CG and real-life movies on a crappy NTSC set without "ruining" the realistic look
This is a BS argument.
The fact is, you sit MUCH farther away from your TV set when you watch stuff on it. Thus, the lack of resolution is not nearly as apparent.

So are you willing to say that "Quake 3" at 1600*1200 looks more realistic than watching "The Matrix" DVD on your 480i TV set?

Thats ridiculous.

Resolution is important, but it is not everything.
 
Actually, what he's saying, if I understand it right, is that The Matrix @ 1600x1200 on a computer screen would look better than The Matrix on a 480i TV :)


Uttar
 
Re: will Doom III run at 60FPS on any current or upcoming ca

megadrive0088 said:
I havent been looking at the lastest benchies for NV35 and 9800P 256MB
- all I want to know is, will I be able to run Doom III in high quality mode at say 800x600 at 60FPS? on present generation PCs equiped with say NV35 or ATI's next card

60FPS is needed to maintain a CG-look that 30fps really cannot offer.

Anand Tech and [H]ardOCP both were able to do some benchamarks with Doom 3 from iD. The results seem to show that with 4xAA and 8x AF, the nVidia 5900U with 256MB RAM ran very near 60 FPS at 1024x768 (Anands has 53 FPS and [H] has 57.2 FPS). Both machines were (apparantly) 3GHZ/800Mhz FSB P4 and used Detonator 40.03 version drivers. But both ran D3 with a "Medium" detail level.

Tom's Hardware also had D3 and did run with "High" quality settings, but a couple of his numbers look really suspect. I wouldn't trust them.

The sites also did some testing ATI cards, but those results seem much more speculative as it appears optimal drivers were not used. The ATI cards did not fare as well, but again, take that with a grain of salt.
 
BoddoZerg said:
Althornin said:
DemoCoder said:
Since I can watch nice realistic CG and real-life movies on a crappy NTSC set without "ruining" the realistic look
This is a BS argument.
The fact is, you sit MUCH farther away from your TV set when you watch stuff on it. Thus, the lack of resolution is not nearly as apparent.

So are you willing to say that "Quake 3" at 1600*1200 looks more realistic than watching "The Matrix" DVD on your 480i TV set?

Thats ridiculous.

Resolution is important, but it is not everything.
thats not what i said at all.
For the reading impaired :p here is a more in depth explaination:
The idea that good quality CGI only requires low resolution is a BS argument. The matrx only looks good on your crappy NTSC set because you sit 15 feet away from it. Sit 18 inches away from it and tell me that it doesnt ruin the realism.

Good god, Boddozerg, i dont know where you got the idea i was saying res matters more than graphics quality. It is neither stated directly nor is it implied.
All i said is plain as day.
 
Uttar said:
Actually, what he's saying, if I understand it right, is that The Matrix @ 1600x1200 on a computer screen would look better than The Matrix on a 480i TV :)


Uttar
close.
what i was really trying to get across is that the reason CGI on TV's looks good is that you sit so far away from them, so the low resolution isnt as apparent. However, on your computer, you are much closer to the display, and so resolution has a much larger impact.
Most of us would agree that 640x480 looks like crap in games, eh?
But do console games look like crap? (hehe). No, because you are so far away, the huge pixels are not apparent.
 
I think it has to do with with the differing types of screens (TV vs computer monitor) more than the distance.
 
RussSchultz said:
You'll notice the difference once you've gone HDTV.

Its like AA vs. Non-AA.
This is true. My room mate has a 55" HDTV. On that screen, the difference between standard tv and hdtv is astounding and instantly noticable.

He hasn't yet hooked his 9800pro to it but I'm looking forward to seeing how that looks.
 
Althornin said:
DemoCoder said:
Since I can watch nice realistic CG and real-life movies on a crappy NTSC set without "ruining" the realistic look
This is a BS argument.
The fact is, you sit MUCH farther away from your TV set when you watch stuff on it. Thus, the lack of resolution is not nearly as apparent.

Balderdash. Are you telling me that if I sit close to an NTSC television set (umm, I have a 100+" inch projection screen just for your info), NO MATTER WHAT VIDEO IS PLAYING, I will perceive the result as "not real"?

The single most important factor for realism is lighting, period. After that, it's AA, you need atleast 64 samples per pixel and temporal to boot to achieve "film look". Even between FILM and VIDEO, the most important difference is lighting. If your lighting is wrong, your film looks like a crappy Spanish soap opera.

You can run flat shaded or vertex lit screens down to atomic resolution for all I care, and it will still look synthetic.


yet, I can look at any photo taken by a 35mm camera, or digital, which is scanned and scaled down to 640x480 (or worse, 320x200 ) and it will look perfectly realistic and great.

Trust me, years of looking at VGA porn *close up* makes me an expert.


DVD Video is a 720x480 (color resolution is 1/2 that). VHS is at something like 300x360. Are you telling me that if I watch the best CGI films on DVD, broadcast, VHS, or VCD, they're going to look fake? I see no discernable difference in the CGI films I have seen on film in the theater vs at home on DVD/VHS except for overall quality. By your own reasoning, nothing you could possibly watch on TV could ever look real. (I do not sit "far" from my screen, and my screen fills my entire visual field)


My experience is precisely the opposite. Low-resolution video or scans add enough visual noise to mask CG-"perfection" that they add to the realism, like running the video through a noise filter, rather than detract from it.

That's why you can watch crappy quicktime video from a shaky cam, and game cut scenes look amazing, and then when you see the real thing on your PC, it looks artificial and less than impressive.

Try this experiment. Take the best looking CG you can find. Drop it's resolution by 1/4 and run it through a "old film" filter in Adobe Premiere. Now tell me that the original looks more non-CGish.
 
Balderdash. Are you telling me that if I sit close to an NTSC television set (umm, I have a 100+" inch projection screen just for your info), NO MATTER WHAT VIDEO IS PLAYING, I will perceive the result as "not real"?
Yes, that is what i am saying.
I find the huge blocky pixels to look like ass, and to be hyper unrealistic.
nice way to get totally defensive and overblown, though.
All i said was the low resolution, when viewed up close, ruins the realism.


yet, I can look at any photo taken by a 35mm camera, or digital, which is scanned and scaled down to 640x480 (or worse, 320x200 ) and it will look perfectly realistic and great.
no, when viewed full screen and up close, it will look like a blocky hyper-unrealistic mess.

as for your AA comments, i say Balderdash.
AA is simply a hack to get around our limited display device technology. Give me higher res over AA any day. I'd take 1024x768 over 512x384x4XAA any day - it looks better and more realistic to me, especially when view up close on a monitor. The lack of resolution destroys realism.
 
DemoCoder said:
Trust me, years of looking at VGA porn *close up* makes me an expert.
I trust and agree with you. Just an example of smal realistic picture of a lady:
a_hedstrom_H1_645v_lady_4grid.jpg
 
Well, clearly you've never tried to produce any CGI or video film. I've spent many an hour trying to achieve "film look" quality. I invite you to do either of the following:

1. pick a CG scene, render it at high resolution WITHOUT AA, and motion blur at 60fps (or 120fps, or whatever you want)

2. pick same CG scene, render at NTSC resolution with AA, @24fps and motion blur.

The fact is, whether you like it or not, we are used to natural AA and noise filtering film grain provides and especially the temporal AA. Super high-fps and high-rez without any AA looks too clean, too artificial. You could shoot film with high speed film, but the result would look decidely less "realistic" to anyone who grew up with "realistic" hollywood film techniques.

I call to the witness stand, Pixar experts:

Rob Russ @ SDSU said:
Hello all,

I've worked mostly in video until now, but I'm preparing to render a big
project for 35mm film.

What experiences/advice can anyone share with me/the world about the
adventures of rendering for film res.

One specific question: I'm trying to decide if rendering at higher res with
less AA is better than rendering at a slightly lower res with more AA. Any
thoughts?? (Please feel free to volunteer any other jewels of wisdom,
though 8)
Coming from my experience doing renderings for print work- more AA and a
higher quality rendering is more important than excessive resolution once
you reach a certain threshold. You'll probably have to do some testing to
see what works with the film recorder and film stock that you're using.

Once you have moving objects/animation, higher anti-aliasing and jitter
will do far more to suppress temporal artifacts than higher res/lower
anti-aliasing.

render time difference may end up being a wash though.
Mark Adams @ Pixar said:
I concur. I'd expect that supersampling methods for antialiasing will be
_far_ less prone to artifacts than resizing an overrendered image. Even if
tricks such as stochastic sampling weren't available, the required resolutions
for matching antialiasing detail would be quite large (assuming your quality
settings aren't set to "poor"). ;) There must be some inefficiency (i.e.,
wasted calculations) in such an approach, as well. "Slightly lower res"
just won't cut it--your overrenders would have to be much larger for you
to match antialiasing results without annoying artifacts.

Didn't we have this discussion already in the NVidia AA vs ATI AA vs 3dfx AA wars. You'd think that 2048x2048 would look way less artifacted than 512x512 with 16X stochastic supersampling, but it all depends how you choose the samples. A good AA algorithm on 512x512 might require you to go to 4096x4096 or 8192x8192 to achieve the same level of artifact removal.

It is the aliasing artifacts that really detract from realism, and even at ultra high resolutions you will still have flicker, shimmer, and most importantly, without temporal antialiasing, you will have temporal artifacts.

I have not seen ANY 1600x1200 PC game, not even Doom3, even APPROACH the level of realism of CGI renders to FILM or VIDEO on DVD or VHS when projected on a large screen TV. Not even CLOSE. And why? Lighting and antialiasing - the big difference between real-time CG and pre-rendered CG.
 
DemoCoder said:
I have not seen ANY 1600x1200 PC game, not even Doom3, even APPROACH the level of realism of CGI renders to FILM or VIDEO on DVD or VHS when projected on a large screen TV. Not even CLOSE. And why? Lighting and antialiasing - the big difference between real-time CG and pre-rendered CG.
clearly, whether you like it or not, you have missed my point.
I AM NOT COMPARING ANY PC GAME TO SOME CG SEQUENCE.
There, do you get it yet?
Allow me to quote:
All i said was the low resolution, when viewed up close, ruins the realism.
Now, look back to my original comment.
Fell free to read the comments others made, and how i responded to them, so you will realize I AM NOT COMPARING THE LOOKS OF SOME GAME AT HIGH RES TO THE LOOKS OF SOME OTHER SOURCE MATERIAL.

Do you get it yet?


All i said was that your argument:
Since I can watch nice realistic CG and real-life movies on a crappy NTSC set without "ruining" the realistic look
is BS, because you are too far away for the aliasing that IS visible to really bother you. Thats it. Thats all.
 
Back
Top