Archie-
Then again I doubt the average consumer will care... Just the tech enthusiasts...
We are pretty much the only people that care now, and I don't expect that to change. Coming into the next gen those with 1080i sets would have to be pretty much blind to not notice the difference I would assume(providing they realize they need to ~$20 connector for their setup).
Randy-
Is this truly what we are proposing here? Is it actually feasible given the timeline?
It's pretty much a given. The XBox already supports 1080i and AA methods that stacked could already come close if they had the fill, and more importantly, the bandwith to handle it. For the timeframe that the next consoles hit, I think everyone would be surprised if they all didn't support 1920x1080i natively, likely with AA.
Zidane-
So u think it needed a GScube(16ps2s!!!) for the rez and AA....rrrrrrrriiiiiiggghhhhttttt......
For actual usability? Yes, absolutely.
Marco-
You never have tearing with VSYNC off???? How is that even possible? Everytime the game framerate output desynchronizes with your monitor refresh output (and that's pretty much all the time) you get image tears. I see them all the time.
Keep the refresh @100Hz and the noticeable tearing isn't going to be that bad the overwhelming majority of the time(spinning around in a FPS is about the worse case scenario).
I just think it has what it takes to be impressive, at least on it's starting level.
Perhaps I didn't explain exactly
how highly they thought of the game. The guys I was talking to made it sound like it was one the greatest ever made. I brought up the negatives more then anything due to their reaction of the game, which I found to be decidedly mediocre. Comparing it to Halo, JKII, Metroid and the like MOH is a very poor title IMO.
I would say it's *somewhat* better looking. As some people said already, why include Mafia in this discussion, to begin with? It's not like it's the game you will be running on a GF1 config and be satisfied with it.
Mafia runs quite nicely on a GF1, in fact Mafia runs quite nicely on an integrated nForce 220(the single channel one, slower then a GF1) at 640x480.
Faf-
Btw, regarding VSync, I see flickering of image at 60-70hz refresh(gives me headaches rather quickly too). And tearing is a heck of a lot more obvious then that. It's a matter of personal preference no doubt, but it's very bothersome to some.
Once in a while I notice it. Most people game with the refresh @60Hz where it is a rather serious issue(pervasive). Up the refresh rate high enough and it is significantly reduced.
Yes. In theory, you're looking at 5:1 difference in fillrate alone for volumes - which are fill limited pretty much always.
Going by practical tests with volumes on a GF2 in our own app, I'd give it even more definite nod, since there was even considerably larger then that theoretical difference.
How much fill would you need? Running 640x480 with 4x OD you have enough fill for six passes on a GF1 at 60FPS, twelve @30. The GF1 is nowhere near as imbalanced as the GF2 is. The GF1 has 91% of the bandwith and 30% of the MTexel fill compared to the GTS. The GF1 actually had a very good balance to it.
V3-
I didn't say to enable VSync, I just said to cap fps to some level like Vsync. But do monitor, actually display 59 fps ? or its 59 is just internal thing ? or do you get 30Hz on your monitor anyway in this case ?
If your gfx card was drawing a frame every 1/59th of a second and you had VSynch on you would output 30FPS(that is what your monitor would display). If you had VSync off you would output 59FPS(that is what your monitor would display, although some frames would be offset a varrying amount depending on how much movement had occured, that is the tearing we are discussing).
Anyway, by capping the fps, your average would be more inline with the gaming quality you get.
Thing is, it depends on how the app is capped. Most rely on VSynch, which has the problem I mentioned above. If you cap it per second, you run in to problems if you can push 1000FPS, all your frames would be drawn in the first tenth of a second and then no more updates for the last tenth. The only good way to do it is to limit how fast a given frame can be drawn. Problem with this is, the singular time when you need the highest framerate in a PC game, while spinning around in a FPS, is going to be directly and negatively impacted in a sizeable fashion no matter how you implement the cap.
Anyway, by capping the fps, your average would be more inline with the gaming quality you get.
Depends. 'Crusher' was an excellent test to show you actual framerates that you would see under real world gaming, enabling VSync there gave you numbers that were simply too low to be viable. Other benches that have a lot of 'slow time' in them aren't very good at giving you a reasonable score. Check out UT2K3's 'Flyby' scores vs 'Botmatch', one is completely unrealistic and would be more in line enabling VSync(flyby) while the other(botmatch) is perfectly reasonable without VSync at giving you a good indicator of what to expect in game.