V3-
When benchmarking, I think its better to cap the fps, to some synch level/or playable frame rate (30 to 120fps), that way, the average won't be misleading. If the hardware is good enough for the game, than the average should be that cap level. If it is below, than there is fluctuation. This is just from games playing perspective IMO.
Say you cap the framerate at 60FPS synched and the system you are testing is capable of drawing every single frame at 1/59th of a second, your framerate ends up 30. There really is a good reason that VSync is disabled when testing(and for most people here at least, when playing).
Randy-
If they are unable to spot them, then they aren't large differences, indeed- just noticeable to those who are trained to look for a certain thing.
Casuals chew up crap games and rave about them all the time(was talking to some guys at work who were blown away by how 'great' MOH Frontline was). Casuals notice a minor difference between running games at 640x480 and 1600x1200. They definitely miss a lot of big differences.
Yes, try convincing your IT manager to equip an entire department from Alienware...
Most of our machines start off life as Compaqs(ProLiants), and are then overhauled prior to being placed on the network(RAM is ususally upgraded, HD formatted and a proper/custom OS install done). We have about one IT guy for every three hundred computers at the company I work for, and most of their time is spent dealing with the POS OS2(old junk that needs to be replaced) and Linux(new custom IBM on their own sub WAN, constant server issues, last time they go with Big Blue
) machines. The Windows PCs don't have any real problems.
...and don't install anything that could remotely make your computer useful for something. Usually, all goes well, but certainly all bets are off.
Our machines at work have all the software installed that they need. Our newest addition is actually over a year old, a Win2K Compaq server(Compaq actually makes some nice servers)- it hasn't gone down or had any issues yet. Doesn't even have a KB/Mouse or monitor hooked up, although that is the norm for most of our servers(except the OS2 boxes, they crash all the time, once a month at least).
That would seem to reinforce that you really need to be a somewhat smart dude to keep your PC in the air. I don't think the people you find here or on a dedicated PC topic site are dummies (even then, there is usually a fairly high traffic PC Help forum on such a website, so what does that say?). ...but they certainly don't represent the average joe on their home PC, either.
Buy an Alienware and it is all taken care of for you.
They're all buzzwords. I'm not saying they aren't truly useful for something (especially where presentation on a computer monitor is concerned), but certainly they are trotted out more often than necessary- thus they become buzzwords.
Trilinear? We are talking about a 1998 feature that eliminates the very noticeable and annoying mip banding artifacts. Every current piece of hardware supports it one way or another. The problem is that some hardware doesn't support it in a 'friendly' fashion.
It's a big difference between nVidia listing off bulletpoint features explicitly and said bulletpoints not having an easy analogy on a certain Sony-designed part. You could either conclude that said features cannot be done at all or it simply isn't documented what effects are/are not possible.
The GS can do tri, and for conversations on what exactly the PS2 can and can't do, Faf is here to set us straight whenever we have something confused on that end
The wise stance is simply not to make such a simple comparison between a PS2 GS and a GF1 on bulletpoints alone, unless your actual goal is to purposely introduce FUD.
FUD
here? The
average IQ of these boards is likely in the 150 range, and pretty much everyone here has a decent amount more then a basic understanding of what is what in the 3D market(actually, far more then basic for the overwhelming majority). You say anything out of line here, and you will be called on it. Simply look at this thread
It's all subjective, ultimately, if you are evaluating said titles on overall presentation, not just noting if you can see xyz feature being used or not.
Given this is a discussion on the tech end, I am discussing things on the tech end
It's more of a question of "is someone willing to rewrite the code completely around a PS2", not "does JC think his game can be done on PS2". You should know that, but it is easier to believe the latter, of course. If it weren't for this one little game, you would just be searching for another game to conveniently make such a claim.
I don't think so. I use Doom3 because it is pretty much the worst case scenario for what the PS2 was designed for, and it happens to be the best looking title on the horizon(not to mention Carmack has explicitly stated that the game was built around what was possible with a GF1).
Marco-
Art, style and polish are more important and effective looking than extra effects to many people.
I agree, given they are both the same level the XB will come out ahead though.
Sure the game is not antialiased like these caps, but the practice of releasing supersampled shots is nothing new.
And when those screenshots show up, they are dismissed as not being real time. This was the case for Rogue Leader and it was the case for PanzerDragoon, it has nothing to do with it being a PS2 game.
Faf-
There's 2x T&L speed increment there too.
Not to mention that GF2 will benefit a lot from lower resolutions too - both cards are just pathetic in terms of effective fillrate.
Core clock has no noticeable impact at low res on Mafia. I've run my GF2(Gainward Pro450) up to 230 and down to 150 with no noticeable difference.
It should still run a lot better then on a GF1. Especially if GF1 was running it on a 300mhz P3, to even the playing field.
Well if you paired it with a Pentium 2 300(they never made a P3 300) then obviously that would change things. How would you go about getting all of the shader effects in D3 to run on the PS2? Very interested to hear that.
I mean you must have an argument to support the assertion if you're so sure about it (other then high resolution which has been used countless times for promotional purposes on all consoles, still using realtime imagery).
AA. As I already mentioned, neither RL nor PD got off when F5 or Sega pulled the same stunt.
Zidane-
Now i've heard of ps2 ingame models up to about 20000 at 60fps, and i've seen some really clean IQ out of it... To truly be SIGNIFICANTLY ahead in this area i'd expect at least several fold increase in geometry(heck if we see Soul Calibur... and games with a 10X geometry increase are say'd to only look marginally better....), and some nasty good IQ....
The entire package, not just the poly counts. You can go ahead and check, I've never argued that the PS2 isn't very strong in geometry throughput
Now af many have say'd true the REZ and the AA will not be up to par with this image... but if we're running at 640xrez in GF1 to get perf... i guess it's only fair this be the case for ps2 too....
640x480 with no AA is natural for the PS2, that is why I said to use it for the GF1. Show me the PS2 running 640x480 using 6x-8x AA running the title in real time and I'll admit I was wrong, hell I'll go buy a PS2 and the game too
Phil-
Well, we weren't exactly comparing GTA3 as this game certainly does not do the hardware any justice.
It was a PS2 native game ported to the PC, where it runs better on GF1 hardware.
If anything, we were comparing the Getaway to Mafia and I still think that comparing the outside locations, these games are equal impressive taking everything into account (from what I have seen and played IMO).
I don't see it. The poly counts are comparable, that's all I see that is on equal ground.
ChryZ-
Could someone explain to me, why to compare GS Vs GF1?
Design philosophy. BTW- The Register ranks with the Enquirer in terms of credibility- next to none.