Still a lot of juice left in the current gen hardware

This is from John Davison's blog at 1UP.com

John's blog

If you can find the time to wander at E3, it can often net some of the best results, today I was determined to find something to rave about on the Xbox booth.

Here's an observation from my wanderings; in front of me was Sega's Full Auto running on a 360 Alpha kit, and behind me was Burnout Revenge running on Xbox. Unless you look really closely, it's actually quite hard to tell which one is running on next-gen hardware. Full Auto is basically Burnout-with-guns, and the two of them have a lot in common; speed, explosions, lots of particle effects. The fact that two generations of game can look so similar is probably symptomatic of a number of things - Full Auto's earliness, Criterion's awesomeness, the 360 Alpha kit's inadequacies. Regardless, it's interesting to see the two games next to each other.

Criterion's Alex Ward told me on Wednesday. "You'll never, ever see what the Xbox is really capable of," he confessed. "No-one's anywhere near tapping into the power of the thing, and it's being replaced already. We're pushing the PS2 at about 60% with our stuff, and I bet we're not close to that with Xbox."

The SCEE Performance Analyser study at the end of 2003 said as much then. The best PS2 game at the time was pushing 8.7 million polygons / sec.

Performance Analyser paper

PS2 theoretical maximum performance with all effects on is approximately 16 million polygons / second according to the original spec sheet. The Xbox is around 30+ million polygons a second.

When you look at a game like Burnout 3, which is pushing the PS2 very hard. The Xbox matches this effortlessly. Now to hear Alex say that even Criterion are only getting 60% of the performance out of the PS2 and even less from the Xbox, makes you think just how much juice there is left in the current gen hardware.

It's a shame that these systems won't be pushed as hard as they can before they are replaced.
 
This is true, but has probably always been the case. It is hard to truly maximise the usage of a bit of hardware, and even harder to do so in a way that is actually *obvious* to people (look at the current "debate" on how games don't match up to tech-demos on PS2...)

However the question is how useful is it to show people how much more performance could be got from a bit of hardware when a newer machine could acheive the same, or more likely significantly better results with much less effort.

It is generally accepted that the average performance of a bit of hardware will be quite a bit less than it's maximum. The more you want to approach the upper limit, the more effort is required and diminishing returns dictate that sooner or later commercial efforts hit a glass ceiling. Also bear in mind that once a platform reaches some kind of saturation point in terms of interest and sales, minor technical improvements in software will not have any kind of serious impact commercially.

So it is of interest to us technical people, but of very little interest to the average consumer or publisher. As people are willing to sporadically part with cash for a percieved jump in their experience, it will always be the case that manufacturers will regularly bring out new machines regardless of how tapped or untapped their current platforms are.

There are still people producing new demos for Vic-20s and the like...
 
Actually I dispute the numbers in that PS2 presentation anyway... other SCE presentations show higher numbers, and having had personal experience of running my game through the PA, I know it had better than 8.7 Mp/s in most common scenes. It also was certainly not the best game on PS2 for performance or otherwise! It was out well before this presentation, so I guess the sample-set used here was small and possibly (mysteriously) didn't include the best titles (or mine!).

The theoretical max you mention is true, but some effects on PS2 are not worth turning on - fogging for example is madness, because it halves the fillrate and the same effect can be achieved much faster and more easily using other techniques. Polygon-edge AA is actually *broken* and shouldn't even be used. The max for textured shaded polys is 37.5M and it only drops to 30M for turning on the unnecessary fog feature.

That makes the 8.7M seem even lower - however as I've said, I know this value is somewhat low for reality.

I've seen games on the PA that beat 16Mp/s in some circumstances, and while they don't sustain that for any length of time it's certainly not the PS2 hardware being maxed out and more a case of the game running average scenes and doing other processing instead of spending all it's time doing graphics...
 
The max for textured shaded polys is 37.5M and it only drops to 30M for turning on the unnecessary fog feature.

That makes the 8.7M seem even lower - however as I've said, I know this value is somewhat low for reality.

Don't forget the lights which is probably why you see 8.7M instead of 37.5M. ;)
 
Run Ratchet and Clank or any well liked later game through the PA, you'll see they push even less polygons.

Most of the later PS2 games push less polygons than the early ones. To be fair a lot of that is just that early on people put all of their textures in VRAM and didn't have to fight the textures for the bus.

I hate when developers or anyone else throw numbers like 60% utilisation out there. It's based on nothing, clearly they are using 100% of something or they'd have put more stuff in the game.
 
so this is why you get slowdowns in games like fable and jade empire.

if u ask me, they have passed the limits of what these consoles can do quite a while ago, and is exactly why next gens are coming out soon.
 
To clarify, I'm not really disputing the original point.

There is probably more to be gotten out of the hardware this generation, Xbox in particlular since it's usually relegated to running code and assets that work on PS2.

There is just no way to put a number on it.
 
In all realistic looks, "programming limits" and more to the point "publisher investment and timetable limits" are hit long before hardware limits of the machine, but whose fault is this? Is it ANYONE'S "fault" or is it simply a matter of course in the industry?

I'd be curious to see how a game like Halo 2 comparest to Burnout 3. (Or more like PGR2, I guess, to keep the genres closer.) Basically, though, to remove "multi-platform contamination" and see how an Xbox-only title comparies to a good-looking cross-platform game like Burnout 3 (especially since it was mentioned by the guy directly) and see what is "effortless" and what is simply the way developers end up programming on their respective platforms.

Heck, I'd also like to see similar looks at previous generations to see how hard games push "the max" on their consoles all the way back to the 2600, too, but we don't seem to have the equipment for it. :p I do believe it's just more "the way things are done to get by" in a world filled with spiraling development costs, shrinking timetables, and all that jazz.

...and it will only be happening again next generation.
 
ERP said:
There is probably more to be gotten out of the hardware this generation, Xbox in particlular since it's usually relegated to running code and assets that work on PS2.

I don't know, I think games like Doom 3, Half-Life 2 and Halo 2 are really pushing the Xbox hard.
 
I don't understand this "pushing hardware". Is there some relation to framerate? :?
 
Alstrong said:
I don't understand this "pushing hardware". Is there some relation to framerate? :?
na its most likely the speed of the code and what they could put on screen. IF they had 10 chars on the screen at once then perhaps with more time they feel they could have 18 people on screen at once with the same of better visual s
 
I think the one console that was truly maxed out by the end of its life was the SNES. It was doing some ridiculous things graphically by the time the N64 came around that everyone thought were absolutely impossible on 16bit systems.
 
Natoma said:
I think the one console that was truly maxed out by the end of its life was the SNES. It was doing some ridiculous things graphically by the time the N64 came around that everyone thought were absolutely impossible on 16bit systems.

True, but its not the same with the older systems since ROM carts sizes kept getting bigger during their cycles. The devs in 1995 had more memory space to work with than devs in 1990. Now that isn't such an issue. I think the most milked system was the ps1 as far as quality arc went. I'd throw the NeoGeo in there somewhere also.
 
Natoma said:
I think the one console that was truly maxed out by the end of its life was the SNES. It was doing some ridiculous things graphically by the time the N64 came around that everyone thought were absolutely impossible on 16bit systems.

well the snes has the fx chip in it and the super fx chip which helped it .


Anyway i think the nes was pretty strained by the time the supernes came out
 
They used a computer graphics technique called ACM. It wasn't Super FX. Basically I think it was SGI rendered, and then downsampled as much as possible to run on the SNES. That's what gave it it's unique look.
 
Back
Top