Console Performance - Now or Later

Status
Not open for further replies.

Rockster

Regular
I often see people stating how unused console power is an advantage, citing it as making a system more future proof and potent. Like the statement below.

"cvg.com - Monday 3-Sep-2007

We won't see games that really use PS3's full grunt until "four, five or six years down the line", says SCE president HIRAI.

It's going to be a wee while before we claps eyes on games that squeeze every ounce of polygon-powering grunt out of PS3, according to Sony Computer Entertainment president Kaz Hirai."

There is always interest in what percentage of a consoles power a particular title is using, as if such a number is even meaningful. And if utilization on a 1st gen title is too high, it implies the console is in someway weak. To me, this seems crazy and counter intuitive. Why would I buy a console to have most of it sit idle for years? Why should I play nerfed titles because its difficult for developers to make efficient use of hardware resources?

So what is the consensus here? Would you rather have a console whose performance isn't accessible, but capable of quality titles at some point down the line. Or a console whose performance is very accessible, delivering maximum quality from early on in its lifespan (although somewhat shorter?). Thoughts?
 
It should not have to take 4 or more years to fully utlize the machines capabilities. But then the end user may feel pleased to see that for each year the games look better/have new interesting effects or such. Although this as long as the games that dont fully utilize the machine dont look to bad/play bad.
But then one could question oneself what "squeeze every ounce"/"full grunt" means, is it a 5% improvement, 10%, 30% etc from what is presented now to reach as near as possible to the real-world processing power... ;)
 
I think its as you say Nebule. If the games get better looking over time most end users might feel like devs are putting more and more energy in every game. To some extent I can understand that. Its impossible to get everything out of a system from day 1 so if games improve over time you can see what devs do and dont put effort in their games atleast to some extent.
 
Complex hardware will always have room to grow, if not least because of the inventiveness of developers. The Amiga was doing amazing things years after it was past its 'best'. For example, there's a 'clock port' on the motherboard that peripheral developers used to drive all sorts of peripherals, using it as a generic expansion slot. The designers never intended it to be used that way, but before it was used that way, the Amiga wasn't achieving 100% of its potential.

Furthermore powerful hardware is often not user friendly, as the investment in hardware is on performance and not developer ease. In the case of Cell, doing away with caches and using LS is such a trade off. This means exploitation of the hardware needs time to progress as developers adjust to the programming models. A longer arc to best results is an inevitable part of having better performance, it seems to me.

The alternative is to go with a lower overall performance that is tapped more fully from the start. IMO I'd prefer the better bang for my buck. As long as the initial improvements are notable over the last generation, it offers enough immediate reward while offering more promise making the platform more competitive with modern technology as it gets older.
 
So what is the consensus here? Would you rather have a console whose performance isn't accessible, but capable of quality titles at some point down the line. Or a console whose performance is very accessible, delivering maximum quality from early on in its lifespan (although somewhat shorter?). Thoughts?

There is no such thing as "accessible hardware".
The way it can be used, this is what really matters.
You can use the same plain knife to cut some sticks and also to make beautiful wood carvings. Does it mean that the knife potential is underused?
If people can find lots of inventive ways to use the hardware - it's a good hardware.
 
the thing is 5 yrs from now, Microsoft will have released their next machine..

The development cycle of a modern game is very close to 5 years. Which means that no developer would like to use a platform which changes faster than their cycle.
 
I much prefer an easier developed console (PS1, Dreamcast, Xbox, xbox360) as they generally get good results right away, better down the road, and allow devs more freedom as they aren't fighting with the console to get their game out the door.

If devs have x time, I'd rather the majority of that time be spent on the game, not fighting with the machine to get the results they want.

For as short a lifecycle as DC had, look at all the great looking/running games that came out on it.

Devs were up to speed also on ps1 and were cranking out quality titles in very short order.

xbox1 wasn't really taken seriously, but most of the titles looked good and didn't cost much to produce.

If it costs devs less time, it will cost them less money to produce. This will likely translate into more software available and of better quality. The deeper/more-complex consoles generally only produce a handful of titles which take advantage of their potential in their lifetime.

see:
ps1 vs saturn
xbox vs ps2
 
I love the way games get better on consoles over time. It feels like you go through the whole cycle of upgrading your hardware, except you're not spending any money. :) Obviously, it's not the only factor that matters, but it does extend a console's lifespan a little - there's room to explore and innovate. True, that can be done in different ways, but generally it leads to some great and interesting innovations. It may seem trivial now, but eventually the thing that will keep the 360 in the running is the graphics chip, for instance, which isn't the most easy part of the system, but tellingly already by far the most interesting (imo, anyway).

I think a common mistake is to consider the fact that something may be difficult to make the most of is simply too complex, and that to get anything out of it at all is very hard. It's just that because you have greater flexibility, you can do a lot of things in several different ways, and there are going to be better ways of doing things that are not immediately obvious. That is not to say that the Cell processor can give you great results in a short period of time. Nor is the RSX extremely difficult. But the fact that you can combine the two to work together in interesting new ways does offer potential that may take a while to be fully untapped. Same with sixaxis, or bluray storage. It may take a while before it is used in a way that really makes a game stand out, but so what? It will happen eventually.
 
I love the way games get better on consoles over time. It feels like you go through the whole cycle of upgrading your hardware, except you're not spending any money. :)...

That sentiment is widely shared and the reason for my thread. It's just I feel the opposite. I feel like I was ripped off with sub-par games that weren't able to properly make use of the hardware if the design was simply too difficult for developers to come to grips with quickly. And not that things won't or shouldn't improve over time, they obviously will with any platform. It's just the early adopters that I think get the shaft. They pay more, only waiting for the promises to come to fruition. For that you might as well wait for the library and devs to come up to snuff, and get the platform after a few price drops. I just can't come to grips with the idea of idle hardware resources being vogue.
 
Did the games please you? If they were fun and gave you entertainment, they did their job. If someone then improves on what they can do, does that detract from the experience? eg. Woking FC play a few matches quite well, and I get all excited and cheer them on. Then midseason they pick their game up and play even better. Does that mean all those previous matches I was being ripped off? I'd have thought things improving over time would be a good thing! To difficult for developers is a subjective term. People often need pushing to get the best out of them, stretching them to achieve better. Isn't it bad enough all the developers we have these days are lazy as it is, without making things easier for them and making them even lazier?! :p
 
That sentiment is widely shared and the reason for my thread. It's just I feel the opposite. I feel like I was ripped off with sub-par games that weren't able to properly make use of the hardware if the design was simply too difficult for developers to come to grips with quickly. And not that things won't or shouldn't improve over time, they obviously will with any platform. It's just the early adopters that I think get the shaft. They pay more, only waiting for the promises to come to fruition. For that you might as well wait for the library and devs to come up to snuff, and get the platform after a few price drops. I just can't come to grips with the idea of idle hardware resources being vogue.

Ok, but with someone like you who doesn't have the patience to keep his mind focussed past the first trwo sentences of an already short post, I don't think any console is going to do. :rolleyes:
 
Promises are great n' all but show me the money!!! After 4-5 years, hell, I'm ready for the next best thing not waiting on promises to be deliverd.

People, myself included, quickly forget the past and focus on the present. And that's ok. Why? because persent is what they're trying to sell to me. I judge games and consoles on their offerings in at the current time. I expect things to improve over time. That's a given.
 
I'm interested in a balance of the two that leans more towards now than later. I have no interest in paying a high premium for a device that will not be effectively leveraged during its life cycle. At the same time, I would hope any console I'd buy would be forward thinking enough that there would be some improvement in the game experience over time.
 
I think it matters little that a console has room to grow over time. The fact that PS3 is at least matching its competitor's performance now while still having room to grow over time, bodes well for its future.
 
I think it matters little that a console has room to grow over time. The fact that PS3 is at least matching its competitor's performance now while still having room to grow over time, bodes well for its future.

Oh I am shure the xbox360 still has room to grow to! ;)
 
Um, all the consoles, yes even the Wii will continue to improve graphically for as long as games are developed for it. Even if we still had X360's, PS3's and Wii's 10 years from now, there will allways be room for improvement. (ofcourse the improvement is less and less noticeable).

For me, i don't really care about whatever improvement is left in the consoles, the improvement is so small anyway, i don't really care about it.

Aspecially not when i'm also going to own a high end pc along with the consoles:

PS3\X360 is going to look terrible in 2-3 years in comparison to my gaming pc anyway..
 
There's not much difference between exotic silicon and poor development tools. Both lead to the same outcome of games showing great improvement over a lifespan of the console.

The main factor in pushing every last drop out of a console is financial. Systems that games sold very very well on got milked technically, not the because they had some magically engineering behind them.

The ps1 and Nes are best examples. Systems like the SMS and Saturn had the potential to see some amazing things if given the same opportunity.

Personally consoles start to lose my interest after 3 years of their life. Consoles have traditionally only lasted 4-5 years, its just these days manufacturers release replacements earlier rather than milking the old tech. I mean who was playing Genesis in 1995 or SNES in 1997? not me.
 
For me, i don't really care about whatever improvement is left in the consoles, the improvement is so small anyway, i don't really care about it.

No! There are several games that represent a performance jump that I would not describe as small.

Donkey Kong Country
Street Fighter 2 (snes)
Yoshi's Island
Sega Rally/VF2 (Saturn)
Jak and Daxter
Shadow of the Colossus
Wipeout XL
Crash Bandicoot
Sonic the Hedgehog

And many many more, of course. All games that made me say "wow, I didn't realize this thing could even do that..."
 
There's always the point where the developers overcome the learning curve on the system and then the only "improvements" are made through choices in art direction, or by sacrificing performance like Perfect Dark on the N64 and Shadow of the Colossus on the PS2. There were tons of games near the end of the SNESs life that had improved graphics that came with huge slowdowns. It's not that those graphics couldn't have been achieved earlier. The knowledge was there, developers just chose performance first. Near the end, performance always gets dropped because devs need to sacrifice something so they can make a game that looks interesting to a buyer, and not the same thing they did the last time.
 
Status
Not open for further replies.
Back
Top