"Taming The Dragon: Next-Generation Asset Creation for PS3" (Some Lair info)

Different

TrungGap said:
Let's consider a car. Can you imagine the variation in the parts that go into a car? Now, imagine the fault tolerance of the car in order to pass QA. The variation between different car of exact model is consider huge. So when a car's gas milage is rated for the model, you'll going to get a wider variation and then you have to factor in the driver. Eh, you're not going to get a car rated 30MPG that will get you only 15MPG or a 15MPG suddenly getting 30MPG for another driver.

You have correct to say fault tolerance of automobile is large. That is precisely why engine management is designed to be very much below performance limit of individual component in system yet nevertheless there is still performance variation of 5 or 10% that can occur.

Now imagine such production automobile pushed to limits of components with advanced engine management software as would occur in Formula 1 racing (like closed box development environment) and you will see how performance can become unpredictable at limits of components. Of course a console is much more precise in manufacturing than car but console is also much much more complicated and is made of much more parts.

Yes, you are correct in your assumption that there variation. But it's not going to be to a point where it matters. Because if that was the case, developers would have factored that in when they design the game/product.

Slow down only occurs at limit of performance envelope, not in situations not at limit, therefore it is apparent developers develop software to occasionally be at limit of hardware. In situations where components are not stressed, performance variation between similar systems may not be noticable but at limit of such systems where many components are simultaneously stressed unpredictability can occur. Remember, slowdown is only a small fraction of total running of game.

If some experienced reviewers see slowdown and others see no slowdown, then for a very small fraction of total game-play something is different. That is likely to be due to difference in hardware or disc. Hardware age, abuse, or manufacturing differences can cause this. Do you know a unnoticed momentary change in disc read or speed can result in slow-down? Or a small flaw in ram chip can cause performance difference in situations where ram is stressed? Those are just two examples.

Overclocked, thank you for positive comment. I am trying to improve my sentence structure.
 
ihamoitc2005 said:
You have correct to say fault tolerance of automobile is large. That is precisely why engine management is designed to be very much below performance limit of individual component in system yet nevertheless there is still performance variation of 5 or 10% that can occur.

Now imagine such production automobile pushed to limits of components with advanced engine management software as would occur in Formula 1 racing (like closed box development environment) and you will see how performance can become unpredictable at limits of components. Of course a console is much more precise in manufacturing than car but console is also much much more complicated and is made of much more parts.

Okay, true. In a f1 car, many components are pushed to the limits. However, within the operating limits, all component will function and the overall performance will not deviate a lot. Remember, most of the time the difference between the fastest car and slowest car isn't much. And here these devices are really push to the limits (barring failure).

Now in an electronic devices such as a game console or a PC. Designers put in a safe margin of error for operation tolerance. A timing deviation from operational limits caused by a flaw in silicon of a memory chip will create a system failure, it will not cause a slow down.

ihamoitc2005 said:
If some experienced reviewers see slowdown and others see no slowdown, then for a very small fraction of total game-play something is different. That is likely to be due to difference in hardware or disc. Hardware age, abuse, or manufacturing differences can cause this. Do you know a unnoticed momentary change in disc read or speed can result in slow-down? Or a small flaw in ram chip can cause performance difference in situations where ram is stressed? Those are just two examples.

I think it's more likely users' perceptions is the cause. As others on this forum as posted earlier, different people will perceive differently. For me, 30fps might be good...while other think when the fps dip down to 30fps, it's a major slow down. Hence, that's why we have benchmark. And if you run those benchmark on computers of same configuration, you'll see that the variation isn't that huge.
 
ihamoitc2005 said:
You have correct to say fault tolerance of automobile is large. That is precisely why engine management is designed to be very much below performance limit of individual component in system yet nevertheless there is still performance variation of 5 or 10% that can occur.
Have you got official figures to back that up? When a F1 team gets a new car for a race (and a new car is used every race) are they really getting as much as a 10% difference between vehicles? Can I buy a standard Clio and get 36.8 MPG fuel consumption, and then buy another Clio of exactly the same model and get 33 MPG? I'd like to see your evidence for this.
Now imagine such production automobile pushed to limits of components with advanced engine management software as would occur in Formula 1 racing (like closed box development environment) and you will see how performance can become unpredictable at limits of components. Of course a console is much more precise in manufacturing than car but console is also much much more complicated and is made of much more parts.
Your argument is sound, but as I say you present no evidence. It seem you have looked at how console are made and have a belief that being so complex, there must be variations that must account for different performance between hardwares. But without evidence, it could well be, as I believe, that these variations are very small and have negligable impact. I've a friend who works in electronic engineering designing chips. He was telling me ages back that where a chip works in binary, switches being on or off, actually the way chips work can cause partially voltages to remain and this results in random rogue bits and switches. Chip designers and manufacturers work to minimise this effect, and do a fantastic job. Likewise I imagine they minimise the other factors that can cause random variation in a total system.

Both points have been made, and until you or I can find actual research in variation on a console or similar system, reiterating points isn't going to prove it one way or the other.
 
Last edited by a moderator:
macabre said:
The article has a image of the Morpheus character`s head scanned for the Matrix, so it`s the same thing.
I wonder if game devs will have to buy these or if they can be rented in a special FX studio ?
In movies the CG stuff is mostly scanned from real models/sculptures, so using this method could bring more great artists from the movie industry to the games industry.

Nice to see a developer using my idea...;)

http://www.beyond3d.com/forum/showthread.php?t=22891
 
_phil_ said:
how's that's your idea ?.I first used cyberware input data and cysclice for games about 7 years ago.And i'm far from the first an alone....

Your game used highly detailed laser scanned maquettes? What game was that?
 
This never made it to the final game ,but we worked on the idea to have in game virtuals actors ,like a D.Bowie for example.3d scan gives you free texture and perfect base face topology.Cysclice is very good to rebuild low poly models.The combo has allways been interesting to consider for workflow.
 
Back
Top