Reverend said:
sireric said:
I don't talk to ALL the developers that your company (or any other IHV) does but the ones that I do talk to are pretty good when it comesa to their knowledge of 3D graphics.
I never said that they were not knowledgeable about 3d graphics. I'm saying that the 24b limitations of the pixel shader have not been the issues that the developers have been complaining about. If you look at some of the work recently presented at Siggraph or some of the work ISVs are talking about in the future, there are many other problems that need addressing above this one. Until either the source and destination formats expand or that unstable shader code becomes a requirement, I do not think FP24 generally be a limit. I can't speak for 2 to 5 years down the road, but for now that appears to be true.
You appear to be as knowledgeable about where software (games) can be going as you are about hardware.
Sarcasm noted. Amusing. End of thread for me.
Can you expand on why you think many things could (not should?) be 32b but not for games? So 32b really should be all that is/will ever be required for games?
For applications that do a lot of procedural operations, such as some of the CPU/VPU items (fluid dynamics, linear algebra solutions, etc...) that I've seen, 32b seems required to be useful (in fact, in some cases it's not enough). However, there aren't enough commercial apps in this category to justify this for the mainstream commercial market (might be a chicken-egg thing though).
As for games, most of them have more issues with source data quality than the stability and precision of intermediate computations. The popular algorithms of today are rather stable mathematically, and the source data precision ends up being more of the issue (i.e. normalizing 8b source vectors doesn't require 24b precision).
How much should a IHV try to improve games 3D quality?
I'm not trying to be "smart" here but your comments appear to be "conclusive" in nature in several aspects.
How much should an IHV try to improve quality? As much as is possible, while making a balanced product with a mainstream target audience. We can make huge chips that cost thousands of dollars (i.e. yield 2 to 4 per wafer) that have every feature you want, in ample precision. That's fine, and I'm sure there's a market. But if you have to design a system that scales from $49 to $500 retail, you do have to decide what is important and what is not, and target a balance that achieves that maximum performance and quality, while allowing for the lowest cost. This means studying the current and (near) future algorithms and finding their real bottlenecks and addressing those. If your source data is 8b and destination is 8b (FP16 next year), then I believe you'll find that 24b is generally fine.
My comments only appear to be conclusive because that's the way I view things. I might (and often) am wrong, but that doesn't mean I don't believe my own opinions. I speak only for myself.
FP32 will not be enough for games. Your comments appear otherwise, in a declaration-type statement. Unless, of course, you're talking about the necessity for the advancement in hardware (not video cards) on the PC platform before we start talking about advancement of video cards.
Of course, this could just be about priorities. I may decide to make a game where I have a personal conviction that anything less than FP32 just won't do for me. And then the DevRels come in and convince me otherwise. Because I need to sell games!
Right now, we have simple apps with reasonably simple pixel shaders. We've just broken into being able to do some very cool things. We need to advance the products in a balanced way, where no one aspect gets way ahead of the others (unless mandated). That's my belief. The current set of shaders are rather stable numerically, and the source data is rather low precision. The artifacts and problems are mainly due to data sets at this time. As that improves, then the limitations of FP24 will show up.
You can certainly make an app that requires 32b, but I'm pretty certain that it will have other limitations (i.e. not be real time) or that it could be written in a way to be fine and dandy for 24b. For the first case, that's not really today's target market, and I find it hard to justify the cost of designing to accomodate that market.
My thinking is getting muddle from cold meds. Later