DegustatoR
Legend
IIRC every NVIDIA chip since Riva TNT supports W-buffering in software emulation mode only, not in hardware. Or not?
All this crap? It doesn't matter. Not one bit. You said there were no drawbacks to the R300. It doesn't matter how bad the drawbacks were, or whether they were in hardware or in software, these drawbacks were still there. Your statement was flawed. This is often the fault of people who decide they like something early-on, and later forget the things they don't like.DegustatoR said:1. Didn't see it. Some screenshots maybe?
2. Balance of quality vs speed.
3. Could be done in drivers. Doesn't need them anyway with 6x SG MSAA.
4. Not the R300 problem.
5. I had a whole bunch of them available even back then In addition to that, all cards had splitters with them.
6. Not visible 99% of the time. Good Q vs S balance again.
7. Enough even for now, not speaking about 2 years ago.
There were NO drawbacks in R300. Did you even read what i said to you? You just took those "drawbacks" of R300 out of nowhere, since there were no drawbacksChalnoth said:All this crap? It doesn't matter. Not one bit. You said there were no drawbacks to the R300. It doesn't matter how bad the drawbacks were, or whether they were in hardware or in software, these drawbacks were still there. Your statement was flawed. This is often the fault of people who decide they like something early-on, and later forget the things they don't like.
IIRC, it was ATI that mentioned that it was a w-buffering issue.whql said:Wasn't morrowwind actually due to vertex shader issues with the software?1. Lack of w-buffering (caused z-buffer errors in Morrowind, for example)
Yes, I read it. Of the issues you actually knew about, you argued away saying it was okay that those things weren't supported. That doesn't remove them from existence. That makes them unimportant (at least to you). There's a difference.DegustatoR said:There were NO drawbacks in R300. Did you even read what i said to you? You just took those "drawbacks" of R300 out of nowhere, since there were no drawbacks
I don't see ATI significantly improving on it for past two years. R350? The same R300, higher clocks. R360? The same R350. R420? From software point of view it is the same R300, just more of it: pipes, clocks, caches... Plus some very subtle tweaking in pixel shader logic (2.0.b) and very simple modification of DXTC ALUs (3Dc).Chalnoth said:Come on, be serious here. If there were no drawbacks of the R300 core, why would ATI bother to improve upon it?
Where did i call it a plus? They are clearly beginning to fall behind with this strategy. But the sole ability to reuse R300 core for two past years shows how good that core was when it came out.Chalnoth said:If you think the R300 architecture is the be all and end all of graphics, you are sorely mistaken. Amazing how you can call ATI's regurgitation of the same core a plus.
There's always a reason to improve. There was absolutely no reason to fix however since nothing was broken.Chalnoth said:Except you claim it was flawless. If it was flawless, there would be no reason to improve upon it.
DegustatoR said:No it's not. r_hdr_ extensions are disabled in Doom 3, they are experimental. Even 6800 won't help here. Trust me, i tried
991060 said:If there's anyone familar with UE3(or UE2)'s material system implementation, I'd like to hear his(or her? Is there "her" on this forum? I doubt ) comment on this particular topic, since I'm on a similar project right now. And I found without a robust/scalable material system, it's very hard to implement those advanced effects in a GAME project, within limited budget/time of course.
RejZoR said:The only thing that bothers me...
Why do we have to pay full price for heated up soup from yesterday?