ref rasterizer questions

Several questions:
1. Did you prefer the rasterizer dx10 hw uses or the 6/7 series rasterizer? I vastly prefer the 6/7 series rasterizer. It simply looks better, textures weren't always extremely blotchy up close, and things that were meant to be transparent weren't almost opaque. For example, water looks nearly completely white (and VERY shitty) on dx10 hw rather than clear like on the 6/7 series.

2. Did microsoft force nvidia and ati to use it, or make it required for dx10 compliance?

ot question: Could fp16/32 textures technically be trilinear filtered? I thought I read they were only bilinear filtered.

another ot question: Where could I find a list of features that were in dx9 hw but aren't in dx 10 hw?
 
Several questions:
1. Did you prefer the rasterizer dx10 hw uses or the 6/7 series rasterizer? I vastly prefer the 6/7 series rasterizer. It simply looks better, textures weren't always extremely blotchy up close, and things that were meant to be transparent weren't almost opaque. For example, water looks nearly completely white (and VERY shitty) on dx10 hw rather than clear like on the 6/7 series.

2. Did microsoft force nvidia and ati to use it, or make it required for dx10 compliance?

This sounds very much like the good old "OpenGL has better colors" nonsense. Care to put up an example screenshot showing the difference?

ot question: Could fp16/32 textures technically be trilinear filtered? I thought I read they were only bilinear filtered.

They can be trilinear filtered.
 
This sounds very much like the good old "OpenGL has better colors" nonsense. Care to put up an example screenshot showing the difference?



They can be trilinear filtered.
IIRC, I never explicitly said "OpenGL has better colors." But I guess it's possible that I'm not remembering correctly.

Just play a game on a 6/7 series (half life 2 is a good highlighter of how much better the 6/7 series rast. was) or play any game and get as close as you can to a textured surface and it will not be blotchy like it will be on DX10 HW.

I'm not surprised, since ms' spec only requires fp16/32 bilinear filtering. I guess ms doesn't require fp16/32 trilinear b/c it gives better iq than bilinear.

Thanks for replying, Humus=]
 
You are free to play your Half Life 2 in DX6/7 mode even on DX10 hardware if you like... What does Half Life 2 have to do with reference rasterizer though? Of course simply waving in the air about how DX6/7 was better then DX10 (???) is simple but without screenshots is totaly useless.

Trilinear filtering on fp16/fp32 textures is not required in DX10 (it is in DX10.1), but both NV and ATI DX10 hardware support it. Also fp16/fp32 is not actually your typical "game asset" format. Games still use only argb8 or dxt textures while fp16/fp32 textures are used for some special effects.
 
iq you are aware the reference rasteriser runs in software to see it you would have to have the directx sdk which somehow i doubt you have
because if you did you would realise it runs on the cpu no matter what gfx card you have you get exactly the same image....
 
For example, water looks nearly completely white (and VERY shitty) on dx10 hw rather than clear like on the 6/7 series.
That's really a matter of opinion. The DX10 version is supposed to be more realistic, by reflecting the bright white sky and adding white foam tops. In real life water isn't very transparent either.
Did microsoft force nvidia and ati to use it, or make it required for dx10 compliance?
No, this is entirely application controlled. You can use DX7 effects on DX10 hardware just as well. The application will typically choose the most realistic effects based on your hardware capabilities though. But in the case of Half-Life 2 you can add -dxlevel 70, -dxlevel 80, dxlevel 81, -dxlevel 90, -dxlevel 95 or -dxlevel 100 to the command line to limit the application to the corresponding DirectX level effects.
another ot question: Where could I find a list of features that were in dx9 hw but aren't in dx 10 hw?
From a functionality point of view anything that can be done with DX9 can also be done with DX10. You have to realize that modern hardware is flexible enough to implement several versions of DirectX and even OpenGL. The driver(s) take care of translating the commands from these different APIs to instructions the hardware can execute.

When talking about the interface itself, DX9 has a ton of things DX10 doesn't have. For instance all fixed-function parts have been deprecated in favor of shaders. This is in contrast with previous versions, which kept the same (outdated) interfaces but extended them. Nothing is lacking in DX10 though, you just have to do things differently, in a way that more closely reflects how modern hardware works.
 
I think he means what d3d calls have dissapeared from d3d
and what happens if an old game makes one of these calls
 
I think he means what d3d calls have dissapeared from d3d
and what happens if an old game makes one of these calls
Doesn't matter. The old game will still load the old runtime DLL which passes the old commands to the driver. The driver is still responsible for implementing these correctly.

Functionality never dissapears. DX10 just cleans up the interface for the developers.
 
iq you are aware the reference rasteriser runs in software to see it you would have to have the directx sdk which somehow i doubt you have
because if you did you would realise it runs on the cpu no matter what gfx card you have you get exactly the same image....
Yes, I realize all of that; however, dx10 hw rasterization is different from geforce 6/7's rasterizer.
 
IIRC, I never explicitly said "OpenGL has better colors." But I guess it's possible that I'm not remembering correctly.

Just play a game on a 6/7 series (half life 2 is a good highlighter of how much better the 6/7 series rast. was) or play any game and get as close as you can to a textured surface and it will not be blotchy like it will be on DX10 HW.

I'm not surprised, since ms' spec only requires fp16/32 bilinear filtering. I guess ms doesn't require fp16/32 trilinear b/c it gives better iq than bilinear.

Thanks for replying, Humus=]
Using trilinear on fp16/fp32 textures is uncommon in any case. These textures take up massive amounts of memory and bandwidth without offering any special benefit which make them unsuitable for regular textures. Because of this, mipmaps are rarely used in conjunction with fp16/fp32 since they're never needed; which makes trilinear useless.
 
Yes, I realize all of that; however, dx10 hw rasterization is different from geforce 6/7's rasterizer.
Do you even know what a rasterizer does? If anything, coverage determination and attribute interpolation have gotten better in newer hardware, and DX10 tightens up the specs. None of the things that you've described have anything to do with the HW rasterizer, let alone DX10 vs previous APIs.

Seriously, any time you think that something has gotten worse in HW or DX10 is in any way not superior to previous APIs, just realize that you're wrong and ask why you're wrong instead of making silly claims. Applications may make different trade-offs in different rendering paths, but that has nothing to do with the APIs or HW. If you've got a question about different rendering paths in different applications, please feel free to post some screen-shots and ask.

I'm not trying to be a jerk, but don't make bold claims about things that you don't understand...

Using trilinear on fp16/fp32 textures is uncommon in any case. These textures take up massive amounts of memory and bandwidth without offering any special benefit which make them unsuitable for regular textures. Because of this, mipmaps are rarely used in conjunction with fp16/fp32 since they're never needed; which makes trilinear useless.
That's true for color textures, but filterable shadow maps (VSMs, ESMs, etc) make extensive use of high-precision texture filtering.
 
Last edited by a moderator:
(Sorry for a bit of repeating everyone. Just trying to explain this a bit differently)

another ot question: Where could I find a list of features that were in dx9 hw but aren't in dx 10 hw?

DirectX does not specify how the HW should implement any features. It specifies things like required pixel/vertex shader instructions (for each shader model) and the maximum calculation error accepted for each instruction. DX10 removed fixed function pipeline from the API, but this has minimal effect to the hardware, as all the newest DX9 chips have already implemented the fixed function T&L and texture stages using pixel and vertex shader pipelines. DX10 hardware can render all the same effects that all previous DX hardwares could, and it can do this with significantly higher precision (and with improved performance).

Several questions:
1. Did you prefer the rasterizer dx10 hw uses or the 6/7 series rasterizer? I vastly prefer the 6/7 series rasterizer. It simply looks better, textures weren't always extremely blotchy up close, and things that were meant to be transparent weren't almost opaque. For example, water looks nearly completely white (and VERY shitty) on dx10 hw rather than clear like on the 6/7 series.

2. Did microsoft force nvidia and ati to use it, or make it required for dx10 compliance?

Developers make their own choices about how to implement each effect. In this case the developer liked the shiny and more physically correct water more. However it was likely too costly to implement on lower end DX9 cards (would require extra processing passes, etc), so it's only available on DX10.

The developer could have used just the same water shader on DX10 cards also, and the image quality would be identical (or likely slightly higher because of better instruction and sampling precision).

ot question: Could fp16/32 textures technically be trilinear filtered? I thought I read they were only bilinear filtered.

Trilinear filtering does nothing to textures that do not have mipmaps. Floating point textures are mainly used as frame buffers (hdr rendering, linear depth buffer, etc). Frame buffers do not (usually) have mipmaps and thus cannot be trilinear filtered.

As Andrew said, many new shadow mapping techniques require mipmaps and trilinear/anisotropic filtering for best output quality. High precision formats with full trilinear/anisotropic filtering support are very important in the future. Hardware support for trilinear/anisotropic depth test samping would be also very nice addition for future DirectX releases (bilinear PCF has been supported starting from old DirectX 7 Geforces).
 
thanks for that answer sebbi, but what would explain the poor image quality on older games when running on new cards eg: system shock2
 
thanks for that answer sebbi, but what would explain the poor image quality on older games when running on new cards eg: system shock2

There are two primary feature drops on modern hardware that can have an impact for older games.
1. There is no dithering for 16 bit render targets. This can cause banding effects you haven’t seen before with older hardware that supports dithering. This isn’t issue for newer games as they all use 32 bit render targets at least.
2. The support for palette textures is removed from all drivers and maybe although from most hardware, too. Depending on the fallback path of the game this can cause banding, too.

The drivers for older hardware may have contained special code path to work around bad rendering behaviors of games. Code paths that are not longer there for newer hardware.

Finally sometimes we all believe a game had look better in the past until we start it on the same hardware configuration again.
 
There are two primary feature drops on modern hardware that can have an impact for older games.
1. There is no dithering for 16 bit render targets. This can cause banding effects you haven’t seen before with older hardware that supports dithering. This isn’t issue for newer games as they all use 32 bit render targets at least.
2. The support for palette textures is removed from all drivers and maybe although from most hardware, too. Depending on the fallback path of the game this can cause banding, too.

The drivers for older hardware may have contained special code path to work around bad rendering behaviors of games. Code paths that are not longer there for newer hardware.

Finally sometimes we all believe a game had look better in the past until we start it on the same hardware configuration again.
The thing I don't get is why don't they just make the driver render internally at int8xRGBA. Would that work? Couldn't they also make the driver force a good replacement/format when pal textures are called for without any issues?
 
IIRC, I never explicitly said "OpenGL has better colors."

I didn't mean that you said it. It's just one of those classics that I've heard many times from average gamers that just don't have a basic understanding of how things work under the hood. They may have played some OpenGL games and found that the colors were better in those games compared to other games using DirectX, and incorrectly concluded that this is down to the API, and not because of artistic choices made in those games. I once interviewed a guy applying for a job at ATI who replied with that when I asked him what the main differences between OpenGL and DirectX is. Let's just say he didn't get the job. ;)

What I'm saying though is that your post sounded very much like a misunderstanding of similar proportions.
 
I didn't mean that you said it. It's just one of those classics that I've heard many times from average gamers that just don't have a basic understanding of how things work under the hood. They may have played some OpenGL games and found that the colors were better in those games compared to other games using DirectX, and incorrectly concluded that this is down to the API, and not because of artistic choices made in those games. I once interviewed a guy applying for a job at ATI who replied with that when I asked him what the main differences between OpenGL and DirectX is. Let's just say he didn't get the job. ;)

What I'm saying though is that your post sounded very much like a misunderstanding of similar proportions.
What he meant to say is that open gl games in general adopted 32 bit color first, and that opengl games were usually more bright colorful than d3d games. both are kind of true, but not b/c of opengl itself (unless opengl had some tricks dx didn't at the time.)

Did he explicitly say that it was b/c of opengl or was it just implied?
 
The thing I don't get is why don't they just make the driver render internally at int8xRGBA. Would that work? Couldn't they also make the driver force a good replacement/format when pal textures are called for without any issues?
Driver can't just force all render targets from 16 bit to 32 bits... What if game relies on that the render target is 16 bits per pixel as many older games did to render text? Add 100 exceptions for 100 games in the driver just to make them working?
Similar with palette textures... Palettes got set at frame render. Which means you can change palette every draw call but keep the texture exactly the same! Which again means you'd have to "recompile" texture every time a palette gets changed... If a game has fallback great, if it does not have it then it won't even work on todays hadrware.
 
Back
Top