Humus said:Some ASCII rendering this time.
Fun, but useless.
return tex3D(Ascii, float3(asciiBase, dot(tex, gray))) * float4(1.5 * normalize(tex), 0);
return tex3D(Ascii, float3(asciiBase, dot(tex, gray))) * float4(1.5 * normalize(tex), 0) + ( 0.01/tex.xyzz );
[maven said:]I was thinking, maybe you could render to a larger texture (2x2 texels for each ASCII character), and then look a character up with the average intensity of the 2x2 block and ddx / ddy (computed from the texels in the block). This would probably need (at least) one other dependent lookup, and I don't have a clue whether this would have the desired effect...
8)Grall said:Hahaha, I don't believe it! LOL Humus, but you actually found a way for an ASCII demo to require DX9! ROFL, kudos man!
Uttar said:Or maybe am I the only one to think that change makes it cooler?
Xmas said:Hm, seems like there is a fundamental difference in how NVidia and ATI treat divisions by zero or something, Uttar. Your version produces bright white on ATI cards where your screenshot shows black.
Mendel said:Humus. How about making opengl version so I can try what happens when i turn on ascii text smart shader on top of that ascii demo?