How far can you realistically reduce texture usage?

boltneck

Newcomer
You can’t completely do away with textures in games can you?

Can you make a game that is completely shader driven? How would that be possible? You would still have to have some kind of base texture, then a Normal map or bump map which is basically another texture (correct?). Some shadow rendering techniques are texture based? Or stencil based. Is that all going to get replaced as well?

Is all that going to somehow get replaced with a shader program?

I just installed fakefactory's cinematic mod for HL2 that replaces virtually every texture in the game with a 2048x2048 textures. It looks completely incredible and was not even made by professional texture artists. How can a shader be better suited to making completely realistic surfaces?

http://halflife2.filefront.com/screenshots/File/52458/7
 
boltneck said:
You can’t completely do away with textures in games can you?

Was just about to post a similar thread. I see people claiming R580 is more future proof because the shader-texture ratio in future games is going to be higher. But for some reason they tend to imply that the absolute texturing demand is going to fall as well which I doubt very much.
 
You can for sure get away with very very few textures if you go to the "shader side", but the problem is that most objects that are not natural (rocks, moss, etc.) tend to look very unrealistic with procedural shading only. Plus, procedural shading is hard to control, so even if some day all color textures are dead and everything is generated procedurally you still want be able to map things like where is the oil on this skin, where is the rust on this car and where are the bullet holes on the wall. I think you can't get below this level of "driver" textures if you want your game to look realistic.

Maybe they assume that you will need less textures for a given material ... But I think once you drive all your shaders with textures you will simply use the now free texture units to drive even more parameters, so IMHO the texture count will probably remain the same.
 
trinibwoy said:
Was just about to post a similar thread. I see people claiming R580 is more future proof because the shader-texture ratio in future games is going to be higher. But for some reason they tend to imply that the absolute texturing demand is going to fall as well which I doubt very much.

If you look at Ati´s projections about the texturing demand, they foresee it will increase slightly. No one in their right mind thinks that texturing demand will drop, the prediction is that the trend we have seen in the last few years will continue, meaning that arithmetic demand will grow faster than the texturing demand.

Also some people seems to think that larger textures equals higher demand for texturing power, this is not true.
 
we won't be getting away from pre made textures for a while. Think about resolution of your monitor, and when a camara can zoom in very close to a texture. When the number of pixels of that texture cover the entire screen pixel for pixel, then we see a stop in increased texture size.

Rendering out to HD tv's most models use 4096x2048 textures sizes to give realistic looks for objects, we are getting there Unreal 3 is at 2048x2048, 1/4 way there.
 
Last edited by a moderator:
boltneck said:
You can’t completely do away with textures in games can you?
It's not about reducing textures. It's about increasing math operations faster than texture operations.
 
Chalnoth said:
It's not about reducing textures. It's about increasing math operations faster than texture operations.

Yeah but how does that help R580 if future texturing demands are the bottleneck? I think that's the premise for the question if I understand it correctly.
 
Bandwidth is the actual bottleneck. You can only build so much TMU capability into a GPU with given bandwidth before diminishing returns bite hard.

Since ATI's the focus at the moment: ATI's been evangelising new texture compression concepts, such as 3Dc for normal maps and a new 1-channel version, for, ahem, whatever it is they do with that ... rummage, bump maps and shadow maps erm...

I think with GDDR4's ~75-90GB/s coming this year - fingers-crossed - things could get pretty interesting.

Jawed
 
trinibwoy said:
Yeah but how does that help R580 if future texturing demands are the bottleneck? I think that's the premise for the question if I understand it correctly.
Increasing math ops faster than texture ops also reduces the bottleneck on texturing.
 
Jawed said:
Bandwidth is the actual bottleneck. You can only build so much TMU capability into a GPU with given bandwidth before diminishing returns bite hard.
Not really. At least, not yet. Eventually I expect this to be the case, but for the moment, I'm pretty sure that memory bandwidth is not a significant concern with respect to texturing.
 
Chalnoth said:
Not really. At least, not yet. Eventually I expect this to be the case, but for the moment, I'm pretty sure that memory bandwidth is not a significant concern with respect to texturing.
Memory bandwidth is definitely a concern.
 
Some of these problems will be alleviated when GDDR4 is in use... but that's still a bandaid to the problem... I suspect we will eventually get an R590 or some G7x version that will use it...

Then again, my question is.. what's the holdup with GDDR4? Low quantities? GDD4 still not a standard? GDDR4 difficult to produce?
 
I just installed fakefactory's cinematic mod for HL2 that replaces virtually every texture in the game with a 2048x2048 textures. It looks completely incredible
incredible!!! http://halflife2.filefront.com/screenshots/File/52458/2
whilst some of the the textures are excellant (eg the faces) theyre juxtaposed against washed out ones eg the walls or clothes which ruins the whole effect, also the ps1 era shading doesnt help.
Is all that going to somehow get replaced with a shader program?
not completely but procedural shaders certainly are gonna help free up memory
 
I'm still waiting to see what people come with on SM4/D3D10 technology. Sure, you can do a lot of procedural/dynamic stuff with the current pipeline - but D3D10 should allow for much more freedom of expression in the pipeline.

Curiously, I was reading an article the other day that hinted that we'd get much more sharing of art assets in future games. Given the stupid LOD required by "next gen" titles, most studios are waking up to the fact that reinventing all their basic assets for each project might not be a smart idea.

On the one hand it could end up being pretty crap if they all look the same, on the other hand we could see specific studios adopting a similar artistic style across many games..

Jack
 
Lack of extensions that allow to play around with procedural generation

I think that the binding of D3D10 to Windows Vista is seriously affecting the widespread use of procedurally generated stuff... I doubt many developers today are willing to put a lot of time into something that only a fraction of the userbase will be able to use. Probably we'll have to wait a long long time before studios really start using this. Maybe not so long if it would be exposed as an OpenGL extension, cause then everyone could try it, but that's another history.

I remember reading an ATI presentation where they said that you could render directly to a vertex buffer (meaning you could generate or modify geometry) and that his extension would be available to developers soon, anyone who knows details about that?
 
Anteru said:
I remember reading an ATI presentation where they said that you could render directly to a vertex buffer (meaning you could generate or modify geometry) and that his extension would be available to developers soon, anyone who knows details about that?

I heard of that (maybe here I don't know :)) as a means to do vertex texturing !?
 
Blazkowicz_ said:
I heard of that (maybe here I don't know :)) as a means to do vertex texturing !?
It's sorta kinda similar, but not the same. It is a way for the output of the pixel shader to be read back in to the vertex shader in a later pass, but it is not a replacement for vertex texturing.
 
Anteru said:
I remember reading an ATI presentation where they said that you could render directly to a vertex buffer (meaning you could generate or modify geometry) and that his extension would be available to developers soon, anyone who knows details about that?
Yeah, it's called 'R2VB' (Render 2/to Vertex Buffer). Annoyed more than enough dev's - sure, the AAA top-grade dev's can afford to make an ATI X1k specific path and a GeForce 6/7 path, but a lot of us don't have the time/resources for that. Means that most people aren't going to use it much because you'll have to implement a fairly advanced effect in two completely different ways :devilish:

I don't know about ATI's R2VB extension, but I've heard that vertex texturing on the Nvidia hardware is still fairly limited - both feature wise (only 1 format) and performance is less than ideal.

Jack
 
JHoxley said:
Yeah, it's called 'R2VB' (Render 2/to Vertex Buffer). Annoyed more than enough dev's - sure, the AAA top-grade dev's can afford to make an ATI X1k specific path and a GeForce 6/7 path, but a lot of us don't have the time/resources for that. Means that most people aren't going to use it much because you'll have to implement a fairly advanced effect in two completely different ways :devilish:

You don't have to make different paths the NVIDIA cards have no problem using render to vertex buffer, it might even be preferable considering the lackluster implementation of vertex texturing (a lackluster implementation is of cause better than no support at all).
 
Back
Top