The Lack of high rez S3TC textures in games

I was just wondering why only 2 games that I know of use high rez textures with s3tc....UT and SS2. Thats pretty sad since these games look simply spectacular. People always say pump in the polys thats all well and good but i am sick of bland textures. Is it because of Nvidia or what?
 
I suspect nvidias poor implentation over their last 3 (4?) product lines hasnt helped matters, but I dont think they're a major culprit.

Frankly I'm also rather puzzled why texture compression hasnt become more prevalent.
 
I think that's odd too, it's not hard to support non-s3tc capable hardware at the same time either, took me about an hour to write a function to decode textures in all DXT modes into a raw RGBA for compatibility with older hardware. It also saves half the storage space on the game CD too if you're using twice the width and height and DXT1.
 
I believe there are actually more than 2 but then I suppose the games that gets used for benchmarks that also use high rez tex aren't many.

The reason, I believe, high rez tex hasn't "taken off" is the fact that it usually isn't high up the list of priorities for devs.
 
sigh its sad considering how absolutely magnificent the graphics are in those games compared to the textures of lets say an Wolfenstein or Jedi Knight 2.
 
borzwazie: Ah... :)


sigh its sad considering how absolutely magnificent the graphics are in those games compared to the textures of lets say an Wolfenstein or Jedi Knight 2.

The texturing in those games is hardly bad, especially not RTCW's.
 
Actually, it IS due to nVidia.
Rather than encouraging Image Quality, nVidia has been "PUMP THE FRAMERATES" mentality since day one. This has resulted in only one of the possibilities of texture compression really being used.
They encourage the use of S3TC for speed increases - not for quality increases. S3, however much their gfx cards had problems, at least encouraged developers to use the textures for better IQ, not sheer speed.
nVidias mindset over the past two years has killed the usefullness of S3TC.
 
I'm sure UT 2003 will support compressed textures as Epic already stated the game was including 32-bit 1024 x 1024 textures..without 128 meg cards I think that option would not be possible...maybe I'm wrong.
I do know the S3TC looked awesome when installing the compressed textures off the second CD.

ut-2.jpg


A pretty good article here..

http://www.digit-life.com/articles/reviews3tcfxt1/
 
I'm fairly sure I read a comment by CliffyB (from Epic) that making some of those S3TC textures was a nightmare.
 
Althornin said:
Actually, it IS due to nVidia.
Rather than encouraging Image Quality, nVidia has been "PUMP THE FRAMERATES" mentality since day one. This has resulted in only one of the possibilities of texture compression really being used.
They encourage the use of S3TC for speed increases - not for quality increases. S3, however much their gfx cards had problems, at least encouraged developers to use the textures for better IQ, not sheer speed.
nVidias mindset over the past two years has killed the usefullness of S3TC.

Which is funny cos while V3 was beating their frame rates they WERE pimping IQ :LOL:
 
It would be nice if someone who's in game development could give a rundown on about how much time a texture artists needs to spend on a texture depending on the resolution ... its unlikely to be independent on the resolution, its also unlikely to be a linear function of it. But Im no artist so I couldnt begin to guess where exactly in between it would be.

Im sure that if you had to go through all the artwork after the fact to add the increased resolution textures it would be hell ...
 
"nVidias mindset over the past two years has killed the usefullness of S3TC."

It couldn't possibly be because developers haven't cared enough to do it? Or publishers haven't felt the expense was worth it?

Yes, it must be because the chip manufacturer has somehow mysteriously convinced everybody its only useful for making things faster, but not better looking.
 
RussSchultz said:
"nVidias mindset over the past two years has killed the usefullness of S3TC."

It couldn't possibly be because developers haven't cared enough to do it? Or publishers haven't felt the expense was worth it?

Yes, it must be because the chip manufacturer has somehow mysteriously convinced everybody its only useful for making things faster, but not better looking.

Did i say "mysteriously"?
Oh wait, thats YOUR words. get them out of my mouth, thank you very much.
Not much mysterious about it - when the leading gfx card manufacturer pimps speed over IQ, developers listen. If nVidia had PUSHED for higher IQ like S3 did then more games owuld use them - which is pretty much undeniable, and the main point of my post - nVidias failing was a failing to push for IQ enhancements.
And Tagrenith, it is funny. Because as soon as their FPS were the highest, their thrust became "faster" not better looking. And i feel the lack of encouragement from nVida hurt the use of TC that I like the most - good sharp textures.

If you disagree, fine - post some REASONS. Keep your words outta my mouth though.
 
Redoing the whole texture packages to update everything to hi-res S3TC texture would be a pain, but for new artwork I don't think it shouldn't take significantly more time to do a high-res texture instead of a low-res. I would guess most artists are working on higher res versions of their work anyway and only scaling it down for the final version.
 
Generating the compressed textures should be relatively trivial at creation time. Simply an additional 'Save as'. There would likely be some additional effort making sure that there were no significant compression errors/artifacts, but still its not as if its a mountain of adversity...
 
IMO, the lack of large S3 textures is largely due to the fact that up until 12 months ago the majority of the consumers didn't have a card that supported S3TC. (Even today V3s and TNT2s make up a large percentage of gaming primary cards).
 
I said mysteriously because your argument is mighty mysterious to me.

S3TC/DXTC reduces bandwidth and memory footprint of the textures.

A developer can use this advantage for faster, yet the same size textures.

Or for same speed, yet larger textures.

Its the same method for either, and there's nothing preventing the developer from using it for either goal, or both if they wanted to.

It's completely a developer issue, and I just can't see how NVIDIA (the chip manufacturer) can be blamed for developers choosing one over the other, unless you're blaming their marketing campaign. It like blaming Intel for Windows getting bloated. Intel's chips "enable" inneficient code because they can execute quicker but they certainly didn't cause WinXP on a 1ghz machine to run the same as Win3.1 did on a 486 at 25mhz.
 
Back
Top