The Lack of high rez S3TC textures in games

Russ,

don't doubt for a minute that nVidia does not help to lead the developers in order to favor their cards. How about all of the those Custom nVidia OpenGL calls found in DroneZ and Vulpine bench marks? Or other games designed to run on a GF3 (like giants again with custom nVidia calls to show off their hardware)? In these cases nVidia has persuaded the developers to tailor their code to fit nVidia hardware. In fact you can see them pushing people to use Alpha Blend vs. Alpha test in order to avoid the lack of AA on alpha textures that their GF3/4 cards suffer. They are have the largest video card install base and developers do tailor to that.

Truth is the issue probably lies in the middle. I just find it very interesting that you say it can not be nVidia when we know they have already force the hand before....
 
Althornin said:
Did i say "mysteriously"?
Oh wait, thats YOUR words. get them out of my mouth, thank you very much.
Not much mysterious about it - when the leading gfx card manufacturer pimps speed over IQ, developers listen. If nVidia had PUSHED for higher IQ like S3 did then more games owuld use them - which is pretty much undeniable, and the main point of my post - nVidias failing was a failing to push for IQ enhancements.
And Tagrenith, it is funny. Because as soon as their FPS were the highest, their thrust became "faster" not better looking. And i feel the lack of encouragement from nVida hurt the use of TC that I like the most - good sharp textures.

If you disagree, fine - post some REASONS. Keep your words outta my mouth though.

Not to further add fuel to the fire that is already brewing but you don't want Russ to speak for you but you just spoke for an entire company (regarding NVidia pushing for speed vs. IQ)? Can you provide some proof of this because our dev team must NOT be on that mailing list.
 
jb said:
don't doubt for a minute that nVidia does not help to lead the developers in order to favor their cards. How about all of the those Custom nVidia OpenGL calls found in DroneZ and Vulpine bench marks? Or other games designed to run on a GF3 (like giants again with custom nVidia calls to show off their hardware)? In these cases nVidia has persuaded the developers to tailor their code to fit nVidia hardware. In fact you can see them pushing people to use Alpha Blend vs. Alpha test in order to avoid the lack of AA on alpha textures that their GF3/4 cards suffer. They are have the largest video card install base and developers do tailor to that.

Truth is the issue probably lies in the middle. I just find it very interesting that you say it can not be nVidia when we know they have already force the hand before....

I'm saying its not NVIDIA because there's not a "FAST S3TC" path and a "BIGGER TEXTURE S3TC" path.

They're one and the same path!

I don't doubt that NVIDIA encourages developers to use techniques that favor their product, but this feature is agnostic when it comes to speed/detail. Are you suggesting that NVIDIA was mum about the thought that you could use this for larger textures and the developers were too stupid to see it on their own? You can't blame NVIDIA for developers using that extra bandwidth for speed instead of larger textures.

Of course, we could blame the amount of games with really crappy game play on NVIDIA also, because they help developers make pretty games, rather than good ones.
 
I don't doubt that NVIDIA encourages developers to use techniques that favor their product, but this feature is agnostic when it comes to speed/detail. Are you suggesting that NVIDIA was mum about the thought that you could use this for larger textures and the developers were too stupid to see it on their own?

I thought he was saying that NVIDIA dev rel would be shining as negative a light on S3TC as possible, as, afterall, its not agnostic in terms of quality.
 
Ty said:
Not to further add fuel to the fire that is already brewing but you don't want Russ to speak for you but you just spoke for an entire company (regarding NVidia pushing for speed vs. IQ)? Can you provide some proof of this because our dev team must NOT be on that mailing list.
I said that is their focus (speed over IQ) up to GF4.
Read any marketing from nVidia for proof.
As for you not wanting to add fuel to a fire, well, thats BS - if you didnt want to, then dont. Not to hard. I wont take away your responsibility for this one with your simple disclaimer, buddy. So suck it down like a man.
 
Althornin said:
Ty said:
Not to further add fuel to the fire that is already brewing but you don't want Russ to speak for you but you just spoke for an entire company (regarding NVidia pushing for speed vs. IQ)? Can you provide some proof of this because our dev team must NOT be on that mailing list.
I said that is their focus (speed over IQ) up to GF4.
Read any marketing from nVidia for proof.
As for you not wanting to add fuel to a fire, well, thats BS - if you didnt want to, then dont. Not to hard. I wont take away your responsibility for this one with your simple disclaimer, buddy. So suck it down like a man.

Obviously you have a lack of simple marketing fundamentals. Let me boil it down for you: numbers, the bigger the better - the same thing your GF complains about no doubt. When company X can claim 200 fps vs company Y's 100 it doesn't matter one whit to the average consumer when Y can claim subjectively that they look better.

As for the fuel to the fire comment, I was referring to trying not to be so bombastic in the tone of my post. Which is something you obviously have a problem with in real life. I have a feeling you "like to suck it down", eh?
 
RussSchultz said:
I don't doubt that NVIDIA encourages developers to use techniques that favor their product, but this feature is agnostic when it comes to speed/detail. Are you suggesting that NVIDIA was mum about the thought that you could use this for larger textures and the developers were too stupid to see it on their own? You can't blame NVIDIA for developers using that extra bandwidth for speed instead of larger textures.

Of course, we could blame the amount of games with really crappy game play on NVIDIA also, because they help developers make pretty games, rather than good ones.

The last sarcastic and unhelpful part - i'll let you think about how rediculous it sounds on your own.

However, about the rest of it, where you appear to have a basic grasp on reality:
1) nVidia has influence on developers
2) nVidia FAILS to push for higher quality textures, instead allowing developers to be lazy and do zero extra work.
3) This leads to the sole usage of TC as a speed up product.

If you cannot see this, i dont know waht to say.
I'll say it again: S3 did NOT do this - they fought for better textures (re: UT), fought for better IQ. nVidia was content to allow the developer to be lazy and not fufill the promise of the technology. That is their failing, as i said before (and you ignored before as well):
If nVidia had PUSHED for higher IQ like S3 did then more games owuld use them - which is pretty much undeniable, and the main point of my post - nVidias failing was a failing to push for IQ enhancements.
 
Ty said:
Obviously you have a lack of simple marketing fundamentals. Let me boil it down for you: numbers, the bigger the better - the same thing your GF complains about no doubt. When company X can claim 200 fps vs company Y's 100 it doesn't matter one whit to the average consumer when Y can claim subjectively that they look better.

As for the fuel to the fire comment, I was referring to trying not to be so bombastic in the tone of my post. Which is something you obviously have a problem with in real life. I have a feeling you "like to suck it down", eh?
Keep your personal attacks and insults in PM, ok?
who says its subjective?
Who says the consumer cant desire IQ more than FPS?
Who says you are missing the whole goddamn point i made earlier? I DO!
Wow - your whole argument is:
a) I have a small penis
2) I suck cock
3) I have no comprehension of basic marketing

Well, you dont know me.
And i have demonstrated here comprehension of more than basic marketing. Youa re just not seeing it.
Please, grow up.
Post again when you move out on your own.
 
Well it may not be entirely relevant, but I'm a game developer...

Our team (Stormregion) is relatively new, we have only one game finished, but we designed that game for 32MB cards calculating with texture compression.
That was the summer of 2000, GF2 was just released as the first nVidia card to offically support DXTC, and there were no 64MB cards in available.
Later we added Anisotropic support as a detail setting that was on by default.

I think I can say we managed to create a title which had significantly sharper textures than any game in the genre (which is 3D strategy games.) We were following competition during the development, and many times we were shocked to see how bad they came in this area. (The only title we saw that had quality textures was Battle Realms, but that has no support for rotation nor zoom...)

So my answer to the question: there are support for high-res textures, but it's not widely supported.
Why? Actually I haven't got a clue.
But it's not because it's hard to support it. (I even wrote a texture compression algorithm, not just decompression.)
And it's not because graphics artist can't do high resolution textures, there were times where we downscaled their texture after finding out that the highest resolution mipmap was never used in rendering...
 
Now you're changing your tune. First you said, "They encourage the use of S3TC for speed increases - not for quality increases." Now you say, "nVidia FAILS to push for higher quality textures, instead allowing developers to be lazy and do zero extra work."

So which is it? Does NVidia actively convince developers to only use low-res textures or are you complaining that NVidia doesn't push developers to use high-res textures enough?

Your previous argument was, NVidia strongarms developers into using low-res textures. I asked for proof of this because I've never heard of it even though I am on a dev team. Your retort, "It's out there, you only have to see it". So basically it comes down to taking your word for it. Suuurree. Go sell crazy some where else (Rage3d perhaps?) because we're full over here. ;)
 
Those participating in this thread who're feeling a little hot under the collar. . .please step back, take a deep breath, and calm down a little.
 
I guess we'll have to just agree to disagree.

I don't see NVIDIA having that much pull over what the developer does or doesn't do, beyond making thing easier for them or funding their company (a la Dagoth Moor).

By NVIDIA educating about S3TC/DXTC, they open pandora's box and provide fruit from the tree of life. The knowledge provides extra bandwidth, with no fetters as to what its used for. The sole aspect of NVIDIA's domain pretty much ends at "take this texture, see how easy it is to make it S3TC/DXTC. Look its smaller! Look its more efficient when compared to the non-compressed one!"

Beyond that, what should they have done to not attract your ire?

They can't force houses to develop hi-res artwork, so what then?
 
Wow. Flames got high while I was typing my reply.

All I can say is calm down.

We are not talking about life & death just games & videocards.

I think its the responsibility of the video card makers to make video cards, and sell as many of them with the marketing that fits best.
And I think its the responsibility of the game developers to make as good games as they can.
I don't think its the videocard manufacturers to blame, when one's doing crappy graphics.

I know we made high resolution stuff, and guess what - it was running best on nVidia cards. Those cards that everyone blames for bad quality, or low speed when it's high quality.
GeForce3 still has the best quality Anisotropic filtering to day. (Ok, I'm waiting for the replacement for our alpha Parhelia, I hope it'll beat that :))

So the possibilities are there.
 
There is actually one point you should notice:

Let's say you have a 256x256 hicolor (16 bit) texture on an object.
Replace the texture with 512x512 DXT1 compressed texture.

1. The compressed one will occupy exactly as much memory.
2. Unless you do something terribly wrong, it'll look much better.
3. It will be faster !!!!

Those who find point 3 suprising need to rethink how mipmapping and trilinear filtering works...

And while I agree that video card manufacturers should educate developers, I think developers should be smart enough to see what features are present in a video card, what it can be used for.
And a great part of developing is doing research work on usable (and unusable) techniques...
 
:D
sorry, I can't hold my laugh...

Everyone who understand even little finnish and has looked Muropaketti's "...Parhelia..." and legendary "Bitboys" Threads, knows that Althornin and TY were just warming up! ;)

I have seen (and sadly, been also a part of) flame war taking over few hundred replies in few days. And all that, because someone had a need to prove being a right. (and as you know, disscussion goes nowhere that way.)

Angry arguments are best way to make mistakes and learn about them, but if you don't recognize your own mistakes, you can't learn about them.

Althornin and TY, before continuing, take that deep breath anyway. :) C'est le Graphique Carte (and if that's no where near to french, neither am I.) :D

anyways, just cards, not life. :)
 
RussSchultz said:
They can't force houses to develop hi-res artwork, so what then?

No, but they can encourage it. and they dont (or didnt).
S3 got epic to do it. nVidia never even tried.

TY: Appologize, asshat.
as for what else you said:
Now you're changing your tune. First you said, "They encourage the use of S3TC for speed increases - not for quality increases." Now you say, "nVidia FAILS to push for higher quality textures, instead allowing developers to be lazy and do zero extra work."

So which is it? Does NVidia actively convince developers to only use low-res textures or are you complaining that NVidia doesn't push developers to use high-res textures enough?
No, dont you get that they are one and the same thing? Failing to do one is encouragement of the other, by proxy. I dont see why it is so hard to understand what i am saying.
And if your only new argument is "changed what you were saying" - well, i only stated what i said originaly but clearer as to what i meant. This is know as clarification. No tune changing here buddy. Try to REFUTE my argument, not character assasinate me :rolleyes:
 
Nappe1 said:
:D

Althornin and TY, before continuing, take that deep breath anyway. :) C'est le Graphique Carte (and if that's no where near to french, neither am I.) :D

anyways, just cards, not life. :)


Well about your french, not bad but try better 'C'est la carte graphique' and it will looks ok ;)
 
I'm not a game developer, but I do work on an app (http://www.shatters.net/celestia/) which uses high resolution S3TC textures. And I figured out how to do this in OpenGL using a tutorial from the developer section of the nVidia web site. I think it's ridiculous to claim that the lack of high resolution textures in games has anything to do with nVidia. It's much more likely that game developers simply don't bother to spend the time to create and distribute two separate sets of textures: one normal one for cards without S3TC, and another high resolution S3TC version for newer cards. Once S3TC is ubiquitous, I'm sure it will enjoy wider support in games.

--Chris
 
Grue said:
It's much more likely that game developers simply don't bother to spend the time to create and distribute two separate sets of textures: one normal one for cards without S3TC, and another high resolution S3TC version for newer cards. s

A couple of points:
No need to develop two seperate sets fo textures - you can just dither the high res set down for regular users.
and on the other point you make, i happen to think that it IS nVidias fault for ALLOWING developer laziness.
Features need to be pushed. nVidia could have done this (as S3 did) - but they failed to. This is why it is their fault.
 
Back
Top