The Official S3 Discrete GPU Rumours & Speculation Thread

T-Buffer AA has programmable grid. RG/OG/sparse, ATi's "temporal" AA, stochastic sampling... all of that is possible :)

It's quite silly to criticise 3Dfx because of lacking HW AF. Enabling RGSS and setting adequate texture LOD results in better texture quality, than competitions would-be-AF could produce.

it's pretty damn unpopular to say that, but i agree 100%.

af isn't the end all be all especially when it still ain't perfect.

instead of af they could use higher res and uncompressed textures with rgss and adjustable lod mip bias..

athough i would like to at least have true trilinear instead of brilinear.

by the way, notice the perfect af shown in the g80 iq analysis. notice how it completely goes smooth all the way ahead, then notice how blurry oblivion's textures can be way straight ahead. there is a discernable difference between the g80's hq and the perfect af shown by the af tester.
 
af isn't the end all be all especially when it still ain't perfect.

instead of af they could use higher res and uncompressed textures with rgss and adjustable lod mip bias..

athough i would like to at least have true trilinear instead of brilinear.
Waiting for screens from a card other than NV30-G7x to show me what's wrong here.

No TC = lower detail textures and lower performance on various things, by the way. Inescapable. Cards would have to have more RAM and more internal cache size, among other things, to match doing things smarter w/ compression.

Screwing with LOD can cause lots of texture aliasing.
by the way, notice the perfect af shown in the g80 iq analysis. notice how it completely goes smooth all the way ahead, then notice how blurry oblivion's textures can be way straight ahead. there is a discernable difference between the g80's hq and the perfect af shown by the af tester.

What does this mean exactly? If you don't force AF in the driver control panel, Oblivion doesn't use it. If you do force it, it works fine. The distance in Oblivion is heavily mip mapped and LOD'ed and will be blurry no matter what. This is a technical limitation caused by our computers not being 100x faster and having 100x more RAM.
 
Last edited by a moderator:
Well by changing LOD you change where the mip maps come in. Without proper mip mapping, you get aliasing.
 
Well by changing LOD you change where the mip maps come in. Without proper mip mapping, you get aliasing.

Because supersampling FSAA increases the texture sampling rate, you can use textures that contain higher frequencies (-> negative LOD bias) without aliasing.
Of course, this does not hold true for multisampling, where the texture sampling rate is not increased.

http://en.wikipedia.org/wiki/Sampling_theorem#Aliasing
 
Last edited by a moderator:

Very much so. Have you actually tried it?On a surface that is rather prone to noise?Try it. And the magical "cleaning" efect of SS is considerably overstated. Unless you do ridiculous ammounts of SS(4X SS adds something along the lines of what 2X AF does). So, in short, no.
 
Very much so. Have you actually tried it?On a surface that is rather prone to noise?Try it. And the magical "cleaning" efect of SS is considerably overstated. Unless you do ridiculous ammounts of SS(4X SS adds something along the lines of what 2X AF does). So, in short, no.
I speak about RGSS, which has double EER than oversampling. And yes, I have all officialy launched 3Dfx hardware and even Q3D HeavyMetal Mercury system. So I have quite a view of RGSS possibilities :)
 
I speak about RGSS, which has double EER than oversampling. And yes, I have all officialy launched 3Dfx hardware and even Q3D HeavyMetal Mercury system. So I have quite a view of RGSS possibilities :)

I had a longer reply to this, but this sums it up nicely:what?
 
Strange to think back when ATI, S3, Tseng, Trident, and a few others ruled the 2D graphics world in the early/mid 90's.

Then the paradigm shift to 3D graphics, and all of them failed miserably compared to the more nimble new companies (3dfx, NV, Matrox, a couple others I can't think of).

I was saddest to see Tseng fail. They were always my favorite. Cheap generic card from random Taiwanese companies I'd never heard of that competed with much more expensive cards from Tier 1 AIBs, and in some cases even made it into the lineup of the T1 AIBs.

ATI and S3 had the largest cash reserves of all companies and while ATI eventually pulled out and got a hang of the 3D game. S3 never did quite catch up.

I'd be really interested to see S3 get back in game. Even if it's just at the low/mid range. Everytime they release a new chip I keep hoping they learned their lesson and will spend significant time and funding to make solid stable drivers. And each time they've disappointed me.

Well, here's to hoping yet again that they don't do a half arsed job on the drivers.

Regards,
SB



I was hoping that Cirrus Logic had gone ahead with their PC card based on the 3D engine of the M2.... yes, 3DO's / Matsushita's M2. the window for that was 1996/1997 when Voodoo1 and PowerVR PCX2 were the fasted consumer 3D cards around. M2 was equal if not slightly better in IQ and 3D performance. even with Voodoo2 coming out, an M2-based card woud've been nice as a single chip 3D accelerator. I'm assuming Cirrus Logic would have used the main M2 chip and not the PowerPC CPUs.

present day - I too would be happy if S3 came back strong in the low-end to mid-range segments. there's obviously not enough competition.
 
I can confirm 4x RGSS cleans much the texture aliasing. LOD -1.5 was a very reasonable setting. the bilinear filtering wasn't even a big issue as the mipmap boundaries were far away and close to each other, though of course it's far from perfect. with 16bit@22bit and compressed 32bit textures (if applicable) and a 800x600 res you had a pretty good compromise (on a 17" CRT) if the game wasn't too demanding.

I agree it's bad the thread has diverged, maybe not so badly. S3 reintroduced RGSS with deltachrome! will the new GPU support it? and is it safe to assume there's MSAA now. that GPU also looks awesome as far as we can tell from 3dMark scores (which I presume they're not nearly as meaningless as in the 3dmark03 days)
 
I can confirm 4x RGSS cleans much the texture aliasing. LOD -1.5 was a very reasonable setting. the bilinear filtering wasn't even a big issue as the mipmap boundaries were far away and close to each other, though of course it's far from perfect. with 16bit@22bit and compressed 32bit textures (if applicable) and a 800x600 res you had a pretty good compromise (on a 17" CRT) if the game wasn't too demanding.

I agree it's bad the thread has diverged, maybe not so badly. S3 reintroduced RGSS with deltachrome! will the new GPU support it? and is it safe to assume there's MSAA now. that GPU also looks awesome as far as we can tell from 3dMark scores (which I presume they're not nearly as meaningless as in the 3dmark03 days)

What are we talking about?Some recent title?With rather detailed surfaces, that are prone to noise?Or the stuff that was around when 3DFx actually existed?Because I'm quite certain you see the huge difference in terms of surfaces between then and now.

About S3...dunno, the Deltachrome and following Chromes also had nice 3DMark numbers being pushed by S3's marketing. They also sucked badly due to inept drivers and rather primitive architecture. Will this change?Maybe, who knows.
 
Bumping this thread. Have we any new info on S3's new chips? Anyone who thinks they will be able to compete with AMD and NVidia's low-end offerings?
 
Bumping this thread. Have we any new info on S3's new chips? Anyone who thinks they will be able to compete with AMD and NVidia's low-end offerings?

We have pictures of dies and names of SKUs:

D2(90nm) left, D3(65nm) right:
diesbk5.jpg


SKUs:
Chrome 460 (D2)
Chrome 450 (D2)
Chrome 440 (D3)
Chrome 430 (D3)

And some more details about 430:
64036ne6.gif

http://www.3dnews.ru/news/videokarti_na_osnove_dx10_chipov_s3_graphics-270231/

But no infos about arch until now... :cry:
 
Small and simple boards, lower clocks then provious generations- I don't have much hope.

Wasn't Laguna lowcost incarnation of M2?
 
Back
Top