The Official S3 Discrete GPU Rumours & Speculation Thread

It's not obvious to me that it even makes sense. x86 cores have to support all sorts of legacy cruft and little niceties if you want half-decent single-threaded performance. Grafting all of this in to a GPU-like core purely for the sake of being able to intermix the instruction streams doesn't seem to me like such a hot idea (more like con-Fusion!). Why is it necessary? What's wrong with heterogeneous cores?

Something doesn't add up.
 
Hmm? I guess I hadn't given that detail of transition much thought. So backwards compatibility with existing API vers is out the window? Is that what you're saying?

Sorry I meant AMD64 (64 bit with complete 32 bit BC). I keep confusing these names :oops:
 
It's not obvious to me that it even makes sense. x86 cores have to support all sorts of legacy cruft and little niceties if you want half-decent single-threaded performance. Grafting all of this in to a GPU-like core purely for the sake of being able to intermix the instruction streams doesn't seem to me like such a hot idea (more like con-Fusion!). Why is it necessary? What's wrong with heterogeneous cores?

Something doesn't add up.

In the early days of fusion, i.e early 2009 if AMD isn't usually late, these instructions I'm talking about will just help the GPU (if on die) and the CPU communicate as they share a crossbar, memory controller etc.

Maybe that is the maximum extent which they'll play with the ISA, however they might go beyond that, I remember Bob Drebin and Phil Hester saying something like " then you will have to consider whether you will execute gfx code separately or as x86". This isn't exactly what they said but the meaning of the phrase is wat i mentioned above. Anyway I'll try to find the article and post the link ASAP.
 
Last edited by a moderator:
This should be Destination Films D2:
dfd2xq6.jpg

source

... and the guy says he wants to post more details/specifications later.
 
"3DMark06 score is around 5800, priced at 155 dollars to 199 dollars."

x1950 pro : 4706 $136
x1900xt : 5677 $230
8600gts : 5292 $164
7950gt : 5238 $199

benchies from x-bit
prices from newegg (13 july) cheapest card
 
I still fondly recall my Savage4. If S3 can put out something decent, with decent drivers, I'd be interested. I'd wait until at least 3 driver sets were released, but I'd still be interested. More power to them; I doubt they could compete with ATI or NVidia, but I'll cheer them on for trying!
 
More competition would of course be great, but realistically I'd have a hard time believing that S3 will outperform ATI/NVidia's last-gen integrated solutions by any noticeable margin.

Hopefully I'm pleasantly shocked.
 
i think they should focus on iq and compatibility rather than performance.

if someone (matrox, s3, intel) came out with better iq than ati and nvidia, then ati and nvidia should get out of the graphics business and work outside of the engineering industry.

none of that will happen.

a lot of people don't realise how much many unnessary image quality compromises ati and nvidia force and how many features they leave out, and how poorly their products are engineered and manufactured, including heat.

they could do something to reduce the heat and wattage and voltage of their cards by 33% without sacrificing anything.

they could use as5, they could make the chips run at a lower voltage, they could put aluminum blocks on the bottom of all of their cards, they could allow raising of the fan speed w/o a whole bunch of bs, they could use both aluminum and copper for heat dissipation.
 
Okay ballsweat, since you're so damn smart and know how to do everything, why don't you start up your own company and go head to head with ATI and Nvidia? I'll tell you why, because you're talking out of your ass and don't have a clue to stand on.
 
Okay ballsweat, since you're so damn smart and know how to do everything, why don't you start up your own company and go head to head with ATI and Nvidia? I'll tell you why, because you're talking out of your ass and don't have a clue to stand on.

i'm not really smart, and i don't really know how to do everything.

i'm the one that has to put up with mediocre image quality, poor compatibility, and shitty drivers and i think that should be fixed by the people who created it.

although when i read a game review on gamespot, i feel the same way when they give a great game a bad review. it makes me want to tell gamespot that i'd like to see them make a game.
 
i'm not really smart, and i don't really know how to do everything.

i'm the one that has to put up with mediocre image quality, poor compatibility, and shitty drivers and i think that should be fixed by the people who created it.

although when i read a game review on gamespot, i feel the same way when they give a great game a bad review. it makes me want to tell gamespot that i'd like to see them make a game.

OTOH, you may just be some guy whining on a forum...hard to say. But knowing what you equate with great image quality...hmm...that`s just about all I have to say. And it`s rather naive to think that multi-million dollar projects are hamstringed because they can`t figure out how to use AS5 and aluminum blocks. As shocking as it may seem, the guys at nV and ATi know about these incredible inovations. The mystery as to why they`re not using them is for you to solve(the first answer that comes to mind, namely:"they're evil and they're not 3dfx" doesn`t count, you`ll have to think harder about it)
 
OTOH, you may just be some guy whining on a forum...hard to say. But knowing what you equate with great image quality...hmm...that`s just about all I have to say. And it`s rather naive to think that multi-million dollar projects are hamstringed because they can`t figure out how to use AS5 and aluminum blocks. As shocking as it may seem, the guys at nV and ATi know about these incredible inovations. The mystery as to why they`re not using them is for you to solve(the first answer that comes to mind, namely:"they're evil and they're not 3dfx" doesn`t count, you`ll have to think harder about it)


thanks=)

i know that they know they can use as5 instead of the not as good thermal paste they use, but they choose not to, when they could. it would be about 3-5 degrees celsius difference.

they choose not to b/c they know it's of no benefit to them.

if i like cooler temps, then no one else cares about the temps.

about 3dfx, it's not really that i'm a 3dfx fanboy. it was what 3dfx did good, that no one else has tried to do. i like 3dfx better for their technologies that are still, imo, better than a lot of the things nvidia and ati do.

i bought 2 8800 gtx's and my mb is a 680i, so I don't boycott nv products or anything.

if the voodoo5 did coverage sampling and nvidia had done t-buffer aa ever since the gf 2 gts, then i would consider them on the same level.
 
i like 3dfx better for their technologies that are still, imo, better than a lot of the things nvidia and ati do.
3dfx isn't around anymore because they couldn't keep up with what NVIDIA (especially) and ATI created. Simple as that. Other than Voodoo5's AA, their competing products weren't technologically ahead of NVIDIA post RIVA TNT.

3dfx had better texture filtering for a while tho. NV liked to do the blur.
 
Last edited by a moderator:
i know that they know they can use as5 instead of the not as good thermal paste they use, but they choose not to, when they could. it would be about 3-5 degrees celsius difference.

Could you clarify: what's the concrete gaming benefit of a 3C lower die temperature?
 
a lot of people don't realise how much many unnessary image quality compromises ati and nvidia force and how many features they leave out, and how poorly their products are engineered and manufactured, including heat.
Like lossless texture compression? What the fuck are you babbling about? (Am I allowed to troll my own forum?)
 
3dfx isn't around anymore because they couldn't keep up with what NVIDIA (especially) and ATI created. Simple as that. Other than Voodoo5's AA, their competing products weren't technologically ahead of NVIDIA post RIVA TNT.

3dfx had better texture filtering for a while tho. NV liked to do the blur.

i really should've just said aa. you're right. maybe the w-buffer, but i'm don't know if 3dfx invented it, but if they did, then that.
 
Like lossless texture compression? What the fuck are you babbling about? (Am I allowed to troll my own forum?)

i'm sorry.

but anyways, i don't think they do currently have a lossless tc that i know of. dxt and s3tc and even 3dc are lossy.

other compromises include:

still not perfect af, see the g80 iq analysis,
csaa and msaa, instead of t-buffer style aa
driver code for dx 6 has been screwed up starting with the 7x.xx series drivers.
16 bit output not internally rendered @ 32 bit
some other things.
too much distance fog, although i guess that's more of a dx issue.
 
still not perfect af, see the g80 iq analysis
I'll bet my trousers that you won't be able to tell the difference with a perfect AF implementaiton in a blind test.
csaa and msaa, instead of t-buffer style aa
t-buffer = another implementation of multisampled rendering, do you know this, right?
driver code for dx 6 has been screwed up starting with the 7x.xx series drivers
who cares?
16 bit output not internally rendered @ 32 bit
see comment above.
some other things.
too much distance fog, although i guess that's more of a dx issue.
Do you know that applications control fog? blame who wrote your favourite foggy game, not graphic cards.
 
I always thought the t-buffer as applied to AA was nothing more than rotated grid supersampling. The hardware devoted to it was probably very minimal.

---------------------------

ballsweat, I don't know why you think so highly of 3dfx. The most important IQ feature is AF, as it affects such a huge percentage of the pixels on the screen. I will never play a game without enabling AF if possible, even if the performance hit is big (which generally means its importance is big). 3dfx never had AF in any of its chips.

With AA, the real innovation was MSAA with colour compression, and 3dfx had neither. 4xAA with a >75% performance hit is rather useless. I saw games take a 70% hit on a GF3 since it had no color compression, and being ordered grid, that was rather useless too.

Lossless texture compression is useless for a graphics chip. You'll never see it happen.

R300 is the first chip to really deliver on IQ at a usable cost. So in summary, I concur with Tim Murray and wonder what the fuck you're babbling about.
 
Back
Top