So what do you think about S3's DeltaChrome DX9 chip

JD said:
I tend to agree with trident when they've said that programable hw is easier to design and write drivers for and this makes it possible for other smaller companies to compete. It's also nice that MS is doing the hlsl that ihv can plug into. Maybe this is why s3 went with 2+ shaders over 2.

While it may be easier to create working drivers and hardware, it's another thing altogether to create efficient hardware (and by efficient here I mean in all areas: high image quality/performance ratios, high actual performance/theoretical performance ratios). I really don't believe for a moment that S3 or Trident will ever be able to compete with nVidia or ATI in producing a truly good piece of programmable hardware (ex. one that actually could compete against ATI's or nVidia's high-end parts if clocked higher and matched with better memory, perhaps requiring a die shrink to match ATI's and nVidia's, considering that ATI and nVidia will have less trouble using the more advanced processes).

First of all, S3 is a crap company and has been a crap company since their first crap attempt at producing a 3D video card. They've had crappy drivers and crappy hardware all throughout the 3D graphics era. Now that they're owned by an even crappier company, VIA, I see no reason for this to ever change.

Trident is similarly a company that has always shot for the low-end. While I have a slightly higher opinion of them than I do of S3, it's still pretty darned low. Trident's only hope is to attempt to focus on a niche market (low volume, high margin) in order to build up capital for significant R&D spending. Without that, they're doomed to failure.

As a side note, PowerVR has a ghost of a chance sometime in the distant future with their design wins for the MBX. I can, pretty easily, see PowerVR doing very well in the very low-power video segment (such as for extra-portable laptops and the ever-increasing number of handheld devices), which may, in time, allow for an eventual return to the desktop PC market (if PowerVR so chooses). I don't think they would do well if they attempted to get back in the desktop market anytime very soon.
 
Chalnoth said:
First of all, S3 is a crap company and has been a crap company since their first crap attempt at producing a 3D video card. They've had crappy drivers and crappy hardware all throughout the 3D graphics era. Now that they're owned by an even crappier company, VIA, I see no reason for this to ever change.

Ah, the typical open minded chalnoth post. MY life is now complete.
 
Chalnoth said:
Trident is similarly a company that has always shot for the low-end. While I have a slightly higher opinion of them than I do of S3, it's still pretty darned low.
I always felt the opposite.
 
Actually on the hardware side S3 did do a good job, Savage2000 had REALLY good specs; based on specs alone S2k should have beat GeForce256. It's a shame the drivers were horrendous, really.
 
Tagrineth said:
Actually on the hardware side S3 did do a good job, Savage2000 had REALLY good specs; based on specs alone S2k should have beat GeForce256. It's a shame the drivers were horrendous, really.
I don't think you can really make that distinction, though. That is, the drivers might be poor because the hardware's really hard to write drivers for. Just look at how S3 was never able to implement hardware T&L (that, and they were originally going to have a CPU/GPU dynamic load balancing...indicating a very weak T&L processor in the first place).
 
OpenGL guy said:
Chalnoth said:
Trident is similarly a company that has always shot for the low-end. While I have a slightly higher opinion of them than I do of S3, it's still pretty darned low.
I always felt the opposite.
Yes, I suppose S3 did bring the first form of texture compression to the PC. Perhaps I just have a lower opinion of S3 because they've failed so many times. Trident has never done well enough to fail so spectacularly.
 
Althornin said:
Chalnoth said:
First of all, S3 is a crap company and has been a crap company since their first crap attempt at producing a 3D video card. They've had crappy drivers and crappy hardware all throughout the 3D graphics era. Now that they're owned by an even crappier company, VIA, I see no reason for this to ever change.

Ah, the typical open minded chalnoth post. MY life is now complete.

Well, it should let you know that I at least have some respect for ATI and PowerVR. I have no qualms about totally bashing a company for which I have no respect. Furthermore, I don't think I said anything that was unwarranted. VIA's motherboard chipsets have always been buggy (well, for at least as long as I've owned them, which probably means always), particularly with regards to AGP. The Cyrix brand they own is as bad as it gets when it comes to PC CPU's, and S3 isn't far ahead.

But, I will say that when I bought an S3 Virge once upon a time, it was far better than my previous card at 2D (videos were no longer choppy). But its 3D was far from acceptable. I had one game that worked okay, but it didn't run fast enough by a long shot, another game that not only ran very slowly, but also had many rendering errors, and a third game that ran pretty well but managed to be unstable. The way I see it, S3's success in 2D does not give me any cause to give them any respect.
 
Chalnoth said:
VIA's motherboard chipsets have always been buggy (well, for at least as long as I've owned them, which probably means always), particularly with regards to AGP.
I haven't had any problems with my KT266a, but note that I feel the same way as you when it comes to VIA's first revision releases (i.e. KT133 vs. KT133a and KT266 vs. KT266a). VIA seems to release chipsets with some problems then fixes them in a later rev. Not unheard of in the PC industry, but chipset problems can be very dangerous.
 
I've also noticed that. Via is contemplating on releasing kt400a that supports 400mhz ddr speeds. There is something in current kt400 that makes it unstable to run at those speeds. I agree with you chalnoth that the difference is between having the features and having them run optimized though I have to give credit to via s3 for even attempting to bring something that is close to gffx/9700 with 2+ shaders to boost. I had an old 1mb trident card(upgradable to 2mb) on my 486, oh the memories:)
 
I tend to agree about Via. I had so much trouble with them back in the day (KT133a) that I refuse to ever buy another one of their chipsets again.
 
Actually the kt400 doesnt have a 1/6 divisor and also they are adding an improved memory controller
it can run at 200fsb fine if ur compents can handle(my kt333 is running at 202fsb as we speak)
and alot of the damage against via came from Creative making a soundcard that was out of PCI specs and abused the bus so bad it crashed systems
Faster product updates happen with more competition which they have had from SiS and nvidia and for a while Ali
didnt see you bashing nvidia and ati so much for just updating theres to only add agp8x and other things they have done thats no different to via
 
psurge said:
What about multi-sampling? Wouldn't the blend have to be performed multiple times for a given shader result?
As I wrote above. Yes, multisampling is a problem.
If you want the framebuffer as input to the PS, then you need to average the subsamples of that pixel. And (at least in most cases) it would be best if the average is taken over just the subpixels that eventually will be written to.
But that's of course difficult if the PS is modifying Z.
 
As a side note, PowerVR has a ghost of a chance sometime in the distant future with their design wins for the MBX. I can, pretty easily, see PowerVR doing very well in the very low-power video segment (such as for extra-portable laptops and the ever-increasing number of handheld devices), which may, in time, allow for an eventual return to the desktop PC market (if PowerVR so chooses). I don't think they would do well if they attempted to get back in the desktop market anytime very soon.

If they want to keep the just setting in success what the PDA/mobile market concerns, they have to keep up with development for future products too, and that's where probably the current development of Series5 kicks in, which I doubt is aiming for just one platform.

Wether they'll decide to enter with that generation the PC desktop market again, is still a question mark due to more than one conditionals involved, but all public statements (even if they're sparse) can suggest that they're planning to.

It's a common secret that any possible success in any market depends more on the partner that will pick up the lisence, than ImgTec themselves. Tough task for them should be to pick the correct partner, that will be commited to invest and stay.
 
Wait a second Chalnoth, did you ever use a Savage 4 on an intel motherboard? The card admittedly had serious issues on VIA chipsets, but it was very stable on my C300A @ 450 on a Abit BH6 w/ 440 BX.

UT was awesome too. Q3 ran ok, as did Q2/Half-Life and a host of direct 3d games. The image quality was excellent for the time. It wasn't the fastest card, but it did very, very well for what I paid for it ($90).

I didn't get rid of the card until I upgraded to a FIC AZ11 (KT133) and a Duron 600. I promptly bought a Geforce 2 MX because the Savage 4 would crash whenever various repeatable conditions occoured. S3 simply didn't agree with VIA at the time, in terms of hardware behavior/compatability.
 
Yes, the idea that S3 should be tarred with the VIRGE brush is a little unfair. Savage3D and Savage4 were mostly excellent chips (albeit with, as you say, spotty compatibility). Even Savage2000 could have competed if they'd pushed it right.

You might as well have a go at ATI because their first chip didn't have a Z-buffer, or Matrox because they didn't do texture mapping, or nvidia because of NURBS, etc....
 
Tagrineth said:
Actually on the hardware side S3 did do a good job, Savage2000 had REALLY good specs; based on specs alone S2k should have beat GeForce256.

At that time there was a comment in circulation, presumably from an S3 engineer, that they didn't get to finish (remove bugs from) the hardware because management wanted it so badly for the imminent Christmas season. Hence the hopeless battle to repair or circumvent the damage by driver hacks. It was, quite simply, broken hardware... hardly a "good job". Assuming it was an S3 engineer who claimed that and assuming he/she was telling the truth, of course!

IIRC, Savage2000 had "pixel/texel" [thanks again, 3dfx] fillrates of 250/500, compared to GeForce's 480/480. Savage2000 was initially announced to launch at 175 MHz (350/700), but even then it wouldn't have beaten the GeForce in single-textured pixel fillrate. (Which occurs fairly often even in games that utilize multitexturing for effects.)

But Savage2000's problems and GeForce's DDR version made that a moot point.
 
Basic said:
psurge said:
What about multi-sampling? Wouldn't the blend have to be performed multiple times for a given shader result?
As I wrote above. Yes, multisampling is a problem.
If you want the framebuffer as input to the PS, then you need to average the subsamples of that pixel. And (at least in most cases) it would be best if the average is taken over just the subpixels that eventually will be written to.
But that's of course difficult if the PS is modifying Z.

if i understand you right, the validity of your concern depends on whether all that color/z reading/writing takes place on an oversampled buffer or on a down-sampled buffer.
 
In reply to S3 having always been a crappy company: I remember a time when my S3 powered Diamond Stealth 64 VRAM kicked the ass of all it surveyed in the land of videocards (until Matrox released their Millenium that is).
It wasn't untill the disaster that is VIRGE that S3 got a bad name. :oops:
 
Back
Top