S3 as viable 3rd IHV?

Jimmers

Newcomer
Just noticed this over at xbit. I'd also like another player in the GPU field. Seeing as how WinVis will require 3D acceleration, and Intel doesn't seem up to the task, I think S3 could carve out a nice niche for themselves dealing in entry and mid-range parts. Anyways, isn't S3's current offerings on par with NV's and ATI's?

The Article said:
Nadeem Mohammad (S3): The market place is screaming out for a solid third supplier of quality graphics products – and we will be that company. Both of two companies you mentioned have expanded their business beyond graphics processors and their sizes reflect that, so when S3 Graphics and VIA are viewed together as an entity the relative of sizes of resources become quite different.
 
I dont even think they come close, they may be on par with the last last generation (5900, 9800) but I'm not so sure. I just hear driver issues with them.
 
S3 has been trying for years but they first need stable drivers and AFTER that performance..
 
S3 has not been able to bring their parts to market in a reasonable timeframe, and they never seem to have stable drivers available at introduction.
The most reviews get written about a product at launch, and if you cant have semi-decent drivers, I wonder why they even bother. It's bad enough that even "stable" drivers from Nvidia and ATI cause crashes, I can't imagine using drivers as bad as the S3 drivers have been described as being.
I don't think Intel will let the Vista accelerated market go to the 3rd parties. If it's mainstream, they WILL have their fingers in the pie, you can be assured of that. If for some unknown reason I had to have a 3rd party low end graphics solution I'd take Intel over S3.
That's not to say that S3 can't do it, but they need to show some resolve and determination to get a product that is timely (i.e. has a compelling price/performance ratio compared to the competition) and stable, with good I.Q. If they do that, I can see them cutting into the OEM low end market.
 
Sadly, I have come to the conclusion that as this industry matures, the barrier to entry at the top end is much too formidable for any reasonable expectation of a viable third player. They are either too small (S3), or have better things to do with their money (Sony, MS) and energies and so would rather pay one of the Big Two for their needs.

Possibly if one of the Big Two crashed and burned we'd see someone else take their place over time, as the "antis" (of which ever flavor) flocked to them out of sheer cussedness and a touch of idealism. . .but otherwise, I think "it is what it is" at the top.
 
If S3 ever delivered a product on time, I'd say it'd have a shot. But it hasn't. Ever.
 
The Baron said:
If S3 ever delivered a product on time, I'd say it'd have a shot. But it hasn't. Ever.

Actually.. Savage 3, Savage 4 and Savage 2000 where all delivered on time. Savage 4 being the best of that lot and actually decent performance drivers and features.

Savage 2000 was a Quake 3 playing monster machine but had completely broken T&L.

S3/Via has the size and the base to get it done if they get serious about it. In fact they are the only possible 3rd playa with a shot. Past driver and hardware issues Don't have anything to do with a current prduct if they do it right. ATi already made this transistion to get where they are now.

Their Deltachrome technology is pretty good for what it is. The drivers are good. The IQ is good from wverythign i have seen and it works. All they need to do is beef it up a lot and make a play for the big time.
 
Last edited by a moderator:
A. I'm familiar with Savage3/4/2000, and yeah, they were good steps to truly competitive products. Since then (the merger, pretty much), S3 has done NOTHING. I mean, DeltaChrome previews were from around Dec 03, but could you even buy it before mid-04? I certainly didn't see anywhere to do so. If they launch and have availability, they will sell chips in HTPCs and the like. You don't need a huge high-end chip to break in to the market. Cater to a niche market, get devs using your drivers, and then slowly build up your hardware. S3 could do this if they weren't six months late with everything.

B. Yeah, I play with titles. :p Friend of mine and I determined that the true Chief Sploit Scientist is Randy Bryant, dean of the school of computer science here, but hey, I'm not seeing any other Sploit Scientists on the B3D boards.
 
Last edited by a moderator:
everyone likes the underdog, ati managed it a while back and they had pretty poor drivers compared to competiton.

besides, whatever happened to XGI? Shouldnt they have taken over the market by now :LOL:
 
Monty said:
everyone likes the underdog, ati managed it a while back and they had pretty poor drivers compared to competiton.

besides, whatever happened to XGI? Shouldnt they have taken over the market by now :LOL:
I'm still waiting for teh 8-way V8.

V8^2!
 
From the little i have read S3 seems to be doing better tech wise than XGI.
I think one solution would be if S3/XGI merges or something...
 
So that they would have S3s asstastic drivers AS WELL AS XGIs craptastic hardware?Yeah, I think that would work.There have to be some hardware masochists around, and certainly they would flock to such a combination:)
 
Skrying said:
I dont even think they come close, they may be on par with the last last generation (5900, 9800) but I'm not so sure.
You always have to keep in mind that S3 texture filtering hardware is still "pure", while both ATI and NVIDIA* are pulling off pretty outrageous tricks to boost their performance.

E.g. you can pretty much subtract 30~40% of all default benchmarks run by "reviewers" 'round the net for latest NVIDIA tech*, because that's what you'll end up with once you go to "High Quality" with the "fixed" drivers ... protecting your eyes from popping out of your skull in the process.

*I'm not interested in discussing who does worse, that's irrelevant here. NVIDIA was chosen because numbers are abundant and I only needed one example srykthx

The Chrome series can actually do full trilinear as fast as the big competition does bilinear, and faster than brilinear. This is a clear edge in quality and performance vs whatever the big two offer at the same pipe configuration.

Unfortunately, there's a catch ... AF on DeltaChrome S8 wasn't working right IMO. I'd like to believe that this has been corrected in the S18. The few available reviews didn't cover this yet, so I still need evidence either way (no PCIe for me this year).

Skrying said:
I just hear driver issues with them.
Right. There have been lots of show stoppers, and there still are some. IMO what S3 needs to further improve their drivers from where they're at now is market penetration, in terms of both money and user/developer feedback. Ie they've accomplished all you can reasonably do with testing performed by just their own crew and a half-dozen external developers. See what happened to ATI when they started selling craploads of R300, and you'll have a perfect example of why market penetration matters for driver quality.

I have some hope that the S18 and S19 chips might do a lot better than S8 and S4 though. From what I have seen on the S8 I have reason to believe that there are quite a few hardware issues, and just working around such bugs can absorb huge amounts of time, money and also performance. You just can't get off the ground with a broken chip.

S18 could very well be an "all bugs fixed"-edition of DeltaChrome S4 with PCIe bolted on, judging by its performance. If true, this would be a much better base for driver development.

My patience is limited, but I can appreciate the progress they've made, and I still wish them all the best.
 
http://theinquirer.net/?article=27438
The ChromeS27 chip runs at a blistering 700MHz core, 700MHz memory.
There is a problem though, this chip is not a 7800GTx crusher, it is fast but narrow. S27 has 4 vertex shaders and 8 pixel shaders, so for those paying attention, it is aimed solidly at the mid-range. It's target market is the 6600s and X700 range of chips, and it has a decent shot at being a contender in that market. The worst downside is that it is only PS and VS 2.0+, not 3.0.

The chip is less about raw performance than it is low power. The GPU itself in the ChromeS27 takes 10-12W, 17-30 for the entire card. For this, you get 2800-3600 in 3DMark05. Not all that bad, and it can even run fanless if necessary, the ones I saw though had a commendably small fan.

One other nifty thing they do is run in a dongle-free SLI-type two card mode called MultiChrome. The nice thing about it is S3 has no proprietary turf to defend, so you can use it on just about anything that supports two PCIe slots. In fact, you can use it in as many slots as you have, but I am told somewhat off the record that after three cards, the performance doesn't increase enough to be worth it.



Last week, at the secret S3 fortress in the glaciated peaks outside of San Jose, I got to see it in action. I can say that it runs 3DMark05 just fine in dual card mode, and gets a bit less than 2x the single card speed. The performance is not stunning, but with a single card, you can play most cutting edge games at 1024*768. With two cards, 1280*1024 should be quite doable and fast.

There is also a lower end version called the ChromeS25, and it clocks at 600/400, does not support DDR-3, and does not do MultiChrome. Instead, you get AcceleRAM technology, a fancy term for using main memory instead of card memory. This leads to cheap cards that don't perform all that badly, certainly better than not having that memory available.

Both cards, the S25 and S27 have a ton of video features available. Per pixel Gamma correction, a ton of blending modes, depth of field and color fog start things out. Add on Chromotion 3.0, the latest S3 high def video engine, and you have something. On the slides I got, S3 claims they can do WMV-HD decoding at a much lower CPU usage than the GeForce 6600 series chips, and if the things I hear about the X1300 series are true, that one should be a walk over.

So, you have fast video, low power, and decent gaming performance. While it may not be my first choice for a dedicated box, it looks very attractive for media center applications. Fanless, low power and HD capable? I like it.

How much will these chips cost? The S27 is going to cost around the $100 mark for a 128MB version, and the 256 will run about $130. For the performance of a mid-range 6600, that is not a bad thing.

Drivers and availability are another thing, something that S3 has not had a stellar track record on of late. For a while, I have been told that this time is really different for the company, both by insiders and partners. I hope so, we need a third player in the GPU business, competition always makes the consumer win. Lets hope this is the start of an S3 comeback, but I will wait for real world benches before I say that for sure.
 
YeuEmMaiMai said:
Savage 2K was only good at 2 games UT (with metal) and Q3. Unfortunately they could not get either to work very well under 2K
running UT on a savage 2000 was a nightmare in general. metal was broken for months, and d3d and openGL were slow. it took at lest 3 driver releases and a beta metal renderer before it got playable, and even then, a rainbow effect would sometimes appear at mipmap transitions and around the edges of multitextured surfaces.

my savage 4 actualy ran UT better out of the box. at low resolutions (640*480) it was faster than my voodoo3, even when using the s3tc high res texture pack. at 800*600 they were about equal, with a slight lead for 3dfx and at 1024*768 the voodoo was the clear winner.
 
Hello everybody, I'm a newcomer.

In the inquirer's article the ChromeS27 is presented as a rough equivalent of "mid-range 6600". But judging fron the 3Dmark score, it behaves in the middle between a 6800 and a 6800 GT. If it is so, for 100-130$ it is very cheap, IMHO (aside from the driver problems that could discourage the end-user, let's hope they're fixed).

PS: please don't look at my English, it's not my mother language.
 
Back
Top