Deltachrome S8 first look

Rugor said:
All I can say is that what we've seen from S3 today looks a lot better than what we have seen from XGI. The S8 looks to be able to perform just about right for the targets they've set for it.

I would agree with that...the S8 looks to be an all-around better effort than the Volari.

However, the S8 is not shipping, and the Volari is.

If they can deliver performance between a 9600 and 9600 Pro for a price point below a 9600SE then they should do fine in the low end, and even into mainstream.

That all depends on when they ship. By February/March, I'm expecting ATI to shake up their low-end and mainstream segment. I am guessing 9800 non-pro like performance inthe $150-$200 range, and 9600 non pro at the <$100 range.

It's also probably a more manageable target than the high end.

Agreed...though not having a high-end can make it more difficult to peddle the low end.
 
I just hope that we do not witness another savage 2000 disaster. I am currently ATi and I do not see anything from S3 that would make me want to switch back...........
 
RussSchultz said:
Hellbinder said:
You guys have to realize that the S8 is currently very very beta. Especially the drivers. S3graphics says they are working on a *Huge* update to their opengl Driver now and it will be included in their next release.
I heard that when I owned my S4.
Actually, we (S3 in those days) did do a large OpenGL update for the Savage 4. I know because I worked on it :) The new ICD was a huge improvement and came out in early 2000 if I recall correctly.
 
One thing that is very interesting about the new S8 is that it essentially seems to be S3's take on the Radeon 9500 Pro, but with a smaller transistor count. Given the wealth of features in the 2D core they have to be cutting in 3D.

One place they are saving in transistors is obvious, the memory controller. A single 128-bit memory controller without a crossbar is going to use a lot fewer transistors than four 64-bit controllers plus a crossbar. I'm also thinking, based on their performance, that S3 may have gone with simpler shader hardware.

I just hope that Tech Report's data that shows it peforms more like a 4x2 architecture than an 8x1 isn't the case. I'm getting very tired of companies mis-reporting their architecture. I don't really care what they use, so long as they tell the truth or don't bother to tell us.
 
Albeit not completely sure, I think hierarchical/early-Z isn't even working just yet.

I wouldn't call it a bad start overall, but I really wonder what took them so long.

Why would one have to have a patent for MSAA anyway? It costs more in hardware to implement (just look at the transistor count of the DC) and I don't think given the existing bandwidth and memory footprint that the initial over-ambitious 16x sample MSAA would have really been possible.

Early pricing estimates I read here and there seem more than just reasonable; probably a cheap mainstream to value pack of sollutions for users that don't care as much about high resolution AA. I doubt it's anything but OGSS by the way.

***edit:

I'm also thinking, based on their performance, that S3 may have gone with simpler shader hardware.

Getting warmer 8)
 
Bjorn said:
The interesting thing imo is that none of these cards support MSAA. Don't know if that's because it's hard to implement or because of patent issues but it sure causes some problems since it's hard to compete in the upper mainstream and high end without it.

Tech-Report initially said that DC would not support MSAA, but only 2x SSAA. Now in their recent early look preview they say that MSAA will be supported...does anyone know if DC will really use MSAA or not?

Quitch said:
IF they can get the drivers up to scratch. IF it is the drivers holding them back. Rather like nVidia and their so-called "driver problems".

That's a big "if".

I agree. I still remember the Savage2000 and its weak drivers. I think its much too early to say any performance problems lie exclusively with the drivers...and even if they do I wonder whether S3 can get the drivers up to snuff.
 
I don't think DC is 4x2, at least you can't tell based solely on using 3DMark's fill rate bench. I ran similar numbers with the 9500 PRO and actually found the DC ahead of the 9500 PRO.

As far as the AA confusion, I think that's caused by S3 having conflicting information. In the docs it lists MSAA and I believe up to 4x, but when I talked with them they acknowledged it was SSAA and up to 2x at 1024x768 max res. I then confirmed this in the driver control panel:

http://firingsquad.com/media/article_image.asp?fs_article_id=1405&pic_id=19

I think we'll just have to wait and see on that one, but if they really are limiting resolution to support to 10x7, I see no reason why they would do that unless it's using SSAA.
 
I wonder why ppl compare this S3 Graphics to S3 Graphics that did Savage 2000. as it is many times stated, there's not much more than name left from the original S3 Inc. there even was discussion if S3 is able to make drivers at all because OpenGL Guy and almost rest of the old driver team was gone. (sounds now pretty rediculous, eh?)

but if you want to believe it's the same, it's fine with me. In that light, for a company that has been 4 YEARS practically dead on desktop market, this is pretty acceptable come back IMO.

HTPC users will absolutely love DeltaChrome S4 with passive cooling, hardware 100% HDTV support and pretty good looking hardware de-interlacing features.
 
Rugor said:
All I can say is that what we've seen from S3 today looks a lot better than what we have seen from XGI. The S8 looks to be able to perform just about right for the targets they've set for it.

If they can deliver performance between a 9600 and 9600 Pro for a price point below a 9600SE then they should do fine in the low end, and even into mainstream.

It's also probably a more manageable target than the high end.

You're kidding me, right?
Considering how unstable the Deltachrome OpenGL ICD is, its lack of AF and FSAA in current drivers (S3 hopes to have in the future FSAA2x up to 1024, if you use a higher resolution, no FSAA for you, etc.), many shaders are not working properly, Shadermark2 doesn't work properly same as other shader programs, X2 benchmark doesn't work properly, etc.
And did I mentioned S3 claims everywhere they support MSAA up to 16x or 4x depending on which S3 paper you read, yet it will only support SSAA...

If you believe that the Volari is horrible, then the Deltachrome is much worse in all regards.
 
From the reviews I have seen, the S8 performs right about where it's expected to based on its projected price point. It gives performance in the same ballpark as an FX5600/R9600 and appears to show no application specific optimizations. Meanwhile the Volari, despite having higher absolute performance appears to be missing its direct competitors (9800/5900) despite having a higher price point.

Neither card has AF or AA functioning in the current driver builds, at least not that I'm aware of, so that's a wash on both.

However, right now, S3 appears to have come closer to their goal than XGI to theirs.
 
Rugor said:
Neither card has AF or AA functioning in the current driver builds, at least not that I'm aware of, so that's a wash on both.
However, right now, S3 appears to have come closer to their goal than XGI to theirs.

XGI has working 2x and 4x FSAA in driver 1.00.00 without any resolution limit
 
Neither card has AF or AA functioning in the current driver builds, at least not that I'm aware of, so that's a wash on both.

Not sure about AA for DC but AF seems to be definitely operational; there's just a question mark whether fast trilinear is hindering applications to detect it as real trilinear, or whether it's just bilinear after all.

Pics provided by PCGH's Thilo:

http://212.123.108.51/tbayer/DC_AF_Tester_1x.jpg
http://212.123.108.51/tbayer/DC_AF_Tester_8x.jpg
http://212.123.108.51/tbayer/DC_AF_Tester_16x.jpg


XGI has working 2x and 4x FSAA in driver 1.00.00 without any resolution limit.

We'll see what is operational and how after both efforts get even deeper analyzed. I hope that in either/or case the implementation actually deserves the term "FSAA"....
 
according to FS the 9600 is EOL now

Directly from Firingsquad's review
This could make the S8 a very worthy competitor to the remains of ATI’s RADEON 9600 (whose status has recently been changed to EOL)
 
Nappe1 said:
I wonder why ppl compare this S3 Graphics to S3 Graphics that did Savage 2000. as it is many times stated, there's not much more than name left from the original S3 Inc. there even was discussion if S3 is able to make drivers at all because OpenGL Guy and almost rest of the old driver team was gone. (sounds now pretty rediculous, eh?)

but if you want to believe it's the same, it's fine with me. In that light, for a company that has been 4 YEARS practically dead on desktop market, this is pretty acceptable come back IMO.

HTPC users will absolutely love DeltaChrome S4 with passive cooling, hardware 100% HDTV support and pretty good looking hardware de-interlacing features.
Of course, its great for the niche Home theatre PC.

However, if you cant see what the S3 brand means....even if it is a new company, the old legacy will still be around until they do something to shake it.
 
Yeah, S3 hasn't exactly been known in the past for great drivers, but then again, neither has ATi, and look how far they've come...

I think the mere fact that the card can at least hold its own with very early drivers is reason enough to believe it will succeed.

The card described in my sig, on the other hand....
 
I don't disagree to most points here, yet it is also true that DC made it extremely late to the market. By the time it'll hit shelves it'll have only a couple of months until it'll have to stand against ATI/NV's next generation mainstream/value parts.

I don't think anyone can deny that it would have made a much better impact would it had been released in H1 2003.

In the docs it lists MSAA and I believe up to 4x, but when I talked with them they acknowledged it was SSAA and up to 2x at 1024x768 max res. I then confirmed this in the driver control panel:

Their first presentation - if my memory serves me well - was talking about 16x sample MSAA. I was wondering back then, how they would have gotten over bandwidth and/or memory footprint hurdles.

Looking at the transistor count estimates and the boards final power consumption, I'd dare to guess that they might have cut out the early planned MSAA implementation entirely, otherwise I don't see why they wouldn't have optimized for just 2xMSAA, instead of Supersampling.

Could be they'll use their MSAA idea in future products, yet IMO they'd need a 256bit bus and more than 256MB onboard ram to realise it properly.
 
Back
Top