Deltachrome S8 first look

Hrm... Not what I would call a thrilling initial show thus far. This part in particular caught my eye, though. DeltaChrome's DX9 take would appear to be much closer to ATi's than nVidia, so what might be going on that they can't produce exactly the same ShaderMark shaders as FX cards?
 
cthellis42 said:
Hrm... Not what I would call a thrilling initial show thus far. This part in particular caught my eye, though. DeltaChrome's DX9 take would appear to be much closer to ATi's than nVidia, so what might be going on that they can't produce exactly the same ShaderMark shaders as FX cards?
I assume you mean Radeon cards in the last sentence. Just because two cards have similar specs, doesn't mean the implementations are equivalent. In any event, performance is gated by several factors: HW and drivers. The drivers may be immature or the HW may just be slow at shading. Tough to say at this point.

I found this comment amusing:
Tech Report Delta Chrome Preview said:
The DeltaChrome is running a little behind the curve in Splinter Cell. Honestly, I expected better performance given the fact that Splinter Cell uses Direct3D.
ATI's D3D driver is very good, why should they (Tech Report) expect S3's performance to be closer? Also, Spinter Cell uses shaders, which doesn't appear to be a strong point for DeltaChrome (compare to UT2003 results which are closer).

Note that the 9600 Pro was used for the comparision. Things would fare even worse for DeltaChrome if they had used a 9600XT!

-FUDie
 
FUDie said:
I assume you mean Radeon cards in the last sentence. Just because two cards have similar specs, doesn't mean the implementations are equivalent. In any event, performance is gated by several factors: HW and drivers. The drivers may be immature or the HW may just be slow at shading. Tough to say at this point.

What he meant was that in ShaderMark 2.0, both GeForceFX and DeltaChrome boards are unable to render the same tests. He was commenting on how suprising that is, considering the DeltaChrome seems closer to ATi's architecture (which runs all the tests) than nVidia's. Of course, your point is still relevant, there could be any kind of driver or hardware issue preventing those tests from running.
 
All this and Volari does, so far, is confirm how complicated 3D boards are these days and how difficult it is for anyone to enter the market and really compete on all fronts. It really brings into focus the R&D efforts of ATI and NVIDIA and the engineering capacity they both have to be albe to churn out as many product variations as they do in a year.
 
DaveBaumann said:
All this and Volari does, so far, is confirm how complicated 3D boards are these days and how difficult it is for anyone to enter the market and really compete on all fronts. It really brings into focus the R&D efforts of ATI and NVIDIA and the engineering capacity they both have to be albe to churn out as many product variations as they do in a year.

Sure does.

The interesting thing imo is that none of these cards support MSAA. Don't know if that's because it's hard to implement or because of patent issues but it sure causes some problems since it's hard to compete in the upper mainstream and high end without it.
 
I agree with Dave. Its really hard to come out with such a complicated part on the "first" DX9 try. I guess baby steps....
 
Does anyone know to what extent ATI and Nvidia have patents on various hardware features?

Not only does the current situation reaffirm the fact that bleeding-edge graphics is hard, but it might be that by virtue of being ahead, ATI and Nvidia have literally patented the means by which the other companies could have competed.

If they happen to have locked up the best algorithmic and implementation methods, then competitors have to rely on the faint hope that their design teams can find some new method that two large companies have missed despite years of competitive research.

Either that, or the newcomers have to remain relegated to a value sector, and rely on less efficient but available techniques.
 
Perhaps this shows that you shouldn't shoot for the top-end with your first effort? After all, where did ATI linger all this time before it got it right?
 
You guys have to realize that the S8 is currently very very beta. Especially the drivers. S3graphics says they are working on a *Huge* update to their opengl Driver now and it will be included in their next release.

Considering that this card is actually a true 8 pipeline card if they can get the clock speed up to 400mhz it should perform on par with a 9500pro (i think better) even with its bandwidth. Remember it also uses a unique deffered rendering scheme to save bandwidth. I think they just need some more time for the drivers.

Most importantly this card is going to retail for 90$.. Which even in its current form completely roasts the FX cards in that price range. Just imagine what this card can do with some improved drivers and improved core speed. It will likely be *The* card of choice for serious budget gamers and a compte Functional Dx9 solution as well.

90$
 
IF they can get the drivers up to scratch. IF it is the drivers holding them back. Rather like nVidia and their so-called "driver problems".

That's a big "if".
 
Hanners said:
FUDie said:
I assume you mean Radeon cards in the last sentence. Just because two cards have similar specs, doesn't mean the implementations are equivalent. In any event, performance is gated by several factors: HW and drivers. The drivers may be immature or the HW may just be slow at shading. Tough to say at this point.
What he meant was that in ShaderMark 2.0, both GeForceFX and DeltaChrome boards are unable to render the same tests. He was commenting on how suprising that is, considering the DeltaChrome seems closer to ATi's architecture (which runs all the tests) than nVidia's. Of course, your point is still relevant, there could be any kind of driver or hardware issue preventing those tests from running.
That's easy to explain. Look at what the FX is lacking compared to the Radeon products: MRTs, float buffers, etc. The ShaderMark tests are probably using one of these features and DeltaChrome either doesn't support it in the drivers or doesn't support it in the HW.

-FUDie
 
Hellbinder said:
..
Most importantly this card is going to retail for 90$.. Which even in its current form completely roasts the FX cards in that price range. Just imagine what this card can do with some improved drivers and improved core speed. It will likely be *The* card of choice for serious budget gamers and a compte Functional Dx9 solution as well.

90$

They might actually sell a few cards for that price.
 
Hellbinder said:
You guys have to realize that the S8 is currently very very beta. Especially the drivers. S3graphics says they are working on a *Huge* update to their opengl Driver now and it will be included in their next release.

Considering that this card is actually a true 8 pipeline card if they can get the clock speed up to 400mhz it should perform on par with a 9500pro (i think better) even with its bandwidth. Remember it also uses a unique deffered rendering scheme to save bandwidth. I think they just need some more time for the drivers.

Most importantly this card is going to retail for 90$.. Which even in its current form completely roasts the FX cards in that price range. Just imagine what this card can do with some improved drivers and improved core speed. It will likely be *The* card of choice for serious budget gamers and a compte Functional Dx9 solution as well.

90$

If they can hit a $90 street price as mentioned by Digitimes that would be really nice. The official figure I was given for S8 was $150, but that's MSRP.

BTW, I tried to dabble with a little bit of overclocking, first starting with Nitro speeds (315/315) but after 15 minutes of gaming the system locked up. This board is from August though, so hopefully things will improve by the final revision.
 
Hellbinder said:
You guys have to realize that the S8 is currently very very beta. Especially the drivers. S3graphics says they are working on a *Huge* update to their opengl Driver now and it will be included in their next release.
I heard that when I owned my S4.
 
All I can say is that what we've seen from S3 today looks a lot better than what we have seen from XGI. The S8 looks to be able to perform just about right for the targets they've set for it.

If they can deliver performance between a 9600 and 9600 Pro for a price point below a 9600SE then they should do fine in the low end, and even into mainstream.

It's also probably a more manageable target than the high end.
 
Back
Top