So what do you think about S3's DeltaChrome DX9 chip

Randell said:
Nappe1 said:
Randell said:
Do we only believe ATI and and nVidia can pull it off now?

umh... this might sound fancy but I got this crazy thought...
If we don't believe new comers, who will? What happens to business if no one believes to change?

think about that. I doubt that I need to talk about my opinion on this particular case...

Dont get me wrong I half like/half dislike S3 - the S4 for example had some great points - especially iro S3TC, IQ and trilinear speed - if only they hadnt crippled it with a 64bit memory bus it could have taken the TNT2/V3 on perf wise and it had some real problems - driver or silicon I dont know e.g. if you used MeTal to play UT online it would lock up solid.

However they did have driver writers and customer support guys out in the forums helping people officially - something ATI, 3dfx and nVidia werent doing at the time.

And competition helps of course Nappe1

okay, so it's time to bring my real thoughts behind my words to this board... here we go...

yes, I know that you can mock up a good movie on final cut phase. BUT, I do not hope that, nor I don't see any reason for taking examples from S3's past. afaik, S3 under VIA is almost completely changed since from Diamond / independent times. when VIA bought S3 division, they didn't had even a road map! :) and besides, I doubt that VIA bases it's success on luck so that VIA would have been pumping money to this "company with nothing" from year to year without confidence that it will bring the pumped money eventually back. (it does not necessarily mean that Delta Chrome would be DA CHIP! but more likely that it's step to right direction.) at the 3rd point, it has been over 3 years since S3 last time bring out something that has (at least) ment to be competitive on desktop market and we all know that 3 years is a long period of time in this business. things happening during that time can be as well as bad as good.

and... ppl here is talking about "who writes the drivers?" and at the same time you forget that there's almost (or is it over already?) 6 000000000 ppl living in this globe. So I don't think finding work force for the task is hard. But more likely question should be, if three years is enough for company like VIA to pull out world class driver team from middle of the nowhere?
 
There has been enough downsizing of larger graphic companies for experienced driver writers to form that team..at least you would think..

Corporations think people are like machines, they want to run a tight ship, low overhead and maxium throughput.
 
Stating the obvious

S3 could make some huge gains in the mainstream and OEM market if they do things right.
 
Anyone notice on one of the control panels it says :

"System Memory : 256mb"
"Video Memory : 16mb"

Video memory = local on chip memory?!
 
IMO, doubtful.

More likely that was the spec of the graphics and system ram of whatever PC/Laptop the guy who snapped those (probably in development) driver control panels was running. If this is the case, it could be an indication of a lack of silicon.
 
nappe1:

I apologise in advance but I disagree with your sentiments about VIA being able to nurture their S3 team into a solid world leading/beating team.

In my experience VIA has always taken shortcuts. They released the KT133 soon replaced with the KT133A, they released the KT266 soon replaced with the KT266a, they released the KT400 and it performed slower than their KT333 chipset initially.

I dont expect much from VIA or S3. My outlook for them is rather pessimistic but it is based on previous experience and what they have done.

VIA lost out to SiS because SiS delivered what VIA promised, cheap chipsets with the performance close to the more expensive solutions from AMD/NVIDIA/INTEL (motherboard chipsets). VIA has lost massive marketshare in their specialist field because of their decisions, once they were scaring INTEL with their growth, now they have been tamed by their own incompetence. They even had problems implementing proper USB support at one point, because they cut corners.

I have noticed a trend and I am sure others have to:
When a company announces a product months in advance that plans to beat everyone on price and performance they dont deliver.

When a company knows they have a killer product they keep as tight liipped about it as possible until the last moment when they smack it in ya face.

This DeltaChrome has taken the former route rather than the latter (even then they have executed that extremely badly - at least BB did it with some style), and that alone tells me to be wary of the VIA/S3 will be releasing.

:(

Also nappe1 you should be happy about the underdogs winning now... the leaders were NVIDIA and now it is ATI. This year the situation could change and someone else takes a step into the limelight. Just please do not expect it to be VIA/S3.
 
I don't expect VIA/SIS to overtake ATI or Nvidia, but it does bring another DX9 product to the market..if it doesn't succeed at least it may offer the oportunity for all the DX6 lovers to finally get a decent modern card without breaking the bank...

I would not downplay VIA..they have lots of cash.
 
Kristof said:
"DeltaChrome can employ not only front-to-back but also back-to-front Z occlusion culling."

Not sure what that means but I think it simply means they support changing the Z compare mode without things going pear shaped :LOL:

Wouldn't that just be where they have a two way Hierarcy, with one that stores the max Z per tile/block size values and another that stores the lowest?
 
Nite_Hawk said:
Hrm....

Lots of marketspeak, I guess we'll have to wait and see. Any comments on the "advanced deferred rendering" they are touting? I'm curious as to how much the two pass rendering scheme for doing front-to-back culling actually saves you. How many programs are doing front-to-back these days? There were rumors that this is what morrowind is dueing and giving it such horrible performance even on top end systems.

Certainly some interesting tidbits in there, but I guess we'll have to get the opinions of some of the big boys around here. :)

Nite_Hawk

I thought Morrowind was doing back-to-front rendering?
 
firstly:
Anyone notice on one of the control panels it says :

"System Memory : 256mb"
"Video Memory : 16mb"

Video memory = local on chip memory?!
i could have sworn that i saw "256MB Frame Buffer" mentioned... I own a few early test cards with only enough RAM for frame buffer and running basic functionality tests. Also, on comment to the speeds shown in the overclocking pannel, did you notice that the indicators were at the absolute bottom of the scale? Hell, the system they snapped those pics on might not even be a DeltaChrome but rather an older chip that the controlls didnt even work on just so that they could show some screenshots and get people impressed. My guess is they used a Voodoo3 2000 ;) :LOL:

Now on to my overall opinion... I am certainly very wary of an S3 Graphics product since my Savage4 Pro+. It was certainly a good chip yet it was plagued by poor OpenGL drivers (one of the main reasons i upgraded from my Rendition Verite 1000 was to play OpenGL games.) I constantly had to look for new beta drivers that would improve stability and did finally find some good drivers. But, there was another issue- cooling. You wouldnt think one of those things could get as hot as it did, and no active cooling? I had to end up putting one of those squril (sp?) cage slot fans next to it. At that point it was very stable and performed quite well, considering the 64-bit memory bus. I still, however, consider it to be the devil card... :devilish: Also, I am weary of any company that hasnt released a competitive product for 3 years- Is the DeltaChrome the new Parhelia? It is certainly very similar and, until I see some actual silicon I wont give it any more status in my mind than merely another piece of competition. Competition is certainly good. Im very happy to see (in no particular order) Trident, S3, Matrox, 3Dlabs, and SiS all back into the game, although it appears Matrox has stopped pulling money out of their ass and may just be fading away rather quickly. So, all we need now are my beloved Rendition and 3Dfx (hell throw in Lockheed and more announcements from BitBoys) and we'll be one big happy family again 8) yeah..... right, THATS going to happen. I did yell "BitBoys" at the HardOCP workshop a few times and the guy with Anand just turned around and laughed at me :rolleyes: maybe it was because i had my 3dfx cap on :oops:
 
S3 could make some huge gains in the mainstream and OEM market if they do things right.

I think you guys could all be underestemating Columbia/Delta Chrome.

You are assuming that it will be a Mainstream or OEM part. Thats not what im seeing. If its clocked at 300mhz delivers on working hardware with reasonable drivers, and its various *defffered rendering* approaches actualy do something.... Then Its going to be faster than the current 9700pro. It also looks like its shader tech could be better once the real details come out. Being that no one really seems to know what their deffered rendering speak is really all about, i would not rish to judgement.

They may also have some serious FSAA performance under the hood as well.
 
one of the main reasons i upgraded from my Rendition Verite 1000 was to play OpenGL games

Damn i miss those guys. The V1000 was my first real 3D accelerator. the V2100 and V2200 were pretty good to. That is a company that, if it had not been gobbled up by Micron and Turned into memory desogners.. wow.. where would they be today?

After all their 3000 Series was going to have Embedded Ram, and a early version of a T&L engine. A Geometry CoProsessor that was to be supporeted by their own API Redline... That would have been all teh way back in circa 1997/1998...

Imagine that.
 
Personally, I think this is great news. Just hope they deliver!
Correct me if I am wrong, but don't S3 have a long hystory of successful OEM deals from back in the days? If this chip performs 'well enough', I am sure they will win back alot of thier OEM partners.

When is it expected to hit the market?
 
misae said:
nappe1:
Also nappe1 you should be happy about the underdogs winning now... the leaders were NVIDIA and now it is ATI. This year the situation could change and someone else takes a step into the limelight. Just please do not expect it to be VIA/S3.

who else that could be?
afaik:
- Matrox: gone from 3D mainstream / High End.
- Bitboys: gone
- 3DLabs: since P10 release, have been a bit more tight lipped on their market placement.
- Trident: yeah, sure they do launch XP6 (or something...), but how about some resonable speed?
- SiS: Trip to World of Marketing Names with OctaHyperTurboCraplizer?

Only one than S3 is PowerVR that has possibilities to do this. Do they want it, is another question. Sorry Misae, but fact is that for this year, we have ATI and nVidia in the High End and as possible new comers and/or come backers to Mainstream S3, SiS and PowerVR. (in the order how possible it seems to be now.)

and yes, I see this year VERY booring coming up. of course 3DLabs could try with Creative market penetration on Mainstream, but somehow I doubt it. They are right now enough busy keeping driver tweaking Quadros and FireGL's behind on their own market. High End timeline is booring as hell. Next bigger steps comes on summer/early autumn when ATI brings R400 and only interesting point is, if nVidia will be able to keep up and bring NV35 against it. But that's all.
 
Help.. i noticed something strange as I keep rereading all their information...

They never actaully say *8 Pixel Pipelines*. They repeatedly say...
V8 Pipeline and 8 pixel Pipline (singular).. Then their is this interesting tidbit...
Full 8 pixel pipeline
Unified super wide pipeline for seamless 2D/3D/Video context switching
What do you make of it?

Also check this out.
You will be assured of being the last player standing in the emerging mobile gaming trend when your laptop is equiped with DeltaChrome's V8 pipeline

They seem to be stating that the Delta Chrome with will be offered feature complete in both Desktop and Laptop segments. This is one serious Laptop chip.
 
Taken from http://www.s3graphics.com/DeltaChromeDX9.html#PixelShaders:
Pixel Shader 2.0+
The ultra high bandwidth 8-pipe Pixel Shader engine with 96-bit pixel precision gives developers unconditional freedom to manipulate pixels and create movie-quality effects in real time, at quality levels not seen before.
I suspect that they just haven't been checking their grammar properly. How could they claim the 2400Mpixel/sec fill rate without 8 pixel pipelines unless they were running some silly core speed?
 
I applaud s3 by going with ver.2+ of shaders. Right up there with gffx. On the other hand I'm a bit skeptical, like most here, about performance and driver support. I think clockspeeds will play bigger role than features, noone wants slow features. It would be nice to see more competitors in gfx field. So, yes I'm a bit surprised by s3 :)
 
Neeyik said:
Taken from http://www.s3graphics.com/DeltaChromeDX9.html#PixelShaders:
Pixel Shader 2.0+
The ultra high bandwidth 8-pipe Pixel Shader engine with 96-bit pixel precision gives developers unconditional freedom to manipulate pixels and create movie-quality effects in real time, at quality levels not seen before.
I suspect that they just haven't been checking their grammar properly. How could they claim the 2400Mpixel/sec fill rate without 8 pixel pipelines unless they were running some silly core speed?

EDIT: Neeyik, they ARE stating about 8 pixel pipelines. ;) (ti took long before I noticed that you say without instead of with :) my original post is below.)


umh...
2400MPixels per second / 8 Pixel pipelines = 300 MHz :?:
it seems pretty clear to me :? And IMO, 300MHz does not sound silly at all...

Or am I missing something here?
 
Back
Top