VIA to Launch Another "Paper" GPU This Year?

Hax, I'd just like to say that looking at the game, I don't buy it.

If it was an overall system performance problem, then the low performance would be consistent. This was not. It would perform almost okay for a moment or two, then slow down horridly, then run quickly, and so on.

In other words, the problems were most certainly drivers, not slow or inadequate memory bandwidth.
 
OVERLORD said:
Having used MANY tweaked S3 drivers in the past this gives me some authority bud! Now go moderate yourself!

Haha. You can't give specific points about the modified ICD, so you go on to make general attacks on S3 (which were completely irrelevant), then you go on to attack me, all the while still not having a point to bring up.

Using tweaked S3 drivers...on Savage3D. Sorry bud, you weren't there for the updated ICD. Like I said, no basis of experience.
 
OpenGL guy said:
It was working when I left S3 in 2001... What did you break hax? :D :D Which integrated chipset is this, anyway?

P.S. I did much of the ICD bringup on the early integrated parts.

It was on a Duron 900MHz with SDRAM, so I believe it was the VIA KM133.
 
Chalnoth said:
Hax, I'd just like to say that looking at the game, I don't buy it.

If it was an overall system performance problem, then the low performance would be consistent. This was not. It would perform almost okay for a moment or two, then slow down horridly, then run quickly, and so on.

In other words, the problems were most certainly drivers, not slow or inadequate memory bandwidth.

Mostly a true argument, from an external perspective. FWIW, the NB drivers were almost identical to the S4 drivers (for the ICD). The same source would run at least 2-3x faster on an S4 than on an NB. The only differences, from the driver perspective, was a different tiling layout. There were also some limitations on the multitexturing aspect that affected memory.

There may be some stalls due to texture swapping, I can't say at the moment, it's been ages since I've used an NB and looked at UT. I'd say with a fair amount of certainty that it is texture swapping due to reasons I won't/can't comment on. I won't comment on the D3D driver, as I don't have high regards for it. Aside from the generals, MHO about the memory arbitration on NB was that it wasn't that great.

There was also engine and bandwidth issues as 90% of the time, the driver is waiting for the hardware, and the hardware would be severely backed up. From an architecture perspective, I don't really care for the NB. Some of the things that were done were just plain dumb.
 
OpenGL guy said:
It was working when I left S3 in 2001... What did you break hax? :D

Ha, you think it was working. Nothing, except passing your bugs to somebody else, and that was just about when I started not caring anymore. Low end stuff, bleh.
 
hax said:
Mostly a true argument, from an external perspective. FWIW, the NB drivers were almost identical to the S4 drivers (for the ICD). The same source would run at least 2-3x faster on an S4 than on an NB. The only differences, from the driver perspective, was a different tiling layout. There were also some limitations on the multitexturing aspect that affected memory.

Consider this, for a moment:

Unreal Tournament was playable on my TNT 16MB at 800x600x32. The framerates weren't great, but it was playable. That video card had 120MHz memory, if I remember correctly. At PC133, the integrated Savage4 should have just over half that available memory bandwidth. Due to the fact that other things might be using that memory bandwidth, let's take a conservative estimate and say that 1/4 of the TNT's memory bandwidth was usable.

If the TNT was memory bandwidth-limited at 800x600x32 (I don't believe it was...at least, not compared to cards like the GF2 GTS), then 1/4 of the memory bandwidth should make 640x480x16 playable. If there are stalls due to texture swapping, then there were a bunch of imbeciles over at S3/VIA designing the memory interface of the integrated Savage (or not designing, as it were...if they were just attempting to do the absolute minimum redesigning of the processor...).

And, if I remember correctly, the Savage4 was actually a fair bit higher-performing than the original TNT. That makes the integrated Savage a fair bit lower than 1/4 the power of the Savage4, at least under UT.

There may be some stalls due to texture swapping, I can't say at the moment, it's been ages since I've used an NB and looked at UT. I'd say with a fair amount of certainty that it is texture swapping due to reasons I won't/can't comment on. I won't comment on the D3D driver, as I don't have high regards for it. Aside from the generals, MHO about the memory arbitration on NB was that it wasn't that great.

Texture swapping from system memory to system memory? Talk about idiotic usage of the hardware...

From an architecture perspective, I don't really care for the NB. Some of the things that were done were just plain dumb.

It certainly seems so.
 
Randell said:
The icd was only really any good for QIIIA though, other OGL games liek Kingpin had problems all the time with my S4.

Still it took a long time for me to get over using Metal/S3TC on UT when I switched to a V3.

AFAIK Kingpin had problems on more cards than just the Savage? I seem to remember it being considered a pretty buggy game? But I don't know firsthand not having played it.

IMO Savage 2000 was actually pretty good from a hardware standpoint, but the driver support just wasn't there. If they had supported the card better with driver updates and work arounds for any problems in the silicon, it could have been a great mid-range card (it was almost as fast as GF1 in Q3 without the benefit of T&L). Also keep in mind I don't have any first hand knowledge of the design and whether or not it was actually good, I'm just observing that the speed was there. Maybe S3's mistake was trying for the TnL unit too soon?
 
My Savage 4 ran everything I wanted it to.... back when a 300/450 was the norm. I didn't run into any compatability issues... so long as it ran on the 440 BX.

Didn't like the FIC AZ-11 (VIA KT133) AT ALL. I ended up buying a GeForce 2 MX because it (the s4) hardlocked when you used transparencies... tried all kinds of things, just didn't work. Was a great little card for the year I used it.

So... I don't know why you all are bashing it. Worked plenty good for the 80 bux I dropped on it. Course I never owned a S2000.

BTW, UT MeTaL ruled.
 
Chalnoth said:
hax said:
Blablabla...

Blablabla...

He more or less agreed with you that the integrated Savage 4 just sucked? So what are you arguing about? If it sucks, it sucks. Still, when has integrated graphics ever been any good? It's mostly useful in that you can load up windows without having a real graphics card and do most things at the desktop as well as play 2D games. I think you may have been expecting a little too much from integrated graphics.

What I don't understand is why the integrated chipset was based on Savage 4 at all. At the time the integrated chipset was released Savage2000 was already old. True, it may have had bugs, but it still was much faster than the Savage 4. I guess Via didn't feel it was worth trying to debug it or something? Personally, however, I think they'd have been better off just coming up with a whole new core instead of basing it on the 3+ year old (is this right or was it only around 2 years old?) Savage 4. But, like I also said, it gets the job done loading to windows and for word processing/diablo II, etc.
 
Nagorak said:
He more or less agreed with you that the integrated Savage 4 just sucked? So what are you arguing about? If it sucks, it sucks. Still, when has integrated graphics ever been any good? It's mostly useful in that you can load up windows without having a real graphics card and do most things at the desktop as well as play 2D games. I think you may have been expecting a little too much from integrated graphics.

I guess I'm trying to say that the Savage4 sucks much, much, much more than the specs seem to indicate.

And yes, I probably was expecting too much from a VIA integrated graphics chipset. Before seeing that crap perform, I had already installed and played around with my little brother's nForce hardware.
 
Chalnoth said:
And yes, I probably was expecting too much from a VIA integrated graphics chipset. Before seeing that crap perform, I had already installed and played around with my little brother's nForce hardware.

Hehe, that'll do it. ;)
 
I don't actually know what caused the hangs on UT; but I think it showed up more because the Metal driver actually stressed the system an awful lot because it was so efficient. It was very non-systematic for me, varied a lot by what motherboard and CPU etc. were installed.

I once hand-overclocked about 40 S3 boards using the Unreal flyby to test - if they didn't hang in about 90 minutes (that was inside a case with average to poor cooling), they were invariably good to go. It was an interesting experiment, seeing what the spread was - taught me a lot about how conservative the hardware companies set the BIOS.

I must admit I'm really encouraged by how many people noticed the Metal port was good. I thought for a long while it was slipped out so quietly nobody had noticed.

BTW, as far as I'm aware, Metal development had ceased before the integrated parts shipped. I certainly didn't do anything on it :)
 
OpenGL guy said:
OVERLORD said:
:oops: I didn't know I was talking to a MOD!

Having used MANY tweaked S3 drivers in the past this gives me some authority bud! Now go moderate yourself!
hax is not a "mod". He is a person who worked on the ICD at S3 (in case you didn't figure it out). I, also, worked on the ICD while at S3 (hence my handle here).

I know he's no MOD(This was an attempt @ sarcasm as he demanded closure. I guess you missed it) I thought I might be in the presence of some ex S3 guys but didn't care to ask as I was on my way home before I posted comment yesterday.(It was 23:00 GMT last night, couldn't continue chatting as finished my shift)

Driver situation was insane either requiring renaming of executables or editting registry for some games or versions of 3dmark to function correctly. How many schemes would I need to get all games functioning as programmer intended. With the support S3 users got the experience was more Plug and PRAY that you didn't have to revert back to previous build.

Some drivers renderring previously functioning games unplayable. Corrupting textures & constant lockups even dumping users to desktop after startup were aplenty.
 
hax said:
OVERLORD said:
Having used MANY tweaked S3 drivers in the past this gives me some authority bud! Now go moderate yourself!

Haha. You can't give specific points about the modified ICD, so you go on to make general attacks on S3 (which were completely irrelevant), then you go on to attack me, all the while still not having a point to bring up.

Using tweaked S3 drivers...on Savage3D. Sorry bud, you weren't there for the updated ICD. Like I said, no basis of experience.

Most had problems like Z buffer errors,performance or corrupt textures. Sometimes requiring patches due to black textures/holes appearing on certain maps.

Having moved from poor MATROX Mystique gl support to 3DFX Voodoorush with its naff GLIDE support and later S3 Savage. Why do you think some users never want to touch these guys with their history. Most turned their backs on the user having sold them a sub-par product.

Its all good and well quoting future driver support but when these fail to materialise. Don't blame the users for their fustrations. When these companies couldn't/didn't give a damn before.
 
OVERLORD said:
SNIP BORING RANT ON S3

Sorry buddy, but he is right.
I had a s4, and while it wasnt perfect, it performed decently, and had better IQ than a TNT (and free trilinear!, whereas trilinear was sloooow on a tnt, and wasnt even real trilin, just dithered, ugh).

I LOVED the MeTaL UT port, with the high res textures.
Made UT look better than almost any game that came out for years afterwards. I remember a LAN, where a couple of friends were watching me play and they were PISSED that UT looked so crappy for them :) (even if it did run faster)
 
Althornin said:
OVERLORD said:
SNIP BORING RANT ON S3

Sorry buddy, but he is right.
I had a s4, and while it wasnt perfect, it performed decently, and had better IQ than a TNT (and free trilinear!, whereas trilinear was sloooow on a tnt, and wasnt even real trilin, just dithered, ugh).

I LOVED the MeTaL UT port, with the high res textures.
Made UT look better than almost any game that came out for years afterwards. I remember a LAN, where a couple of friends were watching me play and they were PISSED that UT looked so crappy for them :) (even if it did run faster)

I only said as much in 1st post b4 the s3 brigade let rip with their puerile comments. Now you know why most S3 users were less than happy of driver support or lack of in some cases even having to petition these juveniles. Unfortunately some in industry still treat support as a game failing to see how this can hurt their Business having aready taken your hard earned cash.
 
Althornin said:
(and free trilinear!, whereas trilinear was sloooow on a tnt, and wasnt even real trilin, just dithered, ugh).

Little clarification on that:
The TNT had either slow trilinear, or bad looking trilinear. It wasn't both. That is, whenever multitexturing was enabled, the TNT used MIP map dithering, which looked like crap, but ran at about the same speed as normal bilinear. When only single texturing was used, it combined the two pipelines to output one trilinear pixel per clock. It was exceedingly slow (1/2 the performance...), but it looked very good.

Additionally, if I remember correctly, the TNT lined up more with the Savage3D in release time. The Savage3D had horrid drivers, and the TNT easily trounced it in performance. The Savage4 was a definite improvement, but was again beset by poor drivers. It ran UT very well, but that's about it...in pretty much everything else, the TNT2 stomped it.

I LOVED the MeTaL UT port, with the high res textures.
Made UT look better than almost any game that came out for years afterwards. I remember a LAN, where a couple of friends were watching me play and they were PISSED that UT looked so crappy for them :) (even if it did run faster)

The great thing is that now with Vogel's OpenGL driver for UT, you can see those textures on any card that supports the GL extensions :)
 
OVERLORD said:
I only said as much in 1st post b4 the s3 brigade let rip with their puerile comments. Now you know why most S3 users were less than happy of driver support or lack of in some cases even having to petition these juveniles.
Good way to support your position. :rolleyes:
Unfortunately some in industry still treat support as a game failing to see how this can hurt their Business having aready taken your hard earned cash.
Maybe you should go work in the real world and then you'll see that engineers have very little say as to what problems get addressed and when. If you have problems with S3, fine, but insulting people who worked on the project won't get you anywhere, especially because you have no clue about what went on internally.

And hax is correct, the Performance ICD was actually quite good. It supported the Savage MX/4 and all later chips. You were complaining about the Savage 3D: By the time the Performance ICD was released, S3 had already EOL'd (end-of-lifed) the product and we (i.e. the company) weren't supporting it anymore.
 
Returning to the initial question, I think that all of you should have in mind that the vast majority of S3`s best people left before/after the graphics division was bought by VIA, so chances are that these new chips have little in comon with S3`s ancient attempts. The only thing that links them is S3`s old roadmap, that seems to be followed by Via to a certain extent. On a side note, people, would you please CHILL!!
 
OpenGL guy said:
OVERLORD said:
I only said as much in 1st post b4 the s3 brigade let rip with their puerile comments. Now you know why most S3 users were less than happy of driver support or lack of in some cases even having to petition these juveniles.
Good way to support your position. :rolleyes:
Unfortunately some in industry still treat support as a game failing to see how this can hurt their Business having aready taken your hard earned cash.
Maybe you should go work in the real world and then you'll see that engineers have very little say as to what problems get addressed and when. If you have problems with S3, fine, but insulting people who worked on the project won't get you anywhere, especially because you have no clue about what went on internally.

And hax is correct, the Performance ICD was actually quite good. It supported the Savage MX/4 and all later chips. You were complaining about the Savage 3D: By the time the Performance ICD was released, S3 had already EOL'd (end-of-lifed) the product and we (i.e. the company) weren't supporting it anymore.

Sorry bud for rants and name calling but I do live in the real world. Work in IT industry actually supporting users in Financial Sector (maybe reason for my affable nature) where minimizing lost time is a prerequisite for making money.

We all have to operate within the BUSINESS constraints, not just engineers.

But the defence of S3's position with regard to support is wasted. Better to have been more vocal during those times of duress than here when it doesn't make any difference to final outcome. Which in S3 & 3DFX case was poor business sense.

Questions sent directly/indirectly to S3 from numerous S3 related forums @ time when it mattered most remained unanswered.

One petition followed by another. Unanswered!

Now that you're answering S3's cause. How about answering a few questions.

What became of the Z400 & Z600 projects that were rumored. Did they ever get T&L fully operational? Why no DDR parts?
 
Back
Top