The Official S3 Discrete GPU Rumours & Speculation Thread

santyhammer

Newcomer
MOD Edit: This has now been hijacked to become the official S3 Discrete GPU Thread.

Not sure if anybody mentioned before but... News of November 24:

http://www.theinquirer.net/default.aspx?article=35932
http://vga.zol.com.cn/46/467761.html
http://www.hexus.net/content/item.php?item=6155

It appears S3 was very occupied developing a MONSTER in the shadows!
First time I see the multicore ( up to 8 ) GPU approach with clock 1,2Ghz and GDDR4-5 and DX10.1 compatible. We will have to wait to see more but sounds good.

It appears it will be available in Q2 of 2007.
Any thoughts, comments, speculation or more info on this?
 
drivers aren't the greatest I agree, but performance if they work doesn't suck, they are targeted at the lower mid end range.
 
drivers aren't the greatest I agree, but performance if they work doesn't suck, they are targeted at the lower mid end range.

My post was more of a joke, and really I will say this that S3 will in no way be able to top Nvidia or AMD in the high end sectors. Hopefully the drives are vastly improved over their previous efforts and then they'll be able to at least possibly provide a good low to mid end solution.
 
ah sorry missunderstood, yeah thats true, they don't have the resources to go up against them.

Drivers are shotty at best, but they have gotten better from 2, 3 years ago, but still alot left for wishful thinking :D
 
It would be good for the high-end market if we have more than two big players. Would be cool if S3 and Intel (Larrabee) would join the high-end GPU market.

I have a soft spot for S3 when years ago I had a very fast 2D card of them. As a student I wanted to fiddle around with some of the 2D accelerator functions of that video card, but I couldn't find any info on the internet. So I mailed them to ask if they had the info available on the internet. They didn't, but instead they sent me a big book with the full information about how to program the card and what ports it used etc. Really cool. :)
 
Small vendors like S3 are most likely to lack severely in resources to release anything high end in a timely fashion (as Razor1 pointed out); most of the info above seems to concern low end sollutions and even then at least half a year later projected release dates than the two major IHVs.

As for Intel and high end GPUs? I'd first want to see Intel entering the graphics standalone market first, see it's success and then I might think of such a long term plan. Frankly (and yes I could be completely wrong) I don't think Intel would approach the issue in any other fashion either, since the risk involved isn't exactly small.

Currently it seems that Intel "lost" interest in the PDA/mobile market (gee I wonder why after the OMAP2 success rate...) and intends to work in the direction of ultra mobile PCs (a new category in between small laptops and PDAs), for which from all indications (IMG's newsblurbs) they'll use Imagination SGX IP. If any of the so far licensed IP makes it in the future into IGPs for PCs/laptops remains open, as of course if Intel IS actually interested in anything standalone.
 
It would be good for the high-end market if we have more than two big players. Would be cool if S3 and Intel (Larrabee) would join the high-end GPU market.
The technology moves quickly enough that obsolescence is better at driving prices down than competition could ever be (barring a monopoly, of course).

Neither Intel nor S3 has ever shown more than a hint of being able to compete at the high end, which is the only place where pricing pressure would be noticeably helpful.

And finally, I rather like the idea of making devs' lives easier by having a small number of platforms to test on. Unless AMD limits ATI's ability to compete, the status quo works nicely.
 
Intel has the necessary engineering talent and know-how to compete in the high-end GPU market, they simply don`t seem to care, although that may change due to AMDs possible focus on that area. I find it funny that people seem to think that Intel are a bunch of retards, considering they have some of the best engineers in the business(and yes, I`m talking about graphics here, not CPUs). IMHO, they`ve been pushing crappy IGP after crappy IGP simply due to the fact that there was no necessity for anything better, and generally companies tend to choose the path of minimal resistance in order to reach their goals.
 
Intel has the necessary engineering talent and know-how to compete in the high-end GPU market, they simply don`t seem to care, although that may change due to AMDs possible focus on that area. I find it funny that people seem to think that Intel are a bunch of retards, considering they have some of the best engineers in the business(and yes, I`m talking about graphics here, not CPUs).

With the only other difference that great math units do not make great GPUs.

IMHO, they`ve been pushing crappy IGP after crappy IGP simply due to the fact that there was no necessity for anything better, and generally companies tend to choose the path of minimal resistance in order to reach their goals.

Why does that sound to me rather like a pity excuse? Their IGPs so far cover roughly anything 2D and some necessary multimedia functionalities; for anything 3D - even for just an IGP - both the hardware as the drivers leave quite a few fields for complaints. This time they definitely learned how to optimize for 3dmark:

http://www.xbitlabs.com/articles/chipsets/display/ig965-gf6150_12.html

Good for them that no one has bothered yet to analyze basic IQ, since what I've seen from them in terms of basic texture filtering was at the edge of an abomination. And no desinging something as simple as a TMU shouldn't cost a crapload of resources, nor would any experienced engine neglect a proper implementation.

I would be willing to accept that the graphics team is severely underpowered, but with that kind of resources you don't go out and design a high end GPU (or you'll get a Parhelia-shenanigan in the end) and finally I'd need a good explanation why they actually use 3rd party IP so far for graphics. Neither licensces nor royalties equal "0".
 
First time I see the multicore ( up to 8 ) GPU approach with clock 1,2Ghz and GDDR4-5 and DX10.1 compatible.
You may wish to take into consideration that when looking at G80 from a certain perspective, it is a 16 cores chip running at 1.35GHz with memory bandwidth equivalent to that of a chip with 2700MHz (5400MHz effective) GDDR4-5 on a 128-bit bus.


Uttar
 
You may wish to take into consideration that when looking at G80 from a certain perspective, it is a 16 cores chip running at 1.35GHz with memory bandwidth equivalent to that of a chip with 2700MHz (5400MHz effective) GDDR4-5 on a 128-bit bus.
Yep, very true. I'm not sure how many "shader" units has each S3 core, but I bet not much because 8cores with the same shader units that the G80 = you can fry an egg over! But looking at the photos it appears a small fan can refrigerate them. Or perhaps the articles refers to 8 cores = 8 independent shading/stream units? We will see!
 
Intel has the necessary engineering talent and know-how to compete in the high-end GPU market, they simply don`t seem to care, although that may change due to AMDs possible focus on that area. I find it funny that people seem to think that Intel are a bunch of retards, considering they have some of the best engineers in the business(and yes, I`m talking about graphics here, not CPUs). IMHO, they`ve been pushing crappy IGP after crappy IGP simply due to the fact that there was no necessity for anything better, and generally companies tend to choose the path of minimal resistance in order to reach their goals.

What does this mean that Intel has some of the best engineers in the graphics business? NVIDIA and ATI have lots that are among the best in the business. I would find it hard to believe that Intel has anywhere near that level of talent. And I also find it hard to believe that too many key employees would leave NVIDIA or ATI to join Intel. Maybe a few looking to jump into a higher position or something, but other than that why would you want to work for Intel when there is an opportunity to transform an industry?
 
I find it funny that people seem to think that Intel are a bunch of retards, considering they have some of the best engineers in the business

What I think most people are usually referring to is the corporation. After all you need only look at examples like Pentium 4 and Itanium. There's certainly no shortage of excellent engineers doing amazing work, but there's definitely something a miss somewhere in the organization as a whole.
 
Imagine S3 enters the x86 CPU\GPU market, and i dont doubt it at all. S3 is a part of VIA, and VIA has an x86 license, so VIA does the CPU while S3 does the GPU. I can imagine what will happen if S3 and VIA were secretely working on a similar project and then smack both AMD and Intel by releasing just a product, maybe by late 2007\ early 2008.
 
Last edited by a moderator:
Back
Top