i915G review at IXBT - why no h/w vertex shaders?

Guden Oden

Senior Member
Legend
Of course there aren't any hardware vertex shaders since none are implemented in silicon in the GPU itself, but why isn't the driver simply emulating hardware vertex shaders in software?

Since the i915 series is aimed at Prescott and SSE3, there should be plenty opportunity to write fine-tuned drivers that perform much better than the general vertex shading path provided by DX9 itself. Also, all software which currently requires hardware vertex shading (which is weird as hell coz the only difference is it'll - hopefully - run faster compared to software shading).

Will Intel add this, or do they simply not give a damn, as integrated gfx is typically aimed at the office landscape and clueless OEM-buying newbs who simply don't know any better; people who really wouldn't need high-performance vertex shading anyway?

I would think most DX9 games would require hardware vertex shaders, since even simple test programs apparantly do it, and what then is the point of even implementing DX9 in an integrated chipset? Longhorn? To quote Kefka: "Uwaa ha ha". By the time that OS launches, Intel will likely be on its next-NEXT gen of chipsets. ;)

Just strikes me as kind of odd. 3dfx were going to pioneer this, but a sudden bankruptcy came in the way of that, and then PowerVR were going to do the same with that Kyro2 derivative, but that chip got cancelled, or whatever. Dunno what happened there really. Now Intel had the chance to really give us the chance to see what software can do compared to hardware and they blew it without even trying! Damn, what a pity. :(
 
FWIW the Radeon 8500 based chipset from ATI (IGP 9100) has no hardware vertex shader either, and neither does Xabre from SiS. The justification is that SSE/3DNow vector floating point units on modern CPUs are fast enough to run vertex shaders without overly starving the pixel pipes. In the integrated sector low cost is top priority and money is saved any way it can be. Intel will "emulate" the shaders on the SSE unit, don't worry. :)

As for why integrated video is subpar, just remember that 75% of the population plays games never or rarely enough to not know the difference between trash and quality graphics. This is the market Intel targets.
 
Dave B(TotalVR) said:
I'd like to see series 5 in a via chipset with dual channel DDR2 ram ;)

Seeing as how VIA owns S3 I don't believe that to be very likely, at least for the moment. :)
 
If MGA900 runs Longhorn's GUI properly, any additional optimizations and features will be gravy.

One should not *expect* gravy. True, gravy sometimes just happens. But to rely upon the appearance of gravy is... unwise.
 
Nebuchadnezzar said:
akira888 said:
Dave B(TotalVR) said:
I'd like to see series 5 in a via chipset with dual channel DDR2 ram ;)

Seeing as how VIA owns S3 I don't believe that to be very likely, at least for the moment. :)
S3 != Series5/PowerVR

Yeah, what I meant to say was that since Via already owns a graphics company (S3) I doubt they'll subcontract out with another firm (ImgTec/PVR) for their integrated boards.

But in retrospect stranger things have happened.
 
It's an intel board, what do intel make big bucks off? Overpriced cpu's, so of course it makes sense for them not include hardware vertex shaders, a feature which reduces the need for a higher speed cpu. This is just another stupid marketing gimmick for intel sell higher speed higher priced cpu's. Lame.
 
I think there is a market for decent performing integrated graphics. GMA900 isn't what I hoped it would be. Of course, truth is, it CAN play Far Cry which is nice by itself (even at low resolution).

Course, Dave Orton has said publicly that they will have a integrated DX9 graphics core this year, I expect something similar from NVIDIA though without a license from Intel, NV's market is AMD only.

If we can see a simple core with near 9600 non pro speeds from ATI and near 5600 class integrated graphics from NV this year I think they would have a market

Of course the top 10 selling PC games from June were all DX8 titles with the exception of Far Cry
 
mozmo said:
It's an intel board, what do intel make big bucks off? Overpriced cpu's, so of course it makes sense for them not include hardware vertex shaders

If you read my original post, you will see I was talking about hardware vertex shaders implemented via software (ie: running on that overpriced CPU).

How does a program know there are hardware vertex shaders in a GPU anyway? Answer, it DOESN'T. It reads the caps bits provided by DX, if the caps says hardware shaders present then it assumes that is the case. It has no way of knowing wether it REALLY is hardware that runs those shaders or a piece of driver software on the host CPU.
 
Guden Oden said:
mozmo said:
How does a program know there are hardware vertex shaders in a GPU anyway? Answer, it DOESN'T. It reads the caps bits provided by DX, if the caps says hardware shaders present then it assumes that is the case. It has no way of knowing wether it REALLY is hardware that runs those shaders or a piece of driver software on the host CPU.

Eh, that must be wrong. It's the hardware that gives that information through the dx caps; otherwise all pc's with dx installed but different gpu's would give the same specs right?
 
Cyberdigitus said:
Guden Oden said:
mozmo said:
How does a program know there are hardware vertex shaders in a GPU anyway? Answer, it DOESN'T. It reads the caps bits provided by DX, if the caps says hardware shaders present then it assumes that is the case. It has no way of knowing wether it REALLY is hardware that runs those shaders or a piece of driver software on the host CPU.
Eh, that must be wrong. It's the hardware that gives that information through the dx caps; otherwise all pc's with dx installed but different gpu's would give the same specs right?
No, it's the driver that reports the HW's capabilities to the application. The driver can say whatever it wants about the HW's capabilities.
 
Unfortunately or fortunately (depending on who you are???) Intel GMA900 is based on Tile based Rendering but not Tiled Based Deferred Rendering. Most architectures from voodoo till now render via tiles - (& in quads??) - to increase cache & internal memory efficiency etc and are based on tiled rendering.

TBDR on the other hand requires initially organising the scene and removing redundant texturing (& shading??) - simon, kristof + co can probably help u out on the finer points on this style of rendering-powerVR's style anyway.

This technique has some awesome benefits- however is yet to be proven as a high-end pc architecture (although sega has licensed it for their next high-end arcades). Maybe if Ati & Nvidia held more patents - or XGI & S3 licensed some tech - it would be more accepted???
 
Back
Top