My Intel X3100 does DX10 :)

I've always been a fan of 'quirky' 3d accelerators, and over the years I've had some early Matrox cards, PowerVR, Kyro II, and the Radeon 8500... but of all those, only the Radeon 8500 eventually lived up to its full potential. The rest was let down by poor driver support, often making games not run, not perform adequately, or just not render properly, making them unplayable.

I don't think you should assume that the games were "faultless" - many weren't. Not checking caps etc was common.
 
thanks for the answers
unfortunately convincing my friend to spend and extra £250+ for a laptop with a decent gpu is not going to be easy
when i told him i found him a laptop for £259.00 he said "is that the cheapest you could find" :(
 
I don't think you should assume that the games were "faultless" - many weren't. Not checking caps etc was common.

I know... The Kyro II especially suffered from that, because its deferred rendering nature required reasonably strict handling of BeginScene()/EndScene() and the availability of zbuffer/rendertarget contents and such. One of the fixes in 3DMark2001SE was actually to make it render properly on Kyro II.

In the end it's irrelevant though... Whether it's the drivers or the games that are the cause... if you cannot play a lot of games because of bugs, the videocard is just poor value for money. I traded in the Kyro II... I think for a GeForce2 GTS.
 
Last edited by a moderator:
I know... The Kyro II especially suffered from that, because its deferred rendering nature required reasonably strict handling of BeginScene()/EndScene() and the availability of zbuffer/rendertarget contents and such. One of the fixes in 3DMark2001SE was actually to make it render properly on Kyro II.
There was that, but there were other things that just make one want to :rolleyes:
 
3DMark really isn't particularly representative here; plus the final score will be dominated by the CPU component of the scoring in these types of schenarios. If you really must look at 3DMark then its best to find something that has a breakdown of the individual tests and feature tests.
the only situation in which the discrete embedded PC GPUs have an order-of-magnitude performance advantage over the GMA IGP series (950 and up) is heavily vertex-bound scenarios, and that when the GMA is of the vertex-neutered kind. or IOW, a capabe vertex cruncher is order(s)-of-magnitude faster than a sw SIMD pipeline, stop the press.

you could also say the discrete cores are infinitely better at FSAA, given the GMAs simply don't do it (GMA 500 is expected to be the first GMA one to address that).

FWIW, two things have been the source of suffering along the GMA line (and i mean that literaly, from the position of somebody who's done development on those) - the flaky bandwidth under UMA (though deferred rasterization does help there) and sub-par driver support, the latter being the overwhelming factor by far. that said, discrete embedded PC GPUs are not bandwidth kings either, but at least their driver support is at another plane of existence compared to intel's.
 
(GMA 500 is expected to be the first GMA one to address that)

You mean the PowerVR SGX535 aka IGP in Poulsbo for Menlow platform?? Yea that's right, its not Intel developed technology...

EDIT:

The official 15.9 drivers are out, and the Release Notes do mention its a DX10 driver.
 
Last edited by a moderator:
The official 15.9 drivers are out, and the Release Notes do mention its a DX10 driver.

Ah thanks, didn't discover them yet.
Just installed them... they do run 3DMark Vantage slightly better, but still nowhere good enough to complete all tests and get a score.
There are also still some problems with the DX10 examples from the SDK.
 
The 15.9 driver might not have true DX10 support. Here's an earlier report about 15.9: http://xtreview.com/addcomment-id-4719-view-Intel-DX10-support.html

"Today we are counting the last hours of the first quarter, and it is possible to tell about the absence of directX 10 support for g35 intel chipset . DirectX 10 support for chipsets Intel g35, GM965 and GL960 will appear with drivers version 15.9, which will be released in the end of April. In the case with GM965 and GL960 mobile chipsets directX 10 support will be partial."

According to another yet similar article, that "partial" support is replaced by "software".

Google translation:
http://translate.google.ca/translat...?q=Intel+15.9+driver+anamorphic+scaling&hl=en

" In addition, the old G35 chipset, Santa Rosa mobile platform of GM965, GL960 chip set will also be additional support DirectX 10 specifications, but will continue to be part of the software rather than fully support the hardware."

The driver version name for 15.9 driver is 7.14.10.1472. On this page from Intel: http://www.intel.com/support/graphics/sb/CS-020667.htm

says that the 2nd number, from the 7.14.10.1472 tells the DX version. From the same page, we can deduct that the 15.9 driver is a DX9 driver.

Perhaps the 15.9 driver only enables DX10 support through software rendering. I have read that Intel had to rush the driver in order to get DX10 support through since DX10 is a requirement to get Vista Premium certification for new GPUs.
 
DX10 demands that you support every feature. If they use software for some things, fine, as long as it works.
They've already enabled some extra things since the beta release. For example, the predicated drawing example did not speed up with the beta driver when an occluder was in view, but it does now. If they can do that sort of thing all in hardware, then how bad can it really be?

I'm quite sure it's not rendering everything in software, most SDK samples run at reasonable framerates, compared to its DX9 performance, and even Crysis and BioShock work okayish in DX10 mode.
So yes, the DX10 drivers are obviously not completely functional yet... some things are still bugged or just crash, but I'm quite sure they'll support most, if not all features of DX10 in hardware.

Looking at a version number says nothing. It supports DX10, therefore it is a DX10 driver (DDI version 10 in dxdiag).

Why do people just want to hack at Intel's IGPs and drivers so badly? Sure, they're not perfect, but at least be realistic. It's not rendering in software just because the version number is not what you expect, geez... Try using some common sense.
 
Last edited by a moderator:
Scali: NVM, I heard that these are indeed DX10 drivers and the naming is a typo. The installation shows it as 7.15.xx.xx. Still doesn't change the fact that G45 is the real DX10 solution and GM965/G35 is a half-hearted one :p(I know DX10 needs minimum feature set to get the certification, but GM965/G35 is really the minimum).
 
Scali: NVM, I heard that these are indeed DX10 drivers and the naming is a typo. The installation shows it as 7.15.xx.xx. Still doesn't change the fact that G45 is the real DX10 solution and GM965/G35 is a half-hearted one :p(I know DX10 needs minimum feature set to get the certification, but GM965/G35 is really the minimum).

Actually, DX10 requires an exact featureset. Not min, max or medium:you either support all of its features or you don't support DX10. And from what Intel has posted the G45 doesn't bring all that much to the table...2 more EUs, higher clock frequency and a few architectural tweaks.
 
Indeed, X4000 will basically be a beefier, faster version of X3000. As far as we know, there are no extra features.
G965 is the half-hearted one. Apparently this chip was released even though the DX10-hardware wasn't fully functional. Therefore DX10 is also not enabled on these chips. GM965 is an updated architecture, and DX10 is enabled by this driver.
Ofcourse it could still be that some features are missing or broken on GM965 and have to be fixed/emulated in software (and that these will be fixed in X4000), but so far the DX10 support looks convincing. The examples with geometry shaders work, as does the predicated rendering... I'd say those are some of the biggest hardware-changes in DX10. Considering this all works, I'm inclined to think that some of the examples that don't work, or don't work properly, are mostly because of driver bugs, not because of major things missing/broken in the hardware. It seems that the main problems are memory management and precision/rounding control at this point (oh, and ofcourse performance :)).

Okay, so the hardware is slow and the drivers are buggy. But they DO support DX10, can't you at least give Intel that?
 
Last edited by a moderator:
Actually, DX10 requires an exact featureset. Not min, max or medium:you either support all of its features or you don't support DX10. And from what Intel has posted the G45 doesn't bring all that much to the table...2 more EUs, higher clock frequency and a few architectural tweaks.

There are still few optional features which were mentioned in MS presentation:

Greater than single sampled MSAA=optional
32-bit FP Filtering=optional
RGB32 Rendertarget=optional(supported in Gen5, aka GMA X4500)
16-bit UNORM Blending=optional

Okay, so the hardware is slow and the drivers are buggy. But they DO support DX10, can't you at least give Intel that?

Happy they are improving, but nowhere enough. It's so horrible compared to the competition, its hard to give them credit :p.

Anyway, Gen5 should do a lot in terms of performance. 25% more EUs(10 vs 8), higher clock(65nm), more memory bandwidth, >512MB DVMT, 2x hardware vertex processing throughput.

Currently, GM965 users report they are getting pathetic performance in older games like Civ4, NFSU(the first one!), and how it runs much faster in software vertex processing compared to hardware. X4500 should remedy that problematic performance.

I have a request. Mind if you test Crysis in DX9 and DX10 using the game's benchmark and tell me how it performs?? Thanks.

(I use G965 IGP)
 
You can add programmable tesselation, raytracing, thai massages or whatever else you can think of as optional features. These do not condition or affect support for dx10. In order to have dx10 support you implement all of the mandatory features,or you don't support it at all. There is no partial dx10 support as a concept.
 
Happy they are improving, but nowhere enough. It's so horrible compared to the competition, its hard to give them credit :p.

Not giving them credit is one thing. Spreading nonsense about how their DX10 support would only be 'partial' or is software-emulated or whatever is something else altogether.
Fact is, there is an official DX10 driver out, and the X3100 DOES support all the required DX10 features (there is no other way). The only problem at this point is that the drivers are still quite buggy, so not everything works properly. Many things already do, however.

I have a request. Mind if you test Crysis in DX9 and DX10 using the game's benchmark and tell me how it performs?? Thanks.

I don't have exact figures right now, I'll have to run the benchmarks again when I get home... but roughly I recall the DX9 test running at about 6 fps on average, and the DX10 benchmark at about 4 fps on average, with the same settings (because of a bug, it is not possible to choose any other resolution than 1280x800 on my system in DX10).

(I use G965 IGP)

Why are you in this discussion then? As I already said, the G965 is an older chip, where Intel has abandoned DX10 support. I have a GM965 (X3100), where DX10 simply works. Find a machine with such a chip and install the 15.9 driver, if you don't believe me.
 
Not giving them credit is one thing. Spreading nonsense about how their DX10 support would only be 'partial' or is software-emulated or whatever is something else altogether.
Fact is, there is an official DX10 driver out, and the X3100 DOES support all the required DX10 features (there is no other way). The only problem at this point is that the drivers are still quite buggy, so not everything works properly. Many things already do, however.

Not nonsense. It was an article about 15.9 driver, how the GM965 will have support for DX10 but in software.

Oh whatever, I told you it was in Microsoft presentation. Search for it if you do not believe me. There still are things in DX10 that Intel graphics do not fully enable that ATI/Nvidia does. (And about that thai massage comment by Morgoth, you are just being a jerk, oh sorry I meant idiot)

Why are you in this discussion then? As I already said, the G965 is an older chip, where Intel has abandoned DX10 support. I have a GM965 (X3100), where DX10 simply works. Find a machine with such a chip and install the 15.9 driver, if you don't believe me.

Doesn't matter. Basic architecture of G965 is identical to GM965, and since I use it development of the drivers in both fields are important. The 15.9 is a significant driver even for DX9 since it brings performance improvements that has not been there since the first hardware VS supporting driver from what I found.

The lack of performance is ENOUGH to justify that its a half-hearted implementation. Implementing it so its not running at optimal speeds is half-hearted.

And there ARE optional features in DX10. DX10.1 implements those optional features. 32-bit FP Filtering, 16-bit UNORM blending, RGB32 Rendertarget.

http://translate.google.ca/translat...Intel+DX10+non+anamorphic&start=20&hl=en&sa=N

"New drive for desktop G35, notebook GM965/GL960 eventually bring the legends of the DX10 support, but because of structure, some specifications is still software support rather than hardware support,..."

I have Intel presentation that lists some advancements of Gen 5 vs Gen 4:
-Significant improvment of Transcendental instructions performance
-Increased support of latency coverage at higher frequency
-Support of shaders with longer instruction lengths

http://softwarecommunity.intel.com/articles/eng/1487.htm

Shader precision goes down from G965's 32-bit to 24-bit for all other IGPs. Things like max 2D texture is between DX9(2k x 2k) and DX10(8k x 8k) at 4k x 4k.
 
Not nonsense. It was an article about 15.9 driver, how the GM965 will have support for DX10 but in software.

What article is that?
And it's still nonsense, because obviously the X3100 is using hardware-acceleration for DX10.

Oh whatever, I told you it was in Microsoft presentation. Search for it if you do not believe me. There still are things in DX10 that Intel graphics do not fully enable that ATI/Nvidia does. (And about that thai massage comment by Morgoth, you are just being a jerk, oh sorry I meant idiot)

So? Unless these things are part of the required DX10 featureset, I don't see how that makes the X3100 any less DX10-compatible.

Doesn't matter. Basic architecture of G965 is identical to GM965, and since I use it development of the drivers in both fields are important. The 15.9 is a significant driver even for DX9 since it brings performance improvements that has not been there since the first hardware VS supporting driver from what I found.

It does matter. The basic architecture is NOT the same. They're similar, but one can do DX10, the other cannot.
And the DX9-portion of the driver is not relevant to the DX10-portion.

The lack of performance is ENOUGH to justify that its a half-hearted implementation. Implementing it so its not running at optimal speeds is half-hearted.

You're using a G965, how can you judge DX10 performance in the first place?

And there ARE optional features in DX10. DX10.1 implements those optional features. 32-bit FP Filtering, 16-bit UNORM blending, RGB32 Rendertarget.

Yes, optional... meaning you don't have to implement them for full DX10-compatibility. Nobody ever claimed DX10.1 compatibility so I don't see the relevance. nVidia doesn't have DX10.1 compatibility either.

"New drive for desktop G35, notebook GM965/GL960 eventually bring the legends of the DX10 support, but because of structure, some specifications is still software support rather than hardware support,..."

Why do they fail to mention what exactly is hardware, what is software, and why?
Since DX10 uses a unified shader architecture, if the hardware can run pixelshaders, it can run vertex and geometry shaders aswell. So even if the drivers would at this point use software vertexshading (which wouldn't make sense to me, because they'd have to write a complete software vp path, while they already have the unified shaders going for pixelshading anyway), it would not be a limitation of the hardware.
Aside from that, whether you support a feature in software or hardware is irrelevant. As long as you support the required features, you're DX10-compatible, no matter how you support them.

Shader precision goes down from G965's 32-bit to 24-bit for all other IGPs. Things like max 2D texture is between DX9(2k x 2k) and DX10(8k x 8k) at 4k x 4k.

If that's true then they would not make DX10 specs, however, they would not pass WHQL testing on the drivers either I suppose.
Another thing is that they list VS and PS as 3.0, while obviously the new DX10 driver runs SM 4.0, and it is too fast to be all software. At least the pixelshaders must be hardware. Vertex/geometry shaders *could* be software, but being unified shaders, they should be able to run on hardware.
So are they listing hardware features, or what the driver supported at the time of writing?
 
Last edited by a moderator:
1. The article is the same thing linked on the later part of my paragraphs.

It does matter. The basic architecture is NOT the same. They're similar, but one can do DX10, the other cannot.
And the DX9-portion of the driver is not relevant to the DX10-portion.

The basic architecture IS the same. G965 had hardware related bugs that couldn't enable DX10. Initially the part has been said to be DX10/SM4.0 for months before introduction, but after it came out, it slowly faded away. It's other reasons that's disabled, could be marketing too(otherwise how do you really differentiate G35 when its basically G965 with a different name and bug fix).

I already told you that the shader precision is higher on G965 than G35/GM965.

You're using a G965, how can you judge DX10 performance in the first place?

First, you already told me a bit of details and there are couple of tests by individuals on the net. For gaming, which is most relevant for DX10, its useless. I can understand if the X3100 could even play pre-DX9 games well, but for the most part it doesn't. Making a part that's otherwise Geforce 2-level, but making it compatible for DX10 level isn't really nice.

About the article, it says some of the specifications, not all. So there might still be finer parts which are not done in hardware, but rather in software.

If that's true then they would not make DX10 specs, however, they would not pass WHQL testing on the drivers either I suppose.
Another thing is that they list VS and PS as 3.0, while obviously the new DX10 driver runs SM 4.0, and it is too fast to be all software. At least the pixelshaders must be hardware. Vertex/geometry shaders *could* be software, but being unified shaders, they should be able to run on hardware.
So are they listing hardware features, or what the driver supported at the time of writing?

On games, especially ones that are couple of years old, software VS is much faster than hardware VS. Besides that, Intel might have rushed 15.9 drivers to get DX10 support since its part of Vista Premium certification requirement for new PCs. They took 1 year to enable hardware VS, then we found out that significant amount of times the hardware is really slow. As the article indicates, there still might be some parts that are done in drivers only but really well optimized.

In software mode, they call the pipeline "processor-specific geometry pipeline" or PSGP in the drivers that allows it to take advantage of CPU specific instruction set enhancements(like SSE). Before 15.9, it might have done some features with software and no optimization, but in 15.9 most are done in hardware and some are done with PSGP.
 
Back
Top