First (real) GMA X3000/G965 Benchmarks

I can't see why you'd do different MCH's for X3000 (G965) and 3000 (Q965), nor go to the 945G's GMA 950 core for G33.

My guess would be they are the same core with parts disabled.
 
I can't see why you'd do different MCH's for X3000 (G965) and 3000 (Q965), nor go to the 945G's GMA 950 core for G33.

My guess would be they are the same core with parts disabled.
The 3000 and X3000 are indeed the same (or at least very similar, I wouldn't know of any difference however safe maybe different clocks), even though the latter officially supports less features. The g33 gma 3100 OTOH really is a reshashed gma 950 (just look at the linux dri drivers for i915/i965 and you'll see which chip is supported by which driver).
 
The 3000 and X3000 are indeed the same (or at least very similar, I wouldn't know of any difference however safe maybe different clocks), even though the latter officially supports less features. The g33 gma 3100 OTOH really is a reshashed gma 950 (just look at the linux dri drivers for i915/i965 and you'll see which chip is supported by which driver).

I am sure you wanted to say former XD.

GMA X3000-Hardware SM 3.0, Transform & Lighting, Vertex Shader, better Early Z
GMA 3000-Everything that X3000 is except the features said above
 
Last edited by a moderator:
I am sure you wanted to say former XD.

GMA X3000-Hardware SM 3.0, Transform & Lighting, Vertex Shader, better Early Z
GMA 3000-Everything that X3000 is except the features said above
This is the official list. There's nothing inherent in GMA 3000 why it wouldn't be able to do these features. The linux driver certainly runs T&L and vertex shaders (those two are the same anyway with any half-way modern hardware) on this chip without having to use software emulation. (Note though it might make sense to do it in software with the GMA 3000 if it's clocked lower than the X3000, apparently the windows drivers even do this sometimes with the X3000 since it may be faster.)
I don't think X3000 (or 3000) have any type of Early Z, though could be wrong.
 
This is the official list. There's nothing inherent in GMA 3000 why it wouldn't be able to do these features. The linux driver certainly runs T&L and vertex shaders (those two are the same anyway with any half-way modern hardware) on this chip without having to use software emulation. (Note though it might make sense to do it in software with the GMA 3000 if it's clocked lower than the X3000, apparently the windows drivers even do this sometimes with the X3000 since it may be faster.)
I don't think X3000 (or 3000) have any type of Early Z, though could be wrong.

Interesting. Anyway, for Early Z, its confirmed by Intel. Its detailed on the GMA 3000 and X3000 developers guide: http://softwarecommunity.intel.com/articles/eng/1487.htm

Maybe the lack of Early Z implementation on the drivers is why the performance is so poor.

The differences of 3000 and X3000 are under the heading Business vs. Consumer SKU. Are you sure its really GMA 3000 on the Linux driver?? What's the chipset?? I know earlier windows referred both the GMA 3000 and GMA X3000 as GMA 3000.
 
Interesting. Anyway, for Early Z, its confirmed by Intel. Its detailed on the GMA 3000 and X3000 developers guide: http://softwarecommunity.intel.com/articles/eng/1487.htm

Maybe the lack of Early Z implementation on the drivers is why the performance is so poor.
Ah yes you're right. Just a bit you can set it seems.

The differences of 3000 and X3000 are under the heading Business vs. Consumer SKU. Are you sure its really GMA 3000 on the Linux driver?? What's the chipset?? I know earlier windows referred both the GMA 3000 and GMA X3000 as GMA 3000.
Well this is confusing. Even on that site you listed (which is very nice, btw), there are "3000 series" g31/g33 which are old GMA 950, basically. Then there's the GMA 3000 which is actually X3000 series...
It even lists the clock of the GMA 3000 as being the same as the GMA X3000, so I wouldn't know of any hardware difference.
The linux driver for i965 supports the following chipsets: 965Q, 965G, 946GZ, 965GM, 965GME/GLE.
 
Well, according to that very link, the GMA 3000(946GZ/Q965/Q963) runs vertex processing on CPU and has a lesser featured form of Early Z. I'd assume the GMA 3000 is like graphics version of a Celeron with disabled features. And the G31/G33 uses GMA 3100.

Funny thing is that the 14.31+ drivers(15.6 on Vista) enables hardware T&L/VS on the X3000/X3100 plus it switches between hardware/software T&L/shaders depending on the game. Intel says some games run on software mode faster than hardware, and I didn't notice that on too many games, but on the laptop(X3100) side, there are lots of games where its much faster on software than hardware.

The "switch" between hardware/software isn't even dynamic based on performance measurements, but rather a real registry toggle, and its a relatively simple procedure once one gets used to it.

Whether the hardware is really weak, or that the software mode is much too well optimized for it, I don't know.
 
Funny behavior on the X3000

Doom 3 Demo: For some reason, the 1280x1024 Ultra High doesn't run the game much worse than the 640x480 Low resolution, on the X3000. The best config is imo, 1024x768 High, because the fluctuation isn't big as 640x480(which can go from 6-8 to 50 frames depending on the scene), and everything looks so much better.

In FEAR MP Demo benchmark, it runs the benchmark much better in DX9 than DX8.

It must be limited by something to perform not much better on much lower settings. Perhaps the shader performance is limiting it?? I'd assume it has to do with subpar drivers, because even with the same game engine, the performance is greatly different. Team Fortress 2, Counter Strike: Source, Portal, runs much worse than Half Life 2, even Half Life 2 Episode One.

On games like Portal which has bad performance, increasing graphics effects do not affect performance a lot.
 
So what does all these means?
Dont trust Intel in terms of Graphics Hardware. ( They have disappointed me time after time )

It looks like Nvidia will remain King then
 
I changed up my old Thinkpad T40 for a new Thinkpad T61 a couple days ago so have GMA X3100 (GM965) in here.

Runs World of Warcraft reasonably well at 1280x720 (Windowed) with detail set to around medium.

Not had a chance to try any other tests but if anyone has any requests for tests to run I'll swap in a lab hard drive and give em a whirl after work one day next week.
 
At this moment I am with this machine waiting for my graphic card.

Gigabyte G33M-DS2R/C2D E6750/2 Gb DDR2 PC2-6400/WXP SP2/6.14.10.4906

If I could do any test, please tell me.
I am playing NWN 1680x1050 medium/high details at 15 FPS (not a problem in this game).

DavidC Could you tell me which one is the entry in windows register for sfotware/hardware change?

PD. Please excuse my english.
 
Osamar: Hi, to go to the place in the registry, search for game's original file name with _ at front. For example, its _3dmark05.exe for 3DMark05. The value for 0 is default and its hardware T&L/VS, meaning the "feature" is off. The feature is enabled by putting the value to 1, which enables software T&L/VS. If your game is not on the registry, you create a DWORD value with the original file name and _ at the front of the name. You have to do that with every place that exists with the list. Just use F3(shortcut) to keep searching and create DWORD for all of them. There's like a dozen places, though I think some are redundant.

Its more important for the GMA X3100, because the hardware is weaker and there is less memory BW. For the X3000, many games that are slower with hardware on the X3100 is equal/faster because of that reason. X3500 looks to be 10-15% faster than the X3000, so yea.

(Some intense effects need hardware to run it smooth, like ship battles in AOE3, because while software gets better max fps, hardware runs faster on ship fights. On software, when ships start firing at each other, it won't be playable at any graphics settings)

Iwod: Actually for IGPs, ATI will be king very soon. Nvidia is better on discrete yes.
 
Last edited by a moderator:
Have we even seen anything concrete from nVidia for IGPs, maybe they were scared off by AMD? I think I recall something about nVidia's IGP overheating and being delayed, hope that's not the case...
 
Back
Top