First (real) GMA X3000/G965 Benchmarks

Hey guys, while searching online for information on the GMA X3000, I came across this thread. I just purchased a board with the GMA X3000 on it (to hold out until I have enough money for one of the newer DX10 cards). Unfortunately I haven't been able to test it because one of the memory sticks in my dual channel pack was defective. I'll probably receive the replacement next week. If it makes a difference, I have an E6600 (2.4ghz, 4mb L2) and 2GB of Crucial 667mhz CAS-3 memory (which is what's getting replaced right now).

Once I receive my new memory, I'll try running some benchmarks with the new 14.25 driver for you guys. Games I have: UT2004, HL2, Counter-Strike Source, HL2: Episode 1, and a few other random games. Things are looking really promising for the X3000 as more drivers with more support come out! :D
 
Once I receive my new memory, I'll try running some benchmarks with the new 14.25 driver for you guys. Games I have: UT2004, HL2, Counter-Strike Source, HL2: Episode 1, and a few other random games. Things are looking really promising for the X3000 as more drivers with more support come out! :D

Sorry to resurrect an old thread... Ryan, did you receive the new memory? How about those benchmarks with the 14.25 drivers?
 
Sorry, I almost forgot about this thread! I recieved the new memory and have been testing out everything for a while.

I don't have FRAPS so I can't really measure the FPS down to each single frame, but I can give you some pretty accurate estimates.

Half-Life 2 - It's playable. It's playable enough at 800x600 (20-40 fps) but at 1280x768 it begins dropping pretty low, below 20 in active scenes. These are with most settings at high except water (simple reflections) and trilinear filtering with no AA and HDR turned off.

Counter-Strike: Source - Runs a bit better than Half-Life 2 when playing online. It's bearable at 1280x768 but for those that need all the extra frames they can get, 800x600 is the way to go.

Half-Life 2: Episode 1 - I kept getting a crash at the loading screen, so I forced DirectX Level 8.1 on it and it worked. It's definitely playable at the same settings as normal HL2 (not surprising) but once again HDR completely cripples the game (gets about 3-9 fps). Unforuntately though, every once in a while for no apparent reason, I get a BSoD linked to the driver, seemingly out of the blue and completely unprovoked.

Unreal Tournament 2004 - All the other Intel IGP's don't really have a problem running it, so neither does the X3000. In fact, it gets nice smooth framerates. However, there has been this strange bug that I've been encountering a lot. Basically it turns UT2004 into something you might see when you're really really drunk. All these random colors start appearing, and the whole world practically gets horrifically mutilated. Eventually this leads to a BSoD with an error linked to the X3000's driver.

Warcraft III: The Frozen Throne - Runs great at 1280x768 with all settings maxed out. I don't know if this has do to anything with the X3000, but I'll randomly get a crash message (memory could not be read) when other people don't. A reinstall didn't fix it, but I have a feeling it has nothing to do with the X3000.

I'm hoping some of the problems I've been encountering (mostly the BSoD's) will be fixed in a driver update, whenever Intel decides to do that. If you have any other games you want me to try running, then let me know. I can try downloading the demo of it or something.

Sorry for the late update!
 
Ryan, I really appreciate the update. Your review seems to be the only on the web with real life gaming benchmarks. You rock!

Lets hope those new Intel drivers come out soon.
 
Your review of the 14.25 drivers seem to indicate it will score similar to the ones from the chinese site: http://www.pconline.com.cn/diy/evalue/evalue/main/0608/856535.html

About Warcraft III, Blizzard always makes games catered toward mass market rather than cockblocking with ultra-high graphics. The first Extreme Graphics makes it playable.

It's ok, but without having numeric comparisons with older driver versions, its hard to verify whether 14.25 does anything or not.

On a side note, I am glad to say 14.25 is a decent improvement over 14.24:

http://www.forum-3dcenter.org/vbulle...=321049&page=4

Google translation:
G965: “a diagram driverâ€￾ approved yet by Intel. “supports the numerous hardware featuresâ€￾. In the comparison to the old driver the values are improved around approximately 10-15%. Only Doom 3 jumps from 4 fps to 10 fps. Benchmarks in 1024x768.
 
Warcraft III will run on a Matrox G200 :) Not really well, but I believe it is playable.
 
So how does the X3100 compare to the FX 5500?

EDIT: Actually X3100 is the mobile part. For some reason the X3100 users seem to be having more problems than with the X3000 on the desktop. However, following are the X3000 results tested with my system. Enjoy!

I am not sure how to compare to the FX5500. All I know is with XP and the best current available drivers, its about par with AMD X1250 IGP.

The drivers are really plaguing Intel. I hope they get to fix it because it looks like it'll be the basic architecture for couple of years to come.

This is how it performs with my system:

E6600
2xDDR2-800
DG965WH(GMA X3000)
Windows XP SP2
Intel Graphics driver 14.31.1

Here's my new benchmarks with 14.31.1. It doesn't compare with previous driver versions but should give you an idea of the capability of the G965 graphics.

My G965 benchmarks

System:
Intel Core 2 Duo E6600
Intel DG965WH
2x1GB Transcend DDR2-800 5-5-5-15
WD 360GD Raptor for OS and main hard drive
160GB Seagate 7200RPM 8MB SATA2 300 for newer games
14.31.1 graphics driver(aka 6.14.10.4864)
Windows XP SP2

Company of Heroes

800x600(can't run at higher res as my monitor can only optimally support 1024x768. Some games run 1024x768 some don't)
Everything at High Settings, AA off

Average 10.7
High 27.5
Low 3.6

1024x768

Average 27
High 77
Low 5.9

Model Quality-High
Texture Detail/Physics/Effects Density/Model Detail-Medium
Shader Quality/Reflections/Building Detail/Tree Quality/Terrain Detail/Effects Fidelity-Low
Shadows/Object Scarring-off

Average 27.1
High 73.7
Low 6.8

Everything low

Average 27.3
High 74.7
Low 6.8

Supreme Commander

Enabling Shadow Fidelity will slow things down a lot. Disabling that will help performance more than anything else. Everything low at default res(1024x768) runs the game at 20-30 fps and I didn't notice lag. It seems its somehow limited at 30 fps. Probably fillrate becomes the limit at low detail.

Everything High and Shadow Fidelity off will play at 6-12 fps. It is possible to play if you want to .

The optimal setting for performance and image quality would be setting fidelity to low and then manually setting level of detail and terrain to medium/high. It doesn't impact performance too much. You can also play with Medium Fidelity and low/medium LoD and terrain detail.

With Anandtech's uATX(http://www.anandtech.com/mb/showdoc.aspx?i=3072) benchmark settings I get 8.366 average fps, which is on par with other IGPs because I use a faster CPU, and Anandtech uses faster memory.

Command & Conquer 3

Command & Conquer 3 will need low settings to be playable. No need to put it ultra low, but just low. Putting the overall quality adjustment bar to all the way low will mean you'll get around 30-40 fps. Putting it to 2nd lowest setting is the most optimal Auto setting imo, that results in 25-30 fps. You may want to experiment with it for better visual quality/performance

I noticed 50% performance improvement in Company of Heroes with 1024x768 and MQ settings shown above going from 14.31 to 14.31.1.

Age of Empires 3

It runs superb at 1024x768 with Shader Quality at Medium and everything on. It gets 25-35 fps. If you want to push it you can get to High and get 15-20.

Battlefield 2

Has minor graphical glitches on the terrain(also mentioned by Intel). It gets 35-45 fps with 800x600 low quality, and 15-20 fps with 800x600 medium quality. I need to get timedemo like thing for this so I can test it properly lol

Bioshock

It reboots computer with a bluescreen after a while. It gets around 10-15 fps with 800x600 everything low.

World in Conflict

For some reason I can't get it to load the menu before the computer restarts with a bluescreen.

Wolfenstein Enemy Territory

1280x1024 Normal
-34.7

High
-32.4

Half Life 2 gets 30-50 fps with 1024x768 everything High. It's a Demo though so I don't know if there will be any difference. Various threads I have seen doesn't seem to differ too much from me.

Warcraft III got 57 fps with 14.31.1 driver with custom playback at 8x speeds using FRAPS. The 14.32 driver gets 49 fps.

14.32 is the latest driver-Intel has the driver currently pulled because it has numerous bugs. I know with tests that the Battlefield 2 texture corruption doesn't exist with the driver though, and I am using it right now without much problems, luckily.

Additional info: Intel has found that some games run faster with software T&L than hardware T&L. So starting with the 14.31 driver, it has the ability to change between software and hardware T&L depending on the app(well it isn't really dynamic, it has a fixed input in the registry that sets certain games to a certain setting).

In Vista, its pretty chaotic, especially with the laptop X3100 chip. I hope they get this fixed.

EDIT2: The idea I got was that its a pretty good IGP-when the games are more modern. Games like Farcry/BF2 seems to run well. But go older like WoW, WC3, Civ4, Q3A, and while its acceptable to play, I wouldn't say its faster than the GMA950.

Generally, the more shader intensive than pixel bound it is, it'll perform that much better.
 
Last edited by a moderator:
Don't get GMA 3100 confused with the GMA X3x00 series. GMA 3100 is more like GMA 950, architecturally. With regards to performance they are both pretty bad, however. No real advantages between either, AFAIK.
 
Swaaye, there's no non-X version on mobile. Just seems that for some reason that the X3100 is somewhat different architecturally from the X3000. Because people with X3100 have more problems than ones with X3000.
 
Between GMA 3x00 and GMA X3100? X3x00 has the unified shader architecture that supposedly gives it better Direct3D compatibility. X3x00 has hardware vertex processing which hopefully means some games that wouldn't let you use DX8/9 effects with GMA 900, 950, 3000, and 3100 now will. Both are about the same speed though; very slow. GMA 950 and newer work great for Vista Aero, however.
 
There's 4 versions of GMA 3xxx/X3xxx IGPs.

GMA 3000
GMA X3000
GMA 3100
GMA X3100

The GMA 3000, X3000 and X3100 is basically all same architectures. GMA 3000 is the version with disabled vertex shaders, and OpenGL support for 1.4, X3000 is the desktop version with full SM3.0 support, OpenGL1.5, and X3100 is the mobile variant. They all have unified shaders. Oh, and the X variants have Clear Video.

GMA 3100 is a different one from that. I found from a user that has the GMA 3100(G33 chipset) that its basically a GMA950 core with Clear Video.
 
It definitely was my impression that GMA 3000 was NOT unified, and simply based on their previous architecture. Do you have any sources/links for that? Thanks! :)
 
It definitely was my impression that GMA 3000 was NOT unified, and simply based on their previous architecture. Do you have any sources/links for that? Thanks!

Yes, both from Intel's site and from talks with a user having GMA 3000.

http://softwarecommunity.intel.com/articles/eng/1487.htm

Under Business SKU vs. Consumer SKU

GMA 3000 and GMA X3000 is both put as 8 EU's(Execution Units).



One major thing about the Gen 4 and their previous generation IGPs is that the Gen 4 doesn't have Zone Rendering. I know that Zone renderer based IGPs reach pretty close to maxing their theoretical fillrate with 3dmark tests. So I found a user with a 946GZ, which is a GMA 3000, and he also got a G31, which from the naming sounds like a GMA 3100.

Here was his response

g31/dd2-800/e4400 on left, 946gz/ddr2-667/e6300 on right

3dmarks 2150 1764

game 1 80.2fps 61.1
game 2 13.7 fps 11.9
game 3 12.8 fps 10.1
game 4 11.8fps 10.4

cpu test 1 77.5 fps 67.7
cpu test 2 16.7 fps 14.6

single fill rate 1503 mtexel 731
multtexturing 1506 mtexel 1696
vertex shader 4.4fps 5
pixel shader 12.8 fps 12.7
rag doll 8.6 fps 8.9

G31 behaves similar to the GMA950 and 946GZ is similar to the G965.

946GZ and G965 with single channel memory should score less in fillrate tests than dual channel tests, while GMA950 based ones shouldn't be affected(I know this because I had a GMA950 motherboard, and used a Celeron D with single channel RAM).

946GZ SC:
game tests

g1 42.2 fps
g2 8.9
g3. 7.9
g4 8.2

cpu tests

test 1 56.4 fps
test 2 13.1 fps

feature test
fillrate single 487.8 mtexel/sec
fillrate multi 1647.7 mtexels
vertex shader 5 fps
pixel shader 2.0 9.5fps
ragtroll 6.4 fps.

946GZ DC:
game tests

g1 61.1fps
g2 11.9
g3. 10.1
g4 10.4

cpu tests

test 1 67.7 fps
test 2 14.6 fps

feature test
fillrate single 731.4 mtexel/sec
fillrate multi 1696.3 mtexels
vertex shader 5 fps
pixel shader 2.0 12.7fps
ragtroll 8.9 fps.

Now isn't that interesting :).
 
If you look at how the specifications for GMA 3000 and 3100 just happen to align with GMA 950 instead of GMA X3000, I thought it was pretty clear the non-X's are just a refreshed GMA 950...

GMA 950 actually has outperformed GMA X3000 on tests I've seen. :) I set up a desktop C2D with 945G and the GMA 950 benched almost as fast as a Radeon 8500 in 3dmark2001. Too bad it has big compatibility problems, with games not recognizing shader support and such. Morrowind wouldn't give me DX8 pixel water, and Max Payne 2 didn't see it as a DX9-compliant card.
 
If you look at how the specifications for GMA 3000 and 3100 just happen to align with GMA 950 instead of GMA X3000, I thought it was pretty clear the non-X's are just a refreshed GMA 950...

Yea well I saw two different cores for the 3000 and 3100 :).

GMA 950 actually has outperformed GMA X3000 on tests I've seen. I set up a desktop C2D with 945G and the GMA 950 benched almost as fast as a Radeon 8500 in 3dmark2001. Too bad it has big compatibility problems, with games not recognizing shader support and such. Morrowind wouldn't give me DX8 pixel water, and Max Payne 2 didn't see it as a DX9-compliant card.

In older games the X3000 sucks, but it blows it away on the newer games. On games like Half Life 2 and Farcry(which GMA950 has no compatibility problems), its 2-3x faster with the newer drivers.

Of course, GMA950 is faster on older games like say, Quake 3. It's probably because it has more fillrate than the X3000. It's like they sacrificed top frame rates+performance in older games to maintain playability on the newer games.
 
Update on Battlefield 2 performance. I have said:

"Battlefield 2

Has minor graphical glitches on the terrain(also mentioned by Intel). It gets 35-45 fps with 800x600 low quality, and 15-20 fps with 800x600 medium quality. I need to get timedemo like thing for this so I can test it properly lol"

I used FRAPS for testing. Then I heard one guy saying from another forum that without FRAPS it runs better. Well, so I cranked the details to all High and put the resolution to 1024x768. Guess what?? It goes from 5-15 fps with FRAPS running(too much variance to be playable), to playable without. It lags a bit in the beginning but after few minutes, most of the initial lags disappear. I'd assume I am getting 15-20 fps.

FRAPS impacts the performance of the X3000 much more than I thought. It's not a good benchmark program, at least for the IGP. 14.32 driver also eliminates the texture glitch present on the previous drivers.
 
Well, I guess I am the only one that cares enough to benchmark anyway. Here's one for Half Life 2: Lost Coast.

Lowest
640x480: 40.7
800x600: 37.59
1024x768: 30.83

Medium(Model Detail/Texture Detail Medium, Water Detail: Reflect World, all else same as Low)
640x480: 41.66(not kidding, it got little better than lowest, ran lowest again to confirm)
800x600: 35.17
1024x768: 29.39

High(Model Detail/Texture Detail/Shader Detail/Water Detail/Shadow Detail, Trilinear Filtering)
640x480: 37.37
800x600: 33.52
1024x768: 27.09

High+HDR+Bloom+Anisotropic Filtering 16x
1024x768: 19.03
 
Oh, no continue posting benchmarks. I only have a Macbook (GMA950) so I can't confirm nor deny your X3100(or the like) scores. But I do enjoy reading them.
 
Back
Top