My Intel X3100 does DX10 :)

Funny thing I noticed:
I installed Windows 7 RC1 on my laptop because I wanted to try an x64 OS on it.
And it actually came bundled with a WDDM 1.1 driver for the X3100.
I also recently installed Windows 7 on my X61 tablet (with X3100), and I noticed that Osu (the only game I really run on it) runs at ~170 FPS average with the same settings that get ~105 FPS on Vista with the same driver version.
 
I tried 3DMark05 and 3DMark06 on my laptop, and the scores in Win7 were much lower than in Vista.
But Vista is 32-bit and Windows 7 is 64-bit, so I'm not sure if it's all because of the drivers themselves. I've never tried Vista 64-bit on that machine.
 
Mine are both 32 bit versions. I'll try 3DMark05 when I get the time (Note that I wouldn't be too surprised if I get similar results to yours, I expect that Osu has a very different gpu utilization pattern compared to a normal 3D game)
 
All I can say is that on my desktop PC I also have Vista and Windows 7... but they are both x64, and I use a GeForce 8800GTS there. In that case, Windows 7 does score slightly better in 3DMark06, and about equal in Vantage.
But I used the latest nVidia beta drivers, not the bundled Windows 7 drivers. I don't think I can get newer Intel drivers for Windows 7 anywhere... and even if I could, I wonder if Intel put in any time to optimize the drivers like nVidia has.
 
I tried 3DMark05 and 3DMark06 on my laptop, and the scores in Win7 were much lower than in Vista.
But Vista is 32-bit and Windows 7 is 64-bit, so I'm not sure if it's all because of the drivers themselves. I've never tried Vista 64-bit on that machine.

Could it be because Vista drivers have the registry code for software VS installed while Windows 7 doesn't? Meaning on Windows 7, its running on hardware VS and Intel said that software VS is faster on 3DMark(including bunch of people using X3100 online).

Well, I found out something even more funky...
I thought I'd deinstall the driver and do a clean reinstall.
After that, my OpenGL support dropped down to 1.1 and only lists 2 extensions.
Previously I had about 63 extensions and 2.0 support.
Looks like Intel broke something.

I've read about that. Though I forgot how the user fixed it, there definitely is driver initialization problem that might be intermittent.

Here's my report with GPU Caps viewer 1.70

Code:
<?xml version="1.0" encoding="ISO-8859-1" standalone="yes"?>
<!--**********************************************************
* GPU Caps Viewer v1.7.0
* (C)2007-2008 Jerome [JeGX] Guinot
* http://www.ozone3d.net/gpu_caps_viewer/
***********************************************************-->

<gpu_viewer>

	<system>
		<cpu 
			name="Intel(R) Core(TM)2 CPU          6600  @ 2.40GHz"
			core_speed_mhz="2397"
			num_cores="2"
			family="6"
			model="15"
			stepping="6"
		/>
		<memory 
			megabytes_physical_amount="2021"
		/>
		<operating_system 
			description="Windows XP build 2600 [Service Pack 3]"
			directx_version="9.0c"
			physx_version="0"
		/>
	</system>
	<graphics_controller>
		<renderer 
			model_from_opengl="Intel 965/963 Graphics Media Accelerator"
			model_from_drivers="Intel(R)  G965 Express Chipset Family"
			model_from_db="Unknown"
			device_description="Intel(R)  G965 Express Chipset Family"
			adapter_string=""
			vendor="Intel"
			vendor_id="32902"
			device_id="10658"
			drivers="6.14.10.5016 (12-12-2008)"
			gpu_codename="n.a."
			gpu_unified_sp="0"
			gpu_vertex_sp="0"
			gpu_pixel_sp="0"
			video_memory_size_megabytes="384"
			bios_string="Intel Video BIOS"
			cur_display_mode="1280x1024 @ 60 Hz - 32 bpp"
		/>
		<opengl_caps 
			opengl_version="156903792"
			glsl_version="1.10  - Intel Build 7.15.10.5016"
			arb_texture_units="8"
			vertex_texture_units="16"
			pixel_texture_units="16"
			geometry_texture_units="0"
			max_texture_size="2048"
			anisotropy="0"
			point_sprite_size="0"
			num_dynamic_lights="16"
			max_viewport_size="2048"
			max_vertex_uniform_components="512"
			max_fragment_uniform_components="1024"
			max_geometry_uniform_components="0"
			max_varying_float="41"
			max_vertex_bindable_uniforms="0"
			max_fragment_bindable_uniforms="0"
			max_geometry_bindable_uniforms="0"
			fbo="yes"
			max_mrt_draw_buffers="7"
			pbo="yes"
			s3tc="yes"
			ati_3dc="no"
			texture_rectangle="yes"
			floating_texture="no"
		/>
		<opengl_extensions num_extensions="69" >
			<li>GL_EXT_blend_minmax</li>
			<li>GL_EXT_blend_subtract</li>
			<li>GL_EXT_blend_color</li>
			<li>GL_EXT_abgr</li>
			<li>GL_EXT_texture3D</li>
			<li>GL_EXT_clip_volume_hint</li>
			<li>GL_EXT_compiled_vertex_array</li>
			<li>GL_SGIS_texture_edge_clamp</li>
			<li>GL_SGIS_generate_mipmap</li>
			<li>GL_EXT_draw_range_elements</li>
			<li>GL_SGIS_texture_lod</li>
			<li>GL_EXT_rescale_normal</li>
			<li>GL_EXT_packed_pixels</li>
			<li>GL_EXT_separate_specular_color</li>
			<li>GL_ARB_multitexture</li>
			<li>GL_EXT_texture_env_combine</li>
			<li>GL_EXT_bgra</li>
			<li>GL_EXT_blend_func_separate</li>
			<li>GL_EXT_secondary_color</li>
			<li>GL_EXT_fog_coord</li>
			<li>GL_EXT_texture_env_add</li>
			<li>GL_ARB_texture_cube_map</li>
			<li>GL_ARB_transpose_matrix</li>
			<li>GL_ARB_texture_env_add</li>
			<li>GL_IBM_texture_mirrored_repeat</li>
			<li>GL_EXT_multi_draw_arrays</li>
			<li>GL_NV_blend_square</li>
			<li>GL_ARB_texture_compression</li>
			<li>GL_3DFX_texture_compression_FXT1</li>
			<li>GL_EXT_texture_filter_anisotropic</li>
			<li>GL_ARB_texture_border_clamp</li>
			<li>GL_ARB_point_parameters</li>
			<li>GL_ARB_texture_env_combine</li>
			<li>GL_ARB_texture_env_dot3</li>
			<li>GL_ARB_texture_env_crossbar</li>
			<li>GL_EXT_texture_compression_s3tc</li>
			<li>GL_ARB_shadow</li>
			<li>GL_ARB_window_pos</li>
			<li>GL_EXT_shadow_funcs</li>
			<li>GL_EXT_stencil_wrap</li>
			<li>GL_ARB_vertex_program</li>
			<li>GL_EXT_texture_rectangle</li>
			<li>GL_ARB_fragment_program</li>
			<li>GL_EXT_stencil_two_side</li>
			<li>GL_ATI_separate_stencil</li>
			<li>GL_ARB_vertex_buffer_object</li>
			<li>GL_EXT_texture_lod_bias</li>
			<li>GL_ARB_occlusion_query</li>
			<li>GL_ARB_fragment_shader</li>
			<li>GL_ARB_shader_objects</li>
			<li>GL_ARB_shading_language_100</li>
			<li>GL_ARB_texture_non_power_of_two</li>
			<li>GL_ARB_vertex_shader</li>
			<li>GL_NV_texgen_reflection</li>
			<li>GL_ARB_point_sprite</li>
			<li>GL_EXT_blend_equation_separate</li>
			<li>GL_ARB_depth_texture</li>
			<li>GL_ARB_texture_rectangle</li>
			<li>GL_ARB_draw_buffers</li>
			<li>GL_ARB_pixel_buffer_object</li>
			<li>GL_WIN_swap_hint</li>
			<li>GL_EXT_framebuffer_object</li>
			<li>WGL_ARB_buffer_region</li>
			<li>WGL_ARB_extensions_string</li>
			<li>WGL_ARB_make_current_read</li>
			<li>WGL_ARB_pixel_format</li>
			<li>WGL_ARB_pbuffer</li>
			<li>WGL_EXT_extensions_string</li>
			<li>WGL_EXT_swap_control</li>
		</opengl_extensions>
	</graphics_controller>
	<graphics_drivers>
		<driver url="http://www.geeks3d.com/?page_id=752" />
	</graphics_drivers>
	<graphics_cards_reviews>
			<review url="http://www.geeks3d.com/?cat=3" />
	</graphics_cards_reviews>

</gpu_viewer>
 
Last edited by a moderator:
Well, I don't want to create a whole new topic for this... but...
My Intel X3100 does DX11 :)
Yea, okay, it only does it at DX10 downlevel, and currently there's no DirectCompute support... But nevertheless, I got the Platform Update for Windows Vista working, and installed the latest DirectX SDK and runtime, and the DX11 stuff runs :)

Okay, the beta runtimes from the March SDK ran aswell, but I forgot to mention that, I think :)
I've been using them for a while to develop my D3D11 engine anyway. It was quite practical that it worked on my laptop aswell, so I didn't need to be at my desktop to do some work on D3D11.
 
For those who haven't seen it yet, Anandtech did a bit of GPU-overclocking in their Core i3 review:
http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=3724&p=4
It actually works! It's still the same basic architecture as the X3100 that I started this topic on... but as far as I know, it wasn't possible to overclock them. Apparently things are different now that they're integrated in the CPU. And it pays off too... they managed to get it to 1200 MHz, at which point it becomes a very good alternative to nVidia's or AMD's integrated graphics offerings.
 
Same basic architecture maybe, but the additions related to HSR REALLY helped performance. They are calling it Gen 5.75(4500 being Gen 5), but performance-wise, its like Gen 7. The lack of decent HSR methods is probably why 3100/4500 was so bandwidth sensitive and performance was below pathetic in certain games.

That was also probably why hardware VS performance suffered too. When it was already lacking in bandwidth, it didn't help that more would be required to do geometry processing on hardware. Hence, it helps to do Software T&L offloading to the CPU. Even with the CPU offloading capability, it was generally true the hardware VS performance on benchmarks like 3DMark reflected its real world gaming capabilities better than software VS.
Here's some benchmarks I did on my Core i5 661(I'm using the iGPU)

Core i5 661 GMA HD

3DMark01 fillrate
1.5GTexels/s Single Texture
3.55GTexels/s Multi Texture

(Hardware VS/Software VS)
3DMark01-12919/12007
3DMark05-4006/2245/3996
3DMark06-2089/1716

Old GMA X3000

3DMark01 fillrate
590MTexels/s Single Texture
1.4GTexels/s Multi Texture

(Hardware VS/Software VS)
3DMark01-4357/7607
3DMark05-896/1464
3DMark06-545/723

I'm happy to say that hardware VS is much faster than software now. Its often 2x faster. The new drivers also enable easy switching between the two, not that I need it as hardware is so much better.
 
Back
Top