XP Media Center - Start automatically?

Sobek

Locally Operating
Veteran
Hey guys.

I was given a small HTPC system by a friend yesterday to tinker with, as i'm shortly going to be building one of my own for use in at our office to display pictures and video to potential customers, and the MCE interface has always caught my attention, so i've decided on that.

My question is, is there any way to make the Media Center interface run on startup? So...from the moment it logs in, it just opens straight to the Media Center?

That's about it, Google is proving fruitless in my search, hope someone has a solution.. thanks!
 
Ok..having another issue with it now.

The system MCE is installed on is an all-in-one s478 board, with a Celeron D 2.26ghz and 512mb ram. Latest drivers for the Intel onboard (14.7), along with chipset and audio. Now MCE works perfectly at EVERY resolution except 1680x1050..the native res of the screen i'm going to be using here at work.

Can anyone offer any suggestions as to why that mode alone results in a black screen? I get a mouse cursor, and I can double click on the border of the screen to reduce the MCE GUI to a window, and that works...I can also close it by hitting the (invisible) 'X' at the top right. But in fullscreen mode...blank, nothing.

:???: :?: :???:
 
Still not working. So far i've tried :

Reinstalling all drivers, Audio, Video, Chipset, Codecs (ffdshow, Xvid). I've tried adjusting accleration options in DXDIAG in the hopes that it would somehow provide an indiction that the problem lays with the onboard video (no spare videocard to test at present, wouldn't want to in this case anyway).

Still...every single damn resolution works just perfect, except 1680x1050. I can interact with it and all, I just can't see a damn thing. The HELP screen seems to work just fine through the GUI though..what the hell gives.

*edit*...ok, so I reinstalled the video drivers again, went into DXDIAG and disabled Direct3D acceleration..and lo and behold, it all works fine...albeit at a slideshow pace (also, whenever I run it with this disabled it gives me the annoying 'your card is not supported' garb). Re-enabling it results in the blank screen again.

I can't understand where the problem is...
 
Last edited by a moderator:
This problem sounds weird, though I suppose that the problem must be with display drivers. I tried googling a bit and couldn't find anything useful, maybe except this:

turns out that MCE gives me a black screen if refresh rate is not 60
 
Ok..I gave up and took an FX 5200 out of one of my systems and threw that in (crammed it in, more like it). And magically enough, everything works fine (so I gues intel onboard is perfect for everything but MCE at 1680x1050? pathetic).

Now my problem is that I can't just use DVI. If I only hook up a DVI->DVI cable from the screen to the FX, I get a blank screen and can never see anything, from the moment the system powers on to Windows Desktop. If I then plug in the analog connection, the screen comes on, and all's good. So then, if I try selecting the Digital (DVI) input in the screens OSD, it just blanks out and shows the message "No Digital Signal" (as if the cable were unplugged). I tried hitting OK on the Digital input then quickly unplugging the Analog, for the hell of it...but that made no difference. I've tried Clone display mode in the Nvidia drivers, but it still only ever outputs an image to the Analog connection.

Just one thing after another eh...any ideas?

*edit* Ugh..it just gets wierder.

Before, when I tried to set Clone Mode, not only did it not seem to affect the DVI input, but it also limited me to 1600x1200 on the resolution. Now I just tried clone again with both the DVI and Analog plugged in (Display has both inputs available, without the Analog in, I can't see anything to switch to DVI, and without the DVI in too, it of course doesn't detect it and i'd never be able to switch to it). Now, Clone mode is allowing me to use the normal 1680x1050 without hassle, but I STILL can't switch over to the DVI input. When I select Digital input in the screens OSD, the screen flashes, as if it were changing over, then just goes back to Analog (as evidenced by the little blue box that appears at the top right saying 'Analog'). If I try and set the DVI as the primary screen in Clone mode (or just single display, even), it does the same thing..acts like it's switching to DVI, then pops back to Analog within seconds. Sometimes when i'm fiddling with it and just clicking god knows what, it'll try to switch, but end up saying "No DVI signal", even though the bloody cable is plugged in!

This is pissing me right off...A natively DVI screen that doesn't want to use DVI? Geez.
 
Last edited by a moderator:
Related to your older problem, I eventually found this:
I remember a similar problem with my old video card. MCE was visible in a window, but when I went full screen, no picture (black). I think that MCE was "optimizing" my refresh rate. I fixed it by "locking" the settings with Powerstrip.

Your new problem could be related to dvi-cable. Are you sure it's working fine? I've seen cases where monitor / projector works just like you described because the cable signal wasn't strong enough. Either the cable was too long or it was faulty. Though I found this:
NVIDIA cards will not work with the DVI information (EDID) received from some monitors and there is little you can do as side from leaving another monitor or a vga dummy hooked up to clone for some resolutions to work, I
have a set that works with ATI nothing extra required, but will not work with any NVIDIA card that I've tried with recent drivers unless I clone a monitor to get any resolution over 800X600.

and this:
With DVI the monitor only works with timings the EDID has programmed, and it
could well be a bad setting is his problem, on the other hand if the list of modes is not correct it will stops logical resolutions from being loaded. I've have a LCD monitor that has a good chip and EDID programming and it works at any display setting listed, the one with a poor chip or programming has modes listed that don't exist. My LCD with proper EDID has worked with ever card I've ever hooked it up to first time ever time at 1280X720 60 or any display setting listed.
It's not a NVIDIA problem, NVIDIA is just stricter about using the display modes as sent over the DVI connection, with some monitor it just not relayed properly. There is a program that reads the EDID http://www.entechtaiwan.com/util/moninfo.shtm and to my surprise the software reads the programming on the monitor that does not work properly and all appears fine but the listed modes are very strange. With some monitors tweaking won't help. Other manufacturers may get around

this by allowing for logical display setting to be used without strict regard to the EDID info.

Not sure if those are helpful.
 
They might just be usefull yet...

I just arrived in at..'work'..for the day, so i'm going to go have some fun with that system shortly. Food being a priority.

The cabling i'm using is the stock DVI->DVI and VGA->VGA cables that came with the monitor. I've bought in an extra DVI cable to test out, some over-expensive, 'titanium series' gold plated cable..fingers crossed. :)
 
Back
Top