Monitor driver / EDID expertise needed

tabs

Veteran
Supporter
I hate to do this, but this problem has been with me for a few months now and the lacklustre support I've had from LG regarding the issue means I'm a bit stuck, and my google-fu is weak with this one.

Basically my monitor has a problem with DVI, in that windows doesn't recognize that there is even a monitor there.

It does not do this on a fresh install. However I do get prompted to reboot, to presumably use what windows thinks is the correct driver for the monitor. At this point DVI stops working.

Things I've tried:
  • two different graphics cards
  • manually uninstalling the driver
  • manually installing the correct driver (lots of ways I can't even remember now)
  • monitor on a different windows pc (does the same thing)

My monitor works just fine with DVI as long as it's not being used by windows. My Raspberry Pi works with it. BIOS works with it.

I am currently of the belief that I have somehow fried the EDID and this is stopping windows doing what it wants to do properly.

Sadly this monitor is a 120hz jobby that I cannot get the benefit of via HDMI. It's also a pain as I'd like to keep that HDMI output for my projector and not have to swap them over constantly.

The graphics board I was using when the monitor started doing this has had that particular output die completely, so I presume in one of its dying throes it damaged the monitor.

What I would like to try is to basically get windows to not try to use the EDID, or the driver that it wants to use, and use something else. Does anyone have any knowledge on this?
 
Typically you want to play with software such as Powerstrip. See if you can read the EDID or fail to read it, use a "fake" EDID which is maybe done by cooking a monitor "driver".. Finding the monitor's original EDID content or obtaining it on some web forum from someone who runs the same monitor might help ; or just tell Powerstrip/Windows/graphics driver to ignore EDID and send a signal that should give 1080p 120Hz. If that dreaded EDID doesn't want to play nice, maybe we don't need that fscking EDID :p!. Not sure if Windows will still think there is no monitor, or if it can be coerced into thinking so. More than expertise, it's a matter of bashing stuff in till it works and so I'm not sure that stuff above will work.

Here's a copy-pasta of using the built-in timings calculator on a linux command line, with what I think are the wanted results in bold. Powerstrip should have such a calculator as well.

$ cvt 1920 1080 120
# 1920x1080 119.93 Hz (CVT) hsync: 139.12 kHz; pclk: 369.50 MHz
Modeline "1920x1080_120.00" 369.50 1920 2080 2288 2656 1080 1083 1088 1160 -hsync +vsync

A linux mint live USB can also be experimented with (with the "joice" of using commands like xrandr, xrandr --newmode, xrandr --addmode, arandr and "sudo service mdm restart") but it's not something very good for your time or sanity.
 
Last edited by a moderator:
Aha. Thanks for reminding me of other things I'd forgotten to add to the above list of 'tried'.

I did have a go with Powerstrip but it couldn't read the EDID. In fact it couldn't recognize that there was a monitor at all. Sadly powerstrip needs a reboot to work so I couldn't even try this on a fresh install.

I did try to 'fake' it with some numbers from what little I recall, but I couldn't honestly say if I did it right as it didn't work. IIRC the problem was that windows just refused to acknowledge the monitor even existed.

I haven't sourced the EDID. After searching the net and asking LG I gave up on that. I also became under the impression that my newer nVidia card was unable to write EDIDs to monitors (apparently this stopped working with the 4xx series and beyond?)

I thank you for that list of reasonable timings, though at present I'm unsure what to do with them. I simply can't get windows to recognize that the monitor exists at all, until I use HDMI with it and at that point it may as well be a different monitor.


FWIW if anyone happens to have one... the monitor is an LG2363D and I want your EDID!
 
Last edited by a moderator:
Maybe you would need some resistor mod on the DVI connector to make Windows believe a monitor is connected. Whether such idea is worth recommending I don't know. And if it works, maybe the monitor itself is borked in a way that dual link or 120Hz operation won't work.

With a simple DVI to HDMI adapter, maybe you can plug projector or monitor in HDMI mode on that DVI connector and it would all be salvaged, only stuck at 60Hz.
 
Yes. Safe mode works fine over DVI. I can boot fully into windows and use it as intended. As an aside, powerstrip doesn't work in safe mode (at least for me).

The exact point that a normal boot-up breaks is the login screen, which it does not render. Just no signal from then. Before that though I get all the joys of POST and watching the windows logo undulate a bit.
 
Have you tried using default generic Windows gpu driver in normal mode? (just uninstall the current driver, plug only that monitor, reboot).

Also try to uninstall all residual detected drivers for gpu and monitor ("There a setting in decide manager to show unplugged devices and allow you to uninstall them)

Edit.
I just remembered, on very old CCC, when it's still branded as ATI, you can force-enable display port even if it's unplugged. It's hidden in (+) button.

I was using that for my dual screensetup where my vga monitor that have a bit broken cable. So my desktop did not changes mode automatically when it disconnected.
 
Last edited by a moderator:
Yes the generic driver exhibits the issue, as does LG's own driver.

I've in fact done a complete reinstall (format) so there were no dedicated drivers hanging around. No joy there. For the first boot, it loads into windows beautifully; at which point it says in no uncertain terms that a reboot is needed. Once that's done the issue is there.
 
I'd love to try, but I cannot set resolutions for monitors that windows refuses to acknowledge the existence of.

It's not an out of range error AFAIK, my monitor is pretty good about telling me when those happen.
 
Basically you set the res when you connect it with other monitor. I think you can even set it while being connected to the monitor using HDMI then switch the connection to DVI.
Edit: btw, what is the spec of your PC? I assume you're using Windows 8.1 with AMD GPU?
 
Last edited by a moderator:
Well.. it's impossible to set 640x480.. I don't think windows has supported that res since XP.

I am using win 7 and an nvidia gpu. :D

Changing to 800x600 and unplugging HDMI...

I am now typing this edit using the DVI connection! It's detected as a generic monitor. I suspect when I reboot it will not work again.. but here goes..
 
Right... so rebooting was weird. I rebooted with it still set to 800x600

It reboots... I can actually see to type my password in.. Gets into windows.. I see my desktop.. then moments later... blank screen.

This gives me some hope that I can do something more useful with Powerstrip now. Hopefully I can boot with HDMI.. go 800x600, switch to DVI.. go to powerstrip and do.... something with it.
 
I think you might have a hdcp problem. Maybe you can check for hdcp with something like cyberlink bd hd advisor.
 
Can you use another PC to remote into the machine? I had a similar problem when a TV wouldn't turn on after post when it was connected through a reciever with a DVI->HDMI cable from the PC.

Using VNC with the TV "unplugged" (according to Windows) allowed some options to force settings for the primary display that were not available when an actual active one was hooked up. Brought it back to life and it has been working OK since.
 
I think you might have a hdcp problem. Maybe you can check for hdcp with something like cyberlink bd hd advisor.

HDCP wouldn't prevent the monitor from being recognized, only prevent the playing of HDCP protected material.

It's most likely a problem with the graphics card driver being unable to either read the EDID info of the monitor or the EDID being corrupted in such a way as to prevent the graphics driver from interfacing with the monitor properly.

Back to the OP.

Right... so rebooting was weird. I rebooted with it still set to 800x600

It reboots... I can actually see to type my password in.. Gets into windows.. I see my desktop.. then moments later... blank screen.

This gives me some hope that I can do something more useful with Powerstrip now. Hopefully I can boot with HDMI.. go 800x600, switch to DVI.. go to powerstrip and do.... something with it.

Windows 7 and 8 and I believe Vista include default drivers for pretty much any Nvidia or AMD video card that was released up until the time they were released to duplication. So just removing the video card driver and rebooting will only result in the appropriate Nvidia/AMD provided default driver to load. Hence why safe mode works, as it uses the generic Windows driver.

You can try forcing the use of the basic display adapter by going into device manager to update driver - browse my computer for driver - let me pick... - Microsoft Basic Display Adapter. (it's unclear if this was the way you enabled the generic adapter or not)

Alternatively, you can try using MSconfig. Go to the Boot tab. And select the option for Base video. (this should be the same video as provided while in safe mode)

Either of those should be able to bypass the vendor provided driver for the video card (similar to safe mode) and allow you to try to troubleshoot your monitor without being in safe mode.

Good luck.

Regards,
SB
 
Actually, my PC can't play nice with Samsung TV. I can only connect my PC if I go to safe mode or using the lowest resolution. When either connected in safe mode or lower resolution and run the cyberlink advisor, I get something like fail hdcp. So no, failed hdcp can prevent device being connected. FYI, I have tried on 2 Samsung TVs, both failed. Connect it to Toshiba TV, working normally. As a bonus info, it only failed when I use HDMI out. Using DVI out (with DVI to HDMI adapter), it can connect fine with the Samsung TVs. I read that the DVI and HDMI have their own separate HDCP thing (as in they are not sharing the HDCP key or something). And yes, it pass the HDCP test with DVI out.
Now, might just be a handshake problem or the PC totally rejecting the TV, but the fact is I can't use HDMI out to Samsung TV. Since this problem can occur with various GPU, I assume this is MS (Windows) fault.
Another fun fact, it used to work! HDMI output was working with my Samsung TV, but suddenly it stopped working. It might have to do with the Windows update because even if I go back using older GPU driver, it still can't work. I finally gave up after trying various method (outside of reinstalling Windows) and use DVI instead.

His problem might be different than mine, but if I can connect only using the lowest resolution and failed the HDCP check, then it should be somehow related to HDCP.
 
Actually, my PC can't play nice with Samsung TV. I can only connect my PC if I go to safe mode or using the lowest resolution. When either connected in safe mode or lower resolution and run the cyberlink advisor, I get something like fail hdcp. So no, failed hdcp can prevent device being connected. FYI, I have tried on 2 Samsung TVs, both failed. Connect it to Toshiba TV, working normally. As a bonus info, it only failed when I use HDMI out. Using DVI out (with DVI to HDMI adapter), it can connect fine with the Samsung TVs. I read that the DVI and HDMI have their own separate HDCP thing (as in they are not sharing the HDCP key or something). And yes, it pass the HDCP test with DVI out.
Now, might just be a handshake problem or the PC totally rejecting the TV, but the fact is I can't use HDMI out to Samsung TV. Since this problem can occur with various GPU, I assume this is MS (Windows) fault.
Another fun fact, it used to work! HDMI output was working with my Samsung TV, but suddenly it stopped working. It might have to do with the Windows update because even if I go back using older GPU driver, it still can't work. I finally gave up after trying various method (outside of reinstalling Windows) and use DVI instead.

His problem might be different than mine, but if I can connect only using the lowest resolution and failed the HDCP check, then it should be somehow related to HDCP.

That's because HDMI requires HDCP handshaking in order to work, at least in pretty much all HDMI devices that I know of. No, HDCP = no image.

DVI and DP doesn't require HDCP in order to work. It only determines whether you can play HDCP protected content or not.

In the case of your HDMI no longer working it could also be that the HDMI cable is damaged.

Regards,
SB
 
Since the OP monitor detected as generic monitor, then there's a high probability that the problem is in the monitor EDID itself (maybe it become corrupted?). Each monitor input have its own EDID.
 
I remember that when I wanted to use 3D in Windows 8, I first had to switch on to allow unsigned drivers in a special shutdown boot menu. Only then could I make it take a different monitor driver (in my case an Acer driver for my LG passive 3D tv)
 
Back
Top