Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This has nothing to do with "Linux". The monitor is exposing a bad ID in its configuration block. Monitor says "I support 140Hz at 4k", so Linux (strictly the kernel framebuffer driver in use) says "OK then, that's the best one, give me that". And it doesn't work, because the Monitor lied.

And the reason it works on Windows or (sometimes) MacOS is just that those systems have arbitrarily different default choices (e.g. "see if it has 60 Hz and use that"). And it happens to work with Windows because the monitor manufacturer bothered to test with windows (and, sometimes, MacOS), where they didn't with Linux.

This happens everywhere in the tech industry. No one does QA to the standards. It's 100% routine to see devices with PCI capabilities advertised that don't work, to see phantom ACPI entries for hardware that doesn't exist, to see EFI BIOSes exposing function tables for capabilities that never worked and weren't tested, USB devices which advertise a standard class but which only work with a proprietary driver, etc...

And the only reason anything works anywhere is that, at the end of the day, they plug it in and test. And on Linux they don't.



OP is using an LG Ultragear gaming monitor which support the preferred resolution reported in the EDID "3440x1440 143.923 Hz", so this definitely has something to do with linux doing something different than windows/macos.


We don't know that. OP never states what monitor he has, and "LG Ultragear" isn't a model. There's a range of models under that moniker, and if you look at LG's website, the max refresh tops out at 240Hz on some of them … and at 120 Hz on others.


I completely agree with you in principle. It's rarely an actual fault of anything in Linux. However the outside effect to me as a user is that I need to debug my monitor's EDID to get it to work correctly. Inconveniently for me (as much as I love digging into things like that) I really want to just plug in a conference room projector into my laptop and have other people already put in place out those workarounds for me sometimes.


> I really want to just plug in a conference room projector into my laptop and have other people already put in place out those workarounds for me sometimes.

Then you should definitely use Linux more, and buy hardware that explicitly supports Linux.


Ever had use hardware that you didn't buy yourself? I don't have a choice to select Linux friendly projectors at conferences. I still wanted to have the same experience of just plugging a cable into my computer and everything working as MacOS users have.


> Ever had use hardware that you didn't buy yourself?

Yes, of course. The point wasn't that I would buy a device for my own use (though that is kind of appealing these days), but rather that your choices influence what the market provides.

If you want display devices to support Linux, you need to preferentially buy devices that explicitly support Linux, and make that fact known to vendors as well as you can.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: