Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As a long-time Linux user who fairly recently dropped the Windows partition entirely, I do think the remaining chafing points are these:

* UI framework balkanization has always been, and remains a hideous mess. And now you don't just have different versions of GTK vs QT to keep track off, but also X vs Wayland, and their various compatibility layers.

* Support for non-standard DPI monitors sucks, mostly because of the previous point. Wayland has fractional scaling as a sort-of workaround if you can tolerate the entire screen being blurry. Every other major OS can deal with this.

* Anything to do with configuring webcams feels like you're suddenly in thrown back 20 years into the past. It'll probably work fine out of the box, but if it doesn't. Hoo boy.

* Audio filtering is a pain to set up.





> UI framework balkanization has always been, and remains a hideous mess

I thought you were talking about Windows there. There are 4 (5?) different UI paradigms within Windows, and doing one thing sometimes requires you to interact with each of them.

At least on Linux, with GTK/KDE, you can pick a camp and have a somewhat consistent experience, with a few outliers. Plus many apps now just use CSD and fully integrate their designs to the window, so it's hopeless to have every window styling be consistent.

I never had to mind X vs Wayland when starting user applications tho.


If we're talking about mass adoption of Linux then there really has to be no concept of even "picking a camp". The vast majority of users - even techy people - will not understand what a window manager is, never mind be capable of choosing one.

Yes, there are many UI implementations in Windows but they are almost totally transparent to the user (no pun intended), and they can all run on the same system at once.


Hard disagree. You can run the same programs on any DE or Window Manager or even without one (on pure X11 for example). That's not a hurdle it's a feature.

Users who don't know about the feature can just use a pre-configured system like Mint Cinnamon and never know about any of these things.


Yeah I wanted to say, for people who don't care, there's Linux Mint. (I used to spend all my time tinkering with the DE, now I prefer to spend zero!)

Except even with Linux Mint you have to choose which one ;)


Nope.

Linux user for decades, but headless since the early aughts. Decided to dip my toes back into the desktop space with Mint Cinnamon.

I can mirror or run lots of phone apps on Windows or macOS, but ironically, not Linux. I decide to run an Android emulator so I can use some phone-only apps.

I read up on reviews, then download and install Waydroid as the top contender.

Does Waydroid work? No. It fails silently launching from the shortcut after the install. Run it from the command line, and, nope, it's a window manager issue. Mint Cinnamon uses X11, not Wayland, and Waydroid apparently needs... Wayland support.

OK, I log out, log into Mint with Wayland support, then re-launch Waydroid. My screen goes into a fugue state where it randomly alternates between black and the desktop. Try a variety of things, and I guess this is just how it is. Google and try any number of fixes, end up giving up.

Yes, that's my old pal Linux on the Desktop. Older, faster and wiser, but still flaky in precisely the same ways.


You can't run X11 programs on Wayland without Xwayland.

Likewise you cannot run Wayland programs on X11 without a wayland compositor like Cage (a wayland kiosk) or Weston. Both run as a window on X11 inside of which Waydroid works just fine.

It's an odd complaint that incompatible software is incompatible.


You read the parent I was responding to, no? You're reinforcing my point.

"Users who don't know about the feature can just use a pre-configured system like Mint Cinnamon and never know about any of these things."


I did. I agree it's not obvious. But you cannot run OpenGL, Vulkan, Glide or DirectX on Windows either without having the proper hardware and software installed. So yeah. Waydroid needs wayland. Anbox runs on X11.

> OpenGL, Vulkan, Glide or DirectX on Windows either without having the proper hardware and software installed

Windows will run at least basic OpenGL and DirectX in software if you don't have hardware to accelerate that, and those software renderers are included as part of the OS. It'll run like garbage, but it will run.

https://learn.microsoft.com/en-us/windows/win32/direct3darti...


I think that type of user wouldn't go out looking for an Android compatibility layer.

Bluestacks works fine for this on PC and Mac, and I've seen casuals use that because they want to play their gacha game on a bigger screen.

Waydroid compleatly fails in comparison, while giving you no pointers on what the problem might be or how to solve it unless you're already a Linux power user.


Headless daily driver? Hardcore. What do you use for a browser?

I've tried it as a challenge for a couple of days (lynx, mutt, some other TUI stuff) and it made some things like Vim stick (although that may have as much to do with that challenge as Tridactyl did). But I couldn't last longer than a week. It does free you from the burden of system requirements. CPU: Optional.


w3m can even display images in a linux console if you have the proper drivers or use KMSCON. It unwieldy but surprisingly usable. And my laptop battery runs for 8 hours which is quite amazing for a Zen1.

> And my laptop battery runs for 8 hours

I imagine your display is almost entirely black for the majority of the time, with your (most probably) LCD backlight blasting away, trying its hardest to get a few thousandths of its light output through the few pixels on the screen that it can escape! XD


Brightness down, LAN card disabled (the media sense on RTL cards sucks about 1.5W with no cable plugged in, wtf? Thats more than the Wifi needs)

And powertop (great piece of software, thanks Intel) tuned to the max + powersave scheduler. All that on Windows or KDE results in about 4-5h of battery though. So fbdev must be somehow really frugal.


That's cause you're using a distro like mint which is using older builds of stuff.

Get yourself a most recent plasma 6 Wayland setup with pipewire for audio. It even has rdp server now.

What's most likely happening is your user space app wants the newer API but you're running old builds from two years ago.

It will continue to degrade for you unless you fully switch to a Wayland DM.

Anything built on X11 is basically deprecated now and no one is building on it anymore.


> That's cause you're using a distro like mint which is using older builds of stuff.

The context here is that I was commenting on the parent's assertion that one "can just use a pre-configured system like Mint Cinnamon and never know about any of these things." Nope!

> It will continue to degrade for you unless you fully switch to a Wayland DM. Anything built on X11 is basically deprecated now and no one is building on it anymore.

That's my impression as well, and again, with the 2nd most popular Linux distro using X11 by default and with "experimental" Wayland support, that only reinforces my rebuttal of parent's claim.


I don't recommend Mint for this reason. SteamOS or Nobara for the white glove premium experience.

> You can run the same programs on any DE or Window Manager or even without one (on pure X11 for example)

The Flatpaks I've tried don't seem to understand this, however.


The same is true of Linux - GTK3 apps runs just fine on Plasma, and so do GTK4, and Qt 5, and Qt 6, and X11 apps, and on.

Sure they all look slightly different, but it's definitely worse on Windows in that regard.


> Yes, there are many UI implementations in Windows but they are almost totally transparent to the user (no pun intended), and they can all run on the same system at once.

Just like Linux. You can run most if not all apps in any DE. Yes gnome will look ugly, but that's gnome's way of doing things. If you pick a decent DE, you will have most basic apps using the same styling, and the rest have CSD anyway.

Each GUI toolkit has its own specialties, but you'll use at most two of them, and they will be kept in separate apps. (Apart from flatpak portals which use gtk instead of the system's).

Windows has 3-5 different UI/UX layers within the same application ... And the rest have CSD anyway, so they look the same no matter the OS.


> Yes, there are many UI implementations in Windows but they are almost totally transparent to the user (no pun intended), and they can all run on the same system at once.

I mean this is a solved problem on linux using modern distributions like NixOS or even 'normal' distros with flatpak, appimage, etc. I haven't had to deal with anything like this in years.

The windows UIs are way more different than linux was. There was a time in the 90s where UIs were expected to follow platform specifics. These days, most UIs don't and they're almost kind of like the branding. Thus, this is not as big a deal as you're making it out to be. If anything, things like the gnome apps and gtk4 are more consistent than any windows app.


No, it's not about users picking a camp, it's about developers.

It's been a long, long time since I've seen an application utterly fail to load because it's a GTK/QT/etc framework running under a totally different DE.

Gnome apps look ugly as hell under KDE[0], but they still work. As a user, you don't need to know or care in any way. It'll run on your machine.

[0]I don't know if they're ugly because of incompatibility or if that's just How Gnome Is. I suspect the latter


>many apps now just use CSD

If there's something I hate about Linux, it's CSD (Client-Side Decorations, in case people don't know what it is).

If I wanted all my apps to look different from each other, I'd use macOS. I want a clean desktop environment, with predictable window frames that are customizable and they all look the same. CSD destroys that.


Having no CSD at all is unacceptable on small screens IMHO, far too much real estate is taken up by a title bar, you can be competitive with SSD by making them really thin, but then they are harder to click on and impossible with touch input. At the moment I have firefox setup with CSD and vertical tabs, only 7% of my vertical real estate is taken up by bars (inc. Gnome), which is pretty good for something that supports this many niceties.

Conversely, I don't want all of my apps to look identical to each other. I want to be able to tell with a submoment of a glance what app I am working on or looking for without having to cognitively engage to locate it, breaking my state of flow in the process.

You're defeating your own point. CSD in practice breaks a basic feature of the desktop : knowing at first glance which of the windows will receive whatever you type on the keyboard.

For eons the standard was: the only one with the title bar showing the theme accent color. That is consistent, predictable, keeps the user in the flow.

With CSD each app does whatever inconsistent thing they can fancy. You type and oops deleted something in the wrong window.

Alas now even many default SSD setups fail at this (selected and non-selected windows look pretty much the same) and keyboard-first workflow is much hindered.


That's what the title bar is for?

Linux doesn't mean GNOME.

KDE favors server-side decorations.


I know, in fact I'm using Linux Mint.

The problem is that sometimes you need to use a GNOME app that looks completely out of place.


I mean, most apps I use daily, no matter the OS, have CSD. Teams, Spotify, Slack, Firefox, Postman, IntelliJ etc...

Doesn't matter which OS they all have different styles. I can understand it's not liked by everyone, but that ship has sailed and no "big" app will use SSD anymore.


> UI framework balkanization has always been, and remains a hideous mess.

At least things look more or less the same over time. With commercial offerings one day you open your laptop and suddenly everything looks different and all the functions are in a different submenu because some designer thought it was cool or some manager needed a raise.

> It'll probably work fine out of the box, but if it doesn't. Hoo boy.

LLMs are actually very useful for Linux configuration problems. They might even be the reason so many users made the switch recently.


Pair-programming Nix with Gemini has taught me a lot about the assistive power of LLMs.

They're still slow and annoying at languages I'm good at. But it's really handy to be able to take one I'm not (like Nix or even C++) and say "write me a patch that does …" Applying a lot of the same thinking/structuring skills, but not tripping on the syntax.


They're pretty good for most things, yes... but man was it rough figuring out getting my IP allocation routing right on my Proxmox server. The system is issued a primary IP, and need to route my subnet through that to my VMs... wasn't too bad once I got it working... I'd also wanted a dnat for "internal" services, and that's where it got tricky.

I need to refresh myself as I'm wanting to move from a /29 to a /28 ... mostly been lazy about not getting it done, but actually mqking progress oo some hobby stuff with Claude Code... definitely a force multiplier, but I'm not quite at a "vibe code" level of trust, so it's still a bit of a slog.


You could just let the VMs be normal IPs on the network....

Where would those IPs route to/from if it didn't have a configured default gateway exactly?

The machine got a single IP, I had to route the CIDR block using that IP as the gateway in the host OS. The VMs wouldn't just get assigned additional real IPs.


KDE & Gnome are both guilty of the same.

> Wayland has fractional scaling as a sort-of workaround if you can tolerate the entire screen being blurry. Every other major OS can deal with this.

I think Windows is the only other one which really does this properly, macOS also does the hack where they simulate fractional scales by rendering with an integer scale at a non-native resolution then scaling it down.


> I think Windows is the only other one which really does this properly

Windows is the only one that does this properly.

Windows handles high pixel density on a per-application, per-display basis. This is the most fine-grained. It's pretty easy to opt in on reasonably modern frameworks, too; just add in the necessary key in the resource manifest; done. [1]

Linux + Xorg has a global pixel density scale factor. KDE/Qt handles this OK; GNOME/GTK break when the scaling factor is not an integer multiple of 96 and cause raster scaling.

Linux + Wayland has per-display scaling factors, but Chromium, GNOME, and GTK break the same way as the Xorg setup. KDE/Qt are a bit better, but I'm quite certain the taskbar icons are sharper on Xorg than they are on Wayland. I think this boils down to subpixel rendering not being enabled.

And of course, every application on Linux in theory can handle high pixel density, but there is a zoo of environment variables and command-line arguments that need to be passed for the ideal result.

On macOS, if the pixel density of the target display is at least some Apple-blessed number that they consider 'Retina', then the 'Retina' resolutions are enabled. At resolutions that are not integer multiples of the physical resolution, the framebuffer is four times the resolution of the displayed values (twice in each dimension), and then the final result is raster-scaled with some sinc/Lanczos algorithm back down to the physical resolution. This shows up as ringing artifacts, which are very obvious with high-contrast, thin regions like text.

On non-retina resolutions, there is zero concept of 'scaling factor' whatsoever; you can choose another resolution, but it will be raster-scaled (usually up) with some bi/trilinear filtering, and the entire screen is blurry. The last time Windows had such brute-force rendering was in Windows XP, 25 years ago.

[1]: https://learn.microsoft.com/en-gb/windows/win32/hidpi/settin...


> Windows is the only one that does this properly. Windows handles high pixel density on a per-application, per-display basis.

This is not our [0] experience. macOS handles things on a per-section-of-window, per-application, per-display basis. You can split a window across two monitors at two different DPIs, and it will display perfectly. This does not happen on Windows, or we have not found the right way to make it work thus far.

[0] ardour.org


> macOS handles things on a per-section-of-window, per-application, per-display basis.

No, it does not. If you have two displays with different physical pixel densities, and especially if they are sufficiently different that Apple will consider one 'Retina' and 'not Retina' (this is usually the case if, for instance, you have your MacBook's display—which probably is 'Retina'—beneath a 2560 × 1440, 336 × 597 mm monitor, which is 'not Retina'), then the part of the window on the non-Retina display will be raster-scaled to account for the difference. This is how KDE Plasma on Wayland handles it, too.

In my opinion, any raster-scaling of vector/text UI is a deal-breaker.


I think the only case where raster scaling is not a deal breaker is a window spanning high and low DPI displays. That is unless the app delegates compositing to the OS compositor which could then raster the contents at the different scales correctly. Not all content can be delegated to the OS - video games for example.

I think there’s one group of people who consider preserving the physical dimensions important that like the macOS approach. For me, if a window is across multiple displays then it’s already broken up and I’m not too bothered about that. What I care about is getting application UI to a reasonable size without blurring. MacOS doesn’t do that.

Actually, the default in MacOS is that the window is only on one monitor, and its the monitor where the cursor was when you last moved the window, so you might have a window appearing invisible because you dragged it near the corner and some sliver ended on another monitor.

Look at this complicated tinkering MacOS makes you do for something as simple as spanning windows across monitors! https://www.arzopa.com/blogs/guide/how-to-make-a-window-span... (OK this last part is slightly facetious but Linux gets dinged for having to go into menus because the writer wants something to work the way it does in on other operating systems the whole time)


> then the final result is raster-scaled with some sinc/Lanczos algorithm back down to the physical resolution. This shows up as ringing artifacts, which are very obvious with high-contrast, thin regions like text.

I don't think this is true. I use non-integer scaling on my Mac since I like the UX to be just a little bit bigger, and have never observed any kind of ringing or any specific artifacts at all around text, nor have I ever heard this as a complaint before. I assume it's just bilinear or bicubic unless you have evidence otherwise? The only complaint people tend to make is ever-so-slight additional blurriness, which barely matters at Retina resolution.


Indeed, these artifacts sound like they're coming from Display Stream Compression [1] rather than scaling. I've had Macs occasionally use DSC when it wasn't necessary; power-cycling the display and/or changing the port it's plugged into usually fixed it. If it's consistently happening, though, it's probably because the display, the cable, the port, and/or the GPU can't handle the resolution and refresh rate at full bandwidth.

[1]: https://en.wikipedia.org/wiki/Display_Stream_Compression


> ringing or any specific artifacts at all around text

There are a few Reddit threads that crop up when one searches for 'macOS ringing artifacts scaling'. For instance, these ones:

https://www.reddit.com/r/macbookpro/comments/1252ml8/strange...

https://www.reddit.com/r/MacOS/comments/1ki58zk/fractional_s...

https://www.reddit.com/r/MacOS/comments/l8oadr/macos_fringin...

All are ringing artifacts, typical of downscaling. I no longer have a Mac (chose one for work to try it out, saw this issue, returned it immediately), but I assure you this is what happens.

> The only complaint people tend to make is ever-so-slight additional blurriness

At no scale factor should there be any blurriness unless a framebuffer resolution is explicitly set. The 'scale factor' should be entirely independent of the physical resolution, which macOS simply does not do.

Apple's understanding and implementation of 'Retina' comes from a singular source: the straightforward doubling in each dimension of the display resolution of the iPhone 4 compared to the iPhone 3GS. It has not changed since, and has applied this algorithm throughout its OS stack.


All of these involve external monitors as far as I can tell, so it seems more likely it's the Display Stream Compression mentioned by the sibling to your comment that is the culprit.

Like I said, absolutely nothing like that happens on my display. I see the ringing in the first link. That doesn't happen to me. Not even a hint of it.

I get you don't like the scaling, but like I said, the very slight blurriness just isn't really noticeable in practice, especially given how Macs antialias text to begin with. Of all my complaints about Macs, this particular one is close to the bottom.


I gotta say, as the guy who brought up DSC, that last Reddit post especially had me doubting. That is not what DSC artifacts look like. DSC subsamples the chroma, which causes distinct color bleeding issues. That is luma bloom, which doesn't happen with DSC.

So I took my Mac Mini, hooked up to a 4K monitor, verified there were no DSC artifacts at native resolution, set it to "2560x1440" and sure enough the same artifacts appeared for me too, but still no telltale signs of DSC. So yeah, I gotta say, this is on Apple. Between this and dropping subpixel antialiasing support for text, it's pretty clear that their only properly supported configuration is 2x scaling on high-DPI displays.


Huh, very interesting.

OK, I just grabbed my loupe to make sure I'm not missing anything, and pulled up an app in dark mode (so ringing should be more visible) on my MBA M4. I'm using its built-in display. I've cycled through all 4 available resolution settings in Display, and absolutely zero artifacts or ringing. Then tried connecting to my LG UltraFine 4K which connects over Thunderbolt, that gives 5 resolution settings instead of 4, and zero artifacts/ringing on any of those either.

So I have no idea what's going on. I don't doubt that you're seeing it, and it's there in that Reddit photo. But maybe it's something specific to external monitors over a certain connection type or something? Seems very strange that Apple would use a different downsampling algorithm under different circumstances though.

I'd normally assume the most likely culprit would be some kind of sharpening setting on a monitor, as that can absolutely cause the type of ringing seen in that Reddit photo. But on the other hand, if you're testing it right now and not seeing it at native 2x, then that would seem to be ruled out, at least in your case. Maybe it's some kind of resolution negotiation mismatch where it's actually the monitor applying a second scaling that has ringing, since monitors can accept signals that don't match their native hardware resolution?


I can get a mild form of it on my M4 MBP's built-in display at "1800x1125"* but it's not nearly as noticeable as it was on the 4K external display at "2560x1440" and honestly I needed my cell phone camera zoomed in to definitively identify it, so that was more of a fishing expedition than a real problem. However, I have tried 2 different Macs, 2 different 4K monitors (both LG UltraFine also, though they differ in firmware version and color reproduction because of course they do), and 2 different interfaces (HDMI, Thunderbolt), and I can reliably replicate it under all of those combinations. I think that exact scaling factor probably has a bad interaction with the scaling algorithm. I do agree that a lot of other scalings do not produce the ringing/halo/bloom effect.

* = You have to go click "Advanced...", enable "Show resolutions as list", then when back on the main Displays page, enable "Show all resolutions", to get this and many other options -- but this is only necessary on the internal display, the external display offers "2560x1440" as a non-advanced choice


> All of these involve external monitors as far as I can tell

This happens on the native displays of MacBooks and iMacs, too. Try any of the 'looks larger'/'looks smaller' settings and it'll show up.


I've repeatedly explained that's not the case. Not on any Mac I've ever owned, and I already explained I thoroughly investigated my current M4 MBA. It's not showing.

That just means you can't perceive it. Sorry, but raster scaling is how Apple's algorithm works, and just because you can't see it doesn't mean it is not the case.

For the record, this isn't just visual. Rasterising to a framebuffer that is considerably larger than the physical resolution and then scaling produces a tangible effect on battery life. Not that it matters much with the impressive efficiency of the M-series SoCs, but it is there nonetheless.


I work a lot with Photoshop. I've studied digital signal processing. Believe me, I will perceive ringing when it's there.

You seem to be fundamentally misunderstanding. Yes, raster scaling is how it works. I haven't disputed that anywhere.

I'm saying the ringing artifact specifically that you're complaining about is not happening on my setup, nor does it seem to be widespread. You were complaining about the specific Lanczos algorithm due to its noticeable ringing, I'm saying that therefore doesn't seem to be the algorithm being used on mine, nor is there any documentation that's the algorithm Apple uses. Your criticism seems to be based on partially wrong information, even if something like it seems to happen on certain external displays -- whatever it is, it's not a universal problem.

If you somehow missed my other comment, please read it:

https://news.ycombinator.com/item?id=46801536


ChromeOS also does fractional scaling properly because Chrome does it properly. The scaling factor is propagated through the rendering stack so that content is rastered at the correct scale from the beginning instead of using an integer scaling factor and then downscaling later. And it takes subpixel rendering into account too, which affects things like what elements can be squashed into layers backed by GPU textures.

I think Android does it properly too because they have to handle an entire zoo of screen sizes and resolutions there. Although they don't have the issue of dealing with subpixel rendering.


This is a surprising opinion to encounter, given my experience with scaling on Windows, where simple things like taking my laptop off its dock (going from desktop monitors to laptop screen) causes applications to become blurry, and they stay blurry even when I've returned the laptop to the dock. Or how scaling causes some maximized window edges to show up on the adjacent screen. Or all manner of subtle positioning and size bugs crop up.

Is this more of an aspirational thing, like Windows supports "doing it right", and with time and effort by the right people, more and more applications may be able to be drawn correctly?

[edit] I guess so, I see your comment about setting registry keys to make stuff work in Microsoft's own programs. That aligns more closely with my experience.


Not sure about the underlying reason, but I use Windows for work and the only program I've encountered in the past two years with this behavior is the Eclipse IDE. Everything else deals very well with rescaling and docking / undocking to 4k displays.

> Windows is the only one that does this properly.

How can you say this when applications render either minuscule or gigantic, either way with contents totally out of proportion, seemingly at random?

I don’t have to pull out a magnifying glass to notice those issues.


These were probably written against the old-school Win32. It's pretty easy to fix.

  Right-click on the `.exe`
  Properties
  Compatibility tab
  Change settings for all users
  Change high DPI settings
  Under 'High DPI scaling override' section, tick box for 'Override high DPI scaling behaviour. Scaling performed by'
  In the drop-down box below, select 'Application'
Done.

For MMC snap-ins like `diskmgmt.msc`, `services.msc`, or `devmgr.msc`, there's a Registry key you can set. See this ServerFault question: https://serverfault.com/q/570785/535358


The 'doing it right' part is from how it should be done, but it still needs application support.

The thing is X11/Xorg can also theoretically do the same thing (and most likely Wayland too) but it needs, you guessed it, application (and window manager / compositor) support.


> The last time Windows had such brute-force rendering was in Windows XP, 25 years ago.

To be fair, UXGA was a thing 20 years ago. I don't think it makes sense for Apple to care all that much about low DPI monitors. They don't sell any, and they wouldn't be acceptable to most Apple people, who have had crisp displays available for > 10 years now. I wouldn't be surprised if the number of Apple users on low dpi is single digit percentage.


That's roughly what I did for my ANSI console/viewer... I started with EGA resolution, and each ega pixel renders 3x4 in its' buffer then a minor blur, then scaled to fit the render area. The effect is really good down to about 960px wide, which is a bit bigger in terms of real pixels than the original... at 640px wide, it's a little hard to make out the actual pixels... but it's the best way I could think of to handle the non-square pixels of original EGA or VGA... I went with EGA because the ratio is slightly cleaner IMO. It's also what OG RIPterm used.

I have precisely one Windows thing I use regularly, and it has a giant window that needs lots of pixels, and I use it over Remote Desktop. The results are erratic and frequently awful.

> * UI framework balkanization has always been, and remains a hideous me

I'd take balkanization over the "we force-migrate everyone to the hot new thing where nothing works".

> It'll probably work fine out of the box, but if it doesn't.

Drivers are a pain point and will probably stay so until the market share is too large for the hardware vendors to ignore. Which probably aren't happening any time soon, sadly.


This is not a driver issue I'm talking about. It's a "best way to adjust the white balance is with this GTK+-2.0 app that hasn't seen maintenance since the Bush administration" issue.

Yes, this one is quite a problem as well.

> I'd take balkanization over the "we force-migrate everyone to the hot new thing where nothing works".

The UI framework for macOS has not changed in any substantial design-update-requiring ways since OS X was first released. They did add stuff (animations as a core concept, most notably).

The UI framework for Windows has changed even less, though it's more of a mess because there are several different ones, with an unclear relationship to each other. win32 won't hurt though, and it hasn't changed in any significant ways since dinosaurs roamed the silicon savannahs.

The UI framework for Linux ... oh wait, there isn't one.


I had to dump a perfectly fine c.2012 workstation recently because of video driver limitations. Could no longer stay current on my flavor of Linux (OpenSUSE) and have better than hideous display resolution limited to just one monitor. NVIDIA’s proprietary drivers are great, but the limited support lifecycle plus poor open source coverage is actually making Linux turn fine systems into trash just the way Windows used to do.

>poor open source coverage is actually making Linux turn fine systems into trash just the way Windows used to do.

I'd blame Linux as a very small percentage of the problem here. This is on NVIDIA ensuring their hardware doesn't last to long and forcing you to throw it away eventually. Open source can make the monitor 'work' but really aren't efficient, and really can never be efficient because NVIDIA doesn't release the needed information and directly competes with their proprietary driver.


Couldn't you swap out for a now lower level AMD GPU? An RX 6600 should be under $200 and likely at least as good as what you were running... unless you were doing specific CUDA workloads. Even on PCIe 2/3, it should be fine.

On UI frameworks... mostly agree, I say this as a COSMIC user even... so many apps still don't show up right in the tray, but it's getting a bit better, I always found KDE to be noisy, and don't like how overtly political the Gnome guys are. So far Wayland hasn't been bad, X apps pretty much just work, even if they don't scale right.

I'm on a very large OLED 3440x1440 display and haven't had too many issues... some apps seem to just drop out, I'm not sure if they are on a different workspace or something as I tend to just stick to single screen, single display. I need to take the time to tweak my hotkeys for window pinning. I'll usually have my browser to half the screen and my editor and terminal on the other half... sometimes stretching the editor to 2/3 covering part of the browser. I'm usually zoomed in 25-30% in my editor and browser... I'd scale the UI 25% directly, like on windows or mac, but you're right it's worse.

For webcams, I don't use anything too advanced, but the Nexigo cams I've been using lately have been working very well... they're the least painful part of my setup, and even though I tend to use a BT headset, I use the webcam mic as switching in and out of stereo/mono mode for the headset mic doesn't always work right in Linux.

On audio filtering, I can only imagine... though would assume it's finally starting to get better with whatever the current standard is (pipewire?), which from what I understand is closer to what mac's interfaces are. I know a few audio guys and they hate Windows and mostly are shy to even consider Linux.


I'm using KDE with Wayland and 2 non-standard DPI monitors (one at 100% the other at 150% scale). No workarounds needed, nothing is blurry. I think your experience comes from GNOME which lacks behind in this regard.

FWIW, I can do the same with KDE on Xorg with Gentoo Linux.

Since the introduction of the XSETTINGS protocol in like 2003 or 2005 or so to provide a common cross-toolkit mechanism to communicate system settings, the absence of "non-integer" scaling support has always been the fault of the GUI toolkits.

> I think your experience comes from GNOME which lacks behind in this regard.

When doesn't GNOME lag behind? Honestly, most of Wayland's problems have been because a project that expects protocol implementers and extenders to cooperate in order to make the project work set those expectations while knowing that GNOME was going to be one of those parties whose cooperation was required.


Mint/cinnamon here at 150%, X11, not blurry. It’s FUD.

The issue with X11 is that it's not dynamic. Think using a laptop, which you sometimes connect to a screen on which you require a different scale. X11 won't handle different scales, and it also won't switch from one to the other without restarting it.

> The issue with X11 is that it's not dynamic.

No, it is. Maybe you're using an ancient (or misconfigured) Xorg? Or maybe you've never used a GTK program? One prereq is that you have a daemon running that speaks the ~20 year old XSETTINGS protocol (such as 'xsettingsd'). Another prereq is that you have a DE and GUI toolkit new enough to know how to react to scaling changes. [0]

Also, for some damn reason, QT and FLTK programs need to be restarted in order to render with the new screen scaling ratio, but GTK programs pick up the changes immediately. Based on my investigation, this is a deficiency in how QT and FLTK react to the information they're being provided with.

At least on my system, the KDE settings dialog that lets you adjust screen scaling only exposes a single slider that applies to the entire screen. However, I've bothered to look at (and play with) what's actually going on under the hood, and the underlying systems totally expose per-display scaling factors... but for some reason the KDE control widget doesn't bother to let you use them. Go figure.

[0] I don't know where the cutoff point is, but I know folks have reported to me that their Debian-delivered Xorg installs totally failed to do "non-integer" scaling (dynamic or otherwise), but I've been able to do this on my Gentoo Linux machines for quite some time.


I use whatever package is shipped by arch, so I think I'm fairly up to date.

I did look a bit into this at one point, but I've found that it's mostly QT apps which work fine with different scaling (telegram comes to mind). GTK apps never did, but I admit I never went too deep in the rabbit hole. Didn't know there was supposed to be some kind of daemon handling this. I do run xsettingsd, but for unrelated reasons. I'll have a look if it can update things.

In any case, except for work, I always used everything at 100% and just scaled the text as needed, which worked well enough.

> I've bothered to look at (and play with) what's actually going on under the hood, and the underlying systems totally expose per-display scaling factors...

Would you care to go into some details? What systems are those and how do you notify them there's been a change?


Maybe I got lucky but my external screen is at the same scale. 4k monitors are not expensive any longer, mine is perhaps ten years old.

That's also my setup, the external 4k monitor is roughly the same scale as my FHD laptop. I did this on purpose, to avoid having to mess around with scaling and just run everything at 100%.

But more and more laptops start having higher resolutions nowadays, and using my external screen at a scale of more than 100% would be a waste. But a 3-4k 14" laptop at 100% would be unusable.


the scaling and UI framework issues are by far my biggest pain point. I will inevitably end up with an app with tiny and/or blurry UI elements every few weeks and have to spend a ton of time figuring out the correct incantation to make it better.

This is on a pretty clean/fresh install of current ubuntu desktop


I'm a lifelong Mac user, but a gaming handheld has gotten me into some of these topics. I dual-boot SteamOS and Windows.

On SteamOS, my 5.1 stereo just works.

On Windows, apparently there was some software package called DTS Live (and/or Dolby Live) needed to wrap the audio stream in a container that the stereo understands. There was a time when there was a patent pool on the AC-3 codec (or something like that - I'm handwaving because I don't know all the details). So Microsoft stopped licensing the patent, and now you just can't use AC-3 on Windows. I spent an evening installing something called Virtual CABLE and trying to use it to juryrig my own Dolby Live encoder with ffmpeg… Never got it to work.

It's easy to fall deep into the tinkerhole on Linux, which has kept me away for a long time, but as mainstream platforms get more locked down, or stop supporting things they decide should be obsolete, it's nice to have a refuge where you're still in control, and things still work.

(Insert meme about the Windows API in Proton being a more stable target than actual Windows.)


I use Linux as my daily driver, with a Mac laptop. I only use Windows when I absolutely have to (i.e., testing), and usually through a VM.

Some other rough edges in Linux I've encountered:

- a/v support in various apps. We use Slack for everything (I can't just use something else) and a/v support is pretty bad to where my video frame rate is now ~1Hz and screen share shows a black rectangle. I think that's mostly Slack's fault as Google Hangouts works fine, but it's probably low on their priority list.

- sleep / hibernation is still sometimes flakey. occasionally it won't wake up after hibernating overnight, and I have to hard reboot (losing any open files though that's not an issue)

- power management on laptops (and therefore battery life) is still worse than Windows, and way worse than Mac. I tried Framework + Linux for a while and really wanted to love it, but switched to a Mac and am not going back (still run Linux on desktop). There is nothing out there that compares to the M-series MacBooks.

- occasional X/Wayland issues, as mentioned


Hardware support for esoteric things such as the new generation of Wacom EMR is still awkward --- I was able to get the previous gen working on a ThinkPad X61T using Lubuntu --- wish that there was such an easy way to try out Linux on my Samsung Galaxy Book 3 Pro 360....

> * Support for non-standard DPI monitors sucks, mostly because of the previous point. Wayland has fractional scaling as a sort-of workaround if you can tolerate the entire screen being blurry. Every other major OS can deal with this.

This sounds like you're using some old software. GNOME and sway have clean fractional scaling without blurring, though that hasn't always been the case (it used to be terrible).


> Audio filtering is a pain to set up.

Like noise filtering for your microphone? It was pretty trivial to set up: https://github.com/werman/noise-suppression-for-voice


Fractional scaling on Wayland is only blurry for X apps, and even then, most apps have Wayland support at this point, so for the remaining apps, just turn off Xwayland scaling, and using native scaling through env vars and flags, and no more blurriness.

- Yes. I think big players in Linux should start supporting core functionalities in GNOME and KDE, and make it polished for laptops and desktops and that would be very cool. For a long time, KDE had a problem of having too many things under its umbrella. Now, with separation of Plasma Desktop and Applications, focusing on Plasma Desktop and KDE PIM should be a good step.

- Kind of ties to the old point: KDE on Wayland does this extremely well.

- You're back to 20 years because problems are exactly from 20 years ago. Vendors refusing to support linux with d rivers.

- Audio filtering? Interesting. I know people who use Pipewire + Jack quite reasonably. But may be you have usecase I am now aware of? Would be happy to hear some.


It would help if Gnome wasn't so hostile towards proper cross-DE interop. A famous quote by a Gnome dev goes, "I guess you have to decide if you are a GNOME app, an Ubuntu app, or an Xfce app unfortunately"

They seem to genuinely believe that their way is the right way and everyone else is "holding it wrong" so there's no need for things that would make cross-DE apps easier (or even possible).


> Wayland has fractional scaling as a sort-of workaround if you can tolerate the entire screen being blurry

Not blurry for me on KDE and I wouldn't tolerate blurry, I'd prefer the imperfect solution of using bigger fonts.


KDE Plasma 6 might be the only desktop that does this right on Linux.

I've also been running fractional scaling on Sway for many years now and native wayland applications are not blurry. X11 apps run through XWayland will be blurry, but I don't have any legacy X11 apps remaining on my system.

> UI framework balkanization has always been, and remains a hideous mess.

Amen.

But, which OS doesn't have this problem? I'm currently running windows on a work laptop and even freaking first-party apps have a different look and behave differently from one another. Teams can't even be assed to use standard windows notifications! And don't get me started on electron apps, of which most apps are nowadays, each coming with their own look and feel.

Also, have you tried switching from light to dark mode, say at night? The task manager changes only partially. The explorer copy info window doesn't even have a dark mode! On outlook the window controls don't change colour, so you end up with black on black or white on white. You can't possibly hold up windows as a model of uniform UI.

So while I agree that this situation is terrible, I wouldn't pin it on the linux ecosystem(s).

> Every other major OS can deal with [high dpi].

Don't know about mac os, but on Windows it's a shitshow. We use some very high DPI displays at work which I have to run at 200%, every other screen I use is 100%. Even the freaking start menu is blurry! It only works well if I boot the machine with the high-dpi display attached. If I plug it in after a while (think going to work with the laptop asleep), the thing's blurry! Some taskbar icons don't adapt, so I sometimes have tiny icons, or huge cropped ones if I unplug the external monitor. Plasma doesn't do this.

IME KDE/Plasma 6 works perfectly with mixed DPI (but I admit I haven't tried "fractional" scales). The only app which doesn't play ball 100% is IntelliJ (scaling works, it's sharp, but the mouse cursor is the wrong size).

> Audio filtering is a pain to set up.

What do you mean? I've been using easyeffects for more than five years now to apply either a parametric EQ to my speakers or a convolver to my headphones. Works perfectly for all the apps, or I can choose which apps it should alter. The PEQ adds a bit of a latency, but applications seem to be aware of it, so if I play videos (even youtube on firefox with gpu decoding!) it stays in sync. It detects the output and loads presets accordingly. I also don't have to reboot when I connect some new audio device, like BT headphones (well, technically, on Windows I don't anymore, either, since for some reason it can't connect to either of my headphones at all). I would love to have something similar on windows, but the best I found isn't as polished. It also doesn't support dark mode, so it burns my eyes at night.


macOS and Windows have a much smaller set of variants, and tend to ship a single UI with everything included with OS. Even the best single desktop Linux distros will ship divergent KDE and Gnome apps.

If you want essentially perfect high-DPI support out of the box and can afford higher end displays, use macOS. It just works. I see the comments above about scaling, and to that, I say: most people will never notice. However, a Win32 app being the wrong scale? They'll notice that.

But the real display weak point of Linux right now vs Windows is HDR for gaming. That's a real shitshow and it tends to just work on Windows.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: