It still applies to what got merged; the past month was spent on clean-ups, fixes, and review feedback, but there are no major changes to the approach. This initial merge was quite complex as it has to touch a number of subsystems and core kernel code in order to support the M1's quirks, and it was CCed to ~20 people. Now that it's done we can more efficiently work on individual drivers and subsystems, so I expect the pace to pick up.
I also streamed initial development on YouTube and will be resuming streams next week (patch feedback and git rebasing and munging doesn't make for very interesting streams, but now I'm back to coding). Coming up I have some driver support other people have been working on, and then I'll be writing a minimal hypervisor that can run macOS as a guest, to help reverse engineer the hardware - this will be important for reverse engineering more complex drivers cleanly, especially the GPU kernel side.
Until now we've used a serial cable to debug/load kernels (needs a DIY arduino thing, an proper design I'm working on which is still vapourware, or another M1 box), but Sven added support for the USB device controller and I'll merge it into our bootloader soon, so from this point on anyone will be able to do quick kernel iteration, debugging, and hardware exploration with just a standard USB cable and any other host machine.
(Side note: I'm happy to see that shilling your patreon is becoming a socially accepted norm. Best of luck with your work!)
One thing I was surprised by: LuaJIT doesn't compile on an M1. LuaJIT 2.1.0 technically compiles, but it core dumps whenever you have a syntax error, which makes it quite a bit less useful.
A reason for this might be due to the lack of __threadlocal support in M1's (apparently, or so says clang). Is there some way this could be addressed in the Linux work?
I'm not sure I have a coherent question here, but my goal is mainly (a) to express surprise that LuaJIT, of all things, doesn't run on an M1, and (b) to maybe suggest that it would be cool as heck to have it running in the Linux M1 work as a benchmark, if possible. I'd hate to see such a wonderful tool get left behind by the new arch.
That's not a lack of support, it's just LLD, the LLVM linker, being broken. I hear LLD for Mach-O is being worked on again, after a long period of stagnation where it was sort of but not fully working. But the default linker on Apple platforms is still ld64.
Well, in theory, Zig doesn't need to care about the specific approach. In the LLVM IR that Zig produces, thread-local variables are still handled as an abstract "thread_local" flag on global variables. The platform-specific parts are handled by a combination of the IR-to-native-object-file translation done by LLVM, and the linking step. If both of those worked correctly, thread locals would "just work" on M1. The problem is that (a) Zig uses LLD for linking and (b) LLD is buggy.
I'm not surprised that Zig is using LLD. Zig wants to be fully self-contained, able to cross-compile for any supported platform without requiring the user to install a toolchain for that platform. And they already use LLVM for code generation, so LLD for linking is the natural choice.
It's a beautiful vision - really. Toolchain installation sucks. And the stuff they're doing with processing and bundling OS C headers [1] is quite innovative. It's just that using LLVM for everything is not a reality for Apple platforms yet. Hopefully it will be soon, once Apple whips LLD into shape.
Edit: mimalloc is different; it reimplements thread-local storage by hand, using `asm` statements, rather than using native `thread_local` variables, presumably because the latter can be slow.
I think the parent poster is saying that LLD needs to take a different approach, not Zig. But it wasn't previously known what approach that would need to be, since M1 does it differently than other ARM CPUs. Therefore there's no expectation that LLD could have gotten it right.
LLVM/Clang were essentially born and stewarded by Apple, it's somewhat odd, to say the least, that they didn't patch LLD on day 1 with the Xcode release that supports M1.
This assumes that LLD has been around much longer than it has.
Apple first sponsored development of LLVM, before clang or LLD existed. Then they sponsored clang development.
LLD came much later. I believe Rui created LLD at Google with an intention of linking Chrome on Windows. That was a very different use case than what Apple had in mind.
Eventually LLD abstracted the target object file, could link ELF executables, and had a good design for furthering support for mach-o. Facebook is developing that in LLD surely to support linking their iOS app much faster. Apple remains disinterested from what I can tell.
Thank you so much for the work put in to documentation; written in kernel or blogged, and streamed. I've watched hours of your streams. I've been working on a bootloader and porting Linux to the Nintendo 3DS; I don't really know what I'm doing, but your streams clarified many questions I had about porting Linux to a new device.
It's not my project in the sense that I didn't start it, or do most of the work; I just enjoy contributing some fixes here and there and its my first experience working on a bootloader. There's a ton of great documentation (from the community, and ARM) at https://www.3dbrew.org/wiki/Main_Page as well as existing code to reference from others in the community. So I'm still in awe of how marcan goes about finding out how the hardware works without documentation from Apple (which IIUC he has done before with the Wii and BootMii).
I'm hoping to put out a release soon, and maybe stream some development on https://www.twitch.tv/ndesaulniers, then pursue upstreaming what we have.
Hey Marcan, really impressed with the progress it's been 100% worth the patreon sponsorship. I first ran into you back in the Wii hacking days and it's awesome to see you're still doing cool reverse engineering stuff after all this time :).
How are things looking on the "primary job goal" accounting for GitHub sponsors as well? Think you'll hit it this summer?
After reading articles last year on Linus insinuating it would be "unlikely" that Linux would appear on the M1 [1] [2], I drank the cool-aid and took that at face value.
I guess I should have known better, just like nature, open source "finds a way".
> After reading articles last year on Linus insinuating it would be "unlikely" that Linux would appear on the M1 [...] just like nature, open source "finds a way".
Not necessarily.
Reverse engineering the unknown architecture and porting the Linux kernel to the new machine is impressive work that we should support and celebrate.
But making it to be a usable desktop or laptop computer (e.g. with functional power management, 3D acceleration and other drivers) is another matter. See Nvidia graphics. Even after 10 years, some cards are still not fully supported by Nouveau due to the lack of documentation. I'm afraid that Apple M1 is going to stuck at a similar state for a long time - technically, Linux boots, but it's not a machine you'd like to browse the web, write code, or carry around.
There was a time when saying you didn't know how to do x on Linux forums got insults and RTFM replies, but saying that Linux was an incapable of doing x got helpful instructions given in exquisite detail.
It did sound a bit like a taunt to enlist the community in figuring it out.
Late Pieter Hintjens of ZeroMQ (and more) openly admitted to using this hack if he needed to have work done. State that something is impossible and watch developers tumble over each other to get it done and prove you wrong.
> I’m doing a (free) operating system (just a hobby, won’t be big and professional like gnu) for 386(486) AT clones. (..
.)
> Linus
> PS. Yes – it’s free of any minix code, and it has a multi-threaded fs. It is NOT portable (uses 386 task switching etc), and it probably never will support anything other than AT-harddisks, as that’s all I have :-(.
it wasn't really cool-aid - while this is amazing progress, I think it's still a pretty long way to having M1 with Linux usable as a laptop. The GPU seems like it will be a big lift without documentation or support from apple.
This is somewhat of a nit-pic, but that's a very incorrect way to use the idiom "drank the kool-aid".
You're using it here as a substitute for "I believed what he said", when in fact "Drank the kool-aid" means you are admitting you were brainwashed into believing a doomed idea to the point you're practically willing to die for it.
The followers of Elizabeth Holmes or Adam Neumann can be said to have "drank the kool-aid".
If you're not aware, it's a direct reference to the Jonestown mass suicide, so to me it comes off pretty jarring to see it used to refer to something as noncommittal and inconsequential as "I didn't disbelieve his statement".
There are dozens of ARM SoCs that have support merged into the kernel, but that does not mean that a barebones kernel will run well or with full hardware support, or that you can just boot a generic ARM ISO, either.
There's mainline support for ODROID boards, for example, but you're still dependent on the manufacturer for making sure new kernels actually work with the hardware and for new images. Once that manufacturer stops maintaining the kernel and images for their old hardware, either the community has to step up or you'll be running an increasingly out of date kernel and distro until you buy new hardware.
Will this have an impact of tools that depend on linux like docker on mac? Maybe android emulation? If so, then I'm super excited for it.
I'm using an m1 macbook air right now and it's the first compelling mac laptop for me since 2015. It's absolutely amazing. But for any workload that uses docker, my linux desktop is streets ahead.
Last time I checked, the Android Emulator running on Apple Silicon still has issues and isn't functionally as good as running it on an Intel Mac.
There is no point in getting the M1 Macbook (which is last year's model) since they will either announce a M1X or even a M2 Macbook this year which will most certainly be more performant and optimised than the last generation.
One of the several reasons why I stayed away from the first generation of Apple Silicon Macs, and still the software ecosystem is still unoptimised for it.
> There is no point in getting the M1 Macbook since they will either announce a M1X or even a M2 Macbook this year
I've been hearing something similar when I got a PS4 in summer 2019. PS5 is still barely available, and I've had almost 2 years of gaming fun. To be fair, the PS4 didn't have as many rough edges back then. But "no point" is a generalization that isn't true for many people.
Except that the difference is that I explicitly stated 'first generation of Apple Silicon Macs'.
Apple is known to frequently update their Macbook line up every year so I am in no hurry to upgrade my Intel Macbook, until they eventually 'start' the process of removing Intel support which by then a third or forth generation Apple Silicon Mac will be available.
Plenty of time to wait and use what already works, rather than jumping on early tech and then finding out that the software support is still WIP or in preview state.
> A tool that exists is far more useful than a tool that does not exist.
A tool that already works (Intel Mac) is far more useful and much better than buying a tool that exists and doesn't work with your existing setup or requires more tweaks and more hacks for it to work. (M1 Mac).
Or as they used to say in the olden days: If it ain't broke, don't fix it.
(In the case for Apple which release new product generations every year, always skip the first generation)
Downvoters: So this is not true? Time to cut through the hype.
From [0] (From Preview)
Known issues:
Webview doesn't work
No sound
No device skins
Video codecs not working
32 bit ARM apps won't work
Graphical glitches in some Vulkan apps
Popup on startup about not being able to find the ADB path (ADB will still notice the emulator if you have it installed though)
When building, it may be faster to start then cancel the Python triggered build and then reissue ninja -C objs install/strip versus letting the Python triggered build finish.
This is not even available in the stable release of Android Studio and is still not production ready. What is the point of using this when it clearly says it has missing functionality and is less functional than the android emulators found on macOS on Intel?
Once again, Another thing the M1 Macbook early adopter hype-squad omitted as they are still suffering from a sunken cost while I get a second or third generation Apple Silicon Macbook which will be much more optimised and by then the ecosystem will be better supported out of the box.
Saying "there is no point in getting an M1 MacBook" is probably what's garnering you the downvotes.
It's rough around the edges and whether or not your use case is successful is definitely very specific, but for my personal hobby projects (ECU reverse engineering), it's been an excellent tool. Ghidra worked out-of-the-box in Rosetta and natively with a few simple build system tweaks, tricore-qemu was trivial to build, Mono is working, VSCode is nicer than ever, and CrossOver Wine (in Rosetta) even works great for running Windows-only toolchains. The battery life and performance-to-price ratio on the Air have been amazing. And, some things like GMP are _ridiculously_ fast - a brute-force RNG -> RSA attack I developed is actually faster on the M1 MacBook Air than on my not-that-old Ryzen desktop.
I agree that the hype is tiring, and yes, some things like Android development don't work correctly yet - but not everyone is an Android developer, and dismissing the first-gen Apple Silicon products in a blanket way goes against the experience of many people. Thus, downvotes.
The secret to the lightweight nature of Docker is that it just uses various namespacing, cgroup etc functionality provided by Linux with no virtualized OS kernel or hw virtualization mechanisms involved.
(Yep you need ARM Docker images when running on ARM of course)
Do we really expect to see many Docker images published that are compiled for ARM? Sure, I expect to see base OS images, but I'd find it hard to believe that many people are going to take the time to cross-build and publish an arm64 version of their container image alongside their x86_64 version.
I guess to some extent that might be generally useful, what with things like AWS's Graviton ARM processors, but I expect it'll take a while.
Most people build their own images using CI/CD to their own registries when using containers in production [1]. You don't really otherwise know what is in them. And for on-laptop hacking it's just a docker build command.
[1]Especially now that public docker hub is ip rate limited to uselessness.
Anecdotally, we already do see a substantial number. Far from 100%, but many of the most popular images on Docker Hub are already multi-arch. The most notable exception is the Docker-maintained MySQL image, but Oracle maintains another which works fine on ARM.
Wouldn't Docker still need a VM for Linux containers on any non-Linux OS? My understanding is that it uses the host's kernel, so running it on macOS would always require a VM.
Some of the Apple M1 changes went into the tty/serial driver tree that GregKH maintains. The actual platform code requires those changes, so we based the branch on top of that, and the top-level merge commit contains all of these.
This is a 60-commit branch, which had to decouple a lot of different bits of ARM-platform code. Specifically, they use a totally different boot method, have a weird and special power-management system, and other stuff I don't know. See this article: https://asahilinux.org/2021/03/progress-report-january-febru...
I noticed that too. If you scroll through the entire diff, it looks like most of it is the removal of those seemingly unrelated things.
I also see some changes to what looks like common arm64 code, which isn't guarded to check for Apple or non-Apple; it seemed strange to me that those changes would be ok.
I wonder if both of those things are just changes from an unrelated merge that are showing up in this diff for no reason.
Yeah all this is good not bad. This is an extremely high quality port in that general-purpose yak shaves were taking as necessary.
This is the exact opposite of your e.g. android phone "fork and abandon" where someone does the absolute minimal and ugliest hacks that upstream would never accept.
Actually, it looks as those changes were unrelated! They were merged in from the tty/tty-next tree, which included some changes required to get the M1 UART working.
It would be wonderful if someday some Pine64 device can take advantage of some of the stuff here. They will never get Apple chips of course, but some of general purpose cleanups I mean.
Probably irrelevant, since the kernel already has support for aarch64 for a long time already, and Apple is unlikely to license or sell their SoC to other OEMs.
I sincerely hope that one day, Linux GUI land will receive similar amounts of work as the kernel.
While it's great that we can boot it on pretty much every device out there, I would much prefer if it was a viable Windows alternative on a few devices instead.
Just yesterday, I plugged an external screen into my Ubuntu laptop. And then I had to duck-duck for like an hour to figure out how to make the fonts large enough to be readable again.
And why doesn't the file browser reliably show me all the files in a folder? Sometimes, the first file in alphabetical order is just omitted.
I sincerely want to like Linux for Desktop as much as I love it on servers. But the fact is, OSX and Windows just make you so much more productive, because things just work.
Anecdotal I know, but I very recently got a macbook air after years of practically exclusively using Linux and the opinionated user interface drives me crazy.
Here's a few pain points:
- Scroll direction can only be set globally so either touchpad or mouse scrolling feels awkward for me
- You need an extra click on another window before you can interact with it
- No focus-follows-mouse (there used to be 3rd party plugins but those seem to be gone now)
- While basic split screen tiling works and you can set up a global kb shortcut to move windows to left/right of the
screen, Firefox seems to completely ignore those
- If you're running multiple displays macos for some reason
always picks the most awkward display to open new windows in
- I really miss being able to grab and resize windows using a modifier key like in XFCE
I'm sure I'd get used to many of those, but the lack of focus-follows-mouse and easy window manipulation are the big one for me and I don't think that'll change much. Actually I think those were already pain points when I last tried out Mac OS in the late PPC days.
There's of course plenty of things that I like, especially the tight integration with my iphone and trackpad gestures (although I'm not sure how ergonomic using the trackpad all day is).
I was hoping it could become my main driver at home, but now I ordered a Ryzen mini PC for that.
edit: Oh, I completely forgot the biggest deal breaker for me at the moment. As of now the GUI version of Emacs (either downloaded through brew or from https://emacsformacosx.com/) doesn't work. MacOS simply refuses to start the binary.
- You need an extra click on another window before you can interact with it
- No focus-follows-mouse (there used to be 3rd party plugins but those seem to be gone now)
- While basic split screen tiling works and you can set up a global kb shortcut to move windows to left/right of the screen, Firefox seems to completely ignore those
- If you're running multiple displays macos for some reason always picks the most awkward display to open new windows in
- I really miss being able to grab and resize windows using a modifier key like in XFCE
I don't think I do, I'm a fairly recent convert to macOS. I'm running Big Sur and I can interact with any window the mouse cursor is over. I can scroll any window I'm hovering over, and with a single click can perform an action.
No, but I understand the rationale. A button click is an input much like typing. Plus, when I want to give a window focus, I'd rather not look for a non-button to click.
But, having clicks work on hover instead of after focus is fine, and simply has a different set of trade offs.
Not Option, but Command. If you hold down Option and click in another window, the current window hides. You can also hold down Command and drag a window that doesn't have focus.
You can also switch focus to a window without raising it by holding down Control+Option and clicking the window.
At any rate, a lot of these things can be chalked up to different philosophies, I think. Personally, I really don't like focus follows mouse; when I was using FreeBSD and Linux many years ago, I would try it and then end up disabling it after a short while.
It is worth noting that Terminal has a hidden preference for focus-follows-mouse:
If a Linux user suggested people to download external 3rd party software to make their desktop experience nicer they'd get absolutely blasted and be told that users shouldn't be expected to tweak stuff and download 3rd party tools. Why is this acceptable for macOS users?
This make no sense. The entire linux ecosystem is nearly all 3rd party software. And people routinely recommend not only installing a tool, but installing entirely different window managers.
That's a red herring. These 3rd party tweaks don't make the Mac desktop experience objectively "nicer", they are tweaks to make it more familiar to people used to other, non-Mac desktop environments.
Personally I consider the stock macOS desktop experience to be perfectly fine.
There's truth to this. After years of using macOS as my primary OS, both Windows and Linux w/various DEs become progressively more irritating the more windows are open because they both expect micromanagement, be it from the user, or automated as with a tiling WM.
Under macOS, windows just kind of exist as they want to be and pile up until you intentionally change them, and the environment doesn't push you to manage it, so you end up not putting much thought into anything but the 1-3 windows you're using at that exact moment.
The mac approach taps a bit into spatial memory too, which makes auto-tiling like with i3 irritating because then windows no longer have an inherent size or position — it all continually changes as windows open and close unless I start maintaining window sets, so at the end of the day I'm still putting more thought into management than I would've under macOS.
Because of this it's long been a desire of mine to have a Linux DE that closely mimics macOS. Some will point to GNOME, but it's more like iPadOS if it were adapted to the desktop. Elementary gets closer but still misses the mark. It would be a neat hobby project to take up but figuring out how to get started is rough, especially with all Wayland example WMs and tutorials being for tiling instead of traditional floating.
Gnome is still by some distance the most popular user interface on Linux, and is also very strongly opinionated and appears to lack customization options [1]. The project's "our way or the highway" approach is frequently vocally criticized and compared to Apple's approach, but as a whole seems to be appreciated well enough. I wouldn't say the Linux community as a whole are assholes about this.
The fact that many online Linux users take offense to lack of customization is not really something to worry about in my opinion. I used to customize Linux installs endlessly, but only truly found zen when I stopped worrying committed to stick to Fedora's default install.
[1]: Besides the point but Gnome is arguably far and away the most customizable desktop shell on Linux, as its extension "mechanism" consists of literally overwriting the JS code that powers its shell interface. For better and worse.
And to be even fairer, user experience consistency isn't one of Linux's selling points. While there is some limited consistency within individual graphical shells, the Linux ecosystem naturally favours a diversity of opinions instead.
Because Apple interfaces are highly opinionated and generally fairly consistent, learning to do things the Apple way is a valid life choice for many people.
macOS does have mouse-following interaction in some situations. If you move the mouse over a background window, you can scroll it with the trackpad/wheel. You can't click on things or type in it because that's dangerous and would make people afraid to accidentally touch the mouse in case they move it.
Of all the things, accusing GNU/Linux of opinionated GUI just doesn't make sense at all to me. There's like 30+ different desktop environments you can use. So many so that there's a need for articles like 10 best Linux DE's of 2021, and those don't even scratch the surface anyway.
I really think we're getting there. Wayland is a much saner, more consistent experience than X11 ever was. I use Fedora and with Gnome and Mutter you get a solid experience. Not everyone likes Gnome but I find it to be generally pretty good and productive to work in.
There are some gaps in Wayland - screen sharing being the obvious one. But these gaps are in the process of being closed (pipewire being part of the picture). Give it another year and I believe that Wayland will be a solid solution for all the major use cases (multi-monitor, high-dpi, screen sharing).
My other issue with Linux as a workstation solution is OOM handling, and even that is being worked on!
Nvidia really needs to get on Wayland already. I tried a few months ago to run KDE on Wayland+Nvidia it was literally unusable. I could not put a panel down without bugs. I'm running mixed DPI with the Nvidia driver and it's just easiest to make seperate X screens despite all the downsides that brings :/.
I'm going to be sad that a lot of little WMs are probably never going to make the jump. I'm not sure Wayland is ever going to get that many.
There are lots of things you've learnt for Windows and Mac that make you productive. I can say this with certainty as someone who hasn't used Windows for almost 15 years and has never used MacOS. When I come into contact with those systems I have no idea what's going on. Windows has a million buttons all over the place and MacOS has really weird, unintuitive behaviour and "gestures" that you need to just know, somehow.
The best way is just to take the plunge. Immerse yourself by removing other OSes from your own PC. In a few weeks you'll be flying.
I've written this before, but it's not going to happen without a large amount of apps, that people actually want and can use. And that's not going to happen unless there's all the necessary tooling, especially for UI. mac/iOS and Windows have great support for building user-oriented applications. And then it still needs a centralized way to monetize the development work. And to get all that, you need a ton of paying users, not a bunch of freeloading tinkerers. It's simply not going to happen any time soon.
There's already GTK and Qt. What more do you want?
> And then it still needs a centralized way to monetize the development work.
I disagree: once you put money into the equation, the quality of packages will go down as programmers try to shake as much money out of the users. Compare programs like MATLAB vs GNU Octave or Photoshop vs GIMP for example.
> And to get all that, you need a ton of paying users, not a bunch of freeloading tinkerers.
Do you really? I mean you are basically saying that the only way linux can compete is if it sacrifices most of what makes it good in the first place. I really doubt that users think "hmm today i'm going to pay for a bunch of crappy apps that each do some trivial thing for me while uploading my data to the cloud, then watch some ads, and then wade through a bunch of graphical menus to disable this one stupid feature I don't like, i love using iWinux this is so much better than other OS's".
> There's already GTK and Qt. What more do you want?
There's GTK2, GTK3, vanilla Qt, KDE-Qt with additional tooling... The problem in Linux is the lack of integration between these tools. The fragmentation of the GNU/Linux ecosystem forbids the possibility of an unified API like HaikuOS kits. (https://www.haiku-os.org/docs/api/)
You have a skewed view of what it’s like on Windows or macOS. I certainly haven’t paid for a bunch of crappy, trivial apps — the utilities I use are all free (and don’t upload my data to the cloud, nor make me watch ads).
And are you saying that gimp is a superior product to photoshop? It’s not like you can’t get gimp or octave on non-free OSes, but for some (good) reasons illustrators and scientists and engineers choose to pay for photoshop and matlab.
Hmm, I’ve had the exact opposite experience. I recently logged on to an (admittedly older) mac machine and it took roughly 3 minutes to fully log in. Then I spent some significant time fumbling around a clumsy gui like a pleab.
After that, cd’ing around my minimal - yet blazing fast - thinkpad w/ void linux felt nearly exhilirating. Didn’t have to touch the mouse for many minutes at a time.
One of the issues is that most of the money comes from big corporate sponsors that are nearly exclusively going to be running Linux on a server with no GUI.
These businesses have no interest in trying to develop a desktop UI when Mac OS and Windows work fine already and someone else is doing all the work for it already.
It's not just corporate sponsors though. Many Linux developers and most hardcore Linux nerds have an irrational distain for GUIs. They think that Vim is the best thing since sliced bread and if you don't want to use the terminal to set up WiFi then you're a noob that belongs on Windows.
Not everyone is like that but that's definitely the prevailing attitude. A good example is the regular hate that SystemD receives. Nobody wants a modern system that supports hotplugging and GUI configuration. They think you should edit init.rc files by hand and reboot your desktop (laptop? what's that?) whenever you plug something in.
The reason people like Vim isn't because they hate GUI's. I use a Mac because I like the GUI, but I still do most of my development in Vim because I'm more productive in it. Most of these hardcore Vim (or Emacs) users stay that way because they've built up years of muscle memory and customizations that makes it the most productive editor for them. I also use VS Code and for some stuff I work faster in VS Code. But there's a good argument that a keyboard driven UX is faster than a mouse driven one since you can input things so much faster than with a mouse.
I personally hate SystemD also, but that's more because it makes it such a pain to stuff like look at logs. I never remember the correct journalctl command. Just let me cat/tail a file like everything else. I don't know anyone who hates SystemD because it allows GUI configuration. It's just that SystemD made some other changes also that people don't like.
Yeah, I wanted to replace my aging Macbook Air with a Linux box but each and every PC laptop trackpad was such a step down that I got hesitant. Additionally, PC manufacturers have been slow to adapt modern IO like Thunderbolt (like Dell had exactly one half-speed TB port for years on their flagship line where Apple had 4 full-speed).
Now I've capitulated and am just waiting for the next iteration of the Apple silicon Macbooks.
It is sad, but due to not generating sales at the same scale as MS or Apple, Linux lacks the development power to keep up.
They've more than redeemed themselves for me with their 2020 keyboards though. In direct comparison between my current 2020 Intel 13", my previous late 2013 13" and a work-issued 2018-ish 13" butterfly MBP, the 2020 wins hands down. Feels mostly like the late 2013, but slightly less mushy, not that I'd ever considered the late 2013 one mushy, but in direct comparison, it's slightly mushier, if that makes sense.
That said, I know at least one person who genuinely prefers the butterfly mechanism, there must be dozens such people out there.
Edit: The late 2013 one got a new keyboard in early 2020 as part of a inflated battery exchange, so it's not mushy due to wear.
It's really a shame. I'm in the same boat as you. I'd buy their machines instantly if I knew I could install my distro of choice. Last I read, Windows-via-Bootcamp was basically dead, and despite the efforts of e.g. the t2linux.org folks, it seems it's quite difficult to get Linux up and running 100%.
Yep, I've been looking for a linux laptop for a while that is actually compelling compared to my mac book pro. It seems just about everything is a downgrade in terms of shitty trackpads, lower resolution screens, memory, etc. Linux UI is not a problem for me; I can deal with that. The hardware is a problem though.
Here's what I want:
- Something that is faster than the last Apple laptop I loved which, I bought in 2012. I currently have a 2017 model which is barely faster and has the famous cluster fuck of a keyboard that I loathe with a passion. The 2012 model would have been a MBP 15" with a quad core i7, 16GB, and 256GB. Anything less than those specs is beyond pathetically and woefully inadequate. Anything not at least 2-8x better than that is still unimpressive given that we've had 4 iterations of Moore's law since then. Right now, I'd probably settle for 64GB RAM and some nice 8-16 core CPU. I'd take more if I could. Those AMD chips look pretty nice. Core i-9 also not too bad.
- 13-15" hdpi, non negotiable. I will never downgrade to anything less than a 4K screen. I'm writing this on an imac with a 5K screen. It's from 2014. My current laptop has a retina screen. It's four years old. It's 2021. I don't want/need a screen that was outdated and underwhelming 6 years ago.
- A trackpad that doesn't suck. I don't want want/need physical mouse buttons. I'm not ever going to use an external mouse. I want something I can use that doesn't actually suck. Apple nailed that more than a decade ago. Some manufacturers come close to this platonic ideal of a trackpad. Bot most don't even pretend to bother. Also, linux support for this seems to be a perpetual work in progress.
- A keyboard that is comfortable to use. Seems obvious but again hard to get. Apple messed this up and is in the process of correcting that mistake. It matters. I use laptops, a lot. My wrists matter to me. So do ergonomics. Or having an actual escape key (seriously, WTF Apple).
- Modern ports (USB-C, thunderbolt), bluetooth and wifi that work and with recent specs. I don't actually care about legacy ports that much and certainly not about the extra space and weight that seems to imply. But I do care that my headphones connect and that I can plug in a fancy screen with high resolution and its own assortment of ports. I.e. a thunderbolt screen that I can daisy chain to other screens and my laptop.
- A sensible design and weight. I don't want a ten kilo brick in my backpack.
- Lap friendly cooling. I actually use my laptop on my lap. I don't want it leaving scorch marks on my pants. Nor do I want it to sound like a vacuum cleaner.
What I described is a pro laptop. High end requirements. I know of a handful of models that come close to this. All of these have issues with Linux support. Key components that don't work or are not fully supported or require lots of hacks to get working.
Really, 4K on a 13" screen? 64GB LP-DDR in a small and light laptop? 16 cores again in a small and light form factor?
That's going to be a long wait. Most consumers don't need more than 4 cores and 16GB and that's where the mass market is. Prosumer models don't reach that high and pro workstations are big and clunky because companies would rather buy a stationary workstation and a light laptop instead of paying $6k or whatever such a beast would cost. With ubiquitous networking, the trend plays against you, I think.
I had to buy a laptop due to covid and splashed out on a Dell Precision 7750, 4K screen, WWAN, aftermarket 128GB RAM and SSDs. It's big but I wouldn't describe it as clunky.
Having previously daily driven an XPS, it's not even close.
It's hard to emphasize this enough but the MacBook trackpad is so good that even when I have a mouse attached to the laptop, I use the trackpad.
The gestures are indispensable for my workflow. Excluding the touchbar, the entire workflow is far more ergonomic than anything I have seen on the market.
I sincerely wish someone would exactly clone the MacBook pro 13 and leave it as an open platform for whatever OS you want to run.
Something like a Thinkpad X1 extreme? Or maybe have a look at the P series - I vaguely remember something like the P51s might fit your bill - the modern equiv. They have a better keyboard than any previous macbook, and up to a 4K screen, up to 64Gb RAM. Shame it doesn't have Ryzen like my 8-core X13 has. Should have all the ports. Thinkpads don't have quite the fit and finish that macbooks do - but they come from a different background/purpose - toughness rather than look/feel. However, more recent Thinkpads are looking smarter these days compared to the bricks of yore (which I liked BTW). My X13 has a magnesium under chassis and glass fibre lid.
Would someone familiar with the porting effort tell what where the major obstacles and breakthroughs. It would also be interesting to hear what methods were used in the process and who are the people behind the effort.
Thank you in advance. You will find appreciative audience here in HN for such engineering porn.
Macbooks make one of the good linux laptops (the ones that have support like older retina macbooks). And the truth is Macbook Air is one of the handful ARM consumer laptops you can easily buy right now even though ARM is the superior everyday computing platform to x86. What a weird time in personal computers.
Things have been spotty since well before the touchbar. I doubt anyone is running Linux reliably on any Macbook built after like 2014. There are still open issues with keyboard, touchpad, display drivers, brightness, sleep, nvme and lots more.
Yep. In 2018 I tried to run Linux on a 2016 MacBook Pro, and it was a pain. The keyboard/touchpad weren't supported without an out-of-tree driver, and audio and suspend-to-RAM didn't work at all. I hear things are better with that model now (I gave up in 2019 and got a Dell XPS13): keyboard/touchpad driver has been upstreamed and I believe there are patches to get audio working, but I don't know about suspend.
Yes. Tons. For a while you couldn't even install Linux on the internal SSD of T2 chip models. Keyboards and Trackpads don't work out of the box. Suspend and sound issues. Touch bar problems.
Yes, true, but Intel's integrated graphics has gotten a lot better in the last 5 years, and many laptops from that era end up with ~8GB of RAM (non-upgradeable in an Apple laptop), which is just not enough for what I do. (My current laptop has 16GB, and my next one will definitely have 32GB minimum.)
I have 2016 and 2018 Intel laptops (non-Apple, but similar in "class"), and it's surprising how much better the graphics is on the 2018 model.
I would love to try a laptop with AMD graphics, but the laptops I find with AMD kit in them generally don't meet my requirements for other reasons. And I'm forever scarred by mid-'00s experiences with Nvidia on Linux, so I refuse to buy Nvidia hardware.
I do some light gaming (not enough to justify a desktop/gaming PC), and even with 2010-era games I can't run them on 2018 Intel graphics with the quality settings above "medium", so I'll take the fastest Intel graphics I can get.
I have nVidia GPU in my current laptop - I agree this is a nightmare. I wanted to use it for some AI stuff, but it's pointless - too unstable so I am on an Intel GPU.
For a new laptop I am going to go full AMD, but nothing is in stock.
I am most worried about the fact that I start to rationalise getting a Mac Pro with M1. I don't do any gaming, and would use it solely for development.
If only Apple was more ethical then that would be a no brainer, but I don't want to fund a company that is anti-right to repair.
I very much experience a difference using Xcode, between my 4 core 2016 MBP and my 8 core 2019 MBP. Since both are base models, I think I disagree with your statement.
Is that true generally? I know my old (2008?) Macbook Pro supported Linux well, but I think that took a few years to get things like good trackpad, bluetooth, and wifi support. I remember getting a new one and it couldn't boot into the installer for most distros and still had driver issues (that might have been resolved).
Macs seem to support Linux a few years after release. They were usually a great choice when modern macOS got too slow for that model.
I also never found a good fit on the Windows side for a Linux laptop. I always ran Linux on a desktop. So maybe I would choose a Mac laptop to run Linux...
Linux support on consumer hardware depends on community contribution generally. So it will take time generally to have good support. However in 2021 it’s no longer necessary to buy a good laptop and struggle to put linux on it. There are some great linux options like the xps developer edition or system76 and i know thinkpads also have a lineup now of linux. These generally should have good linux drivers for hardware (though for certain oems it’s not necessarily the case the drivers will be open source). System76 is probably the go to linux laptop right now.
They didn't, and I disagree they should pay to get Linux porting effort going. Instead, they should publish info about M1 SoC that would make writing kernel extension or drivers for it much easier. That would make adapting any OS for M1 possible.
Because they save money - as they will not pay salaries and taxes on this effort and then when it matures they'll likely appropriate the solution and boast how M1 is great for developers. This should be illegal. In my country a person cannot work for a company for free, they need to receive at least a minimum wage. This port is essentially unpaid and untaxed work for Apple.
By that logic, any enthusiast who has, over the last ~30 years, reverse engineered and written a Linux driver for a random piece of hardware, should get paid by the manufacturer. That's just absurd.
I really doubt that Apple will latch onto the Linux porting effort at all. Their goal is to get people to buy their laptops and run macOS on them, and to lock developers in via Xcode and the Mac (and iOS) App Stores. Linux would be a distraction, at best.
This is very anecdotal but every single laptop I have bought in the last 10 years (4) worked perfectly out-of-the box with either Linux Mint or Ubuntu (which are the same on the inside). I only vaguely remember once I had to change some conf file for my trackpad to work after a dist update but that was it.
Not sure why everyone in this thread is so hesitant about Linux on a laptop? It literally just works (tm) for me and I haven’t used windows for 10 years and will never have a personal Mac.
Every single laptop my company bought me in the last 10 years (3+) had kernel/driver issues, especially with Ubuntu, since the kernel version was usually lower than that from other distros, like Fedora.
This includes mostly graphics driver issues (either simply Nvidia, or hybrids, or some exotic protocols such as DisplayLink), but also touchpad, power/suspend issues (e.g. wake-up failure), lack of fingerprint or smartcard reader drivers... Without mentioning certain manufacturers forcing you to use Windows simply to update docking station firmware drivers (like Dell does).
I still favor Linux whenever I can, but there is a real issue in several models, especially bleeding-edge powerful laptops from brands favored by businesses. Linux certification does improve things, but it often omits "optional" components such as fingerprint readers.
Overall, about 1-2 years after the new laptop model has been released, almost everything works fine, but when you're among the first users, some amount of time will be spent in forums trying to find solutions and workarounds for such issues.
On XPS 13 (2017) I watched this break due to a firmware update from Dell (something to do with sleep states)
Before the update:
Windows sleep was strange, it would sleep but drain battery like crazy. Linux sleep was perfect, I could have the laptop sleeping for days.
After the update:
Linux sleep started showing the exact problem as Windows now.
I did not dive much further because I switched jobs and got a ThinkPad at the new one which works perfect.
Semi on topic: I don't have one either but I would really like to, simply because then I could have a stronger account password.
Having to frequently use sudo restricts the length/complexity of the password since a longer one would simply be a pain in the ass to use. With a fingerprint reader this problem wouldn't exist.
I found it much more difficult. I had to move my swap partition into the LUKS partition so it was encrypted, resize it to be much bigger than the default then figure our how to have the RESUME=disk-uuid boot flag enabled for EFI boot. I got it working but it was quite time consuming.
We see that kind of issue frequently and in the majority of cases it is not a Linux fault, but manufacturer firmware.
It is such a frequent issue I wrote an article 10 years ago that is still extremely relevant with simple instructions to enable a kernel work-around [0].
Specifically the problem is the ACPI DSDT (Differentiated System Description Table) - which is actually byte-code installed by the device manufacturer (but executed by the host OS) that handles device power states and enablement.
The problem is almost always DSDT methods are written to conditionally configure system hardware optimally ONLY when the host OS is a version of Windows as declared by the host OS's ACPI OSI (Operating System Identification).
When Linux boots only a minimal configuration is applied which very often has never been tested and thus devices fail to work reliably, especially for suspend/resume, in interesting ways.
Fortunately, Linux has a WORKAROUND that allows passing a 'fake' OSI on the kernel command-line in the form:
acpi_osi=! "acpi_osi=Windows XXXX"
The first instance (=!) clears all built-in kernel OSIs to avoid confusion. The second string sets the 'best' and only OSI value which has been found in the DSDT of that PC.
E.g: on my Lenovo E495 (AMD Ryzen 7 3700U) I have:
Microsoft Windows NT
Windows 2001
Windows 2001.1
Windows 2001 SP1
Windows 2001 SP2
Windows 2001 SP3
Windows 2006
Windows 2006 SP1
Windows 2009
Windows 2009
Windows 2012
Windows 2013
Windows 2015
Windows 2015
Windows 2016
So, on the basis that "Windows 2016" is the 'latest' and likely the OSI expected to enable all features optimally I use that. On Debian/Ubuntu I therefore have:
Notice the escaped quote-marks inside the shell string since the argument contains a space.
(some folks might prefer to edit the package-shipped /etc/default/grub but I prefer to leave that virgin to avoid package-upgrade prompts when that file is replaced.)
After:
sudo update-grub
/boot/grub/grub.cfg
will have the acpi_osi= entries added to all the "linux ..." command lines
most features of my laptop don't work reliably (MSI GS65) if I don't use the acpi_osi line, why would that be in that case ? I'm on archlinux with pretty much always the latest kernel
What argument are you providing? In some cases if you pretend to be an older version of Windows then things will work better, since it's possible for the kernel to end up pretending to be a new version of Windows without all the relevant semantic changes in the drivers having been made.
Ok. It should make literally zero difference in that case. What problems do you see without it, and could you mail me dmesg for both cases? (mjgarrett59@googlemail.com)
That sounds out of date. Linux has reported itself as Windows via ACPI OSI and actually reports FALSE for "Linux" OSI query for well over a decade now.
most features of my laptop don't work reliably (MSI GS65) if I don't use the acpi_osi line, why would that be in that case ? I'm on archlinux with pretty much always the latest kernel
Just tried this on my AMD laptop with an American megatrends bios and it won't boot at all - just freezes while initializing the efi framebuffer device.
Clearly in some cases the manufacturers overrides for Linux are required...
I'd recommend buying one of the new Laptops with actual, official LVFS support by the vendor. My new (Tiger Lake) Dell XPS 13 for example even has support for the fingerprint reader, face unlock etc. by default, and the new ThinkPads and many other High-End models come with Linux support and Linux pre-installed nowadays. They also have a proper hibernation toggle (Linux/Windows mode) in the BIOS so it "just works".
Are you somehow explicitly testing S3? S3 is obsolete on new hardware — it was replaced by S0ix. The kernel and userspace are supposed to be able to handle this.
>about 1-2 years after the new laptop model has been released, almost everything works fine
It depends on the category of device and what feature you want. I recently got an X1 Tablet (the 2-in-1 version of the X1 Carbon), and it barely works on Linux; there's a veritable zoo of inscrutable bugs that render the device very annoying to use (for example, Plasma's night mode disables when you re-attach the keyboard). It's clear that the FOSS ecosystem as a whole places little priority on touchscreeny things, either on the device end or the software end; the overlap between people hackish enough to fix weird bugs and people who find 2-in-1 tablets compelling enough to hack on is evidently smaller than with normal laptops.
This matches my experience with a 2-in-1 Thinkpad. Obscure bugs (WiFi driver periodically crashing when Thunderbolt is connected) still there after 2 years now.
I think the best bet is going with as much of a run off the mill device as possible. In my experience a separate tablet works much better than a 2-in-1 compromise anyway. When your sole input device is a touchscreen, UX is (unsurprisingly) massively better with a touch-first UI compared to a mouse-first UI.
The problem isn't the "compromise" - the problem is there is no touch-first UI on Linux. That's the point. I can detach the screen and use it as a "seperate tablet", but it still sucks.
Android, on the other hand, is strictly improved by the addition of a hardware keyboard.
If you’re still looking, you could check out JingOS. I haven’t tried it personally but it seems to be basically an iPadOS desktop environment for Linux.
Also very anecdotal, but I've been running Linux on laptops for ~15 years, and have had many issues and difficulties. My first mistake was trying to run Linux on Mac laptops for the first 10 of those years.
(To be fair, my first Mac+Linux experience was on a PowerBook, which surprisingly worked rather well, though power management was terrible so battery life suffered. Once the Intel train started, it was... not great.)
In 2016 I bought a Razer Blade Stealth, and Linux ran perfectly on it; there was even 3rd-party support for custom things like the tweakable RGB keyboard backlight.
Like a fool, I didn't learn from my mistakes, and in 2018 I installed Linux on an employer-provided 2016 MacBook Pro. At the time, the keyboard and trackpad required an out-of-tree kernel driver (so I needed a USB keyboard and mouse to install), and audio and suspend-to-RAM didn't work (well, suspend was fine, but the disk drive wouldn't wake up on resume). My timing here was actually good, as not long before this, the NVMe drive itself didn't even work.
In 2019 I bought a Dell XPS 13, and aside from the fingerprint sensor, Linux again runs perfectly on it.
So, if you get yourself a laptop that is either specifically made to support Linux, or you've determined via research works well with Linux on it, then you should be fine. Otherwise, you'll probably have issues, unless you're lucky.
(Avoiding laptops with certain things, like Nvidia video and Broadcom WiFi, helps a lot.)
- Support for Optimus laptops with on-demand rendering offload, if one does the research and uses the correct env variables or tries the "Launch using Dedicated Graphics Card" menu item in GNOME.
- Some whispers of Nvidia finally using dmabuf in the next version of its drivers. With appropriate code/kernel module restructuring for GPL compatibility, I guess.
I feel like the pattern is that when you start to dig into people talking about their great experience with Linux on a laptop, you end up eventually getting them to admit that there are a lot of annoyances they’ve just resigned to living with
On some fronts it even feels like things have regressed. Trying to resize an encrypted partition is way, way too difficult. And it seems like GParted doesn't handle LUKS so you're back to manually typing block offsets on the command line.
If you wanted to use a TPM to store the FDE passphrase, well, you have the patience of a saint.
Compare this to Mac or Windows where you click a single button and it's all setup, including TPM!
As much as I love linux for server environments, it remains a failure when it comes to modern desktop environments.
>As much as I love linux for server environments, it remains a failure when it comes to modern desktop environments.
On top of what you mentioned, software distribution on Linux is still a nightmare for normal users. Yea apt-get is great for us nerds, but Bob the Accountant has no desire to use a CLI. Linux will not be viable on the modern desktop until the user never has to see a CLI.
from the cursory read of the documentation on the openbsd site it doesn't seem to support TPMs. Setting up the encryption of a disk paritition does seem easier though.
I feel the same way. It was the annoyances with Windows that I couldn't resign myself to just live with that pushed me to Linux on my desktop full time. For those unable to make the switch due to some critical application in their workflow, it's easy to see how they'd feel they have resigned themselves to dealing with Windows' annoyances.
OTOH, my work bought me a top of the line Macbook in 2020, and there's been many annoyances I've had to live with too, including strange bugs that Apple never seems to fix.
I also have an older Macbook Air, and when I tried to auto-update it to the latest MacOS, the update process corrupted my partition and lost all my files (thankfully backed up in the cloud).
Also had issues with Windows Update being broken out of the box when I tried to set up a PC for my mom with Windows 8. Seriously, I needed to manually download an update on a thumb drive to fix windows update. Like, how did this OS release ever pass QA?
I would honestly say Linux on the desktop and on laptops is pretty good in my experience, and if it had just a little more support from hardware vendors it would be near-perfect. I would definitely say that compared to Windows, the Linux experience is a lot better.
As opposed to basic things (like display drivers, hibernation, bluetooth, fingerprint scanners, etc.) working fine on both.
Every OS will add annoyances and bugs on top. But it's on Linux where people usually have to fight/configure/trial-and-error to get basic things working more often than Windows and macOS.
And this is not some Linux hater notion. It has been complained about by people like ESR, Linus, and Miguel Icaza. If the original "bazaar" proponent, the founder of the OS, and the original creator of the still major desktop environment have issues with ease-of-use and just-works, who are we to say otherwise?
> As opposed to basic things (like display drivers, hibernation, bluetooth, fingerprint scanners, etc.) working fine on both.
I have a nice list of issues that contradicts that statement. For example, M1 macs like to kernel panic for some reason.
> But it's on Linux where people usually have to fight/configure/trial-and-error to get basic things working more often than Windows and macOS.
Depends what you are trying to achieve. If you were trying to hackintosh some random ASUS+AMD laptop, your experience would not be nice either. Similarly, when you are trying to use random crap (that's technical term) where nobody ever did any integration with Linux, the experience will be bad (that someone doing the integration is going to be you). If you use either supported machine model (where the machine vendor did all the work), or machine that's very near to reference designs (where the chip vendors did the work), you experience will be much better.
Seconded, but I believe there's more to it than just linux being "bad". I believe the greater problems are:
1. Poor discoverability and ease of use - if you have an issue or a need, it's often really hard to find out how to solve it. Even if you do manage to find a solution, often it's not quite plug-and-play and you have to mess around in configs or compile something yourself in order to get it to work. Fun for college students, but a massive pain for everyone else.
2. Hardware drivers: 50% of my issues with linux come from nvidia being an ass. Fuck you, nvidia. Likewise with other devices that often only get third-party drivers which aren't good enough or require more wrangling, see problem 1.
A2DP doesn't work simultaneously with HFP (i.e. "with the (bluetooth) mic on"). That's not Linux limitation, that's Bluetooth limitation and other OSes - Android, MacOS, iOS, Windows - behave exactly the same.
Sharing single displays or specific windows on Wayland is easy - Chrome and Firefox, or app like OBS can do it; if your conference app cannot do it, ask the vendor why.
I have a fine experience on my Dell XPS laptop, it doesn't come with an nVidia card but know it would be a PITA if it did.
Better than my colleagues who use a Macbook and can't use Display Ports correctly, can't plug a cable into one side without it ramping up CPU, can't use their keyboards after a while, etcetcetc.
So we don't exclude them to make a point, that's also true for Mac and PC laptops.
My XPS 15 wifi won't wake automatically after sleep on Windows. Wifi works perfectly on Linux, but there are things that don't work with the touchscreen in tablet mode on Fedora that I've been told work on Ubuntu.
My T2 MacBooks are little nightmares sometimes, fans screaming during video meetings and crashing and draining battery instead of sleeping, and I still haven't seen a compelling reason to give up my 2015 13". If the second gen M-series solves the weirder problems, maybe.
> So we don't exclude them to make a point, that's also true for Mac and PC laptops.
The 2016-2019 mbp have been ripped on by everyone, Mac fan or not. I'm not sure if they will go down as the worst Macs of all time, but it might the longest stretch of bad Macs in the history of the Mac.
With the M1 that time appears to be over - finally.
Not my experience at all. Thinkpads are the one with the best support, but there are always problems with high DPI screens, plugging external monitors, suspending some times works some times doesn't, battery drains really quick, the trackpad just barely works (gestures, hardly work).
I feel the only people that say Linux works great on a laptop is because A) it manages to boot it and 2) Never used a MacBook Pro or similar so they don't know what something working well means.
I'd love to use a linux laptop and I've used it many years in the past, but not going back this year. It is not ready yet for my expectations.
I've used Linux mostly on IBM/Lenovo ThinkPads and Dell Latitudes. Overall I've had great success. Drivers work, suspend and hibernation work, plugging in monitors just works, wifi works, and battery life is decent enough.
My current benchmark for laptop integration is a Dell XPS 13 2-in-1 that my previous employer bought me with Windows 10. It had a high-DPI screen and there were so many bugs it wasn't even funny. The touchpad was horrible to use. Connecting and disconnecting external monitors was a fun adventure not unlike playing roulette to see if the big presentation you're about to make will ever show on the projector.
Overall, I'd say my Windows and Linux experiences on laptops has been about equal in terms of frustration and annoyance. I haven't had a Mac since like 2007 when I got rid of my G4 PowerBook which was probably the most polished laptop I've ever used in terms of hardware/software just working-ness.
Windows 10 is a mess. It makes Linux on a laptop much easier to bear :)
I think the real advantage of Apple is that they're building both, the hardware and the software.
I'm very skeptical we will ever see such a good integration in the linux or windows world because there isn't any single entity doing both.
My ideal situation would be a Macbook Pro running linux to the perfection, or just having MacOS open source. I don't think I'm going to see any of this in my lifetime.
>Thinkpads are the one with the best support, but there are always problems with high DPI screens
Had no problems plugging my 4k LG CX with at 1440p@120Hz mode, which is kinda more than I managed to do with Mac.
>Never used a MacBook Pro or similar so they don't know what something working well means
Nope. Used three different macs over course of 4 years due to fact that three employers provided me them. I hated those machines, especially constant overheating issues when using external monitors. Maybe M1 fixed that, but they still won't offer it with 16' ones.
Also hated the UI with burning passion. Only thing that made it usable is the fact that maximizing window switches the window to a different "workspace". If only those were numbered, persistently assigned to monitors and windows could be configured to only spawn on particular workspaces it would be pretty nice though.
One thing I really liked though is retina display on two later ones.
> 2) Never used a MacBook Pro or similar so they don't know what something working well means.
On the contrary; if you use all three systems, you realize, that all of them have quirks. You just got used to them, so you don't get disturbed to them, but the quirks of the system you don't use regularly will get noticed quickly.
For example, only with the above-mentioned MacBook Pro I experienced a situation, where I'm connected to the wifi, unless suddenly I'am not, all network packets go to null and to get connected again, I have to reboot the machine. Or the machine randomly not waking up from sleep, when connected to TB dock. Or - new on M1 - machine kernel panicking, the menu bar getting non-responsive or not recognizing ethernet adapter in the TB dock at all.
The experience has dramatically improved since when I first started using Linux with Ubuntu dapper. I have had very good success as well. Having said that, I have run into hibernation, track pad issues with Asus devices. Optimus is also a right pain in the ass.
Your best bet is a Dell laptop with an integrated Intel gpu chip.
Can confirm. Linux "just works" now to the extent that it's boring. I remember when I was a teen the struggle of getting things like WiFi to work and the joy when you finally did it. For years now I've been running Gentoo and it works better than most distros did back then!
The manufacturers of the laptops you mention (HP, Lenovo, Dell, Acer, ASUS etc.) all directly support Linux for their hardware by publishing SDKs, developing drivers, pushing updates to the kernel and more. Some of these companies even sell Linux laptops themselves. Apple is a very different company.
The short answer is that this is a different beast. Much of the hardware in Macs before the M1 was commodity hardware. Now it is all bespoke and undocumented. This is a big undertaking.
I mean, it's really just a case of being back to the good ol' days when Macs had PowerPC (or before then, m68k) processors. So if anything, I'd expect more success with this in the long run than with commodity x86 hardware, since it's a more-or-less new architecture rather than having to adapt an existing architecture to Apple-specific oddities.
But yeah, short-term there's a bit of a barrier to entry. Once it's overcome, though, this might be what pushes me to voluntarily buy any piece of Apple hardware made after 2007.
I don’t think so. The CPU itself isn’t the big problem here. You can see basic support is already there. The challenge is all of the surrounding hardware.
I am not sure why would would expect better success with this processor than we have had with x86? Most PC hardware is directly supported by the OEM in Linux.
He's not asking why it's hard to port linux to M1 (and it's never been easy either, even before M1), but why people in this thread have hesitations about linux on laptop in general.
Not hesitant, just personal preference. Probably like that for a lot of people here. I used Linux on desktop (later, laptop) for a long time, since the late 90s - went through so many phases, "not enough RAM to run X", barebones X, "emacs for everything", enlightenment, gnome desktop, etc.
At some point the churn got to me and I stopped customising my computer environments. Just use the defaults. By this time OSX was a "real UNIX" and I hopped over. No regrets. It does what I need it to do, which is still mostly to run a terminal and a browser.
I will say that reliably turning on and off when I close the lid is a great quality of life improvement.
The Ubuntu bluetooth stack has been misconfigured for a long time, and it still is (specifically, the pulseaudio bluetooth module is misnamed; there is an open bug). I use an Ubuntu MATE, but I think this applies to the other Ubuntu distros as well.
It also had other misconfigurations in the past, and anyway, the bluetooth stack has always been poor quality (nowadays is much better though).
Even the Dell XPS, which is supposed to be perfectly compatible, has minor niggles (in other areas).
So, Ubuntu/Linux works acceptably... yes, but perfectly, definitely not.
I was investigating recently options which laptop I can buy to be able to use Linux without any issues. I've checked Lenovo Legion 5 with AMD processor. Looks like good machine for a price, but touchpad is not gonna work out of the box on the latest stable Ubuntu. Kernel upgrade is needed (or PopOs distro that already has needed patch).
For me kernel update does not sound as something big, at least I know that something like kernel exists in Linux, the point is that average computer user these days does not have such knowledge.
That was just one example. Many people can give more and more on various issues. Some edge cases with routers provided by ISP. With wifi repeaters/boosters. With printers. DLNA equipment connectivity (I had this one too - again, after googling I've installed some software and it started to work, but average... and so on).
Typically older equipment and standard peripherals work ok. But people use all kind of stuff.
If you want to use Linux on laptops, the best option is Fedora. Of all the mainstream dependable Linux distros it has the most up to date Kernel and other bits (like Gnome). This means that its support for hardware is usually the best. On Lenovo laptops with Intel graphics everything works perfectly out of the box with no issues (all the way down to fingerprint readers etc.).
I also find it to be stable and generally good distro, but of course that's more subjective.
It helps, that some Thinkpads do ship with Fedora OOB. Lenovo worked with Redhat to have the support for these models upstreamed to the respective projects.
If you don't need discrete GPU, check out Intel Dell with Linux support. I recently bought 3410 with pre-installed Ubuntu and it works flawlessly, both with Ubuntu and with latest Fedora.
Touchpad is not Macbook-level, though. But may be XPS is better.
I have a System76 laptop that comes preinstalled with Linux.
Everything graphics related is crap compared to Windows and MacOS. From Optimus not working to all kinds of high DPI issues, to the ease of not completely breaking things with a bad driver update.
It's not really Linux's fault, but it affects the user experience for sure.
For example, the microphone of the X1 Carbon 7th generation still doesn't work out of the box. You need to blacklist a handful of kernel modules and edit your PulseAudio configuration to force the loading of an ALSA source hooked to the correct device ID.
The Arch wiki has an article cataloguing issues and workarounds (with various degrees of success) for every generation of the X1.
It really depends on what you buy / how recent it is. If you buy a just-released new model (that's not just a trivial bump in specs from the previous version), you will likely run into issues.
But you're right - the experience is way better than a decade ago and these days I'd expect anything that's been on the shelf of a local office store to work just fine.
I had one of the first Nvidia Optimus laptops. It was a nightmare on Linux for a while even just trying to get the Nvidia card off (no BIOS option) so it wouldn't drain the battery was difficult let alone using it. I don't even think it was officially supported by Nvidia when they added support because the gen after they changed how Optimus worked?
My laptop after that at least had a BIOS option to disable the Nvidia graphics. I only bought an Nvidia model again because the OEM made a mistake and sold it for hundreds less than the Intel one on their website. Every distro needed me to do something to get the screen brightness keys to work.
WiFi chipsets could also be a hassle though they have gotten much better.
My current laptop, a 2018 MacBook Pro, is basically near unusable as a Linux laptop. By default the trackpad and keyboard don't work and you need someone's kernel patch to use them. You can't currently have both sleep and sound. A bunch of little other gremlins.
Got a Dell with an integrated GPU as well as a GTX1660. The laptop's display is hooked to the igpu, the HDMI to the GTX. Problems that have happened with Linux, in no particular order:
* Bumblebee/Optimus is a pain in the ass to install
* Performance on the second screen with Optimus is dreadful (sub 30Hz refresh rate)
* Being an NVidia card, I have the choice between Nouveau which can only put it into its low power state (which fucks me over when I need to use the Android emulator), or the proprietary nvidia ones which are, well, shit.
* Putting the laptop to sleep with an external display connected is a sure way to have it crash when it starts back up.
* Just kidding it doesn't actually go to sleep, the GPU stays on and runs like crazy.
I would _love_ to be running Debian on this machine. Sadly, the state of some things on Linux makes it impossible for me.
My emergency laptop, an 8 year old low end HP with a broken keyboard is still happily running along with Debian though.
I have a very similar Dell. I'm not sure it fits your use case, but I just run on the Intel graphics with the nvidia one powered down and no nvidia drivers installed. The Intel card runs multi-monitor 4K desktop stuff smooth as silk (I use Wayland, but I checked X too).
Whenever I need to run CUDA stuff (my only use case for the nvidia GPU), I power it up, and pass it through to a VM that also runs Linux, but with the horrid nvidia stuff installed. Works like a charm.
I'd be curious to see your setup, purely because I have never found a way to have the iGPU handle the HDMI output. It seems it's directly soldered to the Nvidia card.
Strange. I thought that solution (output soldered onto the GeForce) was discontinued a long time ago for this range of machines. My setup works really well both with the HDMI output, with the Thunderbolt output, and of course with the laptop display.
I have this going on two very similar machines: A circa 2017 XPS 9560, and a circa 2020 XPS 7590. Both are 15". I don't have the 9560 near me at the moment, but the 7590 has
01:00.0 3D controller: NVIDIA Corporation TU117M [GeForce GTX 1650 Mobile / Max-Q] (rev ff)
I'm happy to help you debug if you want. What happens if you remove all nvidia drivers and plug something into the HDMI port? Do you notice any response from the external screen if you hit for example control f2?
I think its like most things - people with bad experiences talk about them a lot. If it works perfectly why would you post about it.
I've also had good experiences with linux laptops. Maybe not perfect, but more like easily resolved in 20 minutes of googling, which is totally reasonable to me.
I have the complete opposite experience, including laptops that were initially shipped with Linux (yes, Dell). Lack of support for hibernate, lack of support for docking station, etc. And things also break after updates. Of course "this is not Linux fault, this is lack of vendor support". But that doesn't change anything for me: It's either I'm willing to fiddle with the system or I have to buy something else.
My single experience with trying to use Linux daily on the laptop was 10 years ago. It had an "optimus" configuration, a combination of integrated/discrete GPU. I had to find instructions and resources online to get it to run for my specific model.
Eventually whoever was posting these probably moved on to a different model, which meant mine would stop working correctly when I'd upgrade Ubuntu.
When I tried again ONE change made all the difference. reconfiguring the trackpad settings. I think before there were confusing click regions for different buttons. I think I changed to one-finger two-finger tap and it was like a different machine.
Same for me, however you can be sure I do my research before buying to make sure it will be compatible. You can't just blindly buy a Windows laptop and expect it to work flawlessly on Linux.
Even on a Dell XPS Developer Edition (that comes with Linux) I can't get the webcam to use the full resolution or get bluetooth audio to work reliably.
I used Ubuntu on my sole/main machine for a month up until recently: i9 10900K, 32GB, 1TB NVMe SSD, GTX1080, MSI Z-490-A Pro, 3 x screens.
In my day job I'm a .NET developer so I created a Windows 10 VM for VS 2019 and allocated 4 cores and 12GB RAM. I used VMWare Workstation 16. It was smooth as hell.
I developed PHP/Laravel/Wordpress stuff in Linux using PHP Storm and VSCode. Docker on Linux natively is faster than in WSL2 (I know it's to do with accessing the file system but still...).
I learned a whole bunch of commands in this time and learned to live inside the terminal for a chunk of my work. It was fantastic. I loved it.
I also loved being able to configure Ubuntu and make it look like I wanted with new icons and whatnot: I didn't go mental, but it looked cool when I was finished.
All in all, as a dev machine (and browsing 'n' stuff) it worked... quite ok, sort of: it wasn't all plain sailing. Here are the issues I had
1. Realtek 2.5GB nic would not work so I bought a 1G USB nic and it ran fine without all kinds of tweaking (yes, I tried a bunch of stuff I found online but I can't get the realtek nic to work: something to do with the older kernel)
2a. When waking from sleep it sometimes lost network so had to reboot
2b. When waking from sleep AND with NordVPN on in the background it always failed to find a network card... reboot fixed it
3. Printer (HP MFP M477DW) would disappear frequently. Ubuntu would find it again when I asked but it wouldn't print to it. This required wizardry to get it running... more miss than hit so I ended up only using it through the Windows 10 VM
4. It would hang sometimes. Music would still be playing in the background but would become unresponsive. Happened maybe 5 times in a month. Reboot to fix.
5. VMWare workstation wouldn't work properly if I had display scaling turned on inside Ubuntu: it had no idea what resolution to show so I had to turn it off making things smaller than I'd like on the Ubuntu DESKTOP
6. Sound would fail regularly: usually crackled when I opened a web page. So I either close every browser window or stop all sound playing for a minute or so. I use a soundblaster something-or-other card.
7. OnlyOffice would open off-screen every single time. I had to do Alt+Space every time. Works perfectly on Windows 10. I switched to Libreoffice
All in all, it has come a long way since I last used it a few years ago but I NEVER have any of these issues in Windows 10. Ever. And, whether we like it or not, we tend to remember the bad stuff and I can't get away from the fact that W10 is just more stable.... for me, on my hardware, at least.
But I should caveat that by saying that I've disabled literally EVERYTHING in W10 using O&O Shutup, including updates (I've also deleted the Windows Update Medic Service dll so it doesn't turn updates back on). I hate notifications and I want to pick when I run updates myself - given the spotty nature of the updates I usually wait a while, like when I've finished some client work and can afford a bit of downtime if the update fails)
In Linux' defense, VMware workstation 16 running Windows 10 is smoother than a Windows 10 VM on VMware workstation on Windows. No idea why, it just runs smoother. If Linux stability was fixed (waking from sleep especially) I would switch immediately but all the tiny, annoying niggles that I just don't get with W10 are enough to keep me on W10... W10 is a tool and, for me, it's a better one for now.
> In Linux' defense, VMware workstation 16 running Windows 10 is smoother than a Windows 10 VM on VMware workstation on Windows.
Reminded me about a time years ago I had to do some c# work in Windows 7. At the time, I found the best Windows laptop experience was running it in a VM on a MBP.
I will say that running natively on Windows is still faster but the VM of Windows 10 in VMWare Workstation on Linux is noticeably faster and smoother than running that same Windows 10 VM on VMWare Workstation on Windows 10... it just defies logic!
It's not like it's a Type 1 hypervisor in Linux and Type 2 in Windows (well, I assume it isn't!).
Yep, honestly running ArchLinux + KDE on my Acer TravelMateX since 4 years approx I never had hardware/drivers related issues.
This laptop is not an ARM of course but works wonderful well.
BTW it also came with a fingerprint reader but I never used it.
Maybe I am being over cynical by isn't this a huge waste of time given that Apple can "pull the plug" at any time? They want to control everything that runs on their chips so why bother?
Apple has publicly stated their intent to permit users to run other operating systems on Apple silicon, and has provided various mechanisms for them to do so in a first-class way (still supporting secure boot, for instance). Of course, they haven't gone as far as many of use would like - marcan has had to do a lot of reverse engineering of stuff they could have just released the docs for, and it's always possible they'll change their mind and reverse course.
But those of us who daily-drive macs but need to use Linux occasionally are very much rooting for this to be a going concern.
> As for Windows running natively on the machine, "that's really up to Microsoft," he said. "We have the core technologies for them to do that, to run their ARM version of Windows, which in turn of course supports x86 user mode applications. But that's a decision Microsoft has to make, to bring to license that technology for users to run on these Macs. But the Macs are certainly very capable of it."
It's a pretty disingenuous way to blame Microsoft. Afterall, there are two ways that Windows 10 on Mac M1 could have occurred: Microsoft could relax the licensing so that end users could buy/install Windows 10 onto the Mac M1, or Apple could approach Microsoft like any other OEMs and license/preinstall Windows 10.
I don’t see what’s disingenuous about it? Apple are basically saying that they haven’t put in any technical barriers to stop Microsoft from porting Windows to the M1, and that Microsoft are free to do so if they want.
It's never a waste of time to try and assert more control over the hardware you own. Apple kept the iPhone fully locked down since day 1, but that didn't deter a massive jailbreaking community. Heck there were app stores on iPhone even before Apple could launch its own.
The PC has always been an open platform. Documentation is readily available and the intention is for users to run whatever they want on the hardware. Completely different.
Intel and Microsoft control Secure Boot. If Linux hadn't already been so popular on servers when SB came about, it's very possible that PC platform would today be locked down to run MS-only operating systems.
Right, but that happened 10 years after Linus started writing his OS. If Microsoft had had their way then the PC would have also been a closed platform. This doesn't contradict my point in any way.
It doesn't change the way things stand today but still I think there's a marked difference between openness as a commitment and openness by lucky alignment of interests so far.
They can certainly lock down future models if they want, but the current models for sale will always be able to boot Linux now. That's a good thing, and even if Apple ships a new laptop next year that only accepts signed OS images from Apple, that doesn't detract from the ability to run Linux on this year's model.
I think they've started taking steps to allow other OS more easily than before? Someone will tell me if I'm wrong or right, I don't remember the specifics.
Which makes it more unlikely that they will pull the plug. Of course, it's always a possibility.
The way they've implemented it suggests that it's intended for there to be support to boot other things.
Going back on it and locking it down could end in another lawsuit like the OtherOS removal from the PS3 (Which ended in $65 settlement payments it seems)
See, even linux has these corporate branch squashers, mixing completely unrelated patches into their feature branch. Here the m1-support feature included the unrelated removal of Cyclades, ISIcom and RocketPort/RocketModem.
That's not the case. What happened is that the Samsung UART patches to support the M1 got merged into tty-next first, and it is preferable to have this series based on those - otherwise, the devicetree would fail validation due to the missing bindings, and there would be no serial to debug or do anything. It helps keeping things ordered for bisection.
So we decided to merge a specific point from tty-next into this branch first, before the rest of the patches. Those patches are already in the pipeline for upstream inclusion. So it pulled in everything in tty-next that was before the Samsung changes, and it shows up in the diffstat, but those aren't really part of this series, it's just caused by the merge from tty-next.
They (community) are adding linux boot to macs with m1 chip. It requires quite a bit of work since apple doesn’t provide any documentation to boot other operating systems let alone drivers and so it will require patches in the kernel to even get it to boot. Eventually (maybe 2-3 years down the road) we may have a very usable linux ARM laptop.
Somewhat related: I'm super curious if M1's advantages would melt away if Intel simply fabbed their current designs on TSMC's 5nm process, to have a level playing field between Intel and Apple's chip designers.
This, porting an architecture to another process is a multi-year process. In a Reddit AMA a week and a half ago they mentioned that the Rocket Lake backport took 2 years (from 10nm -> 14nm) [1]. That's quite a quick schedule and also on a very mature process.
"The project final concept closed in Q1 2019 and we are launching today, so your estimate is right. Took us 2 years for RKL"
And that parallelism in large part comes from them having a genuinely enormous transistor budget. ARM is easier to be wide on, but the way x86 cores are implemented now makes the decoding a coefficient of overhead rather than an absolute game ender.
Of course, you wouldn't fab the current design on TSMC 5nm as they'd have more transistors to play with.
It's still not a level playing field as they'd be implementing different ISAs. If Intel acquired an ARM architecture license and then designed an ARMv8 CPU with identical constraints then you'd get a valid comparison.
Although you'd likely have different IP in other areas which could skew the results.
If you're looking for a comparison of the ISA's I think the decoding issues / execution width points to a likely ARM win still - only question is by how much.
There is still the problem of ARM being superior. You can’t put a “low power” x86 5nm chip in a phone right now and expect it to be on par with the iPhone.
edit: also don’t forget intels fabs are actually better than tsmc when nodes are the same
> also don’t forget intels fabs are actually better than tsmc when nodes are the same
They haven’t been on the same node size for quite a while though. I wonder if this will be true in the future. Intel has a problem because even if they catch up and for some reason TSMC will be blocked at 5nm, TSMC will still have a multi year experience and will be able to improve the 5nm process.
Why do we still buy macs? Apple can pull the plug anytime. Kudos yo the huge undertaking but I personally think we wouldn't need this type of gargatuan effort if we just refused to buy macs without linux support. Why support such a closed system?, enlighten me...
Because surprise, surprise, most Mac users don’t give a squat about running Linux on them. This leads to a world where a lot of people have macs, and if you want to “convert” them you need to support the machines they already have.
This goes for every hardware platform doesn't it? The Raspberry Pi foundation could decide to have Raspberry Pi systems boot only Windows 10 IOT only starting tomorrow. But they don't have any interest to do so.
Apple went out of their way to add support for third party OSes to Macs. They actively made tools to allow the Asahi Linux folks and others to boot unsigned OSes on M1 Macs. They would have had much less work - both in terms of making and maintaining the tools as well as customer support work - if they kept the boot system more similar to the iPad boot system it derives from. In general really Apple doesn't seem to be interested in closing down the Mac at this level. Hence why you can (attempt to) run third party OSes on virtually all Macintoshes going back to 1987 [1].
I for one am happy with that. If I buy an Apple Silicon Mac, I won't be running Linux on it - at least not on bare metal. But I might do that in 10 years after Apple's support has ended.
Because if you ignore the high prices those are really good machines. Unlike anything else out there. Durable as well.
I wish there were sleek, metal body lightweight laptops with Linux (w/o Windows hence minus some cost) which just worked out of the box (one that didn’t require hunting around forums to have basic functionalities), but there isn’t. And no I don’t mean that one or two exotic OEM/model. But a general availability.
So people for whom cost isn’t a concern they buy Macs.
Damn you are so right. In fact, I did realize it after writing the comment, but then I also factored in repairability and upgrade factors and repair cost as well.
In general, it's because they get out of the user's way and it becomes invisible to the user's workflow. Linux always involves tweaking the system just to get work done. The integration of the software and hardware of the Mac is why the user experience is no competition.
https://asahilinux.org/2021/03/progress-report-january-febru...
It still applies to what got merged; the past month was spent on clean-ups, fixes, and review feedback, but there are no major changes to the approach. This initial merge was quite complex as it has to touch a number of subsystems and core kernel code in order to support the M1's quirks, and it was CCed to ~20 people. Now that it's done we can more efficiently work on individual drivers and subsystems, so I expect the pace to pick up.
If you would like to support my work on this project, I do have a Patreon: https://www.patreon.com/marcan
I also streamed initial development on YouTube and will be resuming streams next week (patch feedback and git rebasing and munging doesn't make for very interesting streams, but now I'm back to coding). Coming up I have some driver support other people have been working on, and then I'll be writing a minimal hypervisor that can run macOS as a guest, to help reverse engineer the hardware - this will be important for reverse engineering more complex drivers cleanly, especially the GPU kernel side.
Until now we've used a serial cable to debug/load kernels (needs a DIY arduino thing, an proper design I'm working on which is still vapourware, or another M1 box), but Sven added support for the USB device controller and I'll merge it into our bootloader soon, so from this point on anyone will be able to do quick kernel iteration, debugging, and hardware exploration with just a standard USB cable and any other host machine.