I am so happy to see the quick progress of Linux on the ARM-Macs. In the 80ies and 90ies, there were several competing processor architectures on the market, but for the last 20 years for PCs it was x86 only. With the Apple Silicon, there is now a real contender, actually surpassing the current x86 offerings in many aspects. And that is, why competition between architectures is so important. And of course just interesting from a software development perspective.
With Linux becoming a viable option on those machines, they become interesting for a far wider audience than just the MacOS users. Thanks to the great work by Alyssa, GPU acceleration should be close too.
Then lets see when Linus gets himself a Mac, he already indicated that he would be interested to do so as long as he doesn't have to port Linux himself.
> With Linux becoming a viable option on those machines, they become interesting for a far wider audience than just the MacOS users.
While the option of running Linux on one of these M1 chips is intriguing to many of us, I have a hard time seeing that this will bring these machines to a "far wider audience" than MacOS users. It does open up some niches, and in particular it could mean that people will still be able to make use of these laptops after Apple stops supporting them. But we're pretty much a rounding error for the duopoly that owns the desktop OS market.
I do share your admiration for the accomplishments of Alyssa and all of those who are porting an open source operating system to a new hardware design with little help from the manufacturer.
The size of the audience is moot, but the benefit to me, a software developer, is sizeable.
I have remained a Mac user through the most recent several major OS X/OSX/MacOs/macOS changes with increasing reluctance as Apple increases its ownership of my hardware. I "own" an M1 Mac. I would like the freedom to run a free OS with free drivers on it. I watch Asahi Linux and the associated work closely. I donate and I hope.
It's really hard to beat apple's laptop hardware in overall quality. Maybe framework will be there one day, but it's difficult to see them with an ARM laptop any time soon. I know more than one person who wants to just use apple laptops with linux for example.
Yup. My grandmother wanted a laptop to use, so I slapped basically chrome os on an old 2012 MBP and she's been using that just fine. The hardware itself is still quite snappy, the GPU is way out of date, but for facebook and youtube and video calls? Works great!
>I would like the freedom to run a free OS with free drivers on it.
I may be reading this wrong, but it sounds like you think this is something of a limit because of Apple. You have always been able to do this if you could find software that works. You can't "use" their OS on non-Apple hardware though.
It's not Apple's responsibility to write code for software that is not theirs. Could they release "drivers"? For what? Why would they? The hopes of selling 1k more hardware units to Linux devotees? Why would they want to incur the expense of that support when it is such a tear in the ocean level of user base?
You are right. It is not Apple's responsibility, further I knew the score when I bought my M1 Max.
I merely wish that Apple didn't think it was their responsibility to track and monitor my use of "my" hardware. Free and open software allows me to use the hardware exclusively in accordance with my desires.
in what way are they doing this? is there something other than no other software works on this new hardware preventing your from running other software when (not if) it does arrive?
Macs have huge penetration into the developer market. I could see a lot of devs whacking Linux on their M1 Macs to enjoy benefits like being able to run containers outside of a virtual machine.
If Apple waited 6 months and released a significantly cheaper 16 inch M1 Pro Mac with a non-XDR screen and in the old form factor (to save money on tooling), similar how they do with the iPhone SE line, then they'd make so much damn money from devs jumping on board.
In the big developer markets, the price of the device isn't a huge factor. Today, the available stock is a bigger deal. At least for a daily driver.
I also want the highest resolution screen with crispest graphics possible if I'm staring at it all day (1080p IPS displays are awful for my eyes). The thing about Macs that makes them nicer than most comparable Linux machines is the display is much better.
One of the only reasons to consider the Mac over something like an XPS today though (other than that you can go out and buy it) is that it supports Adobe products which you may need for front end work. However, if my company is all in on Figma and Im just working on a generic backend you bet your butt I'm asking for an XPS.
> In the big developer markets, the price of the device isn't a huge factor.
You might suffer from SV syndrome. I live in a small town in the east of France: anything Apple is unaffordable (without major sacrifices and uncertainty to be able to replace the machine if it breaks too soon).
Right now if I had to replace my current computer, I would have about 600€ total budget. And France isn't exactly a third world country (but we're doing our best to get there ;)
I'm also in France, but in a larger city, but what you said still rings true. I prefer older Thinkpads, as if something breaks I can either fix it myself or take it to just about any computer repair shop on earth and they'll have the ability to repair it. Plus they tend to run Debian with ease.
I'm not in SV, but my employers have all provided my machines. Some companies are better than others, but price usually isn't a factor at places I've worked. I also probably wouldn't work at a company that gives their engineers shitty computers, since that's like working at a restaurant that makes their cooks use dull knives (and there is a big market of places that will give you whatever set you want)
I don't know where you heard that, but even if we accept that assertion, it's not relevant to this conversation.
We're talking about running Linux on the Mac, so you won't be paying that virtualisation penalty.
There is no reason to think ARM64 containers running on an ARM64 machine would run any worse than AMD64 images running on an AMD64 machine. So given M1 Pro/Max Macs consistently rate amongst the best performing laptops on the market, you should expect a similar experience when running ARM64 containers on Linux.
As more developers pick up ARM64 macs, more binaries and containers will be releasing in ARM64. There was already a massive boost in this kind of thing after the launch of the M1.
No they wouldn't. There really aren't enough devs around to make that statement true. Plus, if so many devs would do that, they can buy the 6-month-old refurbs at that time and do what they want. If they aren't doing that, then they wouldn't buy the machine you're imagining.
Devs may be small in number relative to the consumer market, but they're big in impact when it comes to building out an ecosystem. People complain about Linux on Desktop, but the Linux on Desktop is unreasonably good considering its market share.
As devs build out that ecosystem it'll become easier bring products to ARM. Eventually, we might even see something like the Steam Machines built on an ARM64 platform. If it ever happens, it'll happen on the backs of those trail blazing devs who built out the ecosystem, because they were able to get a pretty rockin' laptop on an ARM64 platform.
Devs also happen to be great evangelists for bringing technologies into the corporate ecosystem. This will create a demand for porting popular MDM and other corporate tooling over to ARM64.
I'm a dev and I'd love a machine like that. I actually think the older form factor is better, I carry my laptop to and from work every day, so I appreciate a lighter and slimmer laptop and I don't think I'm alone on that account. I couldn't care less about a screen that can get up to 1000+ nits, but only when playing certain videos.
I fail to see how any of that would make Apple a bunch of money.
I know Apple needs devs to make money. Lots of them. But I don't see any lack of them at the moment. In this context it appears we are just talking about selling more machines. You'd buy one. So would a few others. Apple wouldn't make a ton of money on that machine though.
There are enough iOS developers to make some difference. And I think that if it weren't for those iOS developers, who have no choice but to run MacOS, there would be many fewer developers using Macs.
Not just iOS, just the issue that you need MacOS to make applications for the apple ecosystem regardless. Im sure iOS is a big part, but I also imagine all the desktop application that want to be cross platform, so think like Photoshop. Or Microsoft has to have a Mac somewhere to compile Office to MacOS.
Software packages baked into images (at least the ones meant to be run ultimately on the servers) are actually x86 binaries, so no luck running them on ARM CPUs.
Docker Desktop for Mac runs Docker in a VM. It has poor disk I/O compared to native, and IIRC it is not as capable as native from a networking perspective (perhaps not a big issue). Also, as of recently it costs money to use Docker Desktop for Mac for work purposes.
So the advantage is you get better container performance, all the capability you would normally get with container workloads on their native platform (Linux), and you wouldn't have to use the Docker Desktop product anymore.
In addition to performance, I also ran into weird bugs with the Docker Desktop vm. The VM would run out of disk, and other things that would not occur with a native docker.
It also has weird bouts of high CPU usage, even if running containers are not busy. I think it's due to overhead with osxfs handling local disk changes (e.g. you've mounted a volume with a large git repo, and in that repo on the host you switched to a much different revision). It's hard to troubleshoot because it's not obvious how to get into the VM to see what's going on.
Overall, this is an issue that I think goes beyond Docker on a Mac. We have multiple examples of development targeting one platform, or involving tools native to that platform. But because of preference, or conflict with other tools, or policy, we end up with these emulation or compatibility layers like HyperKit/Docker for Mac, or WSL/WSL2, or WINE, or frankly WebAssembly/Emscripten/whatever. And I think the result is almost always worse than if people just found the most native set of tooling for their primary use case and used that.
What is it about macOS that makes Docker Desktop for Mac worth it, if your development workflow is heavily container-based? At my workplace, it's mostly policy and preference issues making Linux unsuitable for the majority of the user base, even though that majority is working with containers constantly and targeting Linux.
I guess you can pick at this and point out that we're not writing software in assembly for a reason, but I feel like there's a line where you have too much abstraction, adaptation, or emulation, and for me workflows built around something like Docker Desktop for Mac just so we can use macOS are over that line.
Docker on the Mac is 10x slower for some of my workloads than Docker on my linux cloud machine. (e.g., npm install is literally a few seconds vs 5 minutes)
By whose definition? There have always been trade offs to achieve that portability. Processing power has always been one mainly due to the electrical power demands. We're just now getting to battery tech that is impressive. We've taken this long to get to processing abilities that didn't require being attached directly to a power line to idle.
If "supposed to be" means we all have been "hoping and wishing one day" it might be possible, then sure, "supposed to be" it is.
>Processing power has always been one mainly due to the electrical power demands. We're just now getting to battery tech that is impressive.
The M1 Mac laptops are using the same battery tech as everyone else.
What has changed is the ratio of performance to power draw, and leaving behind the almost immediate thermal throttling you see in x86 laptops.
>The chips here aren’t only able to outclass any competitor laptop design, but also competes against the best desktop systems out there, you’d have to bring out server-class hardware to get ahead of the M1 Max – it’s just generally absurd.
M1 Max Macs that have enough memory and processing power to be a mobile workstation can use their battery in 1 hour if running at 100% CPU usage so if you need all that performance then you will still need to be close to the wall power outlet.
If you don't need all that performance then there are other laptops (not Apple) that can last long enough between charging too.
M1 Mac is currently an excellent laptop for regular users but real power users either have to be close to the wall or use desktop CPUs to run their workloads...and no, web and software development does not require a lot of CPU power, I know, I am a developer and I develop software for people who do need to run 100% CPU for hours and hours and M1 laptop will not last them any longer than the i7.
>there are other laptops (not Apple) that can last long enough between charging too.
The reviews don't agree that the "not Apple" laptops have anywhere near the same performance and battery life.
>the new MacBook Pros with M1 Pro and M1 Max chips are incredible — the fastest laptops we’ve ever tested in some tasks, with some of the longest battery life we’ve ever seen.
>can use their battery in 1 hour if running at 100% CPU usage
But I'm okay with that. I can be working at home/work and then move, pickup at new location, and go right back to work without losing peformance. Just need a plug
I run a small Linux compute cluster for a research institute with a ~8 kW budget. If it ran m1s, I’d have 5x more cores to work with (and the cores would be faster). How niche is this?
I would love to be able to run Nuke, so definitely one of those niches. There's not much *nix only software that can't be run on macOS, but there's definitely lots of macOS software not able to run on *nix. Being 1 reboot away from using whichever is needed is dream a little dream territory.
I agree with this sentiment. Overall, we should be praising projects like frame.work over proprietary and hard to port systems like the M1. I'm hoping some day the frame.work laptops will have a mainboard available similar to the M1, but I'd rather have repairable than not.
This project would open things up if it did something novel.
IMO a Linux distribution is the perfect base as a metaverse client for the entire internet.
Ditch the window manager that only acts like a desktop metaphor and login to a 3D capable viewport. Toggle between 2D and 3D representations, virtually load websites. Like applying AR to cyberspace. Natively relying on that ML friendly GPU.
Something like Godot as the window manager process (controls abstracted behind a traditional default UX or something) and hacking away at its scene tree format. Update UX state to be a 2D UI if needed.
Store the contents of a file as a hash to regenerate it like procedural game engines do would improve security if the users login unlocks things. /home need not be a traditional filesystem at all.
There are a lot of ideas going unexplored due to the money being thrown at business as usual problems.
20 years ago IBM PowerPC was still a contender too. With Apple no less.
The x86/64 solo reign was more like 15 years.
But I miss it too. The 90s with all its amazing architectures. SPARC, Alpha, MIPS, PA-RISC, PowerPC. I still have several of those here at home :) Computers have become boring and it's nice ARM is shaking things up.
I wonder if M1 will ever be fully supported though. With full unrestricted video, 3D and AI acceleration. There seems to be a lot of Apple secret sauce in these processors.
Every architecture has its quirks. But compared to x86 and its legacy, most RISC architectures were quite nice. And the point is, you had multiple competing archtectures, so there was a chance to try out new ideas and find out, which of those were actually good.
Yes Intel's attitude is too much based on marketing sometimes.
"Consumers don't need 64-bit" (and trying to promote Itanium)
"Consumers don't need ECC RAM"
It holds back the industry now that they are the only PC platform.
PS: I think Itanium was a really good idea but again marketing made it unviable. They wanted to position it purely for servers, just at a time where there was a real cost focus on servers using commodity hardware (e.g. from Google)
"Avoiding needing to spill registers to the stack" is a focus of much optimisation work in lots of places - and, yeah, it's absolutely work but the performance gains can still be worth it.
Plus I'd argue that SPARC is enough simpler to work with than x86 in other ways that you could look at it as "spending your complexity budget somewhere else" more than it being more headache inducing.
(though admittedly I cut my teeth on assembly on an ARM2 so basically every modern architecture is kinda headache inducingly complicated to me ;)
Solo, yes, probably 15 years. But domination? Probably starting with the IBM PC in 1981 (ok, make that 1982 or 1983 to allow for the sales ramp up) until 2020. That's a very long run for a computer architecture.
I expect that the next architectures to rise to the top will be even more entrenched.
We're way past computing's early years, childhood and adolescence.
Yes, PCs are around since 1981, but if you looked around, they were not that commonplace outside of businesses and quite rare in private usages. There was the age of the home computers, first 8-Bit (Atari, Commodore, TI), then the 68k Machines, the first ARM. In parallel, all the great workstation vendors with their RISC chips.
The PC for home usage really started only in the 90ies, scientists would use workstations. It was only towards the end of the 90ies, that x86 caught up with them and well into the 2000 years, when they overtook them. It was actually AMD who dealt the killing blow with the Athlon and especially the Opteron. At that time, Intel was pushing Ithanium as the architecture for professional usage and kept x86 onto 32 bit with more game-optimized processors (P4).
No offense, but that is all either wrong or ahistorical. You write about personal computing history like someone who experienced it after the fact and solely through blog posts. (are you writing about the PC revolution as you experienced it in some country other than the United States?)
The OP made a number of statements about the various platforms' marketshare over time which were inaccurate if applied globally, but might have been accurate in a certain country, hence my question to them. All of this in service of an idea - that x86 hasn't been dominant for four decades - which is so absurdly counterfactual that it's silly to argue about. I mean, you guys know that these sales figures are public, right?
Regarding the assertion that PCs were ever "quite rare in private usages." There are some pretty good charts of the various home computers' marketshare available here:
It occurs to me after sharing those that one could look at those charts and assume all those IBM PCs were going to businesses (that's an AWFUL LOT of businesses!) but I don't have all day to spend on gathering this data... research game sales in the mid eighties if you have doubts about this! They were servicing a dying C64 and Apple II market, and a rapidly growing x86 market. x86 did not wait for nineties (how is this even an argument... jeez).
I still have the Athlon XP 3200+ system I built, but AMD didn't deal any kind of "killing blow" with anything. That's extremely silly. AMD did an admirable job of forcing 64-bit adoption on the PC a bit sooner than Intel would have liked (and driving multicore!), but x86's domination in the marketplace did not have to wait for that.
In terms of marketshare, ARM didn't matter much until it became an embedded standard. Regarding 68k, which the OP brought up for some reason, I offer this:
AMD did deliver the killing blow to Sun. At least in the company I worked by then, they were all firmly Sun for the large compute servers until the Opteron arrived. PCs were nice as desktop machines, but large servers with multiple CPUs and lots of gigabytes of RAM were not feasible on x86 until the arrival of the Opterons. Eventually they would replace all Sparc based compute servers.
By the time the Opteron came out in 2003, Google was already a large company. My guess is they were already the largest search engine in the world -- and they ran their web crawler and search engine entirely on PCs.
Hotmail was bought by Microsoft in 1997, then Microsoft promptly announced that the service would be switched over to Windows. Till then, according to Wikipedia [1], Hotmail ran some Sun boxes, but also a lot of FreeBSD, and I am almost certain that the FreeBSD was running on PCs. They were probably the largest email provider in the world in 1997.
The Apache web server "was the most popular web server by Spring 1996 and stayed like that until the Summer of 2014" [2]. It ran and runs almost exclusively on Linux, which in turn ran and runs almost exclusively on x86.
Indeed, Intel got flat footed and was pushing for 64 bit only on itanium for a substantial price premium. AMD was first to market with the x86-64 instruction set and did quite well.
I wouldn't short-change AMD by saying their big success during that period was about the 32 vs 64 bit issue, or Itanium. They made the fastest 32-bit x86 chip in the world with the Athlon K7, and they did it four years before they launched their 64-bit chip.
Sure, but the opteron doubled down on it. They added x86-64 to a server class chip for the first time and they moved the memory controller on chip, which made the AMD scale dramatically better under a variety of workloads.
I believe a lot of people had the experience your company had at that time. But...
x86 (and Linux and Windows) started killing Unix and the other architectures a lot sooner than Opteron. At some point in the late nineties, SGI's workstation people pretty much curled up on the floor in the fetal position, moaning, "Windows NT, Windows NT." That spectacle was downright undignified, although the NT box they produced was impressive in its way. (They made some nice contributions to Linux nevertheless. Would that they had taken Linux even more seriously.) It says something about their outlook on the future of MIPS, as well as UNIX, that SGI designed their Visual Workstation using Pentium in an era when Windows NT on MIPS was still a thing.
You can't quite say that AMD64 killed sun. Sun actually made some decent, if overpriced, Opteron stuff. Sun's demise is a fun thing to discuss because former Sun employees often have an interesting opinion about where Sun went wrong. I'm waiting for the guy who says something like "yeah, that was my department's fault. We blew it and the company failed." So far I haven't seen that.
Well, sure, I experienced it in a country other than the United States. Only 4% of the current human population lives in the United States. To be precise, I am from Germany and lived through all the times I described. While PCs appeared in the 80ies, it was not until the very late 80ies and early 90ies until they had some significant home usage. And as far as I know, Apples were very popular in the US at that time. In Europe they were not very common, due to even worse prices than today :p. But my school hat some Apple II, which I eventually used with Pascal.
But for home usage, the "home computers" were popular as named before. Universities used Suns till the late 90ies.
I don't think the US home computer market followed the same path as Germany's. 68K-based machines never became huge sellers over here, for instance, and the Macintosh was the most successful of the bunch -- whereas it's my understanding that in a lot of Europe, the Amiga and the Atari ST were serious contenders even on into the '90s. Conversely, IBM PC clones had taken the lead in US sales by the end of the '80s and just never had any serious competition by the early '90s. It's not an exaggeration to say that Radio Shack was selling more Tandy 1000s than the Amiga and the ST were selling combined over here. (There were games with specific Tandy 1000 "enhanced graphics and sound," so it was actually considered a viable market all on its own!)
> Apples were very popular in the US at that time
Well, the Apple II line was popular in the US in the late 1970s through the mid-1980s -- at one point it was the best-selling 8-bit computer, taking the throne from the Radio Shack TRS-80 -- but the Mac was absolutely not a big seller in the 1990s; Apple survived because they dominated a few vertical markets like desktop publishing. For home computing, the first really successful Mac was arguably the iMac circa 2000.
> While PCs appeared in the 80ies, it was not until the very late 80ies and early 90ies until they had some significant home usage
That might have been true in Germany. It wasn't true in the United States. IBM PC (and moreso, PC clone) adoption was quite strong in the home from the mid eighties. That's all I meant by the question about nationality - I really do think there's a difference there when it comes to market share. (I probably shouldn't beat the subject to death but it was an interesting discussion)
I was responding to the original OP who mentioned Intel only.
I doubt it'll remain that stable. The mobile OSes have already embraced a number of platforms, not even all ARM based. Android seems to be quite flexible regarding architectures with its pre-compilation. I think this is a sign of what's to come.
RISC-V is also an upcoming player and if it's successful it may spawn more fully open contenders. As we move to more AI integration there's a whole new lifecycle opening up too in terms of ML coprocessors. We're in the same situation as early computers with multiple vendor-specific solutions.
I think on the security side there'll also be more hardware signature checking rather than the chain-based checks of Secure Boot. Rather than the OS checking if a program is legit, the CPU could do it (already done on some custom like consoles)
So I don't think computing is really mature at all. It just has had a stable phase for a while.
> With the Apple Silicon, there is now a real contender, actually surpassing the current x86 offerings in many aspects
It's absolutely NOT a real contender for widespread use until you can buy a mini-itx, microatx or regular ATX motherboard from any one of the well known dozen Taiwanese motherboard manufacturers, and an individual CPU to socket into it. Or at least a selection of motherboards with CPUs soldered onto them from same vendors.
The hardware availability is basically a walled garden.
I don't think whether you can buy it in a form factor though is a good indicator of "contender of widespread use." It ignores any technical merits and what the chip can actually do. Can it enable someone to check email, watch youtube, and check social media? Yes. Can it render graphics? Yes.If you put someone infront of a Macbook with an M1, can they accomplish everything hey can on an intel machine? Yes.
Now, is it probably priced out of being a real contender for widespread use? Most likely. Is it offered is configurations that suite everyone? Maybe not. But that doesn't mean it can't accomplish the same or similar tasks. If someone can sit at a computer and accomplish all of their normal tasks, then for the most part, it is a contender for widespread use, it is just a cost factor.
- Layerscape LX2160A 16-core Arm Cortex A72 (up to 2GHz)
- up to 64GB DDR4 dual channel 3200MT/s
- 4 x SATA 3.0
- 1 x PCIe x8 Gen 3.0, open slot (can support x16)
- 4 x SFP+ ports (10GbE each)
- 3 x USB 3.0 & 3 x USB 2.0
- GPIO header
- 170mm x 170mm standard Mini ITX form factor
Isn't that much harder with architectures without a BIOS equivalent? The components on the motherboard have to be told how to talk to each other. Swapping out the CPU would require some reconfiguration.
I bet Apple will sell order(s) of magnitude more MacBooks than the combined total sales of motherboards in those form factors, so your definition of widespread is surprising to me.
Big problem is that you can't buy M1 cpu like you can e.g. i9.
I think Apple should be forced to open up their platform so other manufacturers could make laptops or desktops with that CPU.
It is. You can’t buy Apple Silicon without buying everything else including a laptop and macOS. It’s entirely vertically integrated from iCloud account to transistors.
A real contender would be Amazon’s Annapurna Labs with their ARM processors or something with RISC-V.
With Linux becoming a viable option on those machines, they become interesting for a far wider audience than just the MacOS users. Thanks to the great work by Alyssa, GPU acceleration should be close too.
Then lets see when Linus gets himself a Mac, he already indicated that he would be interested to do so as long as he doesn't have to port Linux himself.