Hacker News new | past | comments | ask | show | jobs | submit login
Apple GPU drivers now in Asahi Linux (asahilinux.org)
826 points by sohkamyung on Dec 7, 2022 | hide | past | favorite | 331 comments



> Through Mesa and Gallium3D, we benefit from thirty years of OpenGL driver development, with common code translating OpenGL into the much simpler Gallium3D. Thanks to the incredible engineering of NIR, Mesa, and Gallium3D, our ragtag team of reverse-engineers can focus on what’s left: the Apple hardware.

The Linux graphics folks have really achieved something.


In the case of AMD, the Gallium-based open-source OpenGL drivers also manage to compete very well with AMD's proprietary OpenGL drivers, both in terms of performance and features, despite the fact that the proprietary drivers are much more mature. See: https://www.phoronix.com/review/radeon-spvp2020-linux

There's even a working Direct3D 9 driver based on Gallium3D. With an appropriately patched WINE, you can use it to run old Windows games quite nicely even on integrated graphics or with low core count CPUs. Here's some reporting: https://www.phoronix.com/news/Gallium-Nine-Better-2021

I hope that with NVIDIA's new open-source kernel drivers, now Nouveau can push forward and get good performance like Mesa's open-source drivers for AMD. That'd be awesome, and it could pave the way for mainlining NVIDIA's new kernel driver.


I remember reading about David Miller, I think, porting Linux to SPARC.

That was an amazing feat, this is also very impressive.

I think at some point Linux was as good as Solaris, if not better, on SPARC systems.

I really wish this were true at some point for Apple hardware.



I didn't want to be a jerk to Dave.

These days I think even Cantrill regrets being a jerk back then.


The weird thing to me is that I don't understand how anyone is expected to believe that Bryan Cantrill is somehow less of an obsessive nerd than Dave Miller or anyone else. Like what character was he playing when he wrote that line??


Maybe he’s admitting defeat with an ironic insult. I say, “shut up, nerd,” to coworkers when I’m wrong in a technical debate. We’re all nerds to some extent, so it’s funny.

On the other hand, maybe he was playing the guy who lashes out when his feelings get hurt after his work was publicly eviscerated.


For whatever it's worth, it was a Saturday Night Live reference -- and I absolutely regret it.[0]

[0] https://www.reddit.com/r/IAmA/comments/31ny87/comment/cq3e4y...


If only you'd also included one earlier line from the same skit¹:

> You've turned an enjoyable little job that I did— for a few years— into a colossal waste of time!

it would have balanced out the later one!

Thanks for making it make sense :)

--

1: https://trekmovie.com/2021/09/06/william-shatner-defends-198...


> I think at some point Linux was as good as Solaris, if not better, on SPARC systems.

That's a bit of a stretch. In fact, the last time I was in an air traffic control tower, it was Solaris, not Linux, in active use by the controllers themselves.


That would 100% be based on how those systems were acquired and where they were tested and delivered originally (e.g. through a tender and some Sun or Oracle support deal), not whether Linux is better than Solaris on that hardware on some particular metric, and even less so if 2022 Linux is better than Solaris on that hardware.

It would also have to go through many series of bureucracy, compliance requirements, and such to be installed in the first place. Even a point update to the next Solaris version could be a year long process, complete with several staging systems and so on.

Last but not least, it would 100% be tied to the ATC software run there, and under what OS it was developed and tested (which, if it was pre-2000, would more likely be some commercial UNIX like Solaris, considered - and being - more mature and supported for such use then).

An airport management wont just go and reinstall some mission critical software they got through a specific contract deal. And ATC software wont be just some GitHub repo you recompile and build for Linux.

But that's almost totally unrelated to how well Solaris vs Linux runs on the machine.


I don't think someone being stuck with Solaris for unknown reasons is sufficient to declare that Solaris is better or equal to Linux. For the ATC, it could simply be a matter of "The application we run only support Solaris"


It's aviation. Even if there was a superior solution and it was readily available, it would take decades to get through certifications and before it became widely accepted or even the new default. See GPS vs radio navigation or just analog radio communications.


Yeah that would be on top of it all. For the ATC to use Linux on something, there would have to be a competing application that runs on Linux, which has to be certified and then migrated towards. That could take a decade or two.


See also this whole radio-altimeters vs 5G debacle that wouldn't have happened if radio altimeters used a reasonably modern design.


Ignoring the air traffic thing, I agree it's a stretch.

Back when Sun was still in business, Linux's SMP support was still in its infancy, futexes were not a thing, and the pthreads documentation was nonexistent.

On top of that, fsync was actually broken on ext2/3. Also, there was a single kernel level lock per file, so you couldn't have two CPUs seeking in the same file at the same time (oracle recommended using a block device instead of a filesystem, and provided a list of changes to the kernel source you needed to make manually if you wanted Oracle on Linux to be supported.)

None of this mattered of course, but the Linux kernel certainly wasn't "better" than Solaris back then.


You're probably right. I worked for Lockheed in a facility that was ATC and MS2. I was on the MS2 side, but our program was ATC for NATO countries (war and peacetime operations).

All of the ATC applications were built on Solaris running SPARC. Most of the developers were familiar with Linux at the time, this was back in the early 2000s. But even then many of the devs wanted to migrate the platform to Linux for a number of reasons. The cost and time, however, for acceptance testing on Linux would have eaten the budget alive.

So instead pieces and parts that were readily accessible in Linux were ported to SPARC. I remember getting a new requirement for GPS time (previously the system had only used Rubidium oscillators) and working with one of the devs on getting OTS hardware working with the ported code. The further along the program went and newer features were added to the scope the more this happened. But everything in the UI was based on CDE and some SPARC specific libraries for the UI. The HMI was written in ADA.

Since these systems have so many requirements just swapping out the OS would be a major overhaul and I'm not actually sure that Linux would even be the right choice.


At USENIX in 1997 I went to a talk by a young David S Miller and Miguel de Icaza on porting Linux to the Ultrasparc. That talk (regretfully unrecorded) more than anything else convinced me that I had to work on Linux.

Throughout the talk they showed Linux vs Solaris performance, and talked about their optimizations. It was lmbench (Larry McVoy the author was ex-Sun, which adds something) that, and by the end they beat Solaris on every result.

Obviously this was impressive, but it's hard for me to clearly express how much this shook my assumptions about how to build good software. A bunch of students shouldn't have been able to beat Sun on their own hardware in anything!


Anything touched by the FAA wasn't considered on technical merit, it was pure inertia and being terrified of any change, no matter how warranted. C.f. leaded gas for general aviation.


Even if it was running on Linux, it would probably be some ancient version of Linux that past gobs of certifications at some point. There likely wouldn't be any OS upgrade until another lengthy certification was done, ie, never.


Asahi has been my daily driver since April (https://jasoneckert.github.io/myblog/asahi-linux/).

I watched the hardware support evolve with each major update and remember when the first builds of specific software (e.g., Chromium and VSCode) that supported the 16K page size were first available.

The last few months were incredibly interesting to watch - especially the live coding sessions. And throughout the whole time, Asahi was rock solid as a daily driver. I find it a stellar example of the power of open source and the people in the community that drive it.


> And throughout the whole time, Asahi was rock solid as a daily driver.

I'm also in awe and respect for the Asahi team, but please don't overstate things just to celebrate their work. Even according to your own posts, there are huge dealbreakers like sound or external HDMI.

I bought an M1 based on HN posts like this, because I need linux and not OS X. I soon returned it when I realized how it could not come close to being my daily driver.


To be fair, 'rock solid' just means stable, not necessarily complete.


> there are huge dealbreakers like sound or external HDMI

I wouldn't need those, nor know they were broken if I had an M1. Sounds like the functionality I would need for a daily driver is there and solid though. Of course I'm capable of looking at lists of working features and roadmaps and deciding if something fits my use case independently of vague "pro" and "con" reviews on random discussion sites.


Lol, the limitations of Asahi are described very clearly on their site. It's your own fault if you spent $2000 on a computer based on random HN comments about an open source driver that's still in it's alpha release.


HN is crazy bullish about some things and it can be easy to get hyped (I’m hyped for M1, but only use FOSS).

I came to this thread looking for exactly what you posted. Sure, I could check the site but I expect limitations to be discussed as well.


As someone who adores Apple hardware, I’m really looking forward to having the option of running Linux on my MacBook.

I’m also really looking forward to the changes in Asahi getting upstreamed. The Rust in Linux work has me really excited for the future of Linux.

I might even start contributing once Rust is more common!


> I might even start contributing once Rust is more common!

as someone who's recently experimenting with Rust, why is that so? I mean if you have something in the kernel where you could give valuable contribution, why letting the language stop you? guess any dev who can write Rust today is also a good C dev, ain't it?


>guess any dev who can write Rust today is also a good C dev, ain't it?

Quite a number of Rust devs I've read about, have never been C devs. They came from dynamic languages or Java or similar, and stuck with Rust and learned it, as a more modern features-wise and less error-prone systems language than C.


I'm one such person.

I learned in C, it's the abstraction that exists in my mind for computers, but I'd never use it to actually write something, because of memory safety. I'd even been using Ada.

So, Rust was appealing in the ways C wasn't.


I would not want someone writing kernel code who can't understand what the rest of it is doing.


There is a difference between not understanding and not wanting to write C.


I'd be curious how C (and C++ for that matter) stacks up on some arbitrary "freshness" metric that measures some kind of statistic/statistics that indicate the average/median time it takes for the population of engineers to reach some particular level of "done"ness with a language.

I have no idea how you would measure it, but I'm assuming that C, VBScript, Perl, sed, AWK, bash, and Haskell (please don't hurt me I'm thinking about the code maintenance, time draining nightmare that is a real-world use case of mega-scaled purely functional codebases as opposed to a mixed dynamic with a purely functional core).

Curious what other languages would top the list. I'm going to assume also that some low-barrier entry scripting languages that are well designed, similar to Python, and languages that save the hypothetical, proverbial broken backs of their ancestors, like Rust, to be near the bottom of the list. I'm curious what that bottom part of the list would look like too, oddly enough. :D :))))


C is an amazing language. If you want to integrate into another language via FFI you basically have 0 other options.

That being said, it’s too easy to do something wrong in C. The desire to use Rust isn’t because C is stale, rather it’s too hard to write C correctly.


"amazing language"

ye ye, sure it is

uses physical pathes instead of logical namespaces for includes

average code base could be summed in such a way: everything is fucking "int" or its cousin - such a great tool for system modeling!

basic concepts as for $current_year are still non-trivial in C - like strings

when opening non-trivial codebase my VS Code goes crazy.

Maybe I do have high standards after using C# for years, but holy shit, writing Rust is 10 times better experience for me than using C.


My desire to write Rust is for both reasons. C tooling is awful, the language offers insufficient abstraction capabilities, and, finally, is unsafe.


Rust and C++ both have C FFI boundaries that the compiler can inline across. What more do you want?


Well for one, I don’t want to add another language to my tool chain. Many languages can compile C directly. For instance in Swift or Go you can add C source files directly to your project and have them compile as part of your Swift or Go build. You can’t do that with Rust or C++.

C is the lingua franca of the software development world.


Er....


Modern C++ now has nearly all of Rust and other modern language features built in, some as flags. At the same time it has 30 years of ecosystem, cruft, and bad habits. It’s entirely possible to write performant, memory safe C++, even in a functional style. It’s a bazooka, and won’t stop you from firing at your feet, but it is also the right tool for big jobs.


The main problem with C++ is that it's too vast, it does everything and everyone has its very own choice of a subset of the language he should use.

The result is codebases hard to navigate, code hard to undestand with lots of clever code and unintuitive syntax.


Except... you know... the most important feature of Rust... the borrow checker.


The last time I checked, there were only nascent efforts to add the borrow checker to C++.

That's the main reason to implement in Rust (the others being syntactic sugar like match, ?, and, of course, the library ecosystem that learned from ~ C++17's mistakes).


It still lacks memory safety.


I would also really love to read a statistics-based approach to all of modern software engineering. Like, does SOLID actually increase bugs? Is Agile more likely to slow things down? Etc.

This is the closest I’ve ever found: http://www.knosof.co.uk/ESEUR/


It strongly depends on the type of software being developed.

As the abstract of that book suggests (when referring to ego and bluster), 100% of the agile proponents I have encountered are incompetent. (This especially includes organizations that claim they are "agile").


Who said those people will be writing kernel code?

Rust for the Linux kernel is like 0.000001% of real life Rust use (which is not that big to begin with).

And it wouldn't be used the main kernel for the most part anyways, mostly (if not exclusively) kernel drivers.


This thread is about a kernel DRM driver written in Rust, writing any DRM driver requires a lot of knowledge about how the rest of the DRM subsystem and the kernel works.


The subthread though is about whether "any dev who can write Rust today is also a good C dev, ain't it?", not about kernel drivers...


I understand C just fine, hacked on kernels before, could easily write “safe C”.

1. I just don’t enjoy C. At this point after writing so much code “for profit” it’s no longer a hobby in itself. If I’m writing code “for purpose” I need to enjoy it!

2. While I trust myself just enough to write some C, I don’t trust everyone else to retain a strong foundation for whatever I’m building on top.

3. C has carried the Linux kernel for its first 40 years. Having used Rust it’s clear the future (while maybe not Rust) is definitely not C. I want to make a contribution that will endure the next 40 years of Linux.


That's a pretty naive statement. No kernel dev knows what the rest of it is doing. Because the rest is a tens of millions of lines code base.


>guess any dev who can write Rust today is also a good C dev,

I'm sure they have the potential, but they might not want to learn/use C.

I'd certainly be much more likely to look at kernel dev if I can use rust than if I have to use C.


my point was more that given rust is so young people proficient in it have probably also experience in more classic low level languages


But that doesn't mean they want to deal with C code in their free time.


I used to write bare-metal C for my day job. It's pretty hard to get everything right when the language has so many sharp edges. I wouldn't want to use C for a bigger project (e.g. the kernel, drivers, a gui app) but I'd quite happily use Rust because the compiler will yell at me if I do something stupid


Not OP but I'm waiting for Rust support to be finished before I look at kernel development. Don't have any interest in writing C.


Honestly this is a bit of a fantasy. If you love Rust and hate C, and haven't looked into kernel development before ... it's a whole different ball-game than user-space Rust. With a variety of real-world hardware, and the complexity of modern CPUs, and demands of a variety of complex user-space software, you just don't have the "guarantees" and conveniences you want, at this layer. The hardware does what it does and you just have to deal with it. User-space C code can be way easier to reason about. For a small taste see https://lkml.org/lkml/2022/9/19/1105


Yes, but dealing with low-level stuff in Rust is going to be easier than doing that in C. See for example the blog post by Asahi Lina talking about how it was easier to program the M1 GPU driver in Rust than it would have been in C.


Even in best case scenario Rust code will be confined to individual subsystems and drivers for a very long time. And even in these subsystems you will probably still have to interact with a lot of C code from shared data-structures in other subsystems.


Most of Linux kernel is drivers, so if new drivers can be written in Rust, it's a huge win in safety for the whole world.


No, it's just not true. Just look at Google's recent Android statistics as a validation.


I'm not paid to contribute to the Linux kernel, albeit you may get some reputation benefits out of it. It's mostly a hobby: if it sounds fun, I might do it - otherwise no chance in hell.

Contributing to a C kernel sounds like a thankless job with crappy tooling and high potential for errors - and frankly just brings back horrible memories (ahhh, the segfaults).

Developing in Rust on the contrary is quite enjoyable and I'm definitely more inclined to contribute something.


Not really. C is a simple but dangerous language where the compiler doesn't do much for you, Rust is the opposite. It's like chainsaw juggling vs bridge.


Life is too short to desire writing in C and/C++.


I've been running Linux Mint on my MacBook Pro (2011 model) for the last year relatively issue-free.

For a couple weeks, I had DNS issues with the /etc/resolv.conf file, but I added a rule to the Network Manager to not touch that file upon reboot so everything works correctly now (except pinch to zoom on the touchpad).


Apple hardware is terrible.

I wish someone could somehow make a laptop with the internals of an MacBook (mainly the CPU), but with the externals of a Thinkpad, including the far-superior Thinkpad keyboard and far better aesthetics. And while they're at it, make it easy to pop the back cover off and replace components (esp. the HD) as needed.


I have a mac m1 pro at work and it's hands down the best laptop in terms of hardware I've ever had if you forget extensability. By a far margin.

It makes no noise at all, to the extent that I even checked a dissasembly video to check if they had fans or just passive refrigeration. The display is the best I've ever seen in any laptop. The trackpad is excellent, the keyboard think is very good, the speakers, microphone and webcam are also very good.

I've owned before a lenovo T530, T460s and T580 and quite frankly if we're talking about just hardware this is even better and my last experience with the T580 was just bad. My perception of the lenovo brand was that they made the best laptops until then. In fact I asked to get a macbook because the T580 was so so problematic and not just for me.

I have to say I personally dislike macOS. In fact because now I'm working on 100% FOSS software I'm only using the laptop for corporate calls or situations where I have to deal with customer data, but the rest I do on my personal computer with ubuntu.


I can never understand how other people stand glossy displays but to each his own


I'm the complete opposite. I hate how hazy and dull "anti-reflective" displays look. I understand being opposed to reflection, but I'd much rather have a display that's as sharp as humanly possible.

To me it's kind of like washing your windows. Sure, it's fine if you don't do it. You can still see outside. The window works. You may even prefer that less light comes in if you're opposed to light for some reason. But once it's clean, you're really getting the full effect.


Matte or glossy, the sharpness of the display (all other things being equal) will be exactly the same. Glossy displays just look "nicer" because we associate glossy surfaces with higher value: glossy magazines, brochures, photos etc.


Matte ends up with lower contrast because it defuses light and washes things out, especially in bright light. I think LTT did a video a while ago showing that it was actually worse in bright light than a glossy screen.


Matte surfaces prevent reflections by diffusing the light rays that hit them. They have the same effect on light rays that pass through them.

There is a reason your house and car windows are polished to a reflective surface. It provides more acuity.


I wonder if there's a way we could engineer a nanotech materical that passes light through a screen in one direction without diffusion, but in the opposite direction diffuses the light to prevent reflections.


this isn't quite right, because the coding used on matte displays definitely makes them a little bit less sharp as you can see comparing the close up: https://youtu.be/jFdtJzAgPtA?t=228


Sharpness is but glossy screens provide wider color range aka gamut, just like glossy photo paper.


It’s worth mentioning that MacBook glossy displays are noticeably better than many (not all, presumably) glossy displays out there, probably because of a decent anti-reflective coating. Some of them on the market look about as reflective as a pane of glass placed over the panel even when on, while in contrast I rarely if ever notice any reflections on my MBP and still have the advantages in image quality.

I far prefer this anti-reflective glossy setup to my previous ThinkPad which of course had a matte display.


> while in contrast I rarely if ever notice any reflections on my MBP

Really? I see my own reflection even in indirect sunlight and it's causing significant eye strain.


It might also depend on the model. The one in question is a 2021 14” MBP with the Mini LED display, and the only other MBP I owned had a matte display.


Yeah the Apple displays are almost like a black hole. I stuck a glass protector over my ipad and noticed the screen became massively more reflective.


Perhaps they've mastered the skill of not pointing the display at a light source...

I'd take a glossy display anytime of day, and have so, since 2005 or so, including external displays. Better color saturation, no artificial fuzziness (which is exactly what the "matte" is in the matte displays, they literally reduce glare through a mesh that also kills contrast and clarity).


I bought two monitors which are matte: 4K 27", IPS, sRGB. Biggest complaint is how hazy/non-clear they look. I personally started to think glossy displays are superior, but I understand others have their own preferences.


I typically work on a 30" Ultrasharp 2560x1600. It's pretty great for media at a distance and physical screen space, but it's so refreshing when I sit down with any of my MacBooks and use theirs for a smaller task. I'd really like a good 5k 30"+ IPS (not 27) but that's just a dream


For me the secret has been in managing the level of the backlight. If the screen is at least as bright as the ambient light level, reflections will not be noticeable. Kind of like how house windows seem so much more reflective at night.

In very bright conditions (like bright sunlight), the backlight can’t be bright enough. But in those conditions, matte screens suck too: they look super washed out, like there is a thick fog n front of the screen. (Optically it is the same phenomenon: diffused light rays start to overpower direct light rays.)


Are you exclusively working with the sun to your back?


It depends but I'd really hate to be limited by using my laptop depending on the lighting of the room I'm in


Let’s not act like most of the other matte displays on the market get bright enough to be useful with the sun hitting your screen anyway. Most aren’t enough to even be sitting out at noon.


My Fujitsu-Siemens laptop from 2005 was great in the sun.

The backlight obviously couldn't compete with sunlight, but the LCD behaved transflectively under enough light, so my Emacs session out in a meadow on a bright, sunny day was perfectly readable. That was unexpected. It wasn't an advertised feature. I don't even know if it was deliberate, but it worked great.


Why, do you often work at random unfamiliar rooms, or cafes that impose a specific seat to the patrons?


> best display.

Well, 6th place

https://www.notebookcheck.net/The-Best-Notebooks-with-the-Be...

But yeah it is pretty good.


I said it's the best *I've* ever seen. I never claimed it was the best in the market or that I had seen every other laptop in the market.

I don't know the criteria for rating the display quality, but scoring less than 2% from the best one is still arguably a very good rating.


The latest m1 MacBooks frankly felt like iPads with permanently attached keyboard. It offers none of the convenience of a typical Wintel laptop, namely getting software running out of the box, and the mouse and touchpad lag is atrocious. I don’t know how mac users put up with it.

Though the only experience I’ve had with macs are at apple stores.


> namely getting software running out of the box

Why do you mean? The lack of a package manager?

> and the mouse and touchpad lag is atrocious.

I don't know which timeline you're commenting from, but as someone that works with both Lenovo and Apple laptops, there is simply no comparison. The Lenovo's trackpad borders on unusable. In fact it is unusable when booting/waking from 'sleep'. It literally has to warm up! I use an MX Master 3S with both, and it works fantastically well connected via Bluetooth on both platforms - surprising so for the Lenovo.


This would be great arguments against Macbooks, if any of these were slightly true. Like "touchpad lag is atrocious" wtf? This is utterly false. And "getting software running out of the box", what are you even talking about?


None of this comment makes any sense at all. MacBooks can be bought with tons of software preinstalled (or did you mean it's missing apt-get? Try brew). And the mouse/touchpad are as good as it gets.


MacPorts is a better alternative to Homebrew. More packages, better design, and created by an ex-Apple engineer who was also behind FreeBSD’s ports system.


Parallels runs Windows 11 & Debian wonderfully in my experience for running anything I can't in MacOS. If someone could just get a good port of Android running well on an M1 it would be the ideal solution for me.


> Android running

sounds like you’re consumer type of user.

i dabble in embedded development and more often than not oem release drivers and tool chain for Linux and Windows, and those drivers are too low level to be emulated properly if at all.


It depends on the domain. Most scientific software, on the other hand, works out of the box on Linux and Mac, and if you want to get it to build and run on Windows, well, you're most likely going to be the first one so good luck if there are any issues as the authors won't bother with non-unixy compatibility.


You could try running Waydroid in Asahi - that's theoretically possible, at least.


Are you thinking about the 2016-2020 Mac hardware? The loud and hot machines with broken keyboards and a useless TouchBar and no IO can be described as terrible. But their recent machines? Powerful, always cold, never makes a noise, pretty good keyboard, industry-leading touch pad, decent IO? I don't think you can defend calling those machines terrible, even if you prefer ThinkPads.


Yes, I can defend calling them terrible: they're terrible because they're ugly and the keyboards suck. The internals seem fantastic, but the outsides are awful (except the touchpads, and also the screens are reportedly excellent but I'm not sure if that qualifies as "internals" or "external").

What we need is a computer with the internals of a modern M1 Macbook, but the aesthetics and keyboard of a Thinkpad of yesteryear, plus a magnesium chassis.


I personally think they're good-looking, they diverge from Apple's traditional "ultra-sleek form over function" design style and enter the area of a somewhat industrial look IMO. They can be described as boring, but they're not exactly eye-sores. But this is clearly subjective. Personally, I don't buy computers based on looks, as long as they don't look like those gnarly "gaming"-branded products.

The keyboards though? I've had a range of laptops, some Dells, some HPs, a 2021 MacBook Pro and a 2011 MacBook Pro, I would simply describe the keyboard as "meh". Most of the laptops O've had have been slightly less comfortable to type on, but they've all essentially done the job (with the exception of one Dell which had a truly terrible keyboard).

No offence, but it looks like you feel the need to describe everything as either "terrible" or "amazing", the keyboards can't just be "not as good as they should", they have to "suck", the externals can't just be "boring", they have to be "awful". I think there's a version of what you're saying which people would agree with (or at least find unobjectionable), but as it is, nobody will agree with you that an objectively average laptop keyboard "sucks" or that an aesthetic lots of (most?) people like is "ugly".


You are entitled to your opinion, but many people like how the macbooks look like. They also like the keyboard.

Many people also dislike how ThinkPads look like, and dislike how the keyboard works.


I hate Lenovo keyboards. Too spongy, and key presses often don't register.


Was that a Thinkpad keyboard, though? It's a different category. Though they have declined over the years as they went to shallower keys.


P14s and X1, sorry, should have included that.


“Ugly” and “suck” are highly subjective and depend on people’s personal tastes. You could say that MacBooks are terrible for upgradeability but “keyboards suck” is pretty generic.

Older MacBooks did have issues but Apple generally delivered on what most mac users wanted.


I guess ThinkPads are pretty in the same way an old Quattro or Golf are, generously, but to me it's a real reach to call them prettier.


With the exception of the notch I think they are in top end of good looking laptops. The ThinkPad Z13 beats it in terms if aesthetics. But still. The new style is miles and miles superior to the all to common wedge shape.


What do you mean “reportedly”?


Do they still ground through the user like the intel models?


This is fixed with using a power cable on the charger block instead of the bunny-ears adapter. Apple removing the power cord in favour of only including the bunny-ears adapter was such a mistake


It doesn't help that the power cord extension they used to ship (and can still buy for $20) flat out refused to be coiled in any sort of way. Even a large loop would spring back straight immediately. Absolutely awful experience. I kept my old power cord extension from my MagSafe 1 charger from 2009 for this very reason.


Apparently they still do [1], even though I have an M1 and never experienced any of the "micro vibrations" that I had on the old Intel Macs.

[1]: https://forums.macrumors.com/threads/macbook-pro-m1-max-almo...


Oh, I finally have something of value to add to a HN thread!

I was always bothered by this and managed to fix it a few months ago. I had grounded my power-plug-box to the central heating system with some copper (don’t ask), but was wondering why I still got those vibrations whenever I was charging the MacBook.

Turns out, charging through just the monitor (via Thunderbolt) solved this: the monitor was grounded. The default MacBook charger (EU) plug just has just two prongs; a third one for grounding exists, but has to be attached separately.

Edit: indeed, this was on a 14” 2021 MBP. They definitely still get the vibrations when connected to a power source without grounding.


Yup, and you feel static charge when it's plugged in, too.


Actually my 2019 13" is still pretty good, but the new 16" M1 pro, though huge, puts it to shame in all of those dimensions.


"Apple hardware is terrible"

Interesting take. As an example I have an Apple laptop with a trackpad that is quite useable, something pretty much unheard of on anything else.

The keyboard feels much better than my old Thinkpad, but this might be subjective. I've not been impressed with the recent models (although admittedly haven't spent long with them).

I'm with you on the upgradeability/repairability/maintainability, but maybe a Framework is a better fit in that case.

My Air outperforms my desktop, has close to 20 hour battery life in my normal use, and is completely fanless. It's not perfect, but I don't see anything else that comes close (if I did I'd probably switch.. macOS is driving me more crazy with every release).


>My Air outperforms my desktop, has close to 20 hour battery life in my normal use, and is completely fanless. It's not perfect, but I don't see anything else that comes close (if I did I'd probably switch.. macOS is driving me more crazy with every release).

I'm on my first macbook--a M1 air--and now I finally see what the hype wrt apple hardware is. I love thinkpads, but between the battery life, the speakers, display, just how light it is, never runs hot no matter what I do to it....I can't believe I went this long without trying one out.

The hardware plus improving linux compatibility means I know this machine is worth it


I've been a diehard Thinkpad user for over a decade now (X201 my first back in 2010). Apple M2 Air is on another level despite not having a trackpoint. Just to add to your list: the aluminum shell feels nice too. Paired with the cpu it just never heats up, which used to be a drawback by acting as a big heat sink.


I love thinkpads, but after getting used to a mac trackpad I see why people always raved about it. That being said I do miss the trackpoint, and I like the old school t420 keyboard.


I'm really rooting for Asahi, but I'm not an advocate for desktop Linux. And I love Linux.

macOS really nails as much of making a consumer-facing operating system as I feel it's possible to get right without taking something else away. XProtect, GateKeeper, SIP and the enclave just feel like the optimal combination of methods to secure the machine while not overwhelming the user.

I wish it was possible as a power user to get more information from logs in the background to ensure everything is working as it should, but Eclectic Light Company have that covered quite well (https://eclecticlight.co/downloads/ if you want to check out their tools).

It's also kind of love/hate. There's a lot of locking down that happens - some it for genuine user protection (cool), some of which is to do with market competition (the whole Epic debacle).

Sometimes they change things that shouldn't be changed and I don't know why (the Settings screen on Ventura is horrid and makes it harder to see permissions settings). Then they announce end-to-end encryption for photos and iCloud and I'm back onboard.

It's also why I don't understand the Windows 11 hate. I run it on another machine and I'm loving it as a direction for Windows - more of the security apparatus runs in the background like macOS, but the information is still available to you as a power user. Mandating TPM2 is absolutely the right call (they have workarounds for non-TPM2 machines), and being able to have cryptographic trust roots down to the boot image is a massive user-friendly feature.

I wish there was a means by which Linux could have software certification so that malicious applications could have their certificates invalidated, but I get that as the FOSS community is decentralised there's a who-watches-the-watchmen question. Having a mechanism baked in at the kernel level could be useful to solve this if it allows me as a user to "subscribe" to an authority if I choose (i.e. if I'm using Ubuntu, I could choose to use a Canonical or Google whitelist for signed apps, or none at all, sort of analogous to how the web has had multiple certificate authorities for many years, except that I could freely choose my own authority for whitelisting if I wanted to).


I can't really compare anything to windows because (luckily) I don't have to use it.

I just wanted to try one of the new M1 machines, and was aware that MacOS has some unix/bsd roots that would make it feel a little familiar. After using it, it feels like a team of people made unix/bsd into a user friendly OS, even if it is more locked down than a linux distro.


The words you are using are not true about modern Thinkpads sadly.

Apple's screen, lower power management, chassis, touchpad, are all dramatically superior.

The stereo on the Macbook Pro 16 inch is absolutely absurd. It's the stereo for my HOUSE.

Apple's killed supply chain that delivers more for less.

Modern Thinkpads have been ruined by the pivot to the ultrabook-esque approach.

Thinkpad's are like models.

There's simply no reason for them to be that skinny.


Yeah, and that's kinda sad. However, old Thinkpads are still totally usable. I was using T420 until a year ago and the only thing that I hate about it was the absolutely awful screen. It was 1600×900 with absolutely horrible colors. There's a few ways to install a better display but it might be tricky [1].

Otherwise it was a decent “daily driver” laptop, in many aspects superior even to current gen MacBooks, particularly repairability, ports, keyboard quality. I'm using a 2020 MacBook Air now which I bought to try out the M1, and some aspects where it excels are screen (obviously!) and speakers, but I think those are comparable to a little more modern ThinkPads (xx40 maybe?).

[1]: https://www.thinkwiki.org/wiki/Replacing_T430_screen_with_a_... (ThinkWiki got some terrible ads since I last checked it, get your ad blocker ready)


I have T420 (and T430s) as well, but I think at this point it no longer qualifies as a daily driver. Even with 16GB of RAM there are times when the machine is just slow. Even with a replacement battery I only really get a couple of hours. Performance is just not there for anything serious. I've used linux and BSD's on it for about 5 years. It's true you can repair it fairly easily, but after getting my hands on a refurbished M1 MBA, there is a huge difference in quality, performance, battery life, etc.


Yeah the speakers in the newer macbook pro models (even the late intel ones) are really impressive. I almost never use them and when I happen to play audio through them I nearly always catch myself thinking "wow, those are really good".


Im sorry but you’re letting your Thinkpad snobby side show. I understand where you come from because I thought the same thing. I used thinkpads for long time (as a choice) but the build is shit compared to what Apple is putting out. Maybe they were better in the past, but they need to keep up.

I moved from thinkpads to the 14 MBP and I can’t think of a reason I’d want to go back to Thinkpad other than being able to run Linux.

I’ll leave the “far superior” keyboard aspect aside because it’s personal preference. I got used to the Apple one and I type much faster than on my x220.

But the screen? Come on. Mini Led, high brightness, just looks absolutely great. Thinkpads usually have shit brightness and basic coatings. I’ve even gotten panels from China to make custom replacements for some of mine because of the garbage panels they came with.

Speakers are vibrating turds on thinkpads. One day my gf brought her laptop to bed to watch a movie and i thought “there is no way this is going to be enjoyable”. Then i heard it and I was blown away. I then ripped off the speakers off mine and I made custom housing for some speakers out of a MacBook Air for mine. They couldn’t live sounded better if the whole chassis didn’t rattle with the added bass and if the speakers fired up.

The soft touch coating on the shell that peels off leaving half shiny half matte, classic no?

They are tied on the hinges, both are solid. Thinkpads may have more ports, but honestly I don’t go around plugging my laptop to too many things.


As a Linux user I've had many Thinkpads. X1 Carbon does have a great keyboard. But there's to my knowledge no other laptop with a display as good as a Macbook's retina. Look at any Thinkpad next to a retina display, it's night and day. That's why keep using a 2015 MBP with Linux as my daily driver. Looks like I might be able to upgrade soon thanks to Asahi.


The Dell XPS has a fantastic OLED screen that is better than anything I've ever used before. I think the Apple laptops have slightly higher pixel densities, but it's all the same when they're as dense as they are.


Thanks, I'll take a look at the XPS.


So, your "Apple hardware is terrible" wants the Thinkpad "keyboard and far better aesthetics", but stil prefers Apple internals especially "the CPU".

In other words, you claim "the hardware is terrible", but your actual case is "I prefer the external design of the Thinkpad".

I guess, I'll give you the keyboard. Which is more than compensated with the far greater touchpad, incredible speed, battery life, and coolness while all of the previous (of the M1).


I vastly prefer the Thinkpad touchpad over the Mac one and do not get what people see in the Mac's touchpad. For me Macs have better cooling, better battery life, better CPU but worse input devices (keyboard and touchpad). Macs have better glossy screens but since I prefer matte screens I prefer the screen of the Thinkpad.


IMHO general design of ThinkPad touchpads with the additional buttons at the top (if it's a thinkpad which has it) is much much better even if you don't use the trackpad.

But if you don't use that buttons apple touchpads tends to be better in my opinion.


IMO, the Apple touchpad really shines when you incorporate gestures and turn on "tap to click".


I'm still annoyed they moved three finger drag into the accessibility options. It's one of the defining features of the trackpad in my opinion.


Wow. I had no idea this was possible. Clicking and dragging (ie holding the click) on my Mac trackpad always felt cumbersome - this is a lot better.


Aesthetics lie in the eye of the beholder.

But it is my understanding that the mouse pad on MacBooks are far superior to most other mouse pads. I don't know about the keyboard, though.


MacBook's touchpads are quite famous at being better than most of the competition.

Thinkpad keyboards are indeed much better, but only if you don't care about small water spills rendering your keyboard unusable.


How does a water spill render your Thinkpad unusable? Mine has drain holes to handle this situation, my work Mac can't handle crumbs. I don't want to think about how it would cope with liquid.


There are some models where the keyboard itself isn't fully water safe, so there is a chance your keyboard dies but the rest of your system is fine. This is especially true if we don't spill water but some other harder to handle liquids.

There are also some products published as ThinkPads which aren't really ThinkPads (like some(many?,all?) of the Yoga ThinkPads don't have any of the properties normally associated with ThinkPads, they are much less robust pretty much in every category and much harder to repair even if it's just the keyboard.


Wouldn't that be true of Macbooks as well?


Yes, but with a MacBook the keyboard is screwed directly into the chassis, which is not threaded for them. It's literally designed never to be taken apart.

To replace the keyboards, Apple internally used to replace the whole top case. You, the business user, are likely to just be handed a whole new unit. This happened a lot for years and years, as we all remember.


So, if I spill water on my Thinkpad, I can drain it and probably replace the keyboard. Either myself or using the Lenovo support who will bring me a keyboard.

If I do the same on my Macbook, Apple have to replace it as everything is fixed in place and its gone for a week. I only say that as I've never had Apple Care able to repair anything same or next day.

I still don't get the original point about Lenovo keyboards and spills because a spill would render a Mac unusable.


Me neither. I think everything after that (bogus) original point has just been others highlighting that some ThinkPads don't have spill trays as awesome as those that are demonstrated at 28:30 or so: https://www.youtube.com/watch?v=ig3xI8dUdm0

I was just agreeing with you that the situation with spilling water on a macOS keyboard still definitely seems worse.


Right, okay it sounded like you were defending the original point! Which really confused me.


> Thinkpad keyboards are indeed much better, but only if you don't care about small water spills rendering your keyboard unusable.

This kind of generalization is as far I can tell the source of a lot of pointless discussions around thinkpads.

The truth is the quality varies, a lot. Sure wrt. most aspects the quality is at least good, but weather it's better or worse then a mac especially wrt. the keyboard is quite device dependent.

For example the one I have has a better keyboard then any mac keyboard I tried and the keyboard is quite nice to type on and quite robust, much more then then any mac keyboard. There are some models where you can continuously pure water on it and they will be just fine. But then there are some thinkpads keyboards which aren't, but then apple had also keyboards for a while which died from a bit of dust.

So in the end general statements like ThinkPads have the best keyboards or the most reobust ones or apple products have that are all kinda pointless. From both companies you can pick modules to get whatever result you want especially if you include some "fake ThinkPads" (published under the ThinkPad brand but not really thinkpads wrt. robustness, repair-ability etc.).

My main point for favoring ThinkPads is that you can easily remove the keyboard and use an external keyboard until an replacement part arrives.

(Just to be clear: I'm not speaking about the very old thinkpads; Only mean thinkpadish thinkpads e.g. mainly the T and P series but not e.g. the Yoga Thinkpad; Mean water and not liquids with a higher acidity)


IMHO Keyboard quality in thinkpads dropped like a stone after the 230s. It’s like Lenovo lost the plans for how to build those.

The keys in the recent ones are incredibly mushy.


not if you have a model with a Chicony keyboard.


You are being downvoted into oblivion for that first line but I support you. Apple hardware look and feel is optimized for perception, not for practicality. I'll always prefer a laptop that's field maintainable even if it is a few mm thicker.


Frame.work? I rather have an ARM processor for the battery life but outside that it's a decent. Just the Linux support is lagging. Wish people would focus on support that type of far more open and repairable hardware instead of closed. That said, my macbook air m1 is the best laptop I ever had because of the battery life and solid hardware (and price... for the E999 I paid, there is nothing close even 2 years later). Frame.work with AMD + discrete Nvidia GPU I would buy, Frame.work with ARM I would buy, but the better part of the dream is that someone can make these boards and I can shove them in my existing frame.work. That's where we should be going.


I love my framework, and would buy it again. But man, I'm not sold the 12th gen intel chip is ideal. My battery life is pretty awful, and it doesn't take much to start the fan / generate heat. Even doing something simple like watching youtube :/


“Better aesthetics” is an entirely subjective judgement.

Agree about being able to swap components though.


The issue with thinkpads has always been the externals! I've only enjoyed them for their budget internals and their openness via extensive reverse-engineering. The lack of a metal body of any sort more or less sentences them to an eventual crumbling, and iFixit rates most Thinkpad T420 body maintenance as "Moderate" to "Hard", so good luck repairing it when something on that plastic case cracks since it will happen eventually, and there's no Apple Store for thinkpad repairs.


> there's no Apple Store for thinkpad repairs

I pay around 50USD/year for an enterprise grade Thinkpad support plan with Lenovo, where they send an engineer to either my home or office the _next day_, complete with any spare parts needed to fix whatever might have gone wrong, whether it's accidental damage or a hardware defect. I've only needed that support maybe four or five times over the last decade, but each time it's been stellar: new screens, mainboards, keyboards, broken case parts, etc. No caveats or gotchas or 'ooh that voids your warranty' to worry about, ever. It gave me full confidence to run my company and equip all of my devs with Thinkpads that run on Fedora - so much so, that when we were acquired a couple of years ago, my only negotiating condition that caused a stir was the requirement that me and my team get to keep our Linux+ThinkPad stack.

What I just described is the polar opposite to every experience I've ever had with anything to do with Apple, ranging from the genius bar arguments to the six week waits to fix our designer's spacebar that stopped working because someone dropped a a breadcrumb in there. It just doesn't compare.

Side notes relevant to your comment:

- the T420 that you mention is now an 11 year-old piece of hardware, I don't understand why you're referencing it

- even so, plastic gets brittle over time. I don't know anyone with a 10+ year old MacBook that still runs

- iFixit are heavily biased, or at least they were when the T420 came out (it's in the iName)

- with all that said I still can't wait to be able to use a fanless desktop M2 as my daily driver (@LinaAsahi you're awesome)


Barely relevant anecdotes:

My mom daily drives my old 2012 retina Macbook Pro. Neither of us have ever had any problems with it. So it's possible for macs to hit the 10+ year mark!

I also still use my X200 tablet (not as a primary machine anymore, but it decoratively runs Creatures Docking Station 24/7). No crumbling or even any signs of aging plastic. That thing is still a tank.


I owned a 2012 rMBP (the 15 with the proper quad core) and it was an absolutely excellent laptop. I had it as a secondary machine for design/music work but used it quite a lot, and sold it to a mate of mine who used it every day until it died around 2019 I think.

Is your tablet the one with the 400nit outdoor screen?


Metal dents. Metal scratches. And if you get any of the colors other than boring bright grey then it's really easy to scratch. And at least things like keyboard replacement, SSD replacement are easy to replace on ThinkPads.

Nothing's cracked on my W520 yet, and I don't have to worry about plugging in the power adapter damaging the finish like I did with my Space Grey MBP.


I dropped my aluminum body Dell XPS 13 on a tile floor (it was in my messenger bag, but I don't think that helped much). It hit the rear corner and mushroomed the metal quite a bit, but nothing broke. I'm reasonably sure that if it was a plastic body it would have broke.


Most Thinkpads are not normal plastic. The W520 is carbon fiber and glass reinforced plastic over an alloy frame.


You can just order body parts from Lenovo (true, probably not for t420 anymore) or third party resellers, or just take them from another one.

And there are a lot of Thinkpad service partners stores, mainly for business customers, I bring around 1-3 Thinkpads a month there for repair (we have ours close by).


> and there's no Apple Store for thinkpad repairs.

But a widespread network of service companies that will happily fix your devices, in most places way denser then the network of apple stores. (Not to mention providing on-site warranty services)


You dont need an Apple Store when the parts are sold everywhere by everyone, not only in a hipster store.


10 year cycle perhaps?

I had a 2012 MacBook and at the time it was the best laptop I'd ever used in terms of screen, keyboard, even performance.

I upgraded last year to an M1 Pro, and once again, it's the best laptop I've used, and I've had a lot of company laptops (Surface, Lenovo, Dell, HP) in that time.

I completely missed the keyboard debacle, the Touch Bar, and all the other drama of the late 10s.


As an ex-IBM guy who spent a decade on Thinkpads I disagree.

> Thinkpad keyboard and far better aesthetics

Thinkpads are great for other reasons, partially that almost everything was replaceable as you pointed out.

In the case of aesthetics, Apple is the clear winner.


The last time Thinkpads had decent keyboards was 15 years ago. Nowadays the Mac keyboard is probably the best keyboard on the market for laptops


Current mac is leagues beyond any mobile devices on the market and it's not even remotely close. I am force to use mac because of it.


hmm, it seems you are talking about the framework laptop ?


Isn't it a shame (and waste of human life really) that everything has to be reverse engineered?


I don't want to put down their achievements but I somewhat agree with you. And doing free work for a corporation the size of apple that could've just supported Linux from the start. I don't think it's a waste of life necessarily, but it is a shame.


How would Apple corporate priorities shift from their current strategy of removing all GPL code? https://news.ycombinator.com/item?id=3559990


I dunno, put their money where their mouth is and license it as BSD instead? They seem awfully fond of the license, what with how much BSD code appears in MacOS...


Not really. Wouldn't it be a shame if nothing could be released without thorough documentation, to avoid the need for someone, sometime, to maybe have to reverse engineer it?


I don't think OP is saying that all hardware must have thorough documentation before it can legally be released. Instead, they're claiming Apple already has some documentation internally, and it costs them nothing to release it.


They should do it, but I doubt it would cost them nothing. That internal documentation is almost certainly not fit for external consumption and would need to be reviewed and rewritten in places.


> They should do it, but I doubt it would cost them nothing.

Let's be real here, they just don't want to do it.

The cost of the documentation review would be peanuts.

Apple has just broken capitalism, they're probably 5 years away from rivaling Saudi Aramco in profits:

https://companiesmarketcap.com/most-profitable-companies/

Financial excuses for Apple strategy decisions don't really hold up to scrutiny ;-)


You have to consider the opportunity cost, probably those people engineering time is better placed on M3 gpu drivers. It takes really excepcional and kind people to achieve what they did AND prepare public documentation along the way for the profit of open source community. It sometimes happens, and we have to celebrate that more, but it's not the normal.


Or, you know, they could hire good technical writers for a fraction of the cost and have them work together with developers, just like other companies do?

Devs would have to write just the hairiest parts.


yep that sounds reasonable, no idea why Apple doesn't do that, my wild guess is that there aren't enough technical writers that can understand low level code, or just a company/team culture that doesn't value it.


You speak so blithely, I can see you've never gone through an Apple documentation review...


Speak less blithely yourself, then ;-)


Perhaps, but releasing it even in an unfinished state would massively reduce the duplication-of-effort which is needed for reverse engineering. There are often clues which would take a long time to find when working blind which you can glean even from a few sentences of concrete inside knowledge.


Honest question because I'm totally ignorant here: can anyone shed some light on the typical roadblocks for large corporations to release documentation like this one, other than "it's just easier not to do it"?


I think a big part of it would broadly fall under "legal and privacy concerns", especially since Apple Silicon is proprietary hardware. They need to work out which details about the hardware they might want to keep secret, and make sure the public documentation only contains things they're happy to share to the public. They then need to make sure it's up to their standards for externally published documents, it probably has to go through a review process, be published under a certain license which needs to be written or selected, and so on and so on.


Documentation for internal consumption often makes assumptions that no longer hold outside of the company. It might reference internal machines or drives that will not be available. It might take shortcuts that will be confusing enough that it will be worse than having no documentation at all. It might just be a few bullet points that refer to asking the right person internally.

If you've ever taken over a department at a company, and inherited the documentation, you've experienced this. It's sometimes better to just reverse-engineer the current state of things than to try to use outdated or misleading or very sparse documentation. Documenting the current state cleanly takes time, effort, and capability. It's an ongoing effort that requires budget and capacity.

And, most of all, it requires that the company see the need of having a good knowledge base, despite such a thing being a short-term cost that only pays out in the long term, and then, in ways that you won't be able to ascribe to one department's budget or another. Corporate structures can get in the way of cross-cutting/long-term benefits.


References to internal concepts (acronyms), Hyperlinks to internal knowledge bases, insight into individual contributers & team / org composition and mention of undisclosed efforts / strategy. And undisclosed failures on whatever level that stakeholders maybe should not see.


Once documented anything becomes public API that customers will expect to work forever with no regressions no matter how many caveats and warnings you put on it.


This is always going to be the biggest barrier, even if there were no concerns around e.g. third-party IP, sanitisation, etc. Apple don't want to end up implicitly committing to supporting every element of the current M1 hardware forever.


Lot of surface for security and legal attack.


Not being required to do it sounds like a good enough reason for not doing something. Large corporation or not.


a) commercial in confidence, and b) non-disclosure agreements with partners.

Internateral documentation may also need business / developer processes sanitised.


Its one thing to write for your team

> This is a huge clusterfuck that requires A to be set before you can use B because of no good reason. :poo:

And another to publicly publish that an API is a turd because you outsourced it for time to market reasons.


Apple is a company with ~$200 billion in liquid cash. If they can't spend $20,000 reworking some spec sheets for the community, what the hell are we paying them for?


FOSS is like the exact opposite of apple. Anyone who has been paying Apple with the expectation that they're suddenly going to play well with others instead of locking everything down as much as they can is delusional.


A pretty big guess that it wouldn't cost them anything. The process of it all plus the secrets their competition could easily learn must both be quite valuable.


That is not really how opposites work.

The opposite of "none" is "some" not "thorough".


...No, that sounds wonderful. I want to move to that reality.


Or you could just open source it.


Welcome to medical software and ISO 62304 compliance


Yeah it is. In the 90s everything had a manual. Does anybody remember how good the documentation was for new hardware? I was looking for an example but google became garbage and I can't find any good example.


A sheet containing all the specs, wiring, resistances etc was tacked to the inside of my 80s speaker set. Fun discovery when I revamped them!


Yep, that is exactly what I am talking about. Maybe it was more of a 80s thing.


I feel like this is overstating how good things were.

Yes, my ZX Spectrum and Amiga both came with some very nice documentation to at least the block level on how everything worked - but at the same time, every programming manual for the Amiga was an expensive Addison-Wesley tome. Every programming language beyond BASIC was an expensive proprietary product. It wasn't the paradise people seem to (mis-)remember.


Reverse engineering can be a fun adventure.


I'm guessing the number of people who just want to use the hardware massively outweighs the number of people who want to reverse engineer it

Even if the software were completely open source from day 0, a reverse engineer could still not look at the source and RE the hardware.


Their work is phenomenal but I also kind of wish we would all stop supporting such a company.


the people working on asahi seems to have a lot of fun, i am really envious of the work they are doing!


That's a super cynical way of looking at it. What I see is a triumph in the very hacker ethos of being able to reverse engineer these drivers. The team is even fixing bugs and will likely get better performance out of the hardware than can be found on MacOS when it's all said and done. And all of this is done without specs.


What's cynical about it?

Apple is benefitting from the availability of Linux on these devices. That they do nothing to help this effort is disappointing.

Yes, it is a triumph in reverse engineering. I am very impressed. But why reward Apple despite not helping in this effort?


What's more cynical is that these hackers are helping Apple to eliminate the competition, and when that happens they'll put another layer of crypto on everything and the hackers and other developers have no platform left to work on without paying 90% Apple taxes.


> What's more cynical is that these hackers are helping Apple to eliminate the competition [...]

Nonsense premise.

> [...] they'll put another layer of crypto on everything [...]

That's just unsubstantiated FUD.


> > What's more cynical is that these hackers are helping Apple to eliminate the competition [...]

> Nonsense premise.

Take Brew and any container runtime away from all developer's laptops. See how useful they are for development when compared to the more open competition.


This is completely nonsensical. Take away your linux package manager and compiler and you'll get no work done either!


Err, the premise is that Linux tech and effort is improving the appeal of Apple products, which hurts the more open, Linux-friendly competition. Look at the rest of this thread comparing MacBooks to other options. My point is that a MacBook without the free work by these hackers/tech would be a paperweight for development and the Linux-friendly options would get more business. Your retort of "take Linux tech away from Linux" doesn't make any sense.

Apple is entirely comfortable with using crypto to lock down its platforms when the competition is dead and users are left with no other choice.


My package manager and compiler are part of my OS. Brew and container runtimes are not part of MacOS.


> That's just unsubstantiated FUD.

It's hypothetical. The problem is that the reverse argument is also unsubstantiated.


What are you on about?

Apple locking down their platform is their prerogative. If it comes to that and the Asahi team cannot continue their reverse-engineering efforts then they'll stop.

Consider why they began in the first place. The M1 and their subsequent M2 and future chips are amazing. The M1 was such a huge leap in performance per watt that it wow'd everyone. In fact it's a huge testament to the hardware team at Apple for creating such an amazing bit of kit that a team of Linux hackers wanted to work on porting Linux to it. This rebellious striving for freedom is refreshing and amazing. They're going to get Linux working on the M1/2... hardware working under Linux and it'll be even more performant than under MacOS. That's huge.

But now folks are saying why Apple? Because nobody has a chip that rivals the M1. Why would you settle for worse performance? Why would you settle for build quality from a lesser hardware manufacturer? Qualcomm and others don't have chips that are as performant. They might in the future but by then the M3 or M4 will likely be out.

Why are we punishing hackers -- in teh purest sense of the word -- for opening up a platform that is superior (hardware wise) to any of the other offerings from all the other deep pocketed ARM laptop/desktop manufacturers? Oh, right, because of tribal hate of Apple. Smh.

If the other manufacturers get off their butts and pour billions into chip design and process and can get laptops out with similar or better performance characteristics then perhaps other teams will attempt what Asahi is doing, and if these same manufacturers wanted to release the specs or work with the upstream Linux community and release drivers themselves that'd be even better. Until then I will continue to support the Asahi team and champion their efforts in every ear that will hear me because I am just so astounded by what they've accomplished so far.

I say all of this as a former Apple-stan. I had macbooks every year since college until now. That's some 15 years. I've since gone full time on Fedora and Thinkpads (currently a P14s but maybe an X1 Carbon in the future).

Edit: I don't think there will be a huge wave of sales because of this for Apple but it will mean that i can get a used M1 and run my favorite distro of Linux because the Asahi team is working with upstream to get their changes up-streamed -- they're amazing like that. It really is how open source work thrives.


Apple produces consumer hardware. You can't build anything hardware-wise using the M1 or M2 processors. There are hackers and startups who love to build new hardware. They now see Apple buying entire supply chains and dominating the market. If this continues, then after a while these hackers will not be able to fully depend on technological progress because it will all be locked up.


What? None of that makes any sense.

There’s been no chatter about Apple buying up or hogging wafers. Nobody is preventing others from building ARM based machines. The M1 and M2 chips are proprietary to Apple and so be it; the Asahi folks are allowing us to run Linux on them at full acceleration. What’s not to love?


> There’s been no chatter about Apple buying up or hogging wafers.

https://www.extremetech.com/computing/315186-apple-books-tsm...

It already happened for TSMC's first-gen 5nm node.

> Nobody is preventing others from building ARM based machines.

There is somebody, namely the ARM corporation that Apple owns a controlling stake in. So yes, Apple does prevent people from doing what they please with the ISA.

> What’s not to love?

You sound like the people preaching the Nouveau drivers right now. "we reverse-engineered this proprietary GPU and got it working at 50% speed and 3x power consumption, what's not to love?"

Nvidia's first-party drivers are far-and-away the more popular (and faster, more power-efficent, more well-supported, etc.) option. What's "not to love" is the fact that we're cheering for someone doing thankless and redundant work that wouldn't exist if the multi-billion dollar corporation dedicated a couple engineers to Linux support. You can't even consistently control the brightness on these machines more than a year after they've launched, it's obvious that there are significant problems WRT reverse-engineering the hardware.


“ > There’s been no chatter about Apple buying up or hogging wafers. https://www.extremetech.com/computing/315186-apple-books-tsm... It already happened for TSMC's first-gen 5nm node. > Nobody is preventing others from building ARM based machines. There is somebody, namely the ARM corporation that Apple owns a controlling stake in. So yes, Apple does prevent people from doing what they please with the ISA.”

AMD and Nvidia and Apple are going to be buying from the new plant.

Again show me where ARM is preventing folks from licensing it? It’s antithetical to the whole of the company.

Given what is possible and what’s possible is what you can control and what you can control is what you do, so in that light they the Asahi devs took it upon themselves to reverse engineer hardware that they knew would not be opened. What’s easier? Getting apple to make like Intel and have a Linux division? Haha. So they took it upon themselves and that effort is laudable nay it’s worthy of lots of praise. Heaps of it.

The Noveau argument is a false flag. Think where they could get if they could get proper firmware. And you can get that on the Apple side.

I just don’t understand what people want? Awesome smart folks are working to open a platform that would be closed. And yet they get shit on. Instead people would rather whine and moan or write apology pieces about hardware that sucks in comparison.

Nobody is porting Linux to arm surface hardware because it sucks in comparison. Give me a 14 inch MacBook Pro with 32GB or ram and Asahi Linux any day of the week.


> Again show me where ARM is preventing folks from licensing it? It’s antithetical to the whole of the company.

ARM is a proprietary ISA. To use it, you have to pay ARM money. It's literally their entire business model, I'm not sure how you could miss it.

> so in that light they the Asahi devs took it upon themselves to reverse engineer hardware that they knew would not be opened

Yep. It's a damn shame too, that's what everyone is saying in this thread. Apple has billions of dollars and they're letting volunteers do their work for them. It's a depressing waste of human effort, considering how Apple has the proper implementation specs available internally. It's undeniable that Asahi's development pace would be faster if they had rudimentary help from Apple engineers.

> Think where they could get if they could get proper firmware. And you can get that on the Apple side.

That's also a false-flag since Apple's firmware interface is undocumented. Plus it's also fairly outdated because Nvidia's GPUs have been shipping with firmware interfaces for years (since RTX 20-series). Think where they could get if they had open source kernel modules. And you can have that, on any recent Nvidia card.

> Awesome smart folks are working to open a platform that would be closed. And yet they get shit on.

They get shit on because they're wasting their time. It's been 2 years and you still can't adjust the brightness on these machines, not because they're incapable of it but because Apple never documented the control interface for each model. Apple has this info, they just withhold it from the community because of how horribly sensitive it is. Real security issue, yunno.

It's really tragic to consider all the engineering hours lost trying to figure out how Apple's hardware works. It's been 2 years since the M1 was released and it still doesn't have the same level of Linux support as a HP or Lenovo machine would have on Day-1.

> I just don’t understand what people want

A Macbook with Linux on it? Preferably one that doesn't suck.

> Nobody is porting Linux to arm surface hardware because it sucks in comparison.

And nobody ported Linux to the previous Macbooks because they also sucked. It's entirely besides the point, though.


"> Again show me where ARM is preventing folks from licensing it? It’s antithetical to the whole of the company. ARM is a proprietary ISA. To use it, you have to pay ARM money. It's literally their entire business model, I'm not sure how you could miss it."

What did I miss? Show up, pay the license, build chips. That's the business model. Are you upset about how business models work? Are you advocating all hardware ISAs be open a la RISCV? That's insane. ARM's whole model is to make money and to do so they'd welcome licensees.

"> so in that light they the Asahi devs took it upon themselves to reverse engineer hardware that they knew would not be opened

Yep. It's a damn shame too, that's what everyone is saying in this thread. Apple has billions of dollars and they're letting volunteers do their work for them. It's a depressing waste of human effort, considering how Apple has the proper implementation specs available internally. It's undeniable that Asahi's development pace would be faster if they had rudimentary help from Apple engineers."

Do you have this same take on the Homebrew project and its many competitors? One could make the same argument that Apple should run their own package manager. Why allow some third party project to add value to the system by allowing end users to be able to run open source software easily on Apple hardware and software? I find this line of reasoning nonsensical.

"> Think where they could get if they could get proper firmware. And you can get that on the Apple side.

That's also a false-flag since Apple's firmware interface is undocumented. Plus it's also fairly outdated because Nvidia's GPUs have been shipping with firmware interfaces for years (since RTX 20-series). Think where they could get if they had open source kernel modules. And you can have that, on any recent Nvidia card."

Nvidia is finally working on first party open source-ish drivers. So that's a win I guess. But that's only because the IP owner -- Nvidia -- deemed it necessary to do so. I am not sure what army of Stallman-stans you command but I am not sure Nvidia or Apple or any other enterprise is going to bend to some FOSS ideal. So given that very real reality intrepid hackers like the Asahi folks took it upon themselves to reverse engineer the hardware and it has been a win for Linux/BSD enthusiasts the world over, how is this bad?

"> Awesome smart folks are working to open a platform that would be closed. And yet they get shit on.

They get shit on because they're wasting their time. It's been 2 years and you still can't adjust the brightness on these machines, not because they're incapable of it but because Apple never documented the control interface for each model. Apple has this info, they just withhold it from the community because of how horribly sensitive it is. Real security issue, yunno."

Smart hackers -- again, in the truest sense of the word -- chose to spend their time doing this. In fact Hector Martin when he embarked on this asked for donations and plenty of folks are donating with their cash to fund this effort. There's clearly a market for this. It's not the fault of Martin or his friends in the Asahi world that Apple doesn't see this. And Apple may never see it. So what? The Asahi team will have brought the ability to run Linux to the M1 and increased the choice amongst Linux enthusiasts, it's a huge win.

"> I just don’t understand what people want

A Macbook with Linux on it? Preferably one that doesn't suck."

If Apple isn't going to give that to you as we just settled above (unless you want to buy a few board seats, or march on Cupterino with some sort of army...) then how else is that going to get accomplished if not by the Asahi team?

"> Nobody is porting Linux to arm surface hardware because it sucks in comparison.

And nobody ported Linux to the previous Macbooks because they also sucked. It's entirely besides the point, though."

While not 100% easy people have been running Linux on x86 Macbooks for a long time. Not sure what you're getting at here.


> While not 100% easy people have been running Linux on x86 Macbooks for a long time. Not sure what you're getting at here.

Linux support on x86 MacBooks was pretty bad for recent models, and moved along really slowly compared to Asahi. There was definitely less interest in those porting efforts.


Come on, you have to admit that competition is lacking in the CPU space, and Apple being closed about everything isn't helping. If you want to see open, have a look at Microsoft Research. Apple is nowhere near that. FOSS people have no reason to like Apple, let alone to support them.


“ Come on, you have to admit that competition is lacking in the CPU space,”

Competition is lacking because Intel fucked up. Apple bet big on power sipping performance and it’s paid off. Why should they “help” the industry out when they’re so far ahead?

They also have no incentive to spend billions on R&D only to open it up to competition. That makes no sense. They’re not a platform like Microsoft is. MS wants an open platform hardware wise so they can sell more licenses of Windows. They’re different business models. Surely you see that?

“ FOSS people have no reason to like Apple, let alone to support them.”

Of course. And nowhere was I saying “FOSS” folks to support them just celebrate the work of your fellow hackers doing the equivalent of reverse engineering some Empire tech for the Rebellion. Does that metaphor work for you?


> Competition is lacking because Intel fucked up.

And now it's Apple's turn to fuck up. The M2 isn't even 20% faster than the M1, it's like a Skylake situation all over again.

> Apple bet big on power sipping performance and it’s paid off.

Apple bet big on the 5nm node (bought the exclusive rights to use it) and it paid off. Your marketing copywriting doesn't mean anything if you don't back it up with evidence.

> They’re different business models. Surely you see that?

They both make hardware. Shouldn't they both get held to the same standards, to encourage healthy competition? They certainly have the financial means to do it.

I think your technical perspective on this situation is horribly maligned, you should spend more time researching the technologies Apple used rather than repeating the words from their announcement event.


> Apple is benefitting from the availability of Linux on these devices.

How do we know this? I think there are downsides too.


And those would be...? Keep in mind absolutely everything in life has downsides, so listing only minor things is for all intents and purposes, worthless.


I would absolutely run a cloud Linux VM on M1 compared to any other hardware. Apple can create a PaaS that can rival AWS


they could ... but doing so would likely cost far too many billions even for Apple. Maybe they could justify it by saying instead of paying AWS and GCP and Azure for services they could just do it all in house on Apple hardware -- and boy howdy that'd be really cool -- I don't think they REALLY think that is a useful use of their time and resources and would rather instead focus on devices and services.


Have you checked AWS' Graviton?


Someone working on the re effort mentioned how Apple runs their own testing on those machines on Linux while macos support is being developed. I don't know where they got the information from. (It was posted on Twitter, but can't dig out the link now)


We know this because people right here are saying they find the Apple hardware more attractive now that it runs Linux.


Maybe Linux on M1 helps Linux at the expense of macOS.


I don't think the grandparent is saying that people are wasting their time; they're saying that it's a shame that they even have to do this, and that public hardware documentation isn't the norm.


They could spend their valuable time somewhere else instead or reinventing the wheel


Is anyone using this as a daily driver? If this is your case, what is your experience? I'm a linux user looking for a new laptop. My preference would be a thinkpad but the Apple machine looks way superior. Migrating to Apple OS is a no go for me. My mainly use will be internet browsing, js development with vim + running docker containers.


About 20 days ago I asked the same question ("How ready for daily driving is Asahi Linux?") as a Ask HN, resulting in ~100 comments about it. https://news.ycombinator.com/item?id=33607994

Conclusion seems: depends on how ready you are to live with the various drawbacks. Personally, I wasn't, but I'm hopeful the day will come soon as I like Apple hardware in general, but can't stand Apple software.


Functional GPU drivers were a big missing piece! Even if they just do 2d bit-blitting for the desktop (and they seem much more functional than that), it saves the CPU a ton of work.


I vastly prefer my Thinkpad X1 Nano to my work MacBook Pro. I pair it with a desktop, so keep that in mind…

The nano is very, very lightweight, which makes it an amazing portable device for packing up and carrying around. The display is matte, which reduces glare when working outside. The keyboard feels significantly better to type on. This is the biggest pro for me. The camera has a privacy shutter, which gives me a greater peace of mind. And of course, it works well with Fedora Linux. I also optioned mine to have a 5G modem, which is convenient, although I rarely use it due to costly data plans. I have only managed to get the modem to work on the Windows side, but I’m optimistic it will have better Linux support one day.

The MacBook Pro is an impressive piece of hardware. The M1 chip is powerful, the battery life is amazing, and the build quality is high. However, I find it to be a much better experience exclusively using it at home docked in my setup due to its weight and glossy screen. At home, I can use my own mechanical keyboard when it’s docked to get around its mediocre keyboard. At that point, I’d rather just use my desktop. But if you’re only getting one device and are fine with MacOS, it is a good option. I prefer the more flexible desktop + lightweight laptop setup personally.

A minor thing I’ll note in favor of Apple is that the MacBook Pro is capable of driving my nicer Sennheiser headphones with ease. It’s something most people wouldn’t care about, but Apple excels in the audio department and deserves praise.


How many hours of battery you can get from the X1?


I’ve honestly never measured it, but it lasts me most of the day (e.g. ~8 hours) with the i5 chip I optioned it with last year. I wouldn’t classify battery life as a strength, but I wouldn’t classify it as a weakness either. On a normal day, I’m never worried about the battery dying. If you want more battery, the bigger sized X1 offers more than the Nano.

My normal workflow consists of Firefox playing music on YouTube, VS Code, and the terminal.


In another year or two you’ll be able to buy an M1 Air refurb on eBay for maybe 400$ and once this thing is stable, that would probably the best bang for your buck Linux laptop you can buy.


You can already find some people selling 2020MBAs on fb marketplace for 400-500


I work at a network VAR and use it regularly though not exclusively. The big remaining limitation at the moment is the speakers are still disabled while the Asahi team works on volume safety. Other than that it's reasonably stable for non critical use but not something to be relied upon to work right by any means. I'd say give it another year unless "I want to get it to tinker" is higher on the list of reasons for getting a laptop than "I want to do work". If you need a laptop sooner than that one of the commonly recommended x86 laptops recommended by HN'ers would probably be the way to go.


Donate.

As a reminder, especially during this holiday season - to donate to your favorite OSS project.

https://asahilinux.org/support/


[flagged]


What do have they done thats toxic? Total outsider to Asahi btw.


what echo chamber, and what toxicity are you referring too


From the code of conduct:

[...] Be kind to others. Do not insult or put down other participants [do not make] Personal insults, especially those using racist, sexist, or otherwise discriminatory terms [do not] Deliberately referring to others by names or pronouns counter to their identity.

The "echo chamber" of "not being shitheads to people". Seems like a pretty neat echo chamber to me. I guess the parent poster finds it impossible to interact with others unless they can use slurs, threats, or what have you.


Surely the actual conduct of community always is the same as stated code of conduct, especially outside of one :^)

Also a nice jump to conclusion (and an ad hominem attack), you already know which ideological line you are pushing, regardless if it runs contrary to what takes place in actuality.


truly incredible, for less than two years, a group of talented engineers managed to release the world's first GPU driver in Rust by reverse engineering apple hardware known for its opacity. oh, they also happened to have the free time to port Linux to Apple hardware.

now I just have one question - when this wonderful work will be merged into the mainline kernel.

(PLEASE - no one cares who is Lina, we've been there many times, let's don't do it again here in this thread. thanks!)


> no one cares who is Lina

I couldn't care less who she is, but I'm pleased they're giving more status updates in written form.


> when this wonderful work will be merged into the mainline kernel

Last year, they already merged part of their work: https://www.theregister.com/2021/04/09/asahi_linux_merged/

But I haven't seen anything since. It's the stated goal, though.


Provided Linus Torvalds has been test driving it, I'd expect Asahi to be merged in sooner than later...


Presumably GP means the GPU driver specifically.

There's a table here that shows not only what's supported vs. not, but also which minimum kernel version or linux-asahi/asahi-edge release it's in: https://github.com/AsahiLinux/docs/wiki/Feature-Support


Who is Lina? :-)

Edit: honest question...


I'm not sure if this is an elephant-in-the-room sort of thing of if people legitimately haven't picked up on it, but if you listen to the speech patterns and accent that Lina presents, (s)he speaks exactly like marcan. You can listen for yourself and form your own opinions, but I am firmly in the camp of "Lina is marcan".


There are only a handful of people in the world that can do this type of work and the odds of both of them being into live streaming and Anime is exactly 0. I have no idea why marcan decided to make his reversing magic unwatchable but I hope it stops soon. That said, he doesn't try to keep it a secret: https://www.youtube.com/watch?v=effHrj0qmwk https://twitter.com/marcan42/status/1509926572488556546


look at the date


So you actually think marcan gave Lina his streaming API key to pull off this April fools prank? Do you also think they conspired to use the same DE, editor, and tooling from the start to fool us all?


Honestly? I don't care, you're probably right but if he wants to keep this double personality that's cool, who cares


I only care because I enjoyed watching him reverse engineer in his easy to understand normal unmodified voice.


Lina also lives in Tokyo, Japan; curiously just like Marcan.

What are the odds of two of the brightest kernel developers living in the exact same city? ;)


???? what are the odds that a tech metropolis with a population of more than 10 million people has 2 people that are proficient enough to contribute to the linux kernel at a high level? i would say very high. i don't have an opinion on whether they are the same person or not, but your logic is hilarious.


Well, it is the most populous city in the world.


If Lina is an Apple employee or otherwise has access to Apple trade secrets, it would be a huge issue.

Merging code from an unverified source that is purportedly submitting a clean room implementation would be quite irresponsible.


Marcan has been explicit in that he knows who Lina is and has indicated that they have been in the same room together. This scenario is not a risk.

Others have discussed in here the speculation around Lina's identity if you care about the specifics for whatever reason. Personally, I don't think it's particularly important.


See if "Marcan said it's fine" flies with legal.


The Linux kernel doesn't allow for anonymous or pseudonymous contributions. It does allow for organizations to contribute on behalf of people working for or through them.

If Asahi wants to submit this upstream, then they and Marcan can put their name to it. Hector Martin or another person at Asahi would be the name on the git commit, and they would almost certainly be taken at their word that there is no concern with the merge around Lina's identity.

It's not like other commits which are done from clean room re-implementations are requiring a background check on the person submitting it.


Silly question.. is there a way to run Asahi Linux on a M1Pro MacBook in a live mode without installing it?

I’d love to try it for fun on a computer that is not mine…


Can I install to an external/USB disk?

Apple Silicon machines cannot boot from external storage. While it may look like they do when you choose an external macOS volume, behind the scenes parts of its boot components are being copied to the internal drive to make this work. It’s unclear whether this mechanism will ever be usable by third party OSes, for technical reasons.

Instead, we recommend using the UEFI environment only installer option to install only a UEFI bootstrap to your internal drive. This only requires around 3GB of disk space, and it will then automatically boot from any connected USB drive with a UEFI bootloader. Note: installing the Asahi Linux desktop images to a USB drive automatically isn’t supported right now, though if you’re adventurous enough it’s not terribly hard to do manually :-)

> https://asahilinux.org/2022/03/asahi-linux-alpha-release/


Sadly no, and it's unlikely for any OS to be livebooted on Apple ARM systems (effortlessly, like on x86 BIOS/UEFI). macOS does this cheat of "copying bootfiles it needs in internal storage", and it'll be likely that Asahi (and any other OS) would need to set up a permanent partition just so they can pretend to boot up external drives.


M1 boot sequence requires some code to be installed on the system. Once the execution jumps to the bootloader the rest can be run from anywhere, such as USB stick.

It still requires adding a new partition with bootloader and some other files required by Apple for successfully verifying the signatures on everything, so it's not a "live mode".


Not as far as I know, but the installation process is the easiest time I've ever had installing any linux


Whenever there is an Asahi thread on HN, I like to ask people who are daily driving it - how is your experience?


i've been daily driving Asahi on M2 for months and it's awesome, but not all drivers are done, sound in particular and sleep modes seem not complete. neither matter to me personally for daily driving.


How is the battery-life? Is it comparable to MacOS? What kind of issues about sound and in general are the most annoying from the end-user perspective?


it's long, but not as long as if the drivers were further along for the screen brightness and processor/os sleep states, I think.

I can work many many hours not plugged in. I'd not measured how long.


Given this feat, it’s quite incredible that Nvidia is incapable of shipping stable Linux drivers for the RTX 30xx series.


Apple Hardware is starting to look attractive to me, as a diehard Linux user. But not sure if I'd want to do this to myself at this stage of development.


I will say do it if your confortable with something like arch or Gentoo but with less documentation on what to do when something goes wrong. Right now it's a little bit raw, don't take me wrong the progress so far it's amazing, but the task itself it's gigantic too


Yeah I use arch (btw), I'd just be worried that I would run into stability issues more frequently while having to be productive on stuff. Being a tester can be frustrating at times.


Congrats! There's not a whole lot left, is there? Audio via the speakers I think was the last one I was concerned about. Maybe brightness?


USB4, Thunderbolt, audio over Thunderbolt, DisplayPort, TouchID. I'm not sure mike and camera work.

Also various accelerators: video decoder, video encoder, neural engine.

Unless I'm mistaken all of it is in progress, but not yet ready.


> audio over Thunderbolt

I'm not sure why this rates a separate mention, or even what exactly you're referring to with this one. Did you mean audio over DisplayPort or HDMI? I don't think there's any standard for audio over Thunderbolt like there is for audio over USB, and if there was then it would automatically start working when Thunderbolt itself is supported.


Yes, I mixed it up, sorry.


Deep sleep, currently they have a s2idle state that eats a bit too much battery.


This is a big one — when you close the lid on your laptop, the laptop stays on. When you open the lid again, the battery will be drained, because it stayed on the entire time.


Depends on how low your battery was. Here's a quote from their november update:

"CPU frequency scaling, device runtime PM (for select devices), hardware auto-PM… even prior to this release, users could already get 10+ hours of idle runtime. With DCP and proper display DPMS, that now goes as high as 30+ hours (powered on, screen off)!"

[...]

"While s2idle does work, it’s in its infancy and we haven’t debugged all driver issues yet. Here’s what works:

    NVMe is shutdown
    WiFi goes into S3 mode
    Display (DCP) goes into DPMS (backlight & screen fully off)
    DARTs power gate & restore state on resume
    CPUs stay in shallow idle
    Some misc devices (i2c/spi/etc) power off
    Wakeup via power button or lid open
"

So they report 30 hours of battery life with display off (not sure if it was in s2idle or just normal screen off operation). So if you close the lid overnight, it should eat <30% of the battery.


Display controller stays on. It's battery-hungry. This should change, now that DCP driver is available.

s2idle will be pretty resource-light once all peripherals can be put into sleep. CPU by itself is not consuming that much power when idle.


Unfortunately, that's also a problem with like 90% of new computers out there.


This has been a problem as long as laptops have existed. I had a Dell in like 2006 that I would close and throw in my backpack at the end of the day. When I got home there was a 50:50 chance it would be about 5000 degrees with the fans blasting because it never actually went to sleep.

The fact that this still routinely happens 15 years later despite the insane progress of technology is kind of hilarious.


This is a widespread recent problem due to Intel pushing for S2 sleep.


The remaining 10% are the mac, I suppose? It's one of the better things about macs, and has been working mostly flawlessly for at least 15 years, strange that the other platforms still struggle with it.


There are a few non-Mac vendors that still support S3 sleep mode. I think Thinkpads still work. The problem is that it requires vendors to support it in the BIOS. Windows no longer supports S3 sleep mode, so vendors aren't willing to add that feature just for Linux users.

Previous discussion:

https://news.ycombinator.com/item?id=33846437


My 2019 Intel Mac Pro is really flakey about actually going to sleep when I close the lid. It seems to require me to reset the PRAM occationally to get it to work again.

I'm glad they moved to their own chips because these Intel Macs are the worst hardware I have ever owned.

My Thinkpad works fine under Linux, I didnt need to do anything to get great battery life and perfect sleep.


For what it's worth, my MBP is also a coin toss if it manages to get to sleep or not. Oftentimes even if i explicitly click Apple -> Sleep it just flashes and stays awake.


A very, very, very, long list of features.

There's actually a humongous list of peripherals, power states, and the speakers which don't work at all, to support.


Brightness is working already, as well as suspend. But there is a _lot_ of stuff to work out


Only on M1 devices, excluding the Mini & Studio: https://github.com/AsahiLinux/docs/wiki/Feature-Support



The important parts are working.


There might be different definitions of "important stuff" between you and others...


The base components are working, thats what I meant.


That definition still varies between you and others. I personally consider sleep to be a "base component."


Base is a component that's needed to the other components, such as the GPU, or a driver for the screen, and similar. A sleep function isn't a base component, since it's rather a feature.


That definition is obviously subjective, and depends on the user.

Some users (like myself) don't need a GPU for anything, and thus a GPU is not "base." It's a frivolous addition for playing games. You can run a whole desktop environment on CPU rendering with no problem.

But going to sleep is necessary for unplugging a laptop and taking it on a trip, which is a basic feature for many users, like myself, and a basic component of having a "laptop."

Sleep is also a necessary for the "battery component" to hold a charge while it's unplugged and unused.


How's M2 looking? Noticed this in release notes from a few months back

> Only the M2 MacBook Pro 13” is tested. We’ve added completely untested M2 MacBook Air support (because we can), but none of us have one yet! If you do, only try it if you’re feeling very adventurous (and don’t blame us if things go wrong).

I think it's time to give it a go on my M2 air :)


It has been fascinating to watch the progress being made on this in the past few months. Makes me wish I had continued studying Computer Science at uni...


If there's a way to get support up to opengl 4.6 then almost all android and iOS apps can run natively (arm) on that setup! Very cool.


Does this enable Mac mini dual screen on Linux? Really been looking for that before switching!


GPU and display controller are two different pieces of hardware on M1 (and on most of computers out there except PCs), so GPU driver won't change anything.


Most "computers" out there are PC's!


There are more mobile phones alone than PCs, and phones are computers with screens.


Aren't M1s PCs too? I wonder why Macs are often not counted as Personal Computers.


> The designation "PC", as used in much of personal computer history, has not meant "personal computer" generally, but rather an x86 computer capable of running the same software that a contemporary IBM PC could. The term was initially in contrast to the variety of home computer systems available in the early 1980s, such as the Apple II, TRS-80, and Commodore 64. Later, the term was primarily used in contrast to Apple's Macintosh computers.

https://en.wikipedia.org/wiki/IBM_PC_compatible


> fast enough to run all of the above at 60 frames per second at 4K.

Does anyone know if it supports variable refresh rate? Apple's marketing term for it is "ProMotion", which is hard to search for.


Wow, that’s really exciting. I’m looking forward to having the option to run linux. I’ve been eyeing elementaryOS for years, but it will be hard to leave some macOS-only apps behind.


I wonder if it can run Solvespace (CAD). It's fairly simple but IIRC requires ES 3 or similar. I'm kind of embarrassed not knowing our GL requirement myself ;-)


Could this pave the way to being able to use Mac GPUs in Docker (Linux VM), running on macOS?


So finally native docker on macs? Is that an attraction for anyone?


Congrats!

On a side note, I wish more DEs and Wayland compositors would move to Vulkan.


What’s the game demonstrated in the screenshots?


Looks like Quake 3 and Super Tux Kart


It should be ready to be Fandaniel Linux now.


It's really sad you have to resort to reverse engineered drivers which will always be behind because Apple won't support Linux officially. You'd think the most profitable company in the world could do better. Really sad.

Not to dismiss these efforts, incredible engineering. But I won't buy Apple hardware unless Apple officially supports Linux. I honestly don't know why they wouldn't. More developers using their hardware is a good thing imho.


It’s sad you think this work is “half-baked.”

The reality is Apple will probably never release drivers for Linux or Windows (the latter of which ever again).

This work is the premier effort in its space and the reverse engineering skills required to accomplish it are exceptionally uncommon.

I doubt you could hire for this type of position remotely easily if you wanted to find the skillset for a corporate environment.


I wasn't trying to down play the engineering. It's going to be half baked because the developers don't have access to hardware documentation. When the next Apple chip comes out, it's back to square one. It's entirely Apple's fault.

Plus while they may be able to achieve good results, we'll never know if they took full advantage of the hardware or realized it to the full potential. Because it's proprietary.


> When the next Apple chip comes out, it's back to square one. It's entirely Apple's fault.

That's simply not true.

Source: wrote a couple of drivers for M1 (now upstreamed in mainline Linux) that work without a single change on M1 Pro and M2


Cool I'm glad you enjoy the Apple status quo and don't want official Linux support. But it's not for me.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: