Hacker News new | past | comments | ask | show | jobs | submit login
An app for M1 Macs that plays the sound of a fan as CPU usage goes up (rambo.codes)
843 points by spideymans on July 18, 2021 | hide | past | favorite | 446 comments



Pre M1, I've always preferred a PC.

When I'm using a laptop, especially doing real work, they heat up, the fans go crazy, the laptop is hot to the touch and everything slows down.

It almost makes you anxious. You're wondering if it's about to crash or go on fire.

I've always kept a Desktop PC on the go for this reason, a well cooled desktop just doesn't have these issues. With a laptop, I always feel like I'm compromising a bit.

I got an M1 Macbook Pro recently and it just doesn't have these issues, I fire up my whole dev environment and get busy, and it's still quiet and (mostly) cool, and I can't notice anything slowing down.

I don't care so much about the architecture differences, x64 vs ARM etc, but the fact that I can finally use a laptop like a desktop is massive.


Same here, it’s like a freaky experience. I run mostly from the browser but have a couple of things going at the same time. One thing I noticed is the blazing fast typing on the Mac. There is an ever so small lag on windows in perceptible when using it. But, when you type on the Mac you notice it, and the brain just feels good using it. I have no idea if this is a real thing, but it’s similar with scrolling, you touch the track pad and things move. Independent of the load I’m running. Now, that many applications are moving to M1 compatible I’m thinking of replace my windows box with a mini+two external hard drives. I can’t imagine what will happen when they release a new processor. This one does more than enough.


One of the reasons iOS feels smoother than Android is because the render loop of the OS is decoupled from the app. The apps aren’t allowed to introduce jank, so if you’re scrolling a webpage and stuff is loading simultaneously, iOS will be way smoother. I think this is also why they can have such low latency on inputs, for example with the Apple Pencil which is much lower latency than the surface pen or the android stylus. I had a 120hz android phone for over a year, and while the frame rate when scrolling is slightly worse on iOS, overall the OS feels more fluid to me. On a 120hz iPad it’s no comparison.

I am speculating here as I don’t know for sure, but I remember iOS started as a derivation of OSX so this may be the case for macOS as well. So I think it’s not your imagination, it’s a different input and render architecture than windows or android.


Android has had a separate "render thread" for ages, though it's per app and runs in the app process. Some animations run on it, but many do not. You can't access it directly from within the app.

The thing macOS on M1 does very cleverly is scheduling threads on cores. IIRC all UI threads run on high-power cores, while all background tasks run on low-power ones. So they never interfere with each other. iOS probably does the same; Android probably does none of this.


> The thing macOS on M1 does very cleverly is scheduling threads on cores. IIRC all UI threads run on high-power cores, while all background tasks run on low-power ones. So they never interfere with each other. iOS probably does the same; Android probably does none of this.

This feature has been in the Linux kernel for ages[1]. Android and ChromeOS are based on the Linux kernel, and have had this feature for quite some time. This is nothing new.

[1] https://community.arm.com/developer/ip-products/processors/b...


So why does linux ux feel slower?


Because in many cases, it is. Unfortunately, there isn't as much communication between the GUI people and the kernel people in the Linux community as there is between those same groups at Apple Inc. Not to mention, there are multiple competing groups of GUI people in the Linux community making coordination across these levels difficult. Also, there are many competing interests working on the kernel who might oppose kernel-level optimizations which favor desktop usage at the cost of, for example, server usage. As a result of these and many other factors, Linux's desktop UX remains far less optimized when compared to macOS's desktop UX.

As with most of Linux's rough edges, however, this is trivially fixable if you're technical enough. Of course, that's exactly macOS's advantage. If you want the best experience on macOS, you don't need to be technical.

Personally, I run Linux with a tiling window manager and lots of custom keybindings and little scripts. Having used the default macOS experience a few times at work (on machines far more powerful than my dinky laptop), I can assure you that my highly customized setup feels far more responsive. On the flip side, it required a lot of up-front investment to get it to this point.


I never felt the UI responsive difference between Linux and MacOS but Windows (including freshly installed on powerful many-core machines when) is a different story. The number one reason I ever switched from Windows to Linux is the latter always feeling way more swift - UI responsiveness always remaining perfect and some background tasks also working much faster. And I never actually used lightweight WMs - only KDE, GNOME and XFCE. The first time I've noticed some slowness in Lunux was on RaspberryPI (4, with the default LXDE).


This.

I think the main advantage with macOS is that it's above all else a system designed to be used interactively, as opposed to a server, so they don't have to put out a configuration "good enough for most things".

I also run Linux with a tiling window manager on an old machine (3rd gen i7), and it flies. One thing that made a huge difference in perceived latency for me was switching the generic kernel with one having Con Kolivas' patch set [0].

I'm using Arch and sign my own EFI binaries, so I don't care about out of tree patches, but for Ubuntu users and similar who don't want to mess with this, there's an official `linux-lowlatency` package which helps [1].

---

[0] https://wiki.archlinux.org/title/Linux-ck

[1] https://packages.ubuntu.com/search?keywords=linux-lowlatency...


X-Windows: …A mistake carried out to perfection. X-Windows: …Dissatisfaction guaranteed. X-Windows: …Don’t get frustrated without it. X-Windows: …Even your dog won’t like it. X-Windows: …Flaky and built to stay that way. X-Windows: …Complex non-solutions to simple non-problems. X-Windows: …Flawed beyond belief. X-Windows: …Form follows malfunction. X-Windows: …Garbage at your fingertips. X-Windows: …Ignorance is our most important resource. X-Windows: …It could be worse, but it’ll take time. X-Windows: …It could happen to you. X-Windows: …Japan’s secret weapon. X-Windows: …Let it get in your way. X-Windows: …Live the nightmare. X-Windows: …More than enough rope. X-Windows: …Never had it, never will. X-Windows: …No hardware is safe. X-Windows: …Power tools for power fools. X-Windows: …Putting new limits on productivity. X-Windows: …Simplicity made complex. X-Windows: …The cutting edge of obsolescence. X-Windows: …The art of incompetence. X-Windows: …The defacto substandard. X-Windows: …The first fully modular software disaster. X-Windows: …The joke that kills. X-Windows: …The problem for your problem. X-Windows: …There’s got to be a better way. X-Windows: …Warn your friends about it. X-Windows: …You’d better sit down. X-Windows: …You’ll envy the dead.

https://donhopkins.medium.com/the-x-windows-disaster-128d398...


> like Sun’s Open Look clock tool, which gobbles up 1.4 megabytes of real memory!

It's funny to read this in an era when smartphones come with 6 GB of RAM to compensate for developers' laziness and unprofessionalism.


Almost all major distributions use Wayland as of today.


move on its 2021....


I use Plasma Desktop and it's been more responsive than macOS was, so I don't know.


On the exact same hardware linux always felt much faster to me, I remember doing stuff like resizeing finder windows vs KDE's dolphin, finder would be all janky and laggy and KDE wouldn't miss a frame



Yeah, my go-to for speeding up Macs is to throw Linux on them.


>the render loop of the OS is decoupled from the app

Can you elaborate on this? If you do too much work on the main thread in iOS, it's going to hang the UI. Isn't the main thread the "render thread"? Do scroll views have some kind of special escape hatch to get off the main thread for continuing to scroll if the main thread is blocked with loading the content?


I believe the point is that the "main thread" for your iOS application, is not the main thread of the OS. They're totally decoupled.


Err, same with Android? And every OS ever. That's just standard process isolation. Or am I misunderstanding something?


There’s some deep dive articles on the way the input loop works. But OP is correct, and the reason iOS feels smoother. Android has a lot more UI lag.


I don't think Apple has any special tricks for input loop.

Some Android phones really have input lag, but it is not caused by CPU load. For example, on my phone, there is approximately 100-150 ms lag between tapping the screen and registering the touch. The lag is not caused by CPU load, but by slow touch sensor.

I don't think Apple has any smart code optimization tricks. Either they have a faster touch sensor or just some people believe that if something is made by Apple then it is of a better quality.

Here is a comparison of input lag in games in iOS and Android [1] and it shows similar results: 80 ms between a tap and reaction on screen.

[1] https://blog.gamebench.net/touch-latency-benchmarks-iphone-x...


They do! See https://devstreaming-cdn.apple.com/videos/wwdc/2015/233l9q8h... and the corresponding WWDC session on them minimizing the input-to-display time.


This is off-topic, but I love that the title of the slides is "233_Advanced Touch Input on_iOS_04_Final_Final_VERY_Final_D_DF.key".


I wonder why somebody working at apple wouldn't just use git for that?


<laughs in Windows 3.1>


UIView animations do not run on the main thread, and will not be blocked if the main thread is blocked. This does help a bit with keeping the OS feeling smooth, but it is far from the only reason.


This is not entirely accurate. iOS indeed uses a client-server UI model, kind of similar to X11. Along with submitting “widget” hierarchy updates, it also supports submitting “animations”. The downside is that the animation states are not truly accessible by the actual app after it submits them.

The scrolling animation is 99.9% of the time implemented as a client-side animation timer submitting non-animated hierarchy updates to the server. It’s common to have janky scrolling.


> Along with submitting “widget” hierarchy updates, it also supports submitting “animations”.

Is that how all these third party iOS apps all have completely consistent “pop” animations on long press?


No, that would be OS-provided GUI components (or sometimes manual reimplementation), similar to how most win32 apps had the same right-click menu behavior.


Off-topic: I see the lack of standardized OS-provided GUI components in Unix/Linux distros as the main root-cause for low Linux adoption. I'm assuming there isn't such a thing since I haven't been able to notice any consistency ever in the GUI of any Linux distro and/or any GUI app that runs on Linux :P But I may be totally wrong.

On-topic: they should build an M1 app that simulates loud coil whine at low-CPU usage, so you can feel like you're using a Dell XPS.


Well, there are common components, it's just that there are multiple standards (Qt, GTK, wx, etc)...

I'm using a tiling window manager with mostly GTK apps, so pretty much all menus and such look the same. The worst offenders are Firefox and IntelliJ, although they have improved a bit lately.

However, I'm not sure that this is the reason for lack of adoption. Windows has always been a patchwork of interface design, every other app having their own special window decorations and behavior, including MS' own apps (thinking of Office in particular). Also, seemingly similar elements, such as text inputs, behave differently in different places. For example ctrl-backspace sometimes deletes the previous word, sometimes it introduces random characters.


The unique thing about this is that it takes a widget from the app, and expands it out while blurring the rest of the screen. So it’s not just an OS gui component.


Blurring happens inside the UI server process. Here is a related technique in macOS: https://avaidyam.github.io/2018/02/17/CAPluginLayer_CABackdr...

Basically, it's like an iframe — you declare what contents this portion of screen should have, and the render server composes it onto the screen. The render server is shared between all apps (just like in X11), so it can compose different apps with dependent effects (like blur). Apps don't have direct access to the render server's context, just like a webpage doesn't have any way to extract data from a foreign iframe.


I wonder what made apple give the iPad Pro such an awful screen though, considering all the software optimisations that they do. I got the M1 iPad a month ago, and the screen has absolutely horrendous ghosting issues, like, what is this? An LCD from 2001? Just open the settings app, and quickly scroll the options up and down - white text on the black background leaves such a visible smudge, it bothers me massively when scrolling through websites or apps in dark mode, it honestly doesn't feel like an apple product to me. Honestly haven't seen this issue on any screen in the past 10 years, and here this brand new(and very expensive) iPad with 120Hz screen has something that looks like 50-100ms gray to gray refresh time.


This is a very interesting anecdote considering the m1 IPad Pro is supposed to have one of the best screens available on any device that form factor, the same XDR display technology as their $5000+ monsters. Every reviewer has mentioned the display as the selling point. I have looked at them in person and have debating buying one, but Next time I’m at an Apple Store I’ll want to see if I can replicate what you’re seeing.

You might be experiencing the mini-LED effect where the back lighting is regionalized, which isn’t ghosting but can be noticeable.


It's on the 11" model, so definitely not a mini-LED issue.


Edit: the parent commenter does not have a miniLED iPad.

If you have the 12.9” M1 iPad, the ghosting is likely due to the new miniLED backlight which was introduced with that model.

This backlight is not static, but tracks to content in order to increase effective contrast (similar to FALD backlights on higher end LCD TVs). If the backlight does not track the LCD content fast enough, there can be ghosting.

In addition, since the LEDs are large compared to pixels, you can sometimes see haloing, particularly in small bits of white on a black background.

Overall, while the display is great for video, especially HDR video, it has some problems with UI and (white on black) text.


It's on the 11" model, so regular old LCD model, not the new fancy microled.


No issues on my iPad you might want to take it to Apple.


I’m guessing it must be an issue with those new miniLED screens, or some other significant generational problem. I have two older 12.9” iPad Pros, one 1st generation and one 2nd generation (the first with the 120hz display). They are both excellent displays with no such issues with motion or ghosting.


I’m worried the upcoming upgraded MBP will only have this option. Although I read they released an update that should minimize this issue. Have you tried that?


To this note, I've noticed huge discrepancies in display quality in iPads. They were passed out in my last year of high school, and each one even appeared to have slightly different color grading. The whole thing was pretty funny to me, especially since I have no idea how artists would be able to trust a display like that.


Display on microled Ipad pro is Apple's first more or less own display. The panel itself is LG's I believe.


Recently I visited the Apple store and compared the latest 11 inch iPad Air (A14 processor) and iPad Pro (M1 processor) models side-by-side. I loaded up Apple’s website in Safari and scrolled. The Pro is noticeably butter-smooth, while the Air stutters. My iPhone also stutters in the same way, but it’s never bothered me before. It’s only after looking at the performance of the M1-driven iPad Pro that I knew to look for it. And I previously had a similar experience noticing how much smoother my iPhone was than my old Android phones!

I don’t know for sure the processor is the difference, this is just a report of my observations.


That's not because of the processor. It's because the newest iPad Pro has a 120hz display and the other devices that you were comparing to have a 60hz display.


While 60hz screen refreshes are less smooth than 120hz, the small (but noticeable) of a difference due to refresh rates wouldn't be correctly describable as "stutter".

The M1 processor makes a real difference. I have both the M1 and the most recent non-M1 iPad Pro's.


Yeah you really don’t notice 60Hz scrolling until you try 120Hz scrolling. It’s a bit like you didn’t notice Retina displays until you tried one then looked at a pre-retina display. It’s crazy how you adapt to perceive things once you see something better/different.


Maybe I should hold off on the iPad Pro until I can get 120Hz on all my devices.


This is not really true, at least in any sense that really differs from other OSes. And, if you watch closely, you will notice that poorly-behaving apps running in the background can and will introduce jank in the foreground process. Since iOS lacks good performance monitoring, this (along with battery consumption) has historically been the easiest way to figure out if an app is spinning when it shouldn't be.


I'm sorry, I have an Samsung Galaxy S20 Ultra with 120mhz. Using my girlfriends iPhone 12 makes me dizzy.

120hz is something you don't notice much when you enable it, but you definitely notice when it's gone.


Typing lag is such a sad result of all our modern computing abstractions.

https://www.extremetech.com/computing/261148-modern-computer...


Hm. I've always thought it was more of a result of our current display technology? Digital displays buffer an entire frame before they display it. Sometimes several frames. And the refresh rate is usually 60 Hz so each buffered frame adds a delay of 16 ms. CRTs on the other hand have basically zero latency because the signal coming in directly controls the intensity of the beam as it draws the picture.

Anyway, is it any better on displays that have a higher refresh rate? I feel like it should make a substantial difference.


CRTs are potentially worse. It takes the electron beam 16 ms to paint the screen. If the electron beam is halfway down the screen and you change a pixel right above where the beam just painted, you'll have to wait 16 ms before you see anything change.

All CRT displays attached to computers in the last 40 years were driven from memory buffers just like LCDs, and those buffers were typically only allowed to change while the electron beam was "off", i.e. moving from the bottom of the screen back to the top. Letting the buffer change while the beam is writing results in "tearing" the image, which was usually considered a bad thing.


> CRTs are potentially worse.

Video game aficionados would like to have a word with you:

https://www.wired.com/story/crt-tube-tv-hot-gaming-tech-retr...

To be fair, much of this is the color and shape rendering, where pixel art had been tailored for CRTs.

Twitchy gamers do swear by “zero input lag” but are perhaps just nostalgic, difference is likely to be 8ms vs. 10ms:

“Using the industry-standard definition of input lag, 60Hz CRTs don't have 0ms input lag. 60Hz CRTs have 8.3ms of input lag…”

https://www.resetera.com/threads/crts-have-8-3ms-of-input-la...


As you said that article seemed to be more about the appearance of objects on a CRT than lag, and I kind of agree with the nostalgia crowd in that respect. But [raster] CRT lag is always going to be 16ms (worst case) and will never be better, while LCDs can in theory run much faster as technology improves.

If we shift the discussion to vector CRTs (which have no pixels) such as the one the old Tempest [0] game used, the CRT has a major advantage over an LCD and the lag can in principle be whatever the application programmer wants it to be. I miss vector games and there's really no way to duplicate their "feel" with LCDs.

[0] https://en.m.wikipedia.org/wiki/Tempest_(video_game)


> CRTs are potentially worse. It takes the electron beam 16 ms to paint the screen.

Back when I had CRTs, 60Hz displays were the older, less-common, cheapo option. I'm having a hard time remembering a CRT display that wasn't at least 75Hz (I believe this was the VESA standard for the minimum to be flicker-free), but most of the monitors I used had refresh rates in the 80-90Hz range. I remember a beautiful higher-end CRT that had a refresh rate around 110Hz.

85Hz gives you a frame time of 11ms, which doesn't sound much better, but is a 30% improvement over 16ms.


Before multi-sync CRTs and SVGA, 60Hz was not the "cheapo" option.


I don't think you can get a display slower than a TV, and they do in fact update at ~60Hz (or 50Hz, depending on region). Of course you're probably only getting VGA, 240p, or less in terms of pixels.


> CRTs are potentially worse. It takes the electron beam 16 ms to paint the screen. If the electron beam is halfway down the screen and you change a pixel right above where the beam just painted, you'll have to wait 16 ms before you see anything change.

This is exactly the same as LCDs though, no? LCDs are also drawing an entire frame at a time, they're not "random access" for lack of a better term. There's just typically no image processing going on with a CRT* though, so there's no inherent latency beyond the speed of the electron gun and the speed of light.

*I'm aware there were some later "digital" CRT models that did analog->digital conversion, followed by some digital signal processing on the image, then digital->analog conversion to drive the gun.


I don't think that LCD buffer anything. I've experienced screen tearing which should not happen with buffering. Most applications implement some kind of vsync which introduces buffering and related delays indeed.

Best option is to use adaptive sync and get rid of vsync. But support for this technology is surprisingly not mature, it works mostly in games.


Screen tearing happens when the software swaps buffers on the GPU while the GPU is in the middle of reading out a buffer to send to the monitor. That tearing has nothing at all to do with whatever buffering is or isn't happening in the display itself, because the discontinuity is actually present in the data stream sent to the display.


See also this talk by John Carmack about buffering problems in input and output stacks.

https://www.youtube.com/watch?v=lHLpKzUxjGk


Sounds broken? Return it.


You have what I called Latency Sensitive or Jank Sensitivity. The most important thing in these environment is actually not how fast they are ( or how low the latency are ), but the latency being "consistent". With Windows and Android there are micro Jank everywhere. I know because while most people dont notice, I felt a small pin push like pain on the right back of my head every time it happens.


Depending on what you're coming from, it could just be you haven't been using modern hardware. I agree some is faster than others. I notice a difference going from my Ryzen 5900x to my i9 11900K. Typing and everything is definitely smoother on my latter machine. Part of it is just software advantages. Apple designed the compiler and built the system for the M1. On Windows and Linux, Intel has a boatload of engineers working on their behalf.

The single core performance of my 11900K is really hard to beat at 5.3GHz, and with 7 of 8 cores hitting that frequency simultaneously (which it hits all the time in desktop use) it's just buttery smooth. My guess is that you just came from an older system. The advances in CPUs (along with good compiler programmers working on that architecture's behalf) in my experience are larger than reviews and colloquial recommendations would suggest.


Just FYI from a person who's used Macs since December 1984 and Windows since DOS, the instant UI responsiveness of Macs (when using any input device), regardless of load, was ALWAYS a feature, even going back to the very first Mac. I always noticed the lag on Windows and couldn't believe it didn't bother anyone... which of course it didn't, because it's all most people knew!


As a hardware nerd I always kind of like that moment when the CPU starts really working and you hear fans spin way up. The auditory feedback is like a throwback to a more mechanical age, like a manual transmission vehicle where you can judge when to shift by engine speed.


When I replaced all my spinning disks with SSDs and fans with ultra quiet ones I realized just how much I relied on fan noise to signal things. E.g. if my computer isn’t visibly on, the fan/disk noise tells me whether I need to power it on or just wake it. I’ve had to start changing my habits as a result.


And it dates back from 1988 for me. I’d always rely on the nice scratching sound of the HDD and the LED to know whether my computer had crashed or was just loading. There was no need for a spinning wheel when you could hear the data getting loaded ;)


I could tell what boot phase my 386 + 486 desktops were at by listening to the ticking of the HDD heads. When I installed Win95 on the 486 (the poor thing), the time until the o/s was finished with the drive quadrupled.


At the same time, we are losing disk access LED.


... and then you get an electric car, and the only time you have any auditory feedback is when it's very hot out and you supercharge.


I like it if the laptop only starts the fan up when the device actually is working. Not constantly turning it on and off during like light web browsing.

My 2018 MBP 13” and the Surface Laptop I’ve tried did that well. An XPS 15 I’m currently using does not.


Web browsing isn't a light activity anymore. Web pages consume more memory and more CPU than my text editor


Has there been a time in history where web browsing was lighter than text editing, ever?


No, but there is now, "thanks" to Electron :P

More specifically, HN would use less RAM than VSC would to open a .txt file.

Eh, not bad. not good; we don't know what the new goalpost positions mean yet.


That’s why I specified “light web browsing”.


I dont understand why they sell the pro. The touchbar is rubbish and the air has 99% the same performance. Yeah the pro can go faster for longer without throttle but its heavier and doesnt have those physical f-keys.

Its more expensive too… not sure why anyone would buy it over the air. Two more usb-c but you can only hook in one external monitor! Also 200 more nits on screen but… is it really that important?

I cant wait for the 14/16 with the m1x. These are already barn burners.


Obviously, to people who choose to pay for it. I produce video news and the Touch Bar is integral to my workflow. Just because something is not useful to you doesn’t mean it’s useless.


The touch bar should have been in addition to the function row really. On the 16" mac there is plenty of space for both. Few people hate the touch bar, they hate the lack of function keys.


I hate it because it’s too easy to accidentally press a “key” and trigger some potentially large scale event.


This has always been my issue with it. For some reason it doesn't have pressure sensitivity like the trackpad (which also has no physical button). Lightly brushing the touchbar is enough to activate the controls on it.


I hate it because it takes three taps and a delay to change the volume or brightness, 99% of my use case.


Yep we share the same unpopular opinion. I really like the Touch Bar.


I do like the idea - and I saw a Windows laptop with a bigger 'touch bar' that I think is even more practical - but I just wished they offered a version without it. I heard the value of just the touch bar is in the region of $600 (random guess), because it adds some hardware from the Apple Watch to the Mac. That's a lot of money for a feature like that.


I'm curious, how is the touch bar used in your workflow?


Final Cut Pro most likely.


Correct.

The Touch Bar is also well-integrated into QuickTime Player as well as Quicklook, making video navigation much more efficient.

I only wish VLC would use it for timeline navigation.


And the rumors say that Apple will be removing it. Let's wait and see for the upcoming rumored 14/16 Macbooks.


> Two more usb-c but you can only hook in one external monitor!

Actually you still only get two USB-C ports on the M1 MacBook Pro. The four-port model is still Intel.

That lower-spec Pro has never made much sense. The last time an entry level MacBook Pro made sense was 2015, where you got a noticeably higher TDP chip (28W vs 15W in the Air iirc—could be mistaken), more IO (two Thunderbolt 2 ports vs the Air's 1, and a full-size HDMI port), and of course an IPS retina screen.

I think the low-end Pro came to existence because they intended it to replace the Air, but the tiny 12" MacBook had already claimed the unadorned "MacBook" name, which would have made a lot more sense.

But then they ended up refreshing the Air as people just never stopped buying the old one, despite its ageing internals and garbage 2010-era screen (possibly because it was the only computer Apple were selling at the time with a keyboard that worked).

I have no idea while it still exists though. It's filling a niche that isn't there. I think the intent is for it to be basically an "upmarket" alternative to the Air, but they only thing they have to offer right now is a Touch Bar. Which for most people, including myself, is more of a reason not to buy it.


I used Macbook Airs for 10 years. Had a week with a loaner machine which happened to be an MBP with TouchBar and I found it neat, and so ended up getting an M1 with TouchBar. I like it. I so rarely use the F keys that it's kind of nice to have sliders, emojis, mute button, et cetera up there. I almost certainly could live without it, but I used to hate on it and now at least opted for it.


I kinda like it too but I resent it a lot because they displaced the function keys. Had they kept both, I think we’d be seeing more touch base fans.

A lot of the time it’s really nice to have a context sensitive/modal form of input.


It actually has the same number of USB-C ports, the M1 MacBook Pros only have 2. But, from the perspective of someone who bought one, I got it for the sustained performance provided via the active cooling and for the extra battery life. Would love the physical keys instead of the touch bar, but I've used a touch bar for the past 5 years so it's whatever.


Having enjoyed using an Elgato StreamDeck on a windows box for a while, my next MacBook Pro upgrade cycle definitely includes a touchbar on the wish list.


>he touchbar is rubbish and the air has 99% the same performance.

It only have 99% of the performance under specific loads. The Air is ~12W TDP, the MBP is a ~20W TDP. That is quite a bit of difference ( 70% ) when you absolutely pushes it to the limit in both CPU and GPU.

That is why half of the not "throttle" comment in this thread dont make much sense as they only consider the CPU usage pattern.


The point is that there are very few real world workloads that get it to throttle, and even more rarely will it throttle enough to matter. 99% of use cases are not affected.


MacBook Pro, by Apple's pro definition aims at creative professional which tends to use GPU a lot more. Which means they are much more likely to be throttled by TDP limitation in a MacBook Air.

There are also a lot of other use cases where GPU or NPU are important.


This.

As I wrote in another comment, I produce video news.

My M1 MacBook Pro is a beast of a machine, but I work it to the point of full-blast fans every day.


> you can only hook in one external monitor

You can run multiple monitors on an m1 with a DisplayPort hub. (not relevant to the point about the number of usb-c ports of course, since you're plugging into the hub anyways)


Yep, the touchbar is pretty horrid. It's my personal decision point for the question "linux or mac?" If Apple fails to provide a pro model without the touchbar, my next laptop will be Linux.


The differences are larger battery, active cooling, touch bar, brighter screen, better speakers, and the base ram/storage model has the 8 core GPU.


The Touch Bar makes me so angry. Why? Because it's a less reliable laptop with a worse keyboard that exists for no other reason than to raise the ASP (Average Selling Price) of Macs. That's it.

Go back 5+ years and you had the (2010+) Macbook Air, which was fantastic. A good compromise of size and power at a really low price. Like competitors just couldn't compete. The only problem? By 2015 or so the display was pretty outdated. It needed a thinner bezel and a retina display and it would've been perfect.

But no, Johnny Ive came along with his war of thinness and foisted the butterfly keyboard on us and added the Touch Bar to increase the price. There's no other reason for it to exist.


My old clevo laptop has an i7 in it with fans that sound like hair dryers. It never throttles.

I would never buy a laptop where throttling was a presumed solution to thermal issues, because I want to get work done. Pre-m1 macbooks were awful. Even worse than top of the line dell laptops.

I have always felt like a minority for expecting a 2k laptop to actually be able to get work done.

If I would ever be able to run Linux as a first class citizen on an M1 (not going to happen) I would buy one in a flash. I have been playing with my mother in law's MacBook air and the thing is pretty darn amazing.


If you're looking for a PC that's not a headache to put together, is semi-mobile, powerful and quiet have a look at AMD-based gaming PCs.

I got a Lenovo Legion with 4800H(8/16)/32GB/2T/GTX1650/144Hz a couple of months ago. That's a really sufficiently cooled system. It almost never spins up the fans, it's powerful, has a lot of space, no RGBs and has a super boring, tasteless, anti-theft design that never goes out of style (but the last one is more of a personal preference ;) It's been great.


One eternal problem with gaming laptops and Linux remains the dual-GPU architecture, which is always kind-of-but-not-quite supported. Most often it is easier to forego either the GPU or CPU-video, leaving the overall experience frustrating.

The fact that most (all?) gaming laptops come with NVidia GPUs (requiring closed drivers, unless Nouveau got much better since last time I tried it) doesn't help.


While I understand that this sucks, I've been running Windows as a primary dev machine for a long time now, so this particular problem doesn't affect me personally.


> It never throttles.

Just in case, are you actually monitoring temperatures?

Regardless, buy some proper thermal paste and pads and replace them on everything, it will quiet down significantly while also running cooler.

The difference between factory Thermal Interface Material and custom is often ridiculous.


I do sometimes when I do heavy things on it (long and heavy compiles or when I'm rebuilding some heavy datasets). I never tried with any synthetic benchmarks. The Mac throttles whenever I look at it the wrong way.

Edit: and I don't mind the noise. I will maybe do something if it starts throttling, but I have had 0 issues for as long as I can remember.


FYI, the one M1 MacBook which throttles under heavy load is the MacBook Air, which is actually a 1k laptop ($999).

The MacBook Pro has fans and basically never throttles as I understand from others.


Absolutely untrue. My M1 Air does not skip a beat at all even under sustained load. Barely gets warm.


It’s not untrue. It has been demonstrated in dozens of reviews that the M1 Air throttles under heavy load. Just because it hasn’t done so for your workload, doesn’t mean it isn’t happening for others.

Besides, when it throttles, it’s not a huge slowdown, so most people wouldn’t notice it “skipping a beat” anyway.


An MS Teams video call with 4+ people will get my M1 Air very toasty (because Teams is shit).

However, it's not as hot as my old 16" MBP got (which would get too hot to have on your lap), it doesn't noticeably get slower, and it's of course totally silent.


I know. I have an M1 and and it hardly ever throttles. I was making a point to the OP though


> a well cooled desktop just doesn't have these issues

Kind of depends on the steady state and your target turbo frequency. I have my fans basically idle when temperatures permit it, but aggressively ramp them to full speed to keep the CPU as cool as possible so as to allow the frequency to scale up beyond spec. This results in a similar effect as a laptop fan -- under heavy load, the machine is loud. I don't really have a problem with this, at idle my computer is using ~100W, at full load, it's nearly 1000W. That's a lot of heat to get rid of, and it has to go somewhere. Having a large water cooling loop can help with the noise, but since I really only use the CPU that heavily for bursts that measure in the minutes at worse, I think it's fine.

It doesn't make me anxious but I do kind of like the feedback that things are under load. (I have the ramp-up time set pretty long, so something like "go build" that only uses all 32 CPUs for a second doesn't really trigger anything. There's enough inertia to prevent that from raising temps much.)


Must be nice, using a fresh machine that's so silent. I recently switched from a powerful desktop PC with noisy fans to a passively cooled RPi4 running Ubuntu Mate. I just feel my productivity and focus going up now I don't hear the equivalent of a small fighter jet taking off. The RPi4 is still a fighter though, very quick and powerful for it's size, price and specs.


I like my 15” MBP, but the fans are extremely annoying. I had to switch IntelliJ to power save mode so I can hear what I’m thinking.

It’s quite surreal the big Xeon server (an actual server, with optional rackmounts) with lots of spinning rust under the desk is still much less distracting.


You might want to look into [Turbo Boost Switch](http://tbswitcher.rugarciap.com), which in my experience does a pretty good job of keeping the CPU cool enough that the fans stay pretty quiet, with a minor impact on performance. (It's easy to turn it on and off from the menu bar, in case you need to turn the knob up to 11 and are willing to hear the fans again.)


This worked well for me also.


Thanks for the tip, folks. I passed it to my IT department for checking (PII, sensitive client data, etc) and, hopefully, approval.


Is this a "hardware have gotten fast enough" situation?

My latest laptop (Ryzen 7) takes a lot of beating before putting the fans at an audible speed. I've never seen this happen in laptops before either. (But well, my last one didn't slow down either, it just started making some noise.)


My Core i9 laptop has fans on just about as soon as it boots.


I just looked up some benchmarks. Apparently a m1 is 2-4x faster at many tasks than my wife’s gaming desktop from 5 years ago. It’s comparable in many benchmarks to my ryzen cpu as well?

What’s the catch?


The only real catch is that you're currently limited to only 16GB of RAM.


I switched from a 16" Pro with 32GB of RAM to a 16GB M1 Air - I really didn't notice the difference, even though I frequently run some memory hogs (Electron apps, Chrome, multiple Docker containers)


and big sur and up... unfortunately the quality of the OS is not keeping up with the quality of the hw.


How so?


let's not go there :} life is too short to compile bug lists for macos.


I feel like the critical bug list being short is basically THE reason power users would go for Apple laptops.


The best OS of existing.


Surely you're joking here.


No.


I haven't found that to be a catch actually.

I went from a mid-2015 15" MBP to 13" M1 with 8GB. I made the impulsive purchase to get 8GB ASAP rather than wait for 16GB (which was backordered). My primary apps are DaVinci Resolve, Lightroom and Photoshop. On the 15", the fans would run on full constantly. I would have to create proxy video files to work with.

On the M1, I don't create any proxies at all and the video files I work with just play in real-time.

Main catch for me is the two USB ports and one external screen limitation. I have a hub (card readers, USB, HDMI, etc) and a 10-port USB hub coming off that (with 7 external drives connected). As soon as there is a 15/16" available that handles two external displays, I will get that as well.

I dislike the TouchBar and the lone external display, but everything else about the M1 has been excellent for me.


I had assumed that photoshop and the like would be gpu bound, not cpu bound


Most likely her gaming PCs GPU is on par with or exceeds the M1s GPU. That would be the biggest downside in your specific comparison.


You can only use one external monitor


And that's only on the laptops. Two monitors are supported on the Mac Mini.


That is nevertheless only two screens.


>only two screens

Honestly I feel like multi monitor setups are actually worse than single monitor. Having to bend your neck to use some of your windows left me with neck pain. Now I only use one large monitor and use the workspace switching shortcuts to move everything around so I only look forwards.

My productivity is identical and my neck feels better. I'd only say its worth it for when you really do need to see a huge amount at once like watching the feeds of 30 security cameras. Not for general work/programming.

Seems like multi monitor setups become more of a looks thing because they make people feel like they are sitting in the FBI command center rather than a generic office worker desk.


Multi-monitor setup has been a giant boost to my programming productivity.

I can simply keep the specification/documentation/stack overflow alongside editor/IDE window alongside debugger. If I had to put the benefit in words, I can offload the state of my programming session all on screen, instead of in my head, taking space where more of the abstract logic and relational graph can reside, before it gets turned into code.

On the other hand, it gets way easier to get distracted because you can keep a chat window on one side, constantly getting pinged.


I used to do the same thing but now I put my IDE on the middle workspace with the left and right workspaces holding my IMs/music/browser and terminal. Now instead of moving my head to see other windows I press ctrl+alt left/right and the windows move in to the space I am already looking. Effective use of workspaces feels just as productive as multiple monitors did but my neck is always facing forward.


That's a problem with two monitors, specifically if you try to place them evenly. Neither monitor is centered.

Designate one of them as the 'main' monitor, and put anything you need to work with for long periods of time on that. Add an office chair, so you can swiwel to look straight at the other(s) instead of twisting your neck. And you'll find there's no such issue.


I just wish I could find a good series of monitors with matching vertical size and dot pitch but different aspect ratios. My best use case would be a 27" 16:9 primary monitor directly in front of me with a 4:3 secondary on the right or left. It would be fantastic to put a 1920x1440 next to my 2560x1440 center display with them being the same physical size.


I only really use 2 monitors in unreal engine. The ui just is too cramped for a single monitor, you really need both

Outside of unreal, I usually find workspace switching to be a better workflow than alt tabbing my way to a window on a second screen


hdmi port doesnt work on many m1 mac minis, is this counting the two usbc or hdmi plus 1?


That's not too bad if you're using an ultrawide.


You can use a DisplayPort hub to run multiple monitors.


The catch is that you are using untested hardware. You have no idea if this will last, who will support what, and potentially risk money and your data. Thing is the upside for us as a consumer is insane. Tbh buy a MacBook is 20/30% more expensive than equivalent type specs on windows. But in the case of the air that’s 200/300 dollars. Just to try and test this new hardware if it’s your main device. It’s a lot of risk.

I feel confident running the system for a while now. Seeing more software come on board. Even for the extra money, I say go. I’m a dev/product designer, most of my work is coding, product development, product management and graphical design. It does everything I need faster, no lag.


Same.

The problem is now my work desktop is a 32-core Threadripper w/ huge GPU and way too much ram.

It’s kinda nice knowing that if my computer ever slows down it’s because of bad programming and not having a too slow computer.

I want to like laptops. I really really do. But they’re slower, have bad keyboards, and have tiny screens. Or I’m docked and the fact that it’s a laptop isn’t meaningful.

That said, I can’t fricken wait to see Apple build a beast of a desktop with their silicon. I just really really hope the GPU doesn’t suck. Lack of good GPU on Mac is a huge problem for certain lines of work.


> Or I’m docked and the fact that it’s a laptop isn’t meaningful.

You might be underestimating the value in that. I’ve been working off of MacBooks since 2011, and for almost all of that time I’ve had them docked and hooked up to multiple monitors, and working just like a desktop.

But if I ever need to go anywhere, meet a client, travel, go to another room/office, I just unplug and go. If power goes, I don’t lose anything, the laptop is still running. Whatever happens, it’s all there with me, always. It’s not as comfortable when it’s on laptop mode but it’s better to have a less comfortable experience than not having it available at all.

It’s a portable desktop. And I love it. But yeah if you’re looking for workstation-level specs, then yeah, a laptop is never going to be enough.


My primary home computer for several years was a docked laptop. Overall I was disappointed and went back to a full desktop.

> If power goes, I don’t lose anything, the laptop is still running.

What kind of monster doesn’t use a UPS!? =P

I mostly work in games and VR. Maybe someday there will be a laptop that doesn’t suck for game development. Sadly that day has not yet come.


I have a desktop dev machine that I ssh into via zerotier and it has been a fantastic dev experience. As a result my entire world needs to be in the terminal, which for me, was pretty easy (tmux+neovim). `tmux a` and I'm right back where I left off and it doesn't matter what front-end computer I use. I can now use my ipad as my front-end cpu which is a great in-the-bed machine.

The only catch is I need to make sure my dev machine is online.

Now with neovim 0.5 with support for LSP and treesitter, neovim is on par with visual studio code.


>It’s kinda nice knowing that if my computer ever slows down it’s because of bad programming and not having a too slow computer.

And then someone with 4 cores tries to run the code and its unusable because it only runs well with 32 cores.


Yes. Software developers need to make sure their software runs well on their user’s machines. This is equally true for people developing software on brand new laptops and it running poorly on older laptops.

A great pet peeve of mine is that designers are notorious for only testing their designs on high-resolution MacBooks. A lot of tools look like crap on 1080p Windows displays.

The benefits of 32-cores mostly comes with compile times and build processes. Where, depending on your project, it can make a HUGE difference. One of my C++ projects went from 15 minutes to under 3.


I've been on a ThinkPad X1 Extreme for almost 2 years (gen 2 with the i9) and never had any noise or throttling issues. I sit next to my partner (also a dev) and their i9 macbook pro regularly spins up then throttles the cpu a ton. We've seen it regularly throttle down by 70-82% (measured with pmset -g thermlog)

Just saying, I definitely get the feeling Apple values form over function. His biggest issues happen when he's using Docker (the annoying Docker for Desktop version) but I also use Docker all day and dont have any issues...


Well, depends of course on your Windows Laptop. I had a Thinkpad X1 Carbon with no heat or noise issues at all. Now a Surface Pro 7 with also no heat issues. It's not the fastest around, but for a tablet really impressive (typical dev setup running Postgres in Docker, Goland, Webstorm with no problems).

Still I likely buy a M1 Air soon. Not because of the M1 CPU, but because it's almost half the price of a comparable Thinkpad. And I hate the bloatware shipped with basically every non-Microsoft Windows laptop.


"the fans go crazy, the laptop is hot to the touch and everything slows down."

So you installed Gentoo, what's your problem? 8)


A bit of a tangent but I am quite impressed how Apple's marketing campaign was so effective that they've managed to redefine what PC (personal computer) means


By being actual PC compatible hardware for about a decade.


What is your definition of PC? There are about 20 definitions in regular usage.


PC originally meant "IBM PC desktop clone". Apple was different because it's hardware wasn't compatible with PC software (for mainstream users).


"PC" was "personal computer" years before the IBM PC. Just look at old computer magazines.

The whole reason the phrase "IBM PC" existed was to differentiate it from the other PCs that already existed. "IBM" was the adjective. "PC" was the noun.

Because of its success in offices, "IBM PC" became just "PC" the same way other words like "omnibus" became just "bus" because it's simpler to say.


This is the correct answer - and PC originally, or "Personal Computer" is more of a distinction from old school mainframe computers that lived in a lab or wherever and took up entire rooms.


As I recall, the most widely used term early on was "microcomputer", to distinguish the small home computers from the larger "minicomputers" (e.g. the DEC PDP-11 of blessed memory). The term "personal computer" was also in use, but the use of "PC" was not common until after the introduction of the IBM model 5150 (whose actual product name was the "IBM Personal Computer").

Since then, you might refer to any of a variety of machines as "personal computers", but "PCs" only meant "IBM PCs" (or later "IBM PC-compatible machines"). In other words, I would argue that the term "PC" derives specifically from the "IBM Personal Computer", and not generically from "personal computer".

Source: I was there. :-) I haven't done the research, but I bet if you did search through Byte and similar magazines of the time, you'd find plenty of supporting evidence. (I do have a memo cube from the early '80s with the slogan "Apple II -- The Personal Computer", but I suspect that was Apple Marketing trying to fight an ultimately losing battle.)



Sure; I don't disagree (and perhaps my previous post was less clear about this) that "personal computer" was definitely in common use, particular in advertising aimed at "regular consumers" rather than hobbyists, and well before the IBM PC.

I do disagree with the suggestion that the initialism "PC" was understood to mean "personal computer" in general before the release of the IBM PC. If that's overly pedantic, well, I'm a computer nerd; what can I say...


Oh, OK, I think you’re right there.


I was there too if you need anecdotal evidence, my Dad worked in marketing for Apple - notable for being the ad manager for the 1984 ad campaign and you're wrong. I mean that ad was an attack on IBM PC's. The answer is on wikipedia too, https://en.wikipedia.org/wiki/Personal_computer


Wasn’t ‘home computer’ the generally used term before the IBM 5150 ?


I don't think so. Comparing use of these terms in books:

https://books.archivelab.org/dateviz/?q=home+computer

https://books.archivelab.org/dateviz/?q=personal+computer

Personal computer was used a lot more when books were eventually written about the era.


As far as I'm aware, the UK the phrase "microcomputer" (or simply "micro") was used before IBM-compatible machines.


Wasn’t ‘home computer’ the generally used term before the IBM 5150 ?

"Home computer" was things like the VIC-20 and the TI-99 4/A. "Personal Computer" was machines that you had in your home or office that you didn't have to share, or timeshare with someone else. Think Cromemco, Kaypro, PET, and SuperBrain.


The only reason I mentioned it is because OP's usage caught me off guard. Most of the time in colloquial usage I've seen PC used as an acronym to mean "computer with Windows installed as the OS", especially when being compared to Apple's products.

But in this case it was used as "desktop computer", which sounded strange to me as I consider laptops, however portable, to be personal computers.


I agree with all of that -- and regret the recent trend of using "PC" to mean desktop as opposed to laptop -- but what does any of that have to do with Apple's marketing?

You don't think Apple's marketing is behind the aforementioned recent trend; do you?


My understanding was that Apple's marketing created the initial deviation from the original meaning which led to the current one, but yes seeing what other people are commenting now it's not that simple and I was wrong on my assumption


> PC originally meant "IBM PC desktop clone".

You're misremembering history. It eventually evolved into that, yes.

Apple didn't really buy into that until the "I'm a Mac/I'm a PC" ads.


And even then they still used the PowerPC chips and branding.


Yep! Never thought about it that way, but indeed, PowerPC was a smaller not-server chip intended for personal computers.


The hardware is beautiful! As soon as it becomes trivial to delete macOS and install Linux (with reasonable support), I'm buying an M1 Macbook Pro. Unfortunately, I'm in too deep with the Linux ecosystem to switch over before then. Fortunately, Asahi Linux appears to be making fast progress, and there are even (spurious) rumors of official support on behalf of Apple Inc.


My experience is just the opposite, when I run even to projects on intelliJ on a mac m1 8gb, the whole system just starts lagging to the point that I cant even scroll the code.

This has not been a good experience for me.


That's probably because 8GB is not enough to run Intellij comfortably and the system starts swapping things in and out of memory constantly. Another thing that happens when intellij uses up its heap is garbage collects including frequent stop the world varieties of that. The solution is to configure it to have more heap. Either way, that's a memory problem and not a CPU problem.

8GB for a development laptop is not a great idea in 2021. I've been using 16GB since 2013. I'll get more with my next laptop, which may or may not end up being the 16" M2?


There was a bug with Android Studio (and I think IntelliJ as well) on M1, not sure if there still is.

You can fix it by setting 'Prefer Tabs' to never - https://www.reddit.com/r/androiddev/comments/jtbl4m/has_anyo...


The uncomfortable heat/fan issue is what ultimately pushed me over the edge to become familiar with Kubernetes. Now I just offload the compute heavy tasks to a desktop in a different room.


I think this is an Intel "feature". They can't upsell you faster CPU's without trading heat and noise. Glad to see Apple is following a much more sensible approach.


The only thing I've been able to do to get an M1 to spin up the fan noticeably and get warm is to max out both the CPU and the GPU by playing Minecraft Java in a huge world or running other 3D-heavy things.

The same things on an Intel laptop will turn it into what my friend comically calls a "weenie roaster" (for what it does when it's on your lap).

Things like JetBrains CLion don't even make the M1 warm. That will make any Intel laptop start cookin'.


Sitting in front of a laptop, the fan coming on at max speed makes me feel like I am stressing the machine and need to be gentle with it. When using the same machine through VNC I feel completely insulated from that effect. It’s a more pleasant feeling. I’ve considered getting a desktop and just using my Macs as remote terminals.


I'm eager to see what can be done with the desktop tower workstation post-M1. I don't think anyone actually wants to have a slim laptop and use "the cloud" to "download more memory" but here we are in 2021 where people are saying 16gb max ram is good.


>but here we are in 2021 where people are saying 16gb max ram is good.

No amount of memory will be good. The problem is not memory but inefficient programming. Desktop apps consume as much memory as possible. The only way to have a good experience is to have more memory than the average user. But then the average user catches up and you need even more memory.

Right now I have ms teams running in the background purely to receive notifications and it is using over 1GB of memory and 8% of my cpu to show me a notification window when I get a message. MS teams on my 3gb ram iphone does exactly the same job and uses no memory and no power because it doesn't even have to run to show me notifications.

I have 32GB on my laptop and I am only just above the average now but in a few years electron will have ballooned so far that 64 is the new minimum. While my 3gb iphone and 8gb ipad feel more than enough.


Tell that to my friends whose main softwares are chrome, office and vlc.


There is actually one reason to do a software development on a laptop: if I don't have a good enough place for work. Laptop is essentially a set of compromises: between computing power and battery life, between screen size and weight, between sense of pressing keys and thikness, between using mouse and forgetting to take one on the go.


M1 mac is also my first Apple product. And now i bought magic mouse 2 and fuck Apple. It is like laggy.


The default mouse settings are always too slow for my liking; I always speed up the tracking speed. I haven't used a Magic Mouse on a regular basis for a while (had both the v1 and v2), but once I updated the setting, it was as snappy as I wanted.


I already set tracking speed to highest but this is really one of the worst mouse i have ever used. Only reason i am still using it my other bluetooth mouse is seriously laggy when i use with m1 mac.


This is funny. For the last ten years I've used Mac laptops as my primary machine. Of course the fan noise was a great auditory clue to tell me to check out Activity Monitor/top for rogue processes and help me optimize which tasks took too many resources.

I didn't realize how much I relied on that cue until I temporarily switched to a Mac desktop machine during the pandemic when I didn't need to travel. The desktop fans pretty much stay at 1200rpm all the time. I think I've seen them spike once. I'd forgotten how much better desktops can be for some things.

Now I'm looking for a better remote access solution so I can consider sticking with a desktop at home and then maybe just an iPad for travel with good remote access and file sync.


I remember using hard disk drive sounds for disk usage and or capacitor noises to debug issues. But, I really like the MacBook Air M1 because its so quiet and it helps me concentrate much better and if you are talking to others or recording a talk it is silent. The other thing is that it has no moving parts so it should be very durable.


I used to use capacitor noises too when i was younger. I could tell other devs when builds were finished by hearing the high pitched noise stop. At the time i thought i could hear the cpu but recently learned it was the capacitors lol


Inductor coils? I haven't heard of capacitors creating noise before.


I use one too and there are still some moving parts. The keyboard keys, the speakers, the display hinge and the Taptic Engine. Those are the parts I'm looking at intently as the ones that will fail on me, especially the hinge.


The most likely thing to break on a computer is the fan itself. Apple has revised the display hinge and the keys have been redesigned. I don't know of any reports of the Speakers breaking or a Taptic Engine (other than people using Bootcamp on Intel MacBook Pros but that isn't an option anymore) but since there is no long term data on all of those components the only thing that I would question would be the redesigned hinge and the keyboard.


Annecdata: I've never had a desktop or laptop fan fail on me, and I'm the kind of person who keeps them for a decade.

Spinning HDDs (Toshiba bearings), keyboards and hinges on the other hand have all failed on me multiple times, in macs. but the fans kept going. The keyboards were the last thing to fail in the 2009 design, they last about 10 years.

Also while were at it, mechanical failure isn't always the biggest concern these days, Apple has had nVidia GPU issues (soldier and fab issues) in the past where they ended up underclocking them in a firmware patch in order to push failures out of warrantee.


I bought a cheap M1 Air model (only the ram is upgraded to 16gb) at $1030 Apple-refurbished. My intent is to try and sell it a bit before the 1 year mark.

For some of the reasons you mentioned (specially the last paragraph), but also the battery and the general non-reparability. It almost feels like a consumable item, compared to my previous Thinkpad.

Here's the thing: all the other laptops I was looking at cost at least twice the price. High-end Thinkpads, my second choice, almost 3 times.

All of the sudden Apple became the budget choice for my use case. In one year hopefully when I sell this one there will be competitors to the M1 chip (or maybe the M1X?), and I'll have more options.


> Here's the thing: all the other laptops I was looking at cost at least twice the price. High-end Thinkpads, my second choice, almost 3 times.

It shouldn't be this way, but most of the big enterprise retailers have ridiculous sticker prices. Unlike Apple, volume buyers get steep discounts, and individuals are expected to wait for a discount.

Ask around online and you'll find it's fairly simple to get fancy Thinkpads for half-off.


It's highly dependent on the country you're at. I see people on r/thinkpad get new high-end ones discounted, the kind of discounts that you can't get in France.

What you can find in France is heavily discounted ones from people that resell their company-provided ones. I bought 3 T460 like that in the past, for me and my family. But no high-end models, nor much choice in the customization.


It might be worth buying the discounted thinkpad in US (I assume that's where most heavily discounted ones show up) and then ship it to France?


I looked into that too ahah. There are "forwarding" services that do that for you, if you don't have a friend there.

But I don't like thinkpads to the point of doing that rodeo. For now, Apple won my money.


M1 is a complete SOC and discrete nvidia graphics from nvidia were so bad that Apple switched to AMD permanently.


That was just one example, one that affected me and millions of others. There are also plenty of other chip level failures in newer apple hardware as quite well documented by Louis Rossmann. Not all GPU/CPU issues either, there are many other chips that go wrong due to cheap component choices.

The GPU issues were both nVidia's _and_ Apple's fault due to assembly with low quality unleaded solder. In either case their attitude to customers was unforgivable, they replaced old broken boards with new broken boards until people just gave up or were pushed out of warrantee... and for the masses that couldn't be bothered to go through the pain of browbeating an Apple store employee into submission - they got firmware patches to push it out of warrantee.

Component failure, solid state or otherwise is not unique to Apple and is an inevitability, it will happen again - complete disrespect for their users by continually lying to their face on the other hand... That is something Apple is uniquely skilled at.


Fan life really depends on usecase. The bearings roughly have a certain number of cycles, and most consumers just don't push them that hard.

For example a tiny 1000 RPM server fan running 24/7 rotates 5 billion times per year. My desktop spends most of it's time off, but if I game on it 10% of the time it's big fans might be at 1000RPM. This is only 50 million cycles per year.


I'm also trying to figure out a solution for remote access/file sync. I've been using resilio sync to sync my files between my laptop and desktop. It works great, but it destroys battery life on my laptop, so I've been looking for solutions.

Syncthing is next on my to try list, but its very similar to resilio sync so I don't have high hopes. iCloud drive/dropbox/etc have issues with syncing git repos, so not sure about that either.


MEGA has been working pretty nicely for me also with Git. I use it on a macOS desktop and a Linux laptop. Only thing is the mobile app isn’t the most polished compared to Apple’s Files for example, but I don’t need it often.


Syncthing is great, however the biggest issue Ive found with thr Mac version is you explicitly need to end the task in the upper taskbar or it will just drain your battery. Also, if you sync a Mac with anything besides, be prepared to see a .nomedia file in EVERY.SINGLE.DIRECTORY. (i cant recall what exactly it is). It's very annoying to say to least.

That being said, Syncthing works great for a wholly cross platform sync tool. I have used it with a PC, Macbook, Android setup. Never have I not been able to get it to work.


To my knowledge, .nomedia files are created on Android to prevent the folder from being scanned for media. I don't think it's due to "anything besides", but Android in particular. I haven't tried configuring the Android Syncthing app to exclude .nomedia files, but it may be possible.


Then it's not .nomedia but some Mac specific file. I can't recall what it is since I stopped using syncthing on my Mac haha. I rarely use it unless I kinda have to.


IIRC, MacOS is well known for leaving .DS_Store droppings absolutely everywhere. If you ever shared a pen drive with a MacOS user, it would come back full of .DS_Store junk all over it. It might be that one you're recalling.


defaults write com.apple.desktopservices DSDontWriteNetworkStores true

defaults write com.apple.desktopservices DSDontWriteUSBStores true

Run these both in Terminal, and either restart Finder or just reboot. Should solve that problem.

The whole point of those files was to store Mac-related metadata (such as icon positions and other stuff) that the filesystem in question did not have the capability to store, to preserve Mac users' expectations.


I have been using syncthing across Mac, arm Linux, Linux, windows and android. Occasionally an Android release turns into a battery drain, but I haven't had an issue with mac (my Mac shares are about 70G). Could need a db wipe and reinitializing? I think their file watcher is native.


> Now I'm looking for a better remote access solution so I can consider sticking with a desktop at home and then maybe just an iPad for travel with good remote access and file sync.

This is my solution. I don't really need to do any file syncing as rsync or scp works fine for my purposes. It has been a fantastic dev experience. I remote in, bring up my last tmux session and pick up exactly where I left off, regardless of the front-end machine. I use an ipad pro for a ton of development now and it really is fantastic for it.

I've got a overkill desktop dev machine (previous gaming rig) that can make all the noise it wants in my basement and my front-end shell remains quiet.

I set up zerotier so I can constant access to it whether I'm on my local network or remote.


> I've got a overkill desktop dev machine (previous gaming rig) that can make all the noise it wants in my basement and my front-end shell remains quiet.

I want to have the better of both worlds and a nice uncluttered desk so I ended up buying an Intel NUC 9 module plus RTX 3060 that I keep hidden in a closet and game remotely using Moonlight and Gamestream. Everything works just fine on a Gigabit connection and plays well with the Xbox controller on the M1 as if its local. I can play everything on 1440p@60fps which is more than enough to me.


Can you tell me more about your setup? What apps do you use on your iPad for remote access? Is it just remote shell or do you do Remote Desktop too? What do you use for Remote Desktop?


Not the original poster but I use my iPad Air for a similar setup. Development environment runs under Docker Compose/WSL2 on my desktop, which I can start/stop using Chrome remote desktop. I've got code server and portainer in that compose file which are accessible from my iPad using tailscale, it makes for a passable development environment if I want/need to work away from the desktop for whatever reason


Can you not use rsync on an iPad?


is it better to have rogue processes silently wasting energy and creating heat?


I think this is a case where the meaning of the parent can be taken multiple ways, and one of the HN rules is to take the more generous case. My interpretation is that they are saying they see the purpose of needing some type of cue when CPU usage is high, but didn’t realize that they were using the fan speed as a proxy for CPU utilization until this was posted, at which time, they put 2 and 2 together. That’s not saying that the product is bad or not, just an anecdote.


That's pretty hilarious. Until recently I didn't think of the CPU fan as an auditory cue for how hard your system is working. But now I can remember times that excessive fan noise has prompted me to investigate the cause of excessive usage.


I had a habit of touching the strip of metal above the keys on my old macbook air, since that heats up before the fan becomes audible.

The sensible approach, of course, is to add a load monitor to the menu bar; now with my new m1 macbook, I can simply make the appropriate noises myself, as necessary. This is the UNIX way.


"MenuBar Stats" has a versatile collection of widgets to add to the menu bar. I'd prefer something open source though, if anyone has recommendations, please share.

Once you're running a tool like that, it is interesting to see the efficiency cores often saturated and the performance cores usually sleeping.


I've been using this one: https://github.com/yujitach/MenuMeters


In addition to that I also use

https://github.com/exelban/stats


You can do this with xbar [0] (which is also just a super cool app). Looks like someone even made one specifically for checking the CPU throttling speed [1]

[0] - https://xbarapp.com/docs/plugins/System.html

[1] - https://xbarapp.com/docs/plugins/System/cpu-thermal-throttle...


[1] seems to use 'pmset -g therm' behind the scenes and it seems to not work on my M1 Air.


What I really want is a menu bar widget that detects apps using 100% CPU for more than a minute and offers to `kill -9` them. Has anyone made that?


Hearing the disk going used to be an auditory cue too, which is now gone thanks to solid state drives.


Yes, I miss that calming crackle telling me that my PC didn't lock up and it's just chugging along. And the floppy boot sound. Objectively worse but still nostalgic.


Like the demonic scream of a dot matrix printer


Occasionally with catastrophic results https://www.extremetech.com/computing/239268-spotify-may-kil...

That said, I don’t miss the noise or slowness.


Oh, yes. This has been an ongoing problem with "sleep mode", which users think means "off", but isn't really. Some Windows docs for hardware makers indicated that "sleep mode" should stop the fans, to maintain the illusion that the machine is "off". Some machines do that, and some stick to temperature-based fan control, so the fans continue to turn if the machine is warm. Some users then complain that sleep mode doesn't work because the fans are still turning.


Wait, why would the device get warm during suspend? Or do you mean a different kind of sleep mode? I'm thinking of the state where it just refreshes DRAM to keep its contents (this does not produce much heat at all) and everything else is off: that definitely does not require the cpu fan to run because the cpu is not processing any instructions during that state.



I assume not get warm, still is warm from running beforehand.


Hmm if that were necessary, then all power states should keep the fan on for a while after running, so a device that was shut down would also keep the fans on and there would be no need to put in a requirements document that the fans must turn off to give the illusion of having shut down? To my understanding though, if no new heat is created (usually mainly by processors like CPU or GPU), keeping the fans on is not necessary in regular laptops or desktops.


And this is how I ended up buying a second Nintendo Wii. They turned off the fans and fried themselves.


I bought a Lenovo P1 Gen 3 a couple of months ago and the fans spin while the machine is sleeping.

I'd love to know if it's crappy thermal design from Lenovo, crappy CPUs from Intel, or Windows sleep mode being too demanding. Maybe it's all three?


It might also be naive firmware that expects the same thermal/cooling constraints when everything else is otherwise going to cool passively. Or even a design choice to actively cool so a quick wake will be in a better thermal state.


Depends on which A1, S2, S3, stage it is etc. And laptops differ from manufacturer to type and models.


Lenovo supports S0, S4, or S5 in their newest machines. I changed my settings to use hibernate (S4) rather than S5 (modern sleep).


That would be useful for me today. Systemd again gone into some of those inifinite loops consuming 100% CPU in one of its daemons that try to replace what already worked before (systemd-resolved) today, while I was sitting in the train without AC.

I only realized something was wrong when the pinebook got way too hot.


It makes me a bit sad that for ten years I took it for granted that Moore’s law is dead and notebooks are constantly overheating incapable computers just to learn that the whole issue was Intel messing up their process nodes.


It's both. It's harder and harder to grind out progress (and if you compare M1 to where the 90's-early 2000's exponential progression, we're not so close).

As it gets harder, it becomes more capital intensive and the probability of success on any given process node decreases. The marginal utility of a doubling of computing is likely decreasing with time, too.

At some point, these factors are going to compound to slow progress. Or: exponential progressions never continue forever.


I don’t think the marginal utility of doubling computing will diminish any time soon. As we gain more computing power, we just start doing more and more computation heavy tasks that weren’t possible before.

For example, VR games are notoriously demanding on today’s hardware, especially if you want to put that hardware in the headset.

And many applications get slower over time, not faster, as new features are added. This is especially true in an era of cross platform frameworks that are convenient for developers, but are much slower than native.


Also old programs had to hyoeroptimize because it either is that or it's too slow.

Today the speed of the computer makes it incredibly forgiving.


… Electron


Marginal utility of anything decreases as you have more. And it obviously has in computing.

E.g. 20 years ago, you could buy a P3-1GHz for $200 or P3-750MHz for $100. So you'd pay about $100 more (2001) for 850 more Dhrystone MIPS.

Vs. modern processors where for a few hundred bucks you get >500,000 DMIPS in a package. You would not choose to pay $150 (2021) more for 850 more DMIPS (< 0.2% more performance).

I think this is true even if you consider things on a log scale; a doubling is worth less now than it was 20 years ago. Twenty years ago nearly anyone could tell the difference and would pay a bunch more for twice the performance, and now a big subset of the computing market wouldn't.


Doubling performance might not be directly noticeable for the average smartphone user as an example, but it is definitely noticeable to the average app developer, who can spend more time on user experience and less time on optimization.

Better performance also enables new forms of technology, such as AR, VR, ray tracing, advanced image processing for smartphone cameras, etc. These are visible and sought after by regular users.

Users can tell the difference, it’s just that they might not realize that the new features are enabled by higher performance of the hardware.


Look, I think you're missing the point:

- First, the marginal value of 1MIP has plummeted in the past 40 years to near nothing. When my dad programmed on the 1401, another .01DMIPS would be worth ~$100,000 (1960s dollars, or $86 million 2021 dollars/DMIP). Now another 100000DMIPS is worth almost nothing (Maybe $0.006/DMIP today). So there's really no debate about declining marginal utility of computation vs. time.

- Second, developer productivity is great and all, but only a tiny proportion of computing units go to developers. Therefore, we can't assume that increased developer productivity will necessarily pay for new process nodes (with increasing capital costs and risks and decreasing benefits for end-users).

- Third, YES, finding new apps is the only thing that can keep things moving. But my point is: the proportion of the marketplace that is seeking above-baseline features and are willing to pay a premium for them is an ever shrinking portion. Replacement cycles for computers are, overall, lengthening, and the proportion of the market that buys the high end is steadily decreasing. This makes paying for ever-more-expensive process nodes more difficult.

- Fourth, when we were really enjoying Moore's law and areal density was improving significantly, better process nodes were clearly worth it: they lowered the cost to manufacture existing parts by yielding more parts per wafer. That is, the fabs wanted to shrink ASAP even without new use cases because of the improving economic picture. This is no longer the case.

- Fifth, eventually a doubling costs more than the entire world economic output, as eventually doubling will get harder and harder. No matter how many killer apps you find in #3, you can't keep this going forever.


I am definitely missing your point. I was originally responding to your comment that the marginal utility of doubling performance is diminishing. I don’t believe it is. I never said the marginal utility of a DMIP or whatever isn’t decreasing, which I don’t think is relevant, since processor performance increases exponentially, not linearly.

I also never said that it will be possible to keep on improving performance at the same rate forever. The rate of improvement is what is diminishing, not the value of the improvement. If we could keep on improving performance at the same rate forever, that’d be great, but we probably can’t.

Fabs are still motivated by demand from their customers and competition. If a fab stops improving, they will either be surpassed (see Intel) or consumers will see less of a reason to upgrade their hardware, and the entire market will decline. Both are bad outcomes for the fab. So fabs will continue to push the envelope as long as it is physically possible and economically feasible.


> I was originally responding to your comment that the marginal utility of doubling performance is diminishing. I

Your ambiguity in your responses is why I've been both addressing the marginal value of DMIP and marginal value of log(DMIP).

> since processor performance increases exponentially, not linearly.

It doesn't look very exponential right now. Looking at consumer-focused processors, we've gone from 176k DMIPS in a package in 2011, to 298k DMIPS in 2014, 320k DMIPS in 2016, 412k in 2018, and now we've finally got a substantive boost again to about 850k in 2021, though these are golden parts that aren't really available.

> The rate of improvement is what is diminishing, not the value of the improvement.

It's both --- the improvement is getting much more expensive, and the value is going down.

> Fabs are still motivated by demand from their customers and competition.

The vast majority of fabs have stopped being anywhere near the leading edge, because costs are too high for the benefits. At >100nm, there were about 30 manufacturers close to the leading edge; then at 45nm we had about 15 manufacturers who were close to the leading edge. Now it's Samsung, TSMC, and Intel-- 3, that's it. And even Intel is starting to question whether it's worth it versus than pooling effort together with others to amortize the massive capital costs. Most parts and applications are staying further and further from the leading edge, because the areal density benefits to cost are smaller and the tape out costs are ever higher, and because the vast majority of applications and semiconductor units do not need the performance.

Whereas 30 years ago, almost anything that could shrink, did, because of areal density benefits and because almost all applications desired improved performance.


MacBook series' thermal design is considered bad compared to other laptops for a long time. It accidentally helped a bit people think M1 is great and Intel is bad (though it's mostly true). (I believe it's not intended.)


Apple put M1 in the exact same bad enclosures they were using before, so the Apples-to-Apples comparison is still valid :)


Yes that's the point. Apple enclosure + Intel CPU was worst one.


That's not entirely true: Intel's year-over-year gains weren't particularly great, but that's because they tended to use more refined manufacturing processes on older, cheaper nodes. Lo and behold, when Apple decides to go all-in on the 5nm process, the entire world's semiconductor distribution network screeches to a halt, and we get the worst silicon drought in decades.

It's a bit of a moot-point. I prefer AMD processors where I can get it, but Intel's position in the market is far from as dire as people make it out to be. Until Apple can find a way to challenge their single-core performance, Intel will still be leading a class of their own.


> Until Apple can find a way to challenge their single-core performance, Intel will still be leading a class of their own.

M1's single-core performance beat out all but the highest-end AMD and Intel desktop chips at the time of its release.


Funny. I spent a summer in 2009 in Vietnam, working on one of those old unbeatable Powerbook silver models, in an apartment without air conditioning. The fan would run at full power constantly, and finally gave up the ghost. There was no way to repair it where I was. So what I used to do was keep a bunch of Wired Magazines in the freezer, and put them under the laptop, changing them out every 20 minutes or so. The machine still works... but yeah, that fan noise is really a great subconscious reminder that you're using it too hard. Also a good way to know if your code just entered an infinite loop, long before any other warning comes up... and occasionally that clue gives you enough time to cancel a process before the input freezes.

[edit] Just had a cool idea, since this discussion is heading towards how subtly useful auditory cues have been in our workflow. What if different processes played fans at different pitches, so the total volume still added up? That could actually be really useful out-of-band state information.


With the introduction of M1, it's hard to find any use of paper magazines these days.


If I wait long enough, maybe I can sell my collection of 90s Wired for more than that stupid Mario 64 cart ;D


About the "sonification" of computer processes there is this experiment, recently shared on HN:

What Does the Event Loop Sound Like? https://medium.com/att-israel/what-does-the-event-loop-sound...


SE Asian humidity was brutal on those older Intel-based MBPs. Even with A/C, it's still so damned hot and humid that you don't have to push the rig too hard before the fans start spinning. And heat is the enemy of Li-ion batteries.


The humidity in the region was also a problem for earlier iPhones, the indicator stickers used to check if the phone had been dropped in water could change colour just from the ambient humidity https://www.cultofmac.com/13571/the-tropics-may-be-too-humid...


I did this in the Sonoran desert with my 15" MBP when the temperature was pushing 50C. Had a stack of those giant ice packs from some meal delivery service. Worked with the laptop sitting on the ice pack and a towel until it was completely thawed after 4-5 hours.


Wondering if it's just me: I bought an M1 MacBook Air with 8GB of RAM, based on reports saying that RAM management is much more effective than with Intel CPUs. Ever since, I'm really struggling with load management. I typically have Chrome open with 20-30 tabs, plus a couple of Electron apps (Notion, Slack, Google Calendar), and my MacBook frequently slows down to a crawl or gets stuck entirely. Simple commands - "close all tabs" in Chrome - can take 50 seconds to execute. Is anyone else experiencing the same?


Anecdotally, the M1 feels very performant with 8gb (I have MBP and Mini 8gb variants) compared to my 16gb Intel MBP and my 32gb Hackintosh.

That said - it’s not magic. 30 tabs in Chrome is a lot if you’re not using a tab suspender extension. But one thing I do notice is that when a Rosetta 2 app is running, the entire machine seems to take a hit. I would check the architecture of your apps and see if you have any x86 apps running.


Lol 30 tabs is a lot? I'm running 32 gb ram on windows i7-6700 3.4 ghz on mini ATX and currently have 883 tabs open, using marvelous suspender and few other tweaks, but still have 95 windows opens (1 tab typically alive in each window minimum excluding my suspension blacklist...). Also have photoshop, notion, and a few other apps going and am running smoothly at 40-70% cpu utilization and 85% memory...


Wow. That would literally send my OCD over the edge. I have a thing about closing all my tabs, reading all my mail, and having nothing on my desktop at the end of the day.


> using marvelous suspender

Did you even read my comment?


GP does say that the amount of tabs actually running still exceeds the number quoted by GGP


They’re not comparable numbers. 800 tabs with 799 suspended is a different beast than 30 active tabs.


I think the "if you're not using a tab background suspender" part was important there.

Not sure if Chrome is still this way, but it would allow background tabs to still run things like JS even after hours of not being focused.


> 883 tabs open

There cannot be a good reason to do this, I start to get anxious at 20-30


High tabs count are from people who don’t care to close tabs after they’re done or use them as a TODO list they never get to.

Browsers should default to closing unused tabs (after 30 days?) and I’m glad that’s possible now. If you haven’t seen a tab in a month chances are you never will.


I'll have you know that I kept a tab open for 3 months, and only then I ordered the item displayed in that tab.

Yes, tabs as a todo list work. With a tab suspender extension.


> Browsers should default to closing unused tabs

Hey, what if I finally get to whatever interesting thing that tab had after 30 years of ignoring it?!

But yeah, I just "close other tabs" every few weeks because it's 99% guaranteed I won't need them.


> marvelous suspender

I totally missed that fork of TGS. Glad I saw this comment, as I hated getting rid of TGS.


why


Seconded. I have 30+ tabs open (though in Firefox), alongside VS Code, Obsidian (a notes app, but with web views), Spotify (web views), and I don't frequently run into serious performance issues.

I do try to avoid Rosetta quite a bit though (right now I have no Intel processes in Activity Monitor).

Also check Messages.app hasn't run away with the CPU. Sometimes it seems to do that :(


> plus a couple of Electron apps

I’m afraid that might be your problem right there :(


I confirm. Embedded nodes each take a few hundreds of megabyte. It looked cool when we had 2 electron apps out there, now with so many of them good by ram.not mentioning apps developed in a few months rather than years, packed with features to become more memory hungry than the OS. MS Team is a particular offender.


Evergreen webviews baked into the OS are the next frontier of desktop app development. They can share a renderer and have much lower memory usage per app as a result - see the changes in Windows 11. Couple that with a lighter weight desktop compatibility shim to break out of the sandbox conditionally (like Tauri does in Rust) and this architecture can be totally fine. The issue isn’t the concept of using web technologies, it’s just the current implementation that was needed to make everything work on our existing platforms.

Mac will be a holdout because they heart native but hopefully will concede at some point because more and more people are going to deploy apps this way from here on out. It’s not worth it to employ native app developers for each platform except for the largest of the large companies.


This will never be the same as Electron due to differences in the various rendering engines (e.g, Webkit2/WKWebView/WebView2 all have subtle differences), and this isn't accounting for version differences.

The reason people ship Electron is because Electron is literally the same thing wherever you shove it. You can search on this very forum for comments from the dev who migrated Slack from per-platform-WebViews to Electron.


That’s what I meant by evergreen webviews. I think you missed the point of what I was saying. There are not evergreen webviews cureently. They currently only update webviews on major OS releases. with windows 11 they will ship an evergreen webview for use by Teams and will abandon trident as well. it’s currently a react/electron app that will run in the OS webview on Windows instead of in Electron. This is precisely the reason MS is doing this. Slack was os-specific before evergreen webviews were a thing.


>I think you missed the point of what I was saying.

I don't believe I have, and I believe you're missing the point - much like everyone who says that platform-specific WebViews are the solution to Electron.

Go actually read the comments from the devs who migrated away from WKWebView. What you are describing will not solve what they sought to get away from.


>It’s not worth it to employ native app developers for each platform except for the largest of the large companies.

Programmers have ported Emacs to the native GUIs for MacOS, Windows and Linux (and 80% of respondents to a recent Emacs survey prefer to use one of those GUIs rather than Emacs's TTY interface), so "largest of the large companies" is going a little too far.


> Programmers have ported Emacs to the native GUIs for MacOS, Windows and Linux

That doesn't mean it would be commercially worthwhile to employ them to do so; volunteers do lots of things that wouldn't worthwhile to employ people to do. OTOH, wanting volunteers doesn't make it happen.


Teams is a horrible abomination on so many levels. fragmented apps, auth, performance, it’s all a mess. Recently they took away the ability to do :emojis:

I believe it just exists to make people suffer


Want to use "apps" that run in a browser. Then just open them in a browser.

That's what I do with Teams, Discord and so on.

Much better than opening many isolated browser runtimes for what's essentially badly optimized webpages.


And if you want the separate window so you can pretend they’re apps, use the “create shortcut” option in the menu.


Electron has higher permissions so you can do things like select your audio inputs in discord which is nicer than the browser selection.


Alternatively you can always use a voip client which doesnt need half your systems resources. Picking audio channels was a normal feature decades ago.


>Alternatively you can always use

No I can't. The IM platform is decided for me already so I can not pick something else. And the company will pick the platform with the most features / integrations which is currently Teams.

MacOS drawing a line in the RAM sand at 16GB with most on 8GB will likely encourage programs to be more efficient rather than be a burden to the user. As we have seen with mobile platforms, programs will use as much resources as they are allowed to. Adding more resources does not fix the problem unless the programs are already quite efficient.


> The IM platform is decided for me already so I can not pick something else

then i am not sorry for you.

> And the company will pick the platform with the most features / integrations which is currently Teams.

The company will pick whatever is best suited for the task at hand. my employer in europe chose mattermost on company premises. i can use it with the matterhorn ncurses client and it's a joy to watch it consume 0% cpu time and a negligible amount of memory.

Besides, this was originally about discord, which hopefully no company uses to manage internal communications(?) When not working you are totally free to choose whatever communication infrastructure you want and are not bound to whatever your employer thinks is best.


This is besides the point anyway. The average user does not need more than 8GB of memory to do their average tasks. Everything they do is possible in 8GB which is proven by the fact mobile OSs and ipados do it just fine. The only reason for the average user to have 32gb of ram is because most desktop apps are insanely bloated. If Apple caps the platform at 16gb but most users use 8gb, it means developers will make their platform efficient enough to run well on 8GB since all users and the developers have that amount of ram.

If MS teams does not work properly on the 16GB macbook then you report a bug and they will fix it but if it doesn't work on a regular PC it is more likely they will tell you to just get one with more ram.


Yes you can, discord client: https://cancel.fm/ripcord/

And I also use Mumble from time to time.


Are they any better than the PWAs?


They have far more extensive permissions like direct filesystem access. It can make some interactions much more fluid or even just possible at all. You couldn't run VS code very well in a browser for example.


"Discord Helper (Renderer) - Not Responding".


Can anyone explain to me the psychology of having 30 tabs open? Why not just bookmark the URLs to an ephemeral folder or something? Having more than 10 tabs open is beyond stressful for me, and honestly I can’t see how having more open would make anyone more productive. Really curious to learn about other workflows and excessive tab use is one I’m morbidly curious about.


How can you do research without going beyond 10 tabs? I typically go horizontal. If i'm trying to solve problem, will middle click on top 5 results etc and go through them until problem solved. Then there is research on various topics that require deep dives into papers, those papers then have citations that require other papers or open up other queries. Then going through email generate various links that I need to see, via google alerts, groups etc, and that doesn't even get started with links generated out of hn/reddit etc. How do you get away with less than 10 tabs? I also use multiple tab managing plugins not just for memory, but for searching between open tabs, and then to manage and collapse entire windows etc...


My workflow for research is like this:

- open link, determine if it’s relevant or not. If it’s relevant, but not relevant right now, I’ll dump it into my notes.

- then I close the tab. Pretty simple.

- when the time comes where that link is relevant again, I’ll find it in my notes and open it.

- I really don’t find any value in having tabs open that aren’t immediately relevant to what I’m doing. 10 seems like the upper limit of focus for me.


> I really don’t find any value in having tabs open that aren’t immediately relevant to what I’m doing. 10 seems like the upper limit of focus for me.

It's clear that you do understand the value of preserving tabs that aren't immediately relevant to what you're doing. What's the difference in value between writing tabs in your notes and simply not closing them?

I'd understand if there weren't tree-style tabs - you can organize things better in your notes. There are tree-style tabs, though, so you can organize your tabs as you need them, and dropping into a root node is just like dropping into an old thought process.


I'm not the OP, but I almost never have a lot of tabs open, because to me "keep multiple windows each with dozens of tabs open" isn't organization any more than "keep dozens of icons on your desktop" is organization. Some people love that, but I can't find anything that way. And every implementation of tree-style tabs I've seen is -- again, to me, personal preference, YMMV, fill in your favorite disclaimer -- a hot mess. More to the point, it's still "keep multiple windows each with dozens of tabs open wait don't close the window reflexively OH NO YOU CLOSED IT FLAIL FLAIL UNDO HIT THE HISTORY UN-ERASER BUTTON WHEW IT'S BACK". Jesus. No. OMG stop.

Seriously, though, it's just a different way of working. If I want to save a link because I'm genuinely going to need it later, I save the link. More often than not, it just goes in the drafts or annotations for the article that I'm working on at that moment. If not, I save it in GoodLinks, where I get a title and a summary and tagging and syncing across my laptop and desktop and iPad.

I get that I'm an anomaly these days, and that "if you have less than 50 tabs open across three windows you're an amateur" is the norm among technonerds. (That is an actual quote from a friend.) But I am pretty sure the Venn diagram of the all-the-tabs-all-the-time folks I know and the "which tab is it? nope, nope, nope, I'm sure it's here somewhere" folks I know is essentially a perfect circle.


> What's the difference in value between writing tabs in your notes and simply not closing them?

Well, one is indexed, searchable, tagged, and available whenever and wherever I want.

The other is ephemeral and maybe, hopefully I can find it and maybe hopefully I didn’t close it out, and maybe hopefully I remember the name so I can find it in my search history.

Managing tabs with useful info instead of writing them down sounds like a living nightmare.


I drag excess tabs into their own windows to group them by subtopic. Too many tabs makes it too hard to track what all the tabs are. When there are more than 2 windows of 10 tabs I’ll move over to collecting and grouping annotated links in notes.


Bookmarks are like a bin things get thrown into and completely forgotten about. I open a bookmark maybe once a week, and then it's usually for something like accessing the wifi router, or a bookmarklet for adapting a web site.

Tabs are like a TODO list. If something needs to have attention paid to it, and then dismissed, I open a tab for it.

(Of course, I use Firefox & Tree Style Tabs. I find Chrome almost unusable due to its tabbing idiom.)


Interesting.

I’d never be able to use Tabs as todos because I personally use tabs very ephemerally. Closing out of chrome with 3 profiles and 10 tabs each open is nothing to me because I don’t care what tabs are open. If they’re important links then I’ve already written them down in Roam and/or bookmarked them.

question - how do you find anything? Maybe it’s the fact that I’ve never used tree tabs, but finding the link you used last week or last month to complete a task sounds like a nightmare without bookmarking it. Contrast this with my workflow, which doesn’t rely on tabs, and I can easily find it in my notes in less than 10 seconds.


Firefox suggests open tabs by default when typing in the address bar.


Tabs are like temporary bookmarks. I use 1 window per task and usually open links in new tabs. I don't need to do anything to pause a task: simply minimize the window and the tab unloader extension does the work of automatically keeping my RAM from overflowing.

When returning to the task, I just focus the window and instantly have a picture of my previous progression. In case of a computer crash or restart, it's been years since the Firefox session restore function ever failed me. That step in booting up the computer is quite painful though, the browser can take a good 60 seconds to restore the previous session.

When I'm done with a task for good, I just close the window. Workspaces help in keeping my Taskbar and Alt+Tab workflow clean as well. I often have 1000+ tabs "open".


Personally on my phone and work computer I keep legion number of tabs open. On my phone, it’s because using bookmarks is slower than opening the tab menu. Plus the OS will swap out tabs that haven’t been used in a while

With a work perspective in mind I have one window set up per task, I use tree style tabs on Firefox, and at a glance I can see the complete context of a task I’m researching, just from the decision tree the tab branches show. Every potentially interesting link that’ll help me with my task gets opened on a branch below the parent, then reviewed and filtered. This is tremendously useful when it comes to writing up documentation or updating tickets and the like.


30 tabs is like 30 apps. Once might be a word processor, texting, image editor, etc...


Someone else in this thread was bragging about having more than 800 tabs open


Hmm. I’ve never ran into that myself and I’ve used a lot of swap on my 8gb M1 Air, so you may be running into another issue, but one thing I have noticed is that performance is far better with native Apple apps. I switched to using Slack in the browser and Safari as my main browser and battery life and performance have been amazing.


Nope. Got the 8gb too. I have xcode, firefox, safari, mail and a lot more. No issues, apart from xcode because it sucks, and has sucked for years.

Chrome is your problem.


Are you using an AdBlock? If not, maybe that's why Chrome is struggling. All these adverts eat resources like crazy.


> "Simple commands - "close all tabs" in Chrome - can take 50 seconds to execute."

Open Activity Monitor and keep an eye on the memory pressure graph, as well as checking for individual apps that are running away with huge amounts of memory. Also, make sure your SSD isn't too full, as pressure on the NVM will reduce swap performance!

M1 Macs certainly seem to "do more with less" than Intel Macs, but as another poster said, they're not magic. Paging in and out vast amounts of swap will still eventually cause performance issues.


I don't have an M1, but on my computers I find Chrome to be memory heavy and rather slow.

It's worth trying out Firefox and seeing if that works better for you.


I have an 8GB M1 MBP. Photoshop, Lightroom, Resolve, Sublime, Chrome (50+ tabs), Transmit, etc and it's almost always very fast. Only slows down if importing to Lightroom and generating previews, or rendering video from Resolve.


Well it definitely is more efficient but RAM is still RAM… Swapping whole 8 gigs will make your system sluggish no matter what. I feel like 8GB is fine if your usage is really light, but swapping more than 70% will slow it down extensively. On the other hand there is a whole camp of people who run heavy dev suites on 8GB and don’t have these issues, so it might really be a faulty line. It’s the first version in the end.


The 8GB M1 I bought was constantly fighting the limited RAM, so I exchanged it for a 16GB model, which has been perfect. For those looking, the MicroCenter near me has 16GB MacBook Air models available at a discount, other MicroCenter locations might have the same.


Same here (although it’s more like 200 open tabs across 10 virtual desktops). It still feels slower than my cheap Atom netbook, but the battery life and hardware quality more than makes up for it.

This might be an opportunity to tweak my browsing habits.


How do you even split windows/tabs across that many desktops? I mean “logically”. Are you working on 10 different things in any given day/week?


A laptop display is too small to see more than a single window. I always make my windows full screen.

For example right now I have the following windows/desktops:

0. Google Calendar + Gmail

1. Facebook Messenger

2. YouTube (music/podcast)

3. Terminal

4. Visual Studio Code

5. Documentation (5-10 tabs)

6. Google Colab

7. Google Cloud Console

8. Google Sheets + Docs

7. Wikipedia (20+ tabs)

8. Google Search (30+ tabs)

9. Hacker News (20+ tabs)


I feel like I am the only one who doesn't hoard browser tabs like everyone I read here. Why 20-30? Why not close the unnecessary stuff or bookmark if that's important?


Memory management certainly feels pretty sluggish on my similarly configured Air. I'm holding out hope for a 64-gig model if Apple really wants to challenge my desktop.


Yes. The M1 is not magic. Or at least it has limits.


This is funny, but I prefer my computers silent.

Semi-related: I can hear the inductors on my NVIDIA GPU, which is somewhat useful. When I am training deep learning models, the coil whine follows very regular patterns. Also, the pattern changes between training and validation. So, I can literally hear when an epoch is done.

(Of course, TensorBoard for the real monitoring.)


I can relate 100%

I could literally hear the minibatches running, they sound more or less evenly spaced treek..treek.. and when it's at the end of the epoch, the larger batch for validation begins, it's a much longer treeeek


When I'm training a model on a RTX 2070, i have a 'ticking' coilwhine that ticks every 1second... kinda funny that it only happens when i do that


Do you do anything special for cooling the GPUs? I mounted a giant CPU heatsink to my gtx 1080, and it still has to turn on the fans when running ML workloads.


If you have something that produces ~300 W of heat and you want to keep it from becoming hot, you're going to need either fans or liquid cooling (which also has fans, on the radiator).


Do you also hear this when your speakers/audio system are off?


I'm not GP, but yes: coil whine is audible as a high frequency whine/buzz, caused by power supply components (inductors) physically vibrating at high frequency.


Certainly, I was just curious if they were hearing coil whine.


Yes, there are no speakers connected to the system.


Haha! I sort of need this. It's amazing how the only time I realize I have a runaway process is if my laptop heats up.

It's truly incredible how much serious heavy work I do on a "little" M1 Air.

MySQL and Postgres, Redis, Sidekiqs, Mailcatcher (small, but still), a couple of Rails apps, Spotify, Spark mail app, Firefox with dozens of tabs, Safari, Calendar, Photos in the background hopefully identifying faces, etc. etc.

I do three times as much on this machine as I could do on my 32GB i7 Dell. There's some magic in here.

I still cannot build some older versions of libraries. So I do a little of my work on an old Intel Mac. But that's not the architecture problem, that's a problem of fragile builds.


Makes me want to replace my i7 MBP sooner than I was planning!


I thought this Air would be a placeholder until the M1 MBP, but unless you need a real GPU, I find this more than good enough. I won’t need to upgrade.


No troll, the fan shutting off was how I knew my app had compiled and I could start debugging.

With the M1, I now have to check the cpu usage history occasionally.


Honestly I believe that as developers we don't use audible feedback enough in our tooling.

Imagine diagnosing your app health like you could diagnose your connection speed of a dial-up modem.


> Honestly I believe that as developers we don't use audible feedback enough in our tooling.

Allow me to point out a very cool project I saw in the early 2000s, which unfortunately seems to have been abandoned and forgotten: http://peep.sourceforge.net/intro.html (Peep (The Network Auralizer): Monitoring your network with sound).


Agreed. I usually laugh at TV shows and movies that use a bunch of beeps and boops in their computer scenes.[1]

But sometimes I actually want that! Like, if there was a switch I could flip to turn on audio cues every time a window's contents are updated, or new lines appear in a terminal window. And every window update would be tagged with a different sound for error, warning, info, success, etc.

I know you can configure your terminal to play a BEL sound, but I'm looking for something system-wide. I want it like the movies, so I can pay attention to something else and not have to check in on the progress of my compilation or transcoding or long download.


I make extensive use of “say”, the built-in macos program that can say out loud anything you give it.


remember ping?

     -a      Audible.  Include a bell (ASCII 0x07) character in the output
             when any packet is received.  This option is ignored if other
             format options are present.


I've been feeling just the opposite - I've been using an IDE from the mid-2000s for a legacy project, and it's amazing how things have changed. The search function dings a bell if nothing is found, and it does it in real-time as you type... so it gets annoying really quick (no auto-complete, and not even ctrl-v to paste in to the search query)


Transition of search dialog to search bar is also a huge improvement.


i stick the OS's equivalent of `bell` at the end of long-running scripts :)


  && say 'All done, come get some'


For the Linux users

    && espeak "job's done"


Some ideas for playing a notification sound in a script runtime:

Shell

  # system beep; also flashes the screen if you have that enabled in accessibility
  osascript -e 'beep'
  
  # play the sound file given by the file system the path
  afplay /System/Library/Sounds/Hero.aiff
AppleScript

  beep
  
  do shell script "afplay '/System/Library/Sounds/Hero.aiff'"
  
JXA + Cocoa API

  $.NSBeep()
  
  // search for and play a sound file; the lookup path is defined by macOS
  $.NSSound.soundNamed('Glass').play


Also shell (edit: as noted elsewhere):

$ say "Your build is done."


If you compile on the command line, modify your compiler, such as mvn.sh, to end with: ‘say “Done”’


command && say done


[flagged]


The fan _was_ their notification. Why would they have wasted time adding another?


Because the fan will continue for some time after the compile finishes. So you can do work before the fan stoos


Depending on the cooling solution the temperature can drop like a brick. On my desktop I can go 75deg to dead silent in about three seconds.


And laptop heatsinks have less thermal mass


A copper MacBook when? Would be also nice for the anti bacterial properties.


  ./build && say "done"


You wouldn't be able to hear that when the fan on my MBP is blowing at its peak :-)


This reminds me of the old Nullsoft Beep. Some of it was just fun, but the hum based on cpu usage was unexpectedly useful. Archived here: https://www.1014.org/code/nullsoft/nbeep/


Hm. How about a nice colored Touch Bar widget that would change depending on temperature and load? At least it would make the bar useful.


I don't get the hate for TouchBar. I have the one with the physical Esc key and it proved useful number of times. Fast-forwarding ads on YouTube was nice until I discovered good adblock. Screenshot shortcut always ready. Emoticons ditto. Shortcut to format my code or comment a line of code in Xcode? Yup, thank you.


That was the promise and I was bullish on it. Sadly Apple did nothing to improve on it. A Taptic Engine underneath would go a long way for example. Another thing would be an external keyboard with it.

As for the shortcuts, turns out that keyboard ones work as well. And are easier to hit while looking at the screen. Personally I spend a lot of time in vim, web based IDEs, qtcreator and orthodox file managers. All of these use F-keys most of the time. Cmd-/ is quite universal for comments, and I’ve remapped ctrl/cmd/shift+prtscr to various print screen shortcuts.


I believe that they want to include Taptic Engine. Maybe redesigned Pro will have it. Either that or they will kill it.


Guilherme Rambo's famed skills at discovering unannounced features and products may be a thorn in Apple‘s side, but this is inadvertently the best ad he’s ever made for them.


Just like the turn signal clicking for cars


Or more like fake engine sounds for electric cars.


This is useful! Every so often Docker acts up or a Youtube video hidden in a tab somewhere or some silly mistake opens tons of Postgres connections on my M1. Previously the only cue was hitting the critically low power (10%?) at 6 hours instead of 12. Now there's this app. Thank you!


Reminds me, 10 years back, battery cars suddenly seemed so much silent, that they added engine noise played on speaker to alert pedestrians.


In both the USA and the EU, that’s more or less (doesn’t have to be engine sounds, I think, but sounds are required) by law. See https://en.wikipedia.org/wiki/Electric_vehicle_warning_sound....


"suddenly" was because that's when they started existing in noticeable numbers.


No, it's because vehicles before then were not required to meet minimum sound requirements by law. Check the spec sheets of HEV/PHEV/EV vehicles made before then -- they don't have acoustic alerting systems. Now they all do (in the jurisdictions that require it).


The law didn’t exist until there were enough electric cars to care about this. The cause and effect starts with the number of quiet cars.


Yes, but it wasn’t their mere existence that caused automakers to add noise. People who buy these vehicles like that they are quiet. The noise generators were added only because regulatory action was taken.


One of my fav macOS apps is iStat. It lets you keep track of all types of stuff from the menu bar...

https://bjango.com/mac/istatmenus/


Or for the free open source equivalent, I like and use Stats:

https://github.com/exelban/stats

(Though I really wish it had a more Googleable name...)


Fan noise and hard drive clicks were very useful indicators of processes doing things they shouldn't.


An app that makes the sound of a hard disk seeking during disk IO would also be quite useful.


This is awesome! I built a similar open source version a few years ago for a friend with a fanless Air, but it foolishly used system notifications and a menu bar icon instead of sound:

https://github.com/caseymrm/notafan


Back when I was a solder monkey at the Media Lab we had some literal fans overhead in our workspace that would turn on to reflect network load.

https://tangible.media.mit.edu/project/pinwheels


There are many people mentioning how the lack of fan is so strange and looks to them that something's wrong. I'd like to point out also that the introduction of the fan in the first place was probably the thing that should have looked very strange. Just pause for a moment and think how odd it is that there is a very high correlation between heavily using one's computer and a noise being emitted from its inside. It seems pretty random that a fan spinning faster and faster should be the thing that happens when I start up Photoshop. But then again how many things like this we take for granted, just the way it is until it's gone, just because we have no other reference point to compare them to when they're initially introduced?


Outside of work provided computers, I've not been a Mac user for many years, but the M1 Macbook Air brought me back and it's been great. It's basically the laptop I've been waiting years for, silent with no moving parts and very little in the way of compromises(I don't personally have any need for super long running CPU intensive tasks). It's weird to see it compile code or encode video without fans ramping up, and even if does start to throttle toward the end of the longer tasks, it still beats my fairly recent i7 work MacBook Pro and my Ryzen 7 4700U Windows laptop most of the time.


I bought an iMac Pro in 2019 merely because it had the most advanced cooling system in the Appleverse - i. e. compared to a high end maxed out regular iMac I probably spent 1000 EUR just to not be bothered by fan noises when doing audio work. Totally worth it after getting frustrated to no end when using the atrocious 2018 Macbook Pro. While the timing of my purchase probably wasn't particularly "smart", I applaud Apple for having understood and fixed this problem in their new line up. Definitely looking forward to the more fleshed out M1 laptops next year.


The temperature of the laptop is an accurate enough replacement for fan noise (if you use it normally, not in clamshell mode). I notice that I've become far more sensitive to subtle temperature changes in my M1 than I used to be in prior fan-cooled laptops. Probably because the fan tends to come on before the machine heats up to the touch, so temperature was a weak, secondary cue.


Excellent idea. Adding audible feedback to the runtime is an effortless way to know what’s going on under the hood. I submitted a post about this concept last week:

https://news.ycombinator.com/item?id=27808113

In it I add midi sounds to the system handlers from nodejs’ event loop.


I love apps like this that sonify different data about your computer and your usage. I wrote something similar for Windows. It lets you listen to the different parameters like CPU usage, disk usage and RAM usage in realtime.[0] [0] https://www.iamtalon.me/charm


This is like fake noise for electric cars.[1]

[1]: https://www.theverge.com/2019/7/1/20676854/electric-cars-art...


This would be funny for remote dev. I’d gotten used to the fan spinning up when building on my MacBook Pro (Intel), and after switching to building code on remote servers I realized I used the fan sound as an indicator that stuff was working.

Playing the fan sound based on remote server cpu would be a fun hack.


I prefer to read a graph, also another one for memory, another one for GPU, another one for network out, and another one for network in. This is in Ubuntu 20.04 with Gnome whatever.

Honestly, this fan sound seems to give much less information than what I am used to.


Whats funny is I had an app crash in the background taking up 80% of my cpu on my M1 macbook pro, and only noticed because I happened to launch activity monitor for something unrelated. You can't do that with any other laptop.


I've long thought an auditory cue would be helpful when my computer is chewing on something. Maybe it could hum tunelessly, or breath harder when it was working hard. Or get excited and sigh when it finished. Something human.


Not quite as useful as this, but https://github.com/rbanffy/selectric-mode also makes your computer sound more serious.



My car has fake engine noises piped in over the speakers (this is actually pretty common now), so maybe you're on to something. Though I doubt I'd miss fan noise myself.


Currently sat in my home-office, mid British heatwave, and my trashcan Mac Pro has been the primary noise for days; I wish I needed this.


This means I can finally upgrade, thanks! Not having the fans to scream against during my Teams calls was a real dealbreaker for me ;)


How did I know this would be full of "My new M1 with everything freshly installed is better than my cheaper 5 year old PC"


Is there a version with engine sound instead? :)


This is a joke, right? I've had my M1 for almost a month, and I have zero complaints about the LACK of sound from the fans.


Is there a similar app that mimics disk noise of yore? That was a great feature to tell you how busy the is subsystem was.


I didn’t think a discussion about a toy app would be so interesting, but it turned out to bring a lot of great anecdotes.


This reminds me of the artificial noise they create on some quiet electric motorbikes as a safety measure.


Oh, okay, so this is just a app that bring your fan in the Mac running? ... cool, but, why?


Borderline off-topic: Sometimes I wish electric stove tops made the sound of gas burners.


Do we have same thing but for Tesla & Co ?

Cycling side by side with silent EVs is quite dangerous.


EVs make artificial sound outside at low speed for that reason. I'm not sure but I would assume this is government mandated.

This is different though, this here is for the user to be able to judge how much the cpu/motor is working. And yes, not only do some EVs offer you fake indoor noises, a lot of gas cars also do it as engines get more silent/sound worse to meet modern regulation. https://www.youtube.com/watch?v=4lQPc9VXBzk


In 10 years' time we gonna get an app that simulates the battery going off.


This is like rumbling, muscle car, sound effects for your electric vehicle?


I don't want to download it, is there a clip of the sound anywhere?


> I don't want to download it

In case what you mean is that you don’t want to run it, an alternative: right-click the app, followed by “Show Package Contents”, and navigate to Contents/Resources. You’ll find a couple of WAV files.

Don’t expected pleasant woosh noises, they sound like rough recordings of computer fans.


And maybe an app with v.56 modem sounds when you use the web browser?


Nice idea. You can also have a CPU usage monitor in your app bar.


I mean, how else are we supposed to know the CPU is busy?


There is no sound without heat.


Does it have an option permanently to scramble the video a month after the warranty runs out?


lol i wish this wasn't so inevitably the best


Just say no.


But... why?


I keep having some rogue apps that are burning CPU for (seemingly) no reason. When I feel some warmness, I check `htop` and there it is, IINA (a media player) is pegging the CPU again.


It's part of my workflow



Came to the comments to ensure this had been posted!


It's a joke that's meant to ironically allude to the fact that M1 computers don't generate fan noise.


It was an April fools joke app, but actually works because that's much funnier

https://9to5mac.com/2021/04/01/april-fools-day-2021-cybermou...


Pretty sure this is a response to a joke on Twitter in the Apple dev community.


hype


how about an actual fan to improve performance? :)


The latest Macs already have this, at least some of them do. The audio output gets distorted the more CPU load is going on. Really fun on video calls.


It's like when person drive tesla, it drives fast, but no ugly vibration and noise that most of the people used to associate with performance car. But some people crave for these sounds. I saw some modes doing that lol :) Kind of cargo cult. They pray for uncomfortable attributes but not actual speed.


Not sure about that. The ride I took in a model X was pretty uncomfortable for a $100k car. Turned me off Tesla as an option.


I’ll wait for the Pro… But honestly - the last Airs and Pros are a joke especially while using a second Monitor. The Pro has a known bug and can get pretty slow. The Air - unresponsive as hell if a second monitor is used and u are in a virtual conference with screencast and cam on. We must always look at the price value tag and at that point Apple really disappointed recently. Don’t get me wrong… I am a Dev and i appreciate the same things as you do. But I am not hyped by the M1… I just think at that price point apple should get things sorted out finally and if the M1 works out… Well, than thanks Apple and thanks for let me buying all this expensive crap upfront ;)


I’ve been using the Mini for 7 months without a single hiccup – daily routine requires several Adobe apps running simultaneously and a couple of dev node servers plus browsers, mail and the like, on a 4k/5k double monitor setup.


Strange... I frequently dock my M1 Air to a 3840x2160 display and it can easily run my 30+ track Logic Pro session without any perceivable throttling.


I believe he's referring to the previous generation of Air's, not the M1 version.


The M1 line up seems great but... I'm very surprised by all the comments from people discovering what a quiet computer is like in 2021.

I've been running PCs so quiet they're inaudible since, what, nearly two decades now!? (since basically the first consumer SSDs started hitting the market: don't remember when that was but it was a long time ago)

An ultra quiet PSU, a gigantic CPU heatsink, a huge CPU fan running so slow you can read what's written on the blades, a passively cooled GPU (ofc not working if you do GPU heavy stuff) and that's it.

Now it's great if, at last, it's coming to beefy laptops too. But a quiet computer ain't anything new.


Computers like the "HUAWEI MateBook X" are fanless with a powerful CPU (10th Gen Intel® Core™ i5-10210U Processor, 1.6 GHz 4 cores, Turbo up to 4.2 GHz). https://consumer.huawei.com/en/laptops/matebook-x-2020/specs...

I have used also a Surface Pro for a long time, and not being modern as this new Apple laptop, it is a full-fledged Windows 10 machine that can run any programming IDE, virtual machines, etc. and completely silent.

Can someone explain why the down-votes for TacticalCoder for just stating facts? I really don't get, unless it is pure Apple fandom and that would be just silly.


> the "HUAWEI MateBook X" are fanless with a powerful CPU

Please note that a MateBook with a i5-10210U gets around 1100/2500 in GeekBench while the also fanless M1 Air gets 1700/7400.

Silent computers are nothing new under the sun, but silent ultraportable laptops as powerful as some people's desktops haven't been around until last year.


I didn't downvote. But I assume it's because he's focusing on heat mitigations which are old, while the point of the matter is the M1's produces less heat in the first place.

There's some obvious advantages to avoiding the creation of heat. Advantages that mitigation does not realize.


Quiet computers, not new, quiet high performance laptops however...


So you’re comparing what sounds like a hand rolled custom build with a no brainer laptop that’s a third of an inch though.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: