Hacker News new | past | comments | ask | show | jobs | submit login
2018 Mac Mini Review (marco.org)
455 points by janvdberg on Nov 6, 2018 | hide | past | favorite | 639 comments



>Number one — and this is a big one these days, especially for this product — is that it’s not any less useful or versatile than the outgoing Mac Mini, including the generous assortment of ports. If the previous one served a role for you, the new one can probably do it just as well, and probably better and faster, with minimal donglage.

Wow. This is what it's come to. "It hasn't gotten worse! Yayy!"


The quote is a bit out of context. The sentence right above it:

>It makes almost nothing worse and almost everything better, finally bringing the Mac Mini into the modern age.

So, no, it's not "yayy" because it "hasn't gotten worse", it's "yayy", because "[it makes] almost everything better".


Anyone who owns a MacBook Pro with Touch Bar will definitely be saying "yay it hasn't gotten worse!" about these.


And now the top thread is dominated by an argument about laptop keyboards, for a device that isn't a laptop and doesn't even come with a keyboard.


But to be fair, the device is made by Apple and the discussion speaks to how Apple treats its consumers.


I'm pretty happy with mine, but thanks for speaking for us all with your sweeping generalization.


I work at a company where most of us use Macs. Everyone that has upgraded from a 2013-2015 MBP to the touchbar model absolutely hates it, including me. There's gotta be someone around here that actually likes it, but I haven't met them yet.

Its OK if you like it. I'm sure lots of people do, but you know damn well significantly more people have had issues with this laptop than normal, so I'm not sure why you're implying otherwise.

The keyboard is by far the worst I've ever used, and I've owned a variety of $200 Wal-Mart black friday laptops and a couple of netbooks, in addition to some high end stuff. I had a friend with an older MBP attempt to show me something using my computer and he could barely type on it without frequent mistakes. Within a minute he was getting frustrated. It just feels terrible. I never thought anything would be worse than typing on a touch-screen but Apple's engineers have accomplished a horrifying miracle. Its like they intentionally tried to design something that's as loud as a mechanical keyboard while still having worse tactile feedback than a $5.00 rubber dome keyboard.

On top of that, its not noticeably faster, after 4.5 years it still maxed out @ 16GB Ram (They fixed this in 2018 but its too late), which is not enough for my use case + it died after 3 months (Not the keyboard, it was a power issue).

This is both the most expensive and the worst computer I've ever owned. The 2013 MBP I'm using now as a loaner while I get my new one fixed is, to quote Steve Jobs "Like getting a glass of ice water in hell." It just works.


The touchbar made me realize that I have a habit of resting my fingers up there. I came to this realization because when I didn't think I was typing anything, stuff would happen. Eventually I realized I was touching virtual buttons on the touchbar.

I don't hate it, but I wish more of the tools I use took advantage of it.

I think it could be massively improved by shortening it a little on the left to make room for a physical escape key. That's about 95% of my problem with it.


Consider mapping CapsLock to Esc (and Ctrl when long-pressed) with Karabiner. I haven't used my physical Esc key since 2014 or something. Might help with your issue.


You don't need a third party tool; System Preferences does this for you already. (Maybe not the long-press, but I'm vim, not emacs).


If you remap CapsLock to Esc, what key do you use for Control for doing Control A, Control E, Control K, Control Y, Control N, and Control P in every application?


I tried to address this in my parentheses, but Karabiner specifically lets you bind CapsLock to both: Esc when tapped alone, and Ctrl when held down in combination with other keys.

And your post also answers you sibling comment: Ctrl modifier is useful outside of just Emacs.

Try it out. I believe the API changed in the macOS release before Mojave, so they relaunched Karabiner under the name Karabiner Elements. This option is under the "Complex Modifications" tab.

If I was writing a list of tricks for macOS power users to try, this would be my number one. Up there with binding a global show/hide hotkey for iTerm (I use Ctrl-Space, thus CapsLock-Space).


I prefer Magic Trackpad 2 (or native MBP 2015 trackpad) to any other way of navigation but Vi keybinds -if available- work as well.

I currently use Karabiner for this as well, but a slightly different configuration.

The two rules I use are:

* R-Cmd + hjkl are arrows (which works great with HHKB but even on native MBP it requires less movement of hand from trackpad or typing hand than the arrow keys)

* Caps solo is Esc while Caps with another key equals Ctrl.

Is there a way to do this in Linux as well? I currently have to use Linux regularly and I rebind Caps to Ctrl however for Vim it isn't ideal. So I'd like to have the same functionality I have with Karabiner on Linux (Xorg / console).


You can use xcape on linux to do this. I use it to map CAPS to CTRL/ESCAPE, and right shift to RIGHT SHIFT/COMPOSE


I use Karabiner Elements to bind backtick (grave accent) to Esc, Command-backtick to Command-backtick as normal (so you can switch between windows of the same app), and Option-backtick to backtick.

It's not optimal, but I do it because backtick is about the same position as a hardware Escape key. It makes using vim feel ok again.

This option is available from the KE Complex Modifications tab.


Note that this means you lose the ability to use option-backtick for diacritic marks. Normally option-backtick followed by a makes "à", Option-backtick followed by e makes "è", etc


This isn't working for me in Mojave; you haven't had a problem?


In this mode... if you hold down CapsLock as Ctrl and then don't decide to press another modifier... does it still fire CapsLock as Esc?


I already have a ctrl key, though. Whatever works for you, though--personally, I prefer not installing extra third-party systems-level software over making ctrl marginally more convenient to press.


Are you denoting control+shift+a as control-A?


No, I was just typing on a phone. I meant control+a (start of line), control+e (end of line), etc.


Capslock is ctrl for most developers I know (because it's by far the most used key in terminal/Vim/emacs/etc).


Yes. Although in a Kinesis Advantage keyboard (the "official" Emacs keyboard), because the Control is under the thumb, it is useful to remap the CapsLock to Esc.


You should snag an app called BetterTouchTool. Makes the TouchBar an amazing addition to the laptop. It's just a shame that Apple didn't build in a tool like it but I hope the developer gets lots of love.


I have BTT for my machine at home, but the IT security department won't approve it for the work machines.


Doesn’t BTT require enabling unsigned kernel extensions?


Yes! I am constantly holding down the escape key without realizing it. Not a problem very often but occasionally causes some really confusing things to happen.


Doesn't it also get very distracting with how often it flashes into different states? I've heard this drives people crazy.


When you remap it it doesn't flash at all.


The touchbar? I find it more amusing than distracting.


> Its OK if you like it. I'm sure lots of people do, but you know damn well significantly more people have had issues with this laptop than normal, so I'm not sure why you're implying otherwise.

Yup, that's exactly what the HN groupthink would like to believe. What is by definition a small user demographic complains repeatedly/loudly that their problems are the most important, indicative of "everyone", and Apple is doomed because so-and-so bought a Surface Pro/Dell/System76. The hyperbolie is kicked up a notch here, as clearly any $200 Walmart laptop keyboard is better.

As is typical, there's never any data to support the claims and a significant number of counter anecdotes are dismissed absent critical reasoning ("you know damn well...!"). Instead, the echo chamber resonates unabated by logic.

Throw in an out of context Steve quote and you can identify this drivel pretty uniformly. It's usually best to ignore, though at times a response is merited when it's completely off topic and unhelpful (as it is here, the new Mac mini seems awesome regardless of a hater's two year old take on the tbMBP).


I like the feel of the keyboard. I did not like the trip to the Mac store at 8 months whereby they very gingerly lifted the spacebar and removed whatever crumb was under there making it mushy.

I also like the touchbar. The touch-slide volume is a nifty improvement...I use the touchbar for the occasional screengrab...and that's about it. As someone mentioned above, I too would really like a physical Escape key


If you go back and read my comment, I didn't say that out of Apple's entire customer base, more people dislike the keyboard than like it. I said that more people are having issues with this computer than a normal Apple MBP.

Just like your comment, this is an anecdotal opinion on the Internet. I didn't read your response and come away with the conclusion that you were attempting to represent it as a peer reviewed white paper, so I'm not sure why you're holding my random Internet comment to the same standard.

My anecdotal evidence is that all of the typical places people go to talk about technology on the Internet(reddit, hacker news, blogs, etc.) seem to have to more complaints about the keyboard on the new MBP than I recall seeing about the old model, which was almost universally hailed as the best laptop on the market.

Included with that anecdotal evidence is my own experience in a company with hundreds of people that use MBPs. There might be other companies where everyone loves them.

It would be really helpful if we could trust Apple to publish accurate, relevant data on the new MBP vs the old one, but they have a history of hiding, denying and/or lying about issues with Apple products.

>The hyperbolie is kicked up a notch here, as clearly any $200 Walmart laptop keyboard is better.

This isn't hyperbole. Its just my opinion and it was presented as such. I've been using computers since the early 90s. I've never used a keyboard that felt worse to me than the MBP. When I say that I don't mean that its one of the worst keyboards I've ever used, I mean that its THE worst keyboard that I have ever used, which is obviously just my opinion.

>Throw in an out of context Steve quote and you can identify this drivel pretty uniformly

This isn't a case where reusing a quote is changing the context of what he meant. I wasn't saying that Steve Jobs agrees with my opinion of the new MBP. This should be pretty obvious. The original context was that Steve Jobs felt one product was so superior to another one that getting the former was like getting a glass of ice water in hell. Its a perfectly relevant quote used in the same way that he did. I just happen to be comparing different products.

>It's usually best to ignore, though at times a response is merited when it's completely off topic and unhelpful (as it is here, the new Mac mini seems awesome regardless of a hater's two year old take on the tbMBP).

1. Its perfectly on-topic to discuss the quality of Apple products in a post about an Apple product. The reputation of their products is pretty valid when people are discussing whether to buy a newer product.

2. I didn't bring up the MBP, someone else did and I responded.

3. Not liking a single Apple product doesn't make me a "hater". I loved my 2013 MBP, I loved all of my iPhones/iPads, and I love MacOS. Apple released a product that I have an issue with. Lots of other people are having the same issue. I'm not sure why you have to react to that like a personal attack.

There's this phenomenon where if lots of people have an issue with something and then a random person buys the product and has a good experience, that person decides to dismiss everyone that has had an issue, pretend its impossible the issue existed, and then act like there must be something wrong with the people that had the issue. I don't really understand that because with any product that sells thousands or millions, its typical for some people to have issues even if most like it. In this case, it just happens to be a product where a slightly larger percentage than normal is having an issue.


I don't understand all the hate for the touchbar. I can never remember the functions performed by the function keys, so having icons indicating which functions are available seems really useful.


The problem is that most serious computer users use a laptop docked, so it’s useless 70-90% of the time, which means the software can’t rely on it, which means it’s an afterthought in all apps.


>most serious computer users

Ahh... the good 'ol Scottsman shows his face yet again! Right on time!


A Scotsman would be to say “No true computer user ...” - this was a generalisation. An apt one IMHO


No it wasn't. Define "serious". What makes one computer user more "serious" than another. The point of the no true Scottsman is that the term is never defined by the statement and so it's completely subjective.


That’s not how it works.


Actually it is: https://en.wikipedia.org/wiki/No_true_Scotsman

Refuting the initial generalization by saying that no "serious" computer users use their Touch Bar because their computers are closed is nearly a textbook example of the fallacy.


that's not what he (or she) was doing.


I don't see how much more clearly I could breakdown what happened. Their entire argument was predicated on the fact that most "serious" (replace "true") computer users don't use the Touch Bar because their lids are closed.

I'll compromise. We'll call it the no "serious" computer users fallacy instead of the no serious Scotsman fallacy.


why close it when 'docked'? mouse on right, keyboard in middle, touchpad and touchbar on left


Do you have to look down to use those icons on the Touchbar? This is something I don’t have to do as a touch-typist. Repeatedly craning your neck to look down will result in RSI. I spend a little effort to learn new hotkeys every so often and the benefit far outweighs the cost: I’m much faster and keep my good ergonomics.


I don't use the function keys much, so yes, i do look at the keyboard when using them.


Comparing the keyboard with a $200 walmart netbook is a joke. It might not be everybody's taste, but it's far from terrible. I actually like it and at my company where 95% of engineers have one, there is not more complains than usual about the new model. People that really can't stand the keyboard will usually use external peripherals at their desk.


I share in the hate. It’s the worst keyboard I’ve used in memory. The trackpad is first rate however.

The touchbar is so much less functional than the keys it replaces. Some key combinations now require serious acrobatics, and you can’t touch type as easily.

We’re not buying any more macs until it improves.


>the touchbar model absolutely hates it

What kind of work do you / your teammates do? I walked into an Apple store last week prepared to hate the keyboards and touch bars, but was very pleasantly surprised. My ladyfriend wanted to see the Air, but was so taken with the Touch Bar for photo work that she will be buying an MBP instead. While it might not be everyones cup of tea, I think Apple may have figured is demographics correctly on this one.


I don't have a problem with the touchbar, its the keyboard that they introduced on the same model.


I also think the new MacBook Pro is the worst laptop I've ever had. I hated it so much I switched to a Thinkpad. Thankfully these are company issued laptops and I could do this pretty easily. Had I bought a personal one I'd be quite distraught.


Anyone who hates the touchbar has never chased a chat window for the mute button (it's there). Also, no need to leave the keyboard to click on dialog box buttons - they also get there.

It could be better - it could require a bit more of pressure to press buttons and could have haptic feedback like the touchpad, but I bet someone is working on that, even if it means extending haptic feedback to the whole chassis (which is not a bad idea anyway).


I have a 2013 15" MBP, a 2015 13" MBP, a 2016 15" MBP w/ Touchbar, and a 2017 15" MBP w/ Touchbar. I vastly prefer the 2013 and 2015, even though they are bare-bones specced and the newer MBPs are top-of-the-line. The touchbar MBPs are provided to me by my employer. I just bought my 2015 13" this year, after considering getting a newer model, and I'm planning for it to be my main personal laptop for years to come.

I don't like the touchbar, I don't like the new keyboard, and I don't like the new ports.

I think there is something wrong if, now two years later, there are still people like me who not only don't see a clear benefit to upgrading, but see it as a net-negative.

Ideally, there should be nearly no one (if anyone) who prefers the previous iteration.


It makes things universally worse imo. Everything was by touch for me previously, including adjusting sound. I had the keyboard memorized. Now I constantly have to look down. A keyboard shouldn't require me to change my focus, that defeats the entire purpose.


Same. I bought a 2018 MacBook Pro with the Touch Bar this July. Before that, I owned a 2014 MacBook Pro, the last one with the mechanical trackpad. My new machine is absolutely better than my old one.


In what ways is your new machine better, apart from being thinner and just a bit lighter?


The dimensions aren't really a selling point for me at all. Both laptops are thin enough and light enough for me to carry them to and from work comfortably.

Performance is better across the board (6-core i9 vs 4-core i7, 32GB vs 16GB, faster SSD). It has a larger, pressure-sensitive track pad. The display is better.

I have a slight preference for the old keyboard, but I don't dislike the new one. I could take or leave the TouchBar. I was never a big user of the F keys and IntelliJ, where I spend a lot of my time, has good TouchBar support. I like having Touch ID. I'd prefer to have a real escape key, but the button is still in the same place so I haven't had to retrain my fingers to hit it.

USB-C with a Thunderbolt 3 hub is marginally more convenient than a Thunderbolt 2 hub plus a MagSafe power cord. I do miss MagSafe though.


These are almost exactly my thoughts as well coming from a 2015 MBP. I would only add that for me, I actually like the new keyboard. The low/firm travel of the keys is actually preferable for me. I felt like the old keyboards were "mushy". That's obviously a personal preference, however.


drcongo is making a statement about the population, and readers can be trusted to not be so confused as to think that this is the official speech of the people.


> Anyone who owns a MacBook Pro with Touch Bar will definitely be saying "yay it hasn't gotten worse!" about these.

If drcongo doesn't mean to speak for "Anyone who owns a MacBook Pro with Touch Bar", drcongo should refrain from using those exact words. And you should probably refrain from speaking for drcongo.


It's pretty common to make sweeping generalizations in casual conversation, which this is. I find it really annoying when someone says something like "everyone likes cats!" and then someone, inevitably, will hop in and reply with "ACTUALLY, I hate cats." This isn't a mathematics proof.


Sure, it's not necessary to completely accurate in casual conversation, but you can say "heaps of people like cats!" or "most people like cats!" or "almost everyone likes cats!" and make your point just as effectively, and without trying to invalidate or dismiss the dislike of cats.


http://surveys.ap.org/data%5CGfK%5CAP-GfK%20Petside%20Like-D... "almost everyone like cats!" Would definitely be false :-)


> It's pretty common to make sweeping generalizations in casual conversation, which this is

It's interesting that you chose the term "sweeping generalization". It almost always carries a negative connotation. In fact, it's considered to be a logical fallacy (see https://www.logicallyfallacious.com/tools/lp/Bo/LogicalFalla...).

> I find it really annoying when someone says something like "everyone likes cats!" and then someone, inevitably, will hop in and reply with "ACTUALLY, I hate cats."

I'm sorry I annoyed you. I find it annoying when someone speaks for me and gets my opinion wrong.

edit: Is it really "inevitable" though? It happens every single time? (just kidding)

> This isn't a mathematics proof.

No, but communicating clearly and honestly is important, even in casual conversations.


[flagged]


You know what? I love hash browns too. One of the best uses of a potato imo.

https://www.youtube.com/watch?v=BPX-wuplDvc


I'm terribly unhappy with mine and so is everyone else I know, so i'm pretty pleased with it


I like the Touchbar more than most of my other colleagues like the Touchbar. And I hate the Touchbar.


The new macbook pro keyboard is flimsy and easily broken. I bought my own macbook pro, delete key stuck.

Got a new one from my new job, keyboard stuck.

My brother bought a new one, keyboard stuck.


Clearly, you are holding it wrong.


You are liking it wrong :)


HN: This is the worst Apple product ever and Apple has abandoned satisfying their customer with the things they actually want.

Reality: Apple product breaks sales records.


McDonald's also makes record sales in Japan right now. So by your logic that must be the best hamburger in the market.


Would you suppose that price and convenience might be factors aside from food quality?

This is the thing - narrow single issue analysis that doesn't even acknowledge that other factors of consideration even exist, inevitably will result in conclusions with very limited applicability.


I don't mind the touchbar. It has it's moments. I just wish they had included it in addition to the normal keyboard, rather than replacing the top row.


I think really the goal was replacing the top row in the first place - the function keys are generally useless except for ancient legacy compatibility, advanced user's macros, and media control.

But the sides of the function keys are the escape and power keys, so the row itself couldn't go away

I wish the Touch Bar was farther offset, 50-100% taller for a better display area, and that the escape and touch id/power buttons were distinct. I also wish there was a standard way for applications to advertise functions for it (I have some really useful tools that are third party, such as mic mute).


I don't dislike the touchbar, I'm just annoyed at having to pay so much for something I'm ambivalent about.


I don't understand why it isn't just an option. If you want it give you can pay for it.


Hate it or not, Apple has to make it relatively commonplace if they want macOS apps to bother developing for it instead of ignoring it. Their short-term goal is 100% of MBP users, and surely, eventually 100% of all macOS laptop users.

Keeping it optional indefinitely, then, defeats this goal.


Then is it going to come to desktop machines too, where people often use battery-powered Bluetooth keyboards, or peripherals not made by Apple? What about the tiny 12 inch Macbook?

I don't see how this thing is practical across the entire Mac line.


The 12-inch Macbook has an F-key row which is what gets replaced with the touch bar.

I only estimated that they were going for 100% laptop coverage. But I can imagine a future where Apple keyboards have a touch bar option. Though not as important because Macbook touch bar penetration can drive developers to integrate with it, alone.

At which point it's not much different than gestures when it comes to answering your questions. What happens when you use a Logitech mouse on your macOS desktop instead of a Magic Trackpad?

Note that touch bar integration cannot have unique features, so it's never required. The challenge is to get developers to care about it which is the prerequisite for users to care about it.


From Apple's perspective, extending this to desktops kills two birds with one stone.

    1. Widespread touchbar adoption
    2. Increased sales of Apple keyboards


I don't see how you can add the Touchbar to desktops without creating a wired keyboard or adding an expensive big battery to it and jacking the price to $200.

Actually, I do: They'll raise the price of the cheapest iMac configuration and include the keyboard in the box. After seeing the starting Mac Mini prices, it makes total sense.


Probably because it can't be made to be "just an option".

Having seen teardowns of a MacBook, I'm pretty sure that one without a touchbar would be more-or-less a completely different computer - different keyboard, yes. To accomplish that, though, you'd need to also make a different housing to accommodate it, and a different motherboard, too, because this stuff's all soldered together as a single unit these days.

And for all that, people would still be griping about the keyboard and the monoport.


They sell a version without a touchbar.

https://www.apple.com/shop/buy-mac/macbook-pro


A low end 2-port version, yes.

They ignored those for the 2018 MBP refresh, probably because the new Macbook Air was coming. I'd bet on the full keyboard MBP getting discontinued now that it's out, or best case having the 2017 hang around while the touchbar models continue to see updates.


I doubt the "MacBook Escape" (13" non-touchbar) will be revved, because it will either be met by the air or an eventual revved MacBook focusing on ultraportable demographics.

The Escape has higher powered CPUs than the Air or MacBook, but I don't think anyone is buying from those three based on horsepower.


Right. But, echoing what I said, it's not "just an option" - it's a whole different model that they happen to be selling under the same name. Different CPU options, different monitors, different port configuration, etc. They don't even have the same number of microphones.


Exactly!


I don't have a problem with the Touch Bar. It's kind of handy for occasional things that are context specific and normally involve using the mouse.


I think the thing people have a problem with is the atrocious new keyboard


It's taking me some time to get used to, I'll grant you that.

I'll be getting a 2018 Mac Mini once they're available for order here, so this machine will see much less use then, just basically if I need to be able to do something while travelling (which isn't that frequent).


Chiming in to agree that it wasn't a problem here. Though I also never understood why someone would be crazy enough to leave ESC as the default keybinding anyhow.


You'd think that Courage Incorporated would have had the gumption to officially move Escape to where Caps Lock is.


Some people use both ESC and Caps Lock all the time.

How else would you bind ESC?


Well, tbh, if it's simply a choice of which key to map to the currently-Caps Lock key then Escape should be the winner even if exiling Caps Lock to the Touch Bar would cause significant annoyance to some heavy users.

However, while users are mostly just faced with perhaps-hard choices about how to remap the given keys, the keyboard manufacturer and integrator Apple had many other options. Caps Lock is currently 2U wide on both ISO and ANSI layouts https://i.ytimg.com/vi/3tJagPz-xIw/maxresdefault.jpg , so it could fairly comfortably be split into a 1U Esc and Caps Lock. You could even give Escape the outside position: that's not a big reach, just the mirror-image of the ISO English # key and more convenient than Esc's existing position in both ANSI and ISO, and there's an arguable case for not putting Esc on too much of a hair-trigger position near the home row anyway. Alternatively you could carve a 1U key out from the right of the right Shift, which is pretty uselessly overlong on both ISO and ANSI.


Caps-Lock is a waste of a key. NeXT had it right by using Command-Shift (without any other keys) to turn on/off Caps-Lock.


I learned high speed touch typing in the eighties, with included heavy use of the caps-lock.

There is nothing wasteful about it when you've been trained that way.


I don't think that really changed the implication. Regardless of whether he's speaking specifically about the Mac Mini, about Apple products in general, or about the entire industry; he's still saying that it's bucking the trend of "being a series of compromises and disappointments sold as innovation" and that almost everything is genuinely an improvement.


I'm going to hold on to my Mid-2015 MacBook Pro for dear life.

I'm hoping it will be good for 3 more years.


You might be alright - I'm still on a 2009 MBP 13" (4Gb SSD) and a 2010 17" (8Gb, SSD) hands down the best money I ever spent.


I'm using a 17" 2010 at work with an i7, SSD upgrade, and 16Gb of RAM and its plenty fast still. At home I've got a 2010 Air that begs for more RAM yet still runs as well as when I bought it including 3+ hours of battery life depending on if I am surfing or writing. 2010-2012 was a high point in Mac laptops.


You're not alone. I actually bought one refurb from Apple this year and I plan on extending the AppleCare to the full 3 years.


My mid-2012 MacBook Pro works like a dream and has never had a problem, FWIW.


Given the use of the word "donglage" I suspect it was a somewhat snarky reference to the iphone and the disappearing head phone jack.


It's actually a reference to the USB-C ecosystem for the laptop line; Marco and his cohosts on the Accidental Tech Podcast have burned a lot of time griping about the state of connecting things to a laptop. Not so much time on the head phone jack, which was mostly a no-op for them (since it was paired with the airpods).


No, that quote is not out of context at all. The expectation that Apple would in some way ruin the Mac Mini in the next update is the topic of several of the first paragraphs in the article. The gist is indeed "It hasn't gotten worse! Yayy!"

Granted, that's not all of it. He is celebrating that it's gotten better, but he is also specifically celebrating that it hasn't lost many advantages.


> "Unless you used optical audio, audio input, or the SD-card reader"

Craftily hidden in a footnote for that paragraph.


They probably figured that people who want these are acceptably served by USB.


Yeah. For SD cards in particular, having the reader slot on the back has always made it useless for me.


External card readers are more flexible. You can get more options with multiple slots, or card sizes, or whatever.

Similarly with audio... I don't know if this is true with Macs too, but on PC laptops I've often found the mic quality sucks or has interference, and I started using USB inputs.


Ha. I completely forgot about the SD card reader. Probably because I plugged it in and never looked at the back again. Having an SD card reader on the back of a device just isn't practical. I've just been using a USB hub with an external reader.


The SD card reader was the first thing to break as well. And the cost of replacing it was way more than a SD > USB dongle.


I missed that! So not only is 'not any less useful or versatile than the outgoing Mac Mini, including the generous assortment of ports' a shockingly low bar, it's also what might charitably be termed 'rather a bit of a fib'.


It also doesn't look like it has a DisplayPort. If I wanted to hook it into my KVM, I'd need a USB-C to display port dongle.


It has four Thunderbolt 3 connectors. Thunderbolt 3 features DisplayPort 1.2


With the old one, you would probably need a mini DP to DP dongle anyway (I did for my displayport KVM on my 2012 Mini), so this isn't really worse, but it might mean you now need a different dongle.


I'm really hoping there's a good/reliable solution for 30" Apple Cinema Display > Dual Link DVI > Mini Display Port > [INSERT PRODUCT HERE] > USB-C


From what I've heard, you'll want a USB-C to DisplayPort+USB-A adapter, and an Active DVI (with dual link) to DisplayPort adapter.


I use optical audio in a mac mini as a home theater TV driver. No thanks.


These days optical audio is dying when it comes to home theater systems, primarily due to its lack of codec support. You can't passthrough DTS-HD over optical, you can over HDMI.


I used to use optical audio but it was superseded by HDMI for home theater and USB for hifi.

edit: I'd be curious to see what people are using optical for today (I'm under the impression audio pros mainly use USB interfaces as well)


Pro audio for recording has been shifting to USB over the past few years. They were heavily on Firewire or PCI card formats before, but USB has finally gotten fast enough to do the throughput, and of course it's much cheaper and more standard.


USB has had the throughput to handle multichannel audio for decades; USB still has issues regarding device priority when it comes to the nature of its interaction with the OS.


And thus throughput is not enough. In pro audio, we can't afford dropouts or lost data, even when the CPU is under heavy load.


I'm not a pro audio guy, but I think if you want to work with any external audio device and want that connection to be noise-free, your safest bet is still optical.

I have an external DAC/amp, and I had been using optical until I recently upgraded my workstation. Now I am using USB, because the new motherboard doesn't have optical; I assumed USB would be fine, so the lack of optical wasn't something I considered with the purchase.

Unfortunately there's an awful amount of hiss with USB on this DAC. I'm not knowledgeable enough to pinpoint what's causing it, but standard remedies I've found (eliminate ground loop, try ferrite cores, get a usb filter) have not made a difference. It could be the USB interface on the DAC side, I have no idea.

While I don't _need_ an external DAC, I like using one for a few reasons: mine has a low output impedance which is good for some headphones, a convenient switch to toggle between headphone and speaker output, and a big volume knob with really smooth action. I like it. The alternative is to use the motherboard's ports; this motherboard does jack sensing such that the rear line out gets disabled if something is plugged into the front, so switching between headphones and speakers involves plugging/unplugging the headphones all the time. I also don't like adjusting volume through a tray applet.


Since it’s USB, it has to be the DAC generating the hiss. No amount of electrical interference can do anything to the digital audio going Ofer USB.


The electrical interference can't do anything to to the digital audio, but interference on the USB cable can potentially be picked up by the analog amplifier circuitry in the DAC. I had a particular combination of headphone, DAC and amplifier years ago that I could hear electrical noise on when no music was playing.


Absolutely. I work as a sound engineer and have used lots of things of this type, and it's particularly easy to understand how a USB connection to the computer would wreck the sound of a cheap (sub-$1000, not professional) DAC.

Optical digital means perfect isolation from the ground plane of the computer. It's that simple. The DAC can do whatever it needs to manage its own noise levels, but you're pretty much guaranteed a huge difference from entirely decoupling the DAC from the computer's ground plane. That electrical interference can do a surprisingly enormous amount of damage to the analog circuitry of the cheap DAC, which itself is probably not very resilient at rejecting any sort of electrical interference.


But it sounds like its the same DAC, which was hiss free with optical. So it's not the DAC "generating" the hiss. Rather electrical "interference" making it over to the analog side from the digital side.

So it's the DACs fault, but maybe it's fair to say that it's harder to make a good DAC when USB is how the data is being delivered. I'm sure a high end USB DAC doesn't have this issue, but I'm pretty impressed with my cheap offbrand optical DAC that was $11. I'm guessing I would not be so impressed with the $11 USB version.


A higher quality USB DAC either has to do quite a bit of signal clean-up, and may use its own power supply rather than trying to clean up the USB power from the computer.

Optical has the benefit of always being electrically isolated.


I thought so too, but regardless, with this same device there is no hiss when connected via optical. So if I had the option, I'd be using optical vs. USB today, which was what the GP asked.


Logically, it seems like if you had a USB-optical dongle you would have no further trouble. I don't know if there is such a thing, but I do know that in that situation all the 'it's digital, so it should be perfect' talk becomes somewhat true.

You'd be converting from USB to optical, at which point you'd break the ground connection which would be where your hiss is coming from (assuming it's still noiseless when still being used with optical). Then, your concern would be jitter and whether the added conversion is adding lots of jitter to the equation. Your DAC might (or might not) be good at rejecting jitter noise. I've got a Lavry DA10 that's exceptionally good at rejecting jitter (in crystal mode), but that's mastering grade and maybe overkill for you.

It wouldn't add literal noise, but it's also possible for the USB connection to be more jittery than a different computer making the optical connection. That's partly hardware and partly software design (controlling how the data stream is buffered, and associated things that might slightly modulate the audio data clocking). So a change in computer feeding the DAC could also substantially affect the 'sound' of the DAC, as well as the noise issue you observed.


I agree, that should work. I haven't seen USB-optical devices that weren't straight media converters though. Everything that converts USB to spdif is advertised as a DAC, so I'd have two DACs.

This isn't a huge issue for me, it's just an example of where I'd prefer to use optical. If it ends up bothering me such that I need to fix something, I'll 'fix' the motherboard.


> Unfortunately there's an awful amount of hiss with USB on this DAC.

If you used optical before, it can be reasonable to use USB->optical. A lot of USB audio interfaces already have this. I'm not making use of it, but the USB interface I have on my Mini has an optical out.


I bought a soundbar for my TV a couple months ago. Recent Samsung model. Has optical. I beleive the only other option was 3.5mm jack.


Interesting. I bought a pretty cheap soundbar this year (from Yamaha) and I think it had optical as an option (along with 3.5mm and Bluetooth) but the main interface was HDMI-ARC.

And it works great over ARC! The TV remote's volume buttons get passed through automatically, and it turns itself off and on along with the TV. I don't even know where the soundbar's own remote is anymore.


The easy answer is: same thing I was using it for 10 and 20 years ago. Audio equipment doesn't age nearly as quickly as personal computers. I'm not going to replace my entertainment system just because there's a new Mac.

Also, I've had compatibility problems with HDMI between my old Mac Mini and my receiver. I don't know how to troubleshoot those sorts of issues. I do know that optical audio always works perfectly with every device I've ever used it with, though.


Shockingly, I need optical for a $300 set of gaming headphones I purchased within the last year (Astro A50). Maybe it’s the only way they can do pass through, dunno.


Headphones are stereo (okay, there are some exceptions), and optical S/PDIF can do stereo PCM, so there's no quality downside. But for home theater audio, you can't do any lossless multi-channel formats over optical S/PDIF, so the vast majority of enthusiasts will use HDMI.


It’s (likely) because they don’t want to add a secondary hdmi output for audio like a lot of other high end gear uses.


Never would have guessed that for headphones! I guess they have some kind of a dock?


Wireless (non-Bluetooth) Sennheiser headphones also take optical input (as well as 3.5mm). Since they're only doing stereo, and require a digital-to-analog conversion on the headphone side, makes sense to avoid an additional analog-to-digital on the transmitter side.


I'm probably in a tiny niche but I use it so that I can have optical out to my speakers and analogue out to an extension cord with a plug next to my keyboard that I plug my headphones into. This means I can swap between audio devices without touching cables or plugging things in to go headphones -> speakers depending on whether I want to annoy my wife with my music or not.


I have the same setup actually! But my speakers only take analog in so I use a cheap USB sound card.


Sonos, which might be one of the most hyped average user audio things in the last years, only offers optical inputs for their soundbars. Imho it's a pretty huge limitation on Sonos side, but maybe they did it because HDMI ARC is still flaky in many setups and they prefer ease of use. Or simply because of cost savings.


I've got a pre-ARC TV, so use optical for my home theatre. Am hell-bent against smart TVs, so my next display will probably be either a digital signage display or a projector still without ARC.


I want to smash a Sony engineer with a hammer because my goddamned $5000 smart TV keeps insisting on switching from tv audio to external audio to tv audio to external audio to tv audio to external audio.

It doesn't matter that I have my audio preference set to "prefer external audio" because the effing thing decides to switch back and forth between 2 and 30 times on startup, and maybe 80% of the time it settles (properly) on the receiver, and 20% of the time it tries to use the TV speakers, which are off, and of that 20% of the time, at least half of the time I HAVE to power everything down and start all over again. The other half of the time I can go through the menu with the slow-ass on-screen menu, and manually toggle back to receiver.

I just want an effing option that says "never in hell try to use the fucking TV speakers, for the love of god".


My (very low-end) Samsumg soundbar only has optical and 3.5mm in. I think a lot of cheap equipment still uses optical.


Still pretty popular in the 2-channel audio world... audiophiles, in other words.


Somehow HDMI audio is still less convenient for me.

I send HDMI to my TV which sends optical to my sound system, and it still works pretty well. Earlier this year I bought a nice sound system with a receiver for the reasons you mentioned, but it added extra lag to my video games, so I returned it. My current sound system also supports ARC but it causes most of the same problems. Decoding those fancy codecs creates lag on every system I've tried, so sticking to LPCM over optical seems to be the only way to enjoy games still.


says the person who keeps buying products without optical audio ports.


Isn't that little headphone port a combo optical/electrical port that you can plug either headphones or mini-toslink-whatever into ?

Isn't that what it was before ? I swear I had previous mac minis with a combo port like that ... why would they not continue that ?


Yes that is precisely the thing that used to be in all macs that they discontinued


Now that I think about it, axing the optical audio ports was probably a deliberate move by Apple to nudge home TV users of mac minis into using Apple TVs, where they have way more control over the interface and content.

edit: apparently Apple TV doesn't have optical audio either.


I doubt it -- the number of users who use a Mac Mini as an entertainment center is probably so small Apple barely thinks about them. More likely is that HDMI has superseded the optical audio port for most users, so they save money by removing a port few use these days.


It's still a little weird. I think a lot of people would have a separate A/V receiver and speakers; maybe this is changing as old equipment gets replaced, but I suspect there are still far more homes with optical-capable A/V receivers than HDMI-capable ones.

This is just a standard annoying thing with Apple though. Their designs are very 'forward looking' in that they don't consider what potential customers already have so much as they do what their own future peripherals are going to need.


At this point HDMI is not a particularly new thing; my Marantz receiver from ~2007 had HDMI connections on it. Audio-only digital connections aren't necessarily going the way of the dodo yet, but they're becoming progressively more niche. So are home theater PCs, of course.

Anecdotally: I do have a Mac mini with my receiver, which I replaced this year. But the new receiver not only has USB input, it has wifi and built-in clients for Spotify, Tidal, Pandora, TuneIn, Roon, whatever protocol Windows uses for media sharing whose name I'm utterly blanking on right now, and a dozen or so other services, with Apple Airplay 2 theoretically coming in an update.


The AppleTV doesn't have optical audio either.


Ok well I guess that's even dumber than I thought. I hope apple takes that $2 per device savings in optical audio parts laughing to the bank.


Don't judge, maybe the next version of the mac mini will be waterproof!


lol


Marco gotta keep those free gifts from Apple coming in


You obviously haven’t seen Marco complain about the MacBook, Apple Watch, the existing Mac Pro, the 2014 Mac Mini upgrade, etc.

Marco made a few million when Tumblr was sold, has a profitable podcast app and is on either the most popular or second most popular Apple related podcast.

I doubt that he’s hurting so much for money that he has to lie to get a loaner Mac Mini.


Alternatively, lower down the article:

> I Can’t Believe The Mac Mini Is This Awesome, I Can’t Even Say “Again” Because It Never Was

Sounds like his conclusion is that it's better than it used to be.


That's a reasonable expectation from Apple these days. Their "War on Ports" and "War on Repairability" has gotten to groan inducing levels.


I would phrase it more like "Yay it has made improvements without compromises"


The MacMini is a near perfect computer in its category; only thing was it wasn't updated for years. Now it is.


The combination of price and absence of user upgradable M.2 SSD drives make this product un-viable in my opinion. The current price is fine if I could upgrade the drive later. Non-upgrade-able SSD would be acceptable if the price for larger drives was more reasonable.

If SSD could be upgraded buying a i5/i7 would have offered amazing upgrade paths down the road. This machine misses the mark IMHO but not by much.


You can upgrade the drive, externally through tb3. There will be tb3 enclosures that exactly match the mini’s size (historically there always have been) and it won’t even look weird. I did that on a previous mini (firewire).

Having said that, I’m still going to get one with 512gb ssd, because I just want everything neatly in one box.


I believe there was speculation they would stop the Mac Mini line altogether, so perhaps it's more of a comment on the fact that they built a new one.


I was doing some research a year ago or so on which Mac Mini to buy - currently using one from 2006 or so, which can't handle https sites mostly, but otherwise is great.

Everything I read said to get the 2012 model, not the 2014 one, which was slower, less memory, less everything - way worse. So the last one before this new one definitely did get a lot worse.


I have a 2006 Mac Mini that refuses to die. I ended up putting Windows 7 on it and giving it to my mom. She still uses it as a secondary computer when she is tutoring.


And it has got a bit worse. Just not as worse as we'd expect from Apple.


Well, quite a bit faster, in ways that are relevant to my interests. Stands to reason designing a thing like this together onto a bespoke board allows for some optimizations: it's hilarious that it's apparently faster than any other Mac anywhere ever, for single-core things that suit it. Including Mac Pros and iMac Pros… I just think that's amusing.

I'm now wondering if VCV Rack is sufficiently multicore that it will perform better on more expensive Mac boxes, or whether this little thing has now set the bar for ability to run demanding modular synth software live. That would be really convenient.

Building a PC mothership is certainly cool as hell and I'm not knocking it, but there's something to be said for 'this is my live performance rig, it's stable as a rock and it fits in my pocket. And it's cheap enough that if I'm headlining Coachella with my modular jams, I'll buy a second one and clone it so I have a backup, there onstage ready to be plugged in if there's a problem'.

I'm with Marco on this one. Looks nice. I know what I'm getting with a Mac of this type, and this definitely looks nice to me.


And it still has real USB A ports so you don’t have to hop your audio interface through a C to A dongle thing. Though I don’t know if a C->A adapter is just rerouting wires, or has circuits with their own latency - either way, it’s a bit of mess you don’t need with one of these.

(Assuming you use a typical USB A audio interface rather than something more exotic)

FWIW, I also do electronic music, though just as a hobby.


Nah, my audio interface is more exotic. Thunderbolt MOTU 16 channel in and out, 192k/24. So I love that it has so many thunderbolt ports, because they're spoken for. I think you might well be able to run two MOTU 16As and have 32 in, 32 out. You'd start to have issues with drive space but the computer can probably handle it.


"more exotic" :-)

OK, hardcore studio level stuff.

Yeah, that's a bit beyond my hobby "Behringer" type stuff. Recently upgraded from 2-in/4-out to 4-in/4-out. Still plenty of bandwidth on USB 2 for that.


Worse how? There are a couple of directions you can go in, just curious (price/storage)?


Probably price. $800 is pretty steep for the entry version, when you can buy a better equipped Lenonvo ThinkCentre for $200 less.


A Lenovo ThinkCentre is like 10x bigger than a Mac Mini. I wouldn't want one of those sitting in my living room. And my wife would absolutely forbid it.

The only real competition I see for the Mac Mini is the little computers from System76. But they're about the same price.


My Lenovo ThinkCentre M710q is almost identical in size to my old (very) Mac Mini.


I just customized a Lenovo ThinkCentre with a Core I7 (I’m assuming it’s the six core), with a 128GB SSD, 8GB of RAM, Windows Pro, and a WiFi card and the price was $979. The equivalent Mac Mini is $1099.

Are the USB C ports Thunderbolt?


Does that Lenovo have Mac OS?


Obviously it does not. And that's a pity. Whether or not that justifies a more than $200 difference is up to whoever wants to use it.


This is going to best outcome as Moore's Law ends.


People are missing the point about the integrated graphics. For me this is a feature not a bug because it has USB-C instead.

That means that instead of hopelessly compromising thermals and power supply with some power hungry, yet limited GPU (because of the small form factor) you simply buy the CPU and memory that you need. It simplifies the job of keeping it cool.

CPUs are evolving a lot slower these days. This one should last you many years before it becomes a problem. Having upgradeable memory all the way to 64GB means that too is not going to be a problem any time soon. It means that a mac mini should have a serviceable life of 3-5 years or more if you are less of a power user.

Additionally, you plugin an eGPU of your choice and additional storage via USB-C. Better, when improved eGPUs become available, you can sell your old one and buy a new one without having to tear the machine apart. They also don't overtax and compromise your power supply or cooling.

I currently have an imac 5K that is nearly 5 years old now. I maxed it out at the time with all the bells and whistles and it has served me well. So, money well spent despite the shocking price initially.

I'd totally consider spending 3-4K on a setup with a mac mini, decent eGPU + monitor, and external ssd storage (I actually have a Samsung 2TB T5 already). The new imac pro would cost more and deliver less value. What I like about this setup is that it is completely modular and I can replace individual components without having to worry breaking the other components.

I imagine the mac pro next year will also emphasize expansion and upgrades through USB-C rather than internals. Basically the old model without dedicated GPU and upgradable Xeon CPUs + Memory would be exactly the right product right now. Egpus can be replaced easily and with a solid base configuration, a pro machine should have a long productive life.


That sounds like a horrible mess of wires and power strips to me. If only they could make a single case to fit all these various computer components in, and one chord to supply power to all of them.


That doesn't sound like something with "mini" in the name. Maybe an alternative product with "pro" in the name?


Sure, except the 'trashcan' Pro doesn't have room in it for anything either, it was also designed to daisychain drives etc. via TB. My very elderly 'cheesegrater' Pro of course does have lots of room for 3.5in drives and PCIe cards, but I have wrung about as much as I can out of it, so the new Mini has some appeal. I am skeptical that the new-new Pro of 2019 (assuming it materializes) will be designed with extensibility in mind...


It was a sarcastic attempt to point out that scattering the components of a computer into 3 or 4 units with separate power chords is an inferior design to the standard desktops we’re used to. I want a Mac desktop for under $3,000 that doesn’t require multiple accessories plugged in to behave like a desktop.


> If only they could make a single case to fit all these various computer components in

They do. It's called an iMac and comes free with an excellent monitor.

Now, seriously, unless you plan to do a lot of GPU computing (I don't) the internal Intel graphics are pretty reasonable. I can't see how a 64 gig mini would not be able to be my main machine for 5 years or more. With an updated SSD, my previous mini is a pretty good general purpose computer.


It's not so much that I want a gigantic GPU, but I'd like something that could just run two external 4K monitors without stuttering. That would make this an excellent general-purpose desktop machine. The 15" 2017 MacBook Pro does this fine, so it's hard to accept that it would be difficult to engineer something into place. Unfortunately, an eGPU is gigantic external box costing a large chunk of change, so it's not really a good option.


With a laptop, an external GPU is indeed a big issue in terms of form factor. However in your case, your big 4K monitors are comparatively large as well so having an eGpu that you can also use as docking station is not necessarily the end of the world. Likewise, with a mac mini this should be less of an issue. I'm also guessing this market will develop over the next few years with more attractively priced options aimed at people who just want a middle of the road setup that works reasonably well for games and vr. Right now all the available options are targeted at people who in any case spend too much money on HW.

Besides, with Apple you are paying a premium for a dedicated GPU in a macbook, which makes the price of an external eGPU less of an issue. Because if the choice becomes paying 500 euro extra for a mac mini with a last generation laptop grade GPU or spending the same on a eGPU with modest specs (or a bit more on something fancier), that's a lot easier to defend. Especially if you can swap it out once every other year for a new one.

I'm guessing that with the mac pro they may offer some options for a dedicated, non upgradable GPU as well as eGPU options. Most professionals would probably prefer investing in the latter because it provides them performance without much compromise and an easy way to upgrade. E.g. I'm still waiting for Apple to make a move with AR/VR, which they sort of support but which they don't really actively promote currently.

Anyway, the mac mini is not necessarily that interesting for people looking for cheap options. The start configurations are nice if you don't need much but most more serious users are going to want bigger ssds, more cpu, and more memory. That creeps into the mac pro use case already. I'm guessing that in terms of price segment this thing is designed to fill the gap with the mac pro, which will likely have a starting price of around 5K, just like the last one.


What's the need for an eGPU if there's enough space for a GPU in the unit?

I long for a return to the Powermac G5 'basic tower' form factor.


The MBP 15" mobile Radeon Pro is decent and does 4K well. It would only cost Apple 1mm of extra thickness and what, $60? There is no excuse.


I wish someone made MacMini-sized eGPUs and external disk drives so they could be stacked on top of each other. That'd look very good aesthetically with minimal or no mess of wires.

An ITX format video card (~7") should fit into MacMini-sized case (7.7"), not sure about PSU and heat dissipation though.


If it was the same shape as the MacMini but just taller then it might still look cool stacked.


What would you recommend for a Mac-compatible eGPU these days?


The Blackmagic eGPU is designed to be plug-and-play compatible with Macs. You can even buy it directly from Apple:

https://store.apple.com/xc/product/HM8Y2VC/A

Also, one nice thing about this eGPU is that it has two Thunderbolt ports, so you can plug a Thunderbolt display like the LG UltraFine 5K directly into it.


Well, THAT's interesting. I've been looking at some Blackmagic stuff for streaming and camera work, that would take burden off my desktop computer. It looks like the new Mini plus this plus outboard x264 encoding (also includes ability to run a good dynamic mic into the encoding) would add up to an insanely flexible livestreaming setup that could do both GPU-needing things and camera-based things without loading the computer, and be expandable later with SDI camera inputs and the ability to do production video switching.

It… didn't occur to me that I might be thinking about running such a rig off a Mac Mini. But then it didn't occur to me that a Mac Mini would come out and be in some conditions faster than any other Mac currently made. Interesting times…


I have an HP Omen Accelerator with an RX580 and a 2016 MBPr and it's plug and play in macOS 10.14. Last I checked, you could get most Nvidia GPUs working, but it required some spelunking. It may be different now.

I've been very happy with it overall. I have a 4k 28inch monitor and a 30 inch Cinema Display plugged into the GPU, and the eGPU enclosure has a sata port + ethernet port. There are some gotchas, eGPU.io is a great resource.

Also, bootcamp with windows 10 works with my eGPU, but required some work. If you want more info, let me know.


I've been using the Razer Core X for a few months now and it's worked flawlessly.


Have you tried swapping it between machines? Could I use an eGPU with say a NUC and the Mac Mini?


You have small form factor PCs with GPUs. But the exterior won't be as pretty because you have much more thermal to deal with


> But the exterior won't be as pretty

There are actually plenty of nice looking small form factor PC cases that can house a full GPU

- http://www.louqe.com/

- https://www.dan-cases.com/dana4.php

I'd actually say this is one area of the custom PC market that's well served by nice design.


Those are 7 liters or more, or the volume of five Mac minis. So, a different class of “small”.


They are also far from nice looking.


i like that idea, but I also need a laptop and investing in both seems overkill for my needs. But i guess with a current 6C Macbook and 32GB + eGPU, i could kind of do the same thing.


a) You assume that at least some GPU power in SFF wouldn't

- be possible (while SFF devices like Hades Canyon et al. with e.g. Kaby Lake-G manage to do just fine without breaking thermals and noise)

- still be immensely useful for many/at least better than current integrated Intel (which is a significant step backwards with the dropping of Iris)

b) You assume to have USB-C/TB3 instead, when there's no argument against having both.

And for anyone really not needing it and arguing about the unneccessary cost - just restrict it to the more expensive SKU, done.


The pricing is not really justifiable especially given the most use cases of the Mac mini (Home Theater, NAS, Backup etc.) are better handled by other OSes.

The NUC8 with comparable specs (sans 6 core CPU) comes at under half the price! If you needed the 6 cores sure but given most workloads other than encoding don't - it's questionable. Even then the 6C processor doesn't cost enough to make up for the difference.


What OSes are better for a "Home Theater" TV computer?? That's a serious question; I recently replaced a creaking old Mac Mini (that couldn't do 4K) with a new Dell, after going a little bicurious with respect to Windows 10 (for software development work)... and, surprisingly, it's a complete and utter shitshow.

I was astonished, actually; I had assumed Windows would be better than macOS as a TV computer, other than integration with Apple services (Apple Music, my kids photos as screensaver).

Nothing could be further from the truth. It's a shitshow. 100% of Windows media players are garbage (VLC included, and there is no Movist). They can't play high-bitrate video without stuttering (on way better hardware), they show some ungodly mishmash of scaled UI and tiny unreadable UI on a 4K TV, for each player you install (about 10, so far, for me) you have to google for an hour to make sure they aren't malware (and of course almost all of them nominally are, trying to install all sorts of insane adware shit during the install phase, although that is par for the course on Windows)...

It's been 2 months and I saw the new Mac Mini and despite the 200%+ markup on storage I couldn't help but think Hmmm....

My TV also has at its disposal Xbox (OK but not great), PS4 (pretty shit), iOS (Apple TV, pretty shit), and Nintendo Switch (has no TV computer features at all, basically).

So what OSes are you talking about? Linux?? Android?


> What OSes are better for a "Home Theater" TV computer

I use an Apple TV 4K. For non-netflix/hulu/primevideo stuff, it streams off a Synology NAS (using Plex or VLC for Apple TV)

edit: it's funny how your experience pretty much mirrors mine almost 10 years ago. A Mac guy outfitting a machine for a home media center. I was used to Mac media stuff always being behind on codec support and performance so I bought some HP PC. It never worked right, always had to mess around with audio outputs (HDMI audio output worked maybe once) and codec stuff kept breaking since either Windows or the Nvidia drivers hated trying to do hardware acceleration and you just got a green screen. We mostly ended up plugging in our MacBooks...


I used to use Plex, but went back to AppleTV/iTunes.

For some reason Plex wouldn't see half of my video library. I eventually tracked it down to Plex not liking the filenames of certain TV shows.

My TV shows are named Show Name - Episode Name.

Plex wanted them to be ShowName.s0x.e0x.

I'm not going to go through and rename all of my files. Plex should just work from the metadata like iTunes does. That's what metadata is for.


I think it's a philosophical difference. iTunes is more database driven (which I really liked at the time for music). Let it control your filesystem, drop in your files and it'll load from metadata. Any changes will only modify iTunes' DB. Use iTunes to decide what to sync to devices.

A lot of people were super annoyed coming from WinAmp. They organized their library by filename and the metadata was often a mess. They also wanted to treat iPods as USB drive and drop files on there.

Plex is even different in that it only looks at the filename and pulls metadata from the Internet.


Not sure if you are aware of this, but you have to have your TV shows separate from your movies in Plex. When Plex scans files, it uses different logic for TV show and movie file name. I had a similar problem where I imported my TV shows as movies... some worked, but most didn't. After moving my shows out of my movies directory and re-importing them specifically as tv shows, everything detected perfectly.


Not sure if you are aware of this, but you have to have your TV shows separate from your movies in Plex.

Nope. Had no idea. And no idea why I should, since iTunes has no problem with it. I'm not going to spend six hours sorting 4GB of files because Plex can't read metadata that iTunes can.


We've moved over to Plex from iTunes. The iTunes Media library folder is already sorted into separate Movies and TV Shows directories, so just added those to Plex separately.

We're running the Plex app on an AppleTV (4th gen, not 4K), and it all works really well. The only issue is that it loses connection to the server maybe once every month or two, but I think that's related to my really crappy modem.

edit: Plex server is running on the 2012 Mac mini "Server" edition. Quad-core, baby!


I'll have to try this again. I put VLC on my AppleTV4, and I don't remember having luck playing things off my Synology. I'm using Kodi on my Android TV-equipped Sony to stream from the NAS, which works, but the UI is a disaster for adding sources. I finally got it to work, but I never want to have to change anything again.

For me, using Plex is an utter and complete nightmare. Anything with transcoding and library reindexing is a no-go. I've tried to be happy with it at least 6 times, and each try was an unmitigated disaster. I just want a thing to play files from storage.


Plex sucks on the Apple TV (and likely most other devices) because unless the video fits in a very narrow band of codecs (AVC, maybe HEVC with DD audio) it's going to insist on transcoding the video.

The absolute best setup I've found is Infuse (https://firecore.com/infuse) with Plex as the backend. Infuse will ALWAYS request the raw video/audio stream and does all the decoding on the Apple TV, so your Plex machine can be very modest (it's never transcoding, after all).

You can also forego Plex and just have Infuse index all the content, but I've found that it's nice to have a single Plex backend that can be shared amongst multiple Apple TVs, iPads, etc.


Which codecs do you mean? Guess I'm weird; I have only ever used h264 and now hevc, any others are only ancient lower-than-SD quality files which transcode without any issues. (Many files are .mkv; they still work without a transcode.)


Well DVDs are MPEG2, and some really old Blu-Ray releases are as well. You also see VC-1 in older releases. Probably should have been a little more specific, Plex will insist on transcoding if the client can't handle the codec natively. Generally speaking that means AVC or HEVC (on newer devices) only.


True, thanks. There's AV1, VP9 and related codecs too, but I've not seen many sources that only provide those and not one of the MPEG codecs as well. I've moved towards HEVC for anything I want to hang on to long-term, now that all my devices can handle it.


my apple tv 4 plays natively 1080p h265 with 5.1 audio with infuse 5 pro served by my pentium with 8 gigs of ram plex server over lan. Strangely, i am sure and old version of the plex app did the same. i think its more an issue of the way the videos are muxed and the audio codec.


Plex is the least nightmarish of the video library apps I've used in the last few years (Plex and Kodi are the only ones I've ended up using extensively, though I've tried out several others).

Transcoding is obviously a last resort, but I've used it on rare occasions for streaming live and recorded TV from home to the Plex app on my iPhone and it works reasonably well. For my home theater stuff I absolutely don't want any transcoding at all, and Plex is generally smart enough to figure out what formats my playback device is able to decode on its own. When I first got a Chromecast Ultra, I had to fiddle with some XML config files in Plex because Plex was only aware of the non-4k Chromecast and thus would try to transcode 4k content, but I believe that Plex resolved that issue fairly quickly.


The issues I had were that, first of all, Plex server wouldn't run on my ARM-based Synology, and then later, I think there was a build for it, but it didn't have the horsepower to transcode, and it seems like it ALWAYS wanted to transcode things.

I've also tried running Plex server on my Win7 HTPC with the volume mounted over CIFS, but I had various issues with that; for one, having to reindex the volume means I have to either wait 15 minutes to watch something I just procured, or manually login and force a refresh which is annoying when I just want to browse files. It's been awhile, but I have tried a number of approaches (including Plex server on OS X) and none of them were satisfactory. Furthermore, I could never figure out why any of it was preferable to simply having a media player read files from a network store and play them. I don't care about "album art" type stuff; I don't download/store/play hollywood movies or music, so those features would be irrelevant even if I cared about them in general.

The one nice thing about Plex is I now have a few friends sharing their libraries with me over the internet. It's a cool feature, but I'm only using it from the client side, and I typically forget to even consult their libraries before paying to rent content.


Yeah, I just have mine running on my primary desktop (an iMac), but I've done a bit of research on consumer NAS products and the Plex support definitely seems spotty.

As for Plex wanting to transcode things, I mentioned having some issues with my Chromecast Ultra, but even that wasn't too bad. There should definitely be an option to not transcode video at all and let my playback device tell me that if it really can't decode the file. As far as I could tell, that option doesn't exist.

I haven't had any issues with library refreshing with Plex on my iMac and movies shared over SMB from my Linux file server. Plex seems to watch the shares and do small incremental refreshes for new content very quickly.

> I don't care about "album art" type stuff; I don't download/store/play hollywood movies or music, so those features would be irrelevant even if I cared about them in general.

That's definitely valid. Wouldn't basically any playback device that supports DLNA (and can decode your files) work fine for that? For me, the library features are pretty important. I had done a lot of work getting Kodi on my Raspberry Pi setup how I liked it, then I switched to Plex which is just as good or better but with a lot less work and maintenance.


I haven't used any Apple stuff, but I've gone through many iterations of this over the past 15-ish years, going from MythTV to SageTV to Plex+Netflix, and using various iterations of Linux, Windows 7, 8, and 10 on everything from leftover PC part builds to the SageTV appliance to dedicated Mini-ITX HTPC systems to Raspberry Pis.

My main criteria is being able to use everything from the couch, using a single remote, and no keyboards. I was close to that with the SageTV boxes, but Google bought SageTV and it started becoming obsolete, and there was never really a good Netflix integration anyway.

I find the trouble with all PC-based stuff is switching between things and inconsistency between apps. Netflix on PC doesn't have a good way to do keyboard-based control, so remotes don't work well -- you pretty much need a mouse-style pointer. Plex works GREAT but the keyboard works differently than it does when you're using YouTube, and there's not a great way to switch. There's also all kinds of dumb problems getting multi-channel audio working with Netflix since they require some type of certified system or something.

A couple years ago I gave up on PCs and bought an Nvidia Shield, and it's basically the perfect device for me. AndroidTV is a great launcher, it supports everything I use (Plex, Netflix, Youtube, Weather, Baby cam monitor) and everything works with a single remote (I use a Logitech Harmony to control it + the TV + audio). No compatibility, rights or audio issues. And bonus: it is a Google Cast device, and also works with an OTA tuner (Silicondust HDHomerun -- though I never actually use that anymore). My TV never changes inputs: it's basically just a dumb display for the Shield, which now does 100% of content playback in my house.

It costs maybe a bit more than a nice MiniITX HTPC build, but just works (whereas an HTPC will take hours of time to get it working semi-properly).


> I was close to that with the SageTV boxes, but Google bought SageTV and it started becoming obsolete, and there was never really a good Netflix integration anyway.

Man I was pissed when Google bought and killed SageTV. It was probably the best DVR out there. Flexible in all the ways you'd want it to be. High Wife Approval Factor (MythTV was a disaster)

I guess they open sourced it a few years ago, but our family has moved on to just renting comcasts DVR. It's actually not that bad.

... That being said, the way we consume media has changed considerably since the days of SageTV. Youtube has almost completely taken over my TV watching time.


Yeah, similar feelings. MythTV and SageTV both got me entirely off non-DVR broadcast TV. The idea of 'shows is on at x time' or 'show is interrupted by commercial breaks' is completely foreign to us at this point. Also even the MythTV interface of a decade ago is superior to all the DVR's I've seen at my family/friends locally: they all seem to do "live TV" first, then the DVR bit is an afterthought in a menu. To me, Live TV is the exception that I rarely use, and even then, I want to go to the Guide first -- not just start "channel surfing".

I did spend some time trying to get the SageTV backend working with Plex, but scheduling required an awkward web app (not wife friendly) and I could never get it naming everything in a way that made it easy to navigate with Plex. Around this time, my cable company also turned off their unencrypted QAM channels so I could no longer tune them, and that was really the nail in the coffin for me. I don't pick up enough OTA channels to make it worthwhile.

When Plex released their PVR thing, since I had the gear I set it up, but honestly, I just never watch it.


Kodi is recommended if you want a media pc connected directly to your tv, but personally I prefer Plex with a Roku and the media streamed to my tv. Only negative is that it's not useful for anything besides video


It's good for music as well, and I even watch Twitch on my Roku with a custom PLEX channel. It's pretty flexible.


Plex also does Photos and Podcasts. Not that I've ever used them.


Buy a cheap NUC for server (run Ubuntu) and use an Nvidia Shield for front end. Works flawlesslh.


Second the nvidia shield. My original shield is still kicking all these years later and they give frequentish updates.

Although recently I’ve been using my Apple TV more, I like the interface better. Still use the shield for anything I want in proper 4K with original audio.


The only downside to the NUC is that it maxes out at 32gb of ram. I have the fastest i7 model ($1,600usd) and it's an absolute beast. I offload my compiling and services to it and it handles everything like a champ, gpu stuff notwithstanding. I do hit the wall with some stuff at 32gb so a 64gb option would be ideal. I would never buy a mac to run servers and I would never buy a NUC to replace my Mac workhorse.


> 100% of Windows media players are garbage (VLC included, and there is no Movist). They can't play high-bitrate video without stuttering

Eh? I use VLC and Media Player Classic HD on a few different machines, including a rather old i5 that only had spinning disks, and I've never had any issue with high bitrate video.

I wonder if it's something specific to your setup?


I wonder if they weren't using VLC's hardware acceleration option or something? Not that I blame them; the options screen is kind of a nightmare.


I personally use Xbox and Roku along with Plex running on a windows desktop computer which is powerful enough to do transcoding.

All of the video files reside on a local windows machine and a cheap storage VPS on the cloud(located in NL). Both of the machines run plex server and auto transcode all content to an MP4 container ( HEVC for 4k content ). I force plex on the xbox to direct play all content and it usually works pretty well.


>I had assumed Windows would be better than macOS as a TV computer,

You assumed correctly circa ten years ago. Unfortunately the best solution bar none was killed by Microsoft: Windows Media Center Edition. It had a killer ten foot interface and actually supported CableCard digital TV tuners. I can only assume it wasn't a big revenue maker and that the support calls were pretty high. Not to mention it never had more than lukewarm reception from cable companies which didn't appreciate the competition for set top tuner boxes.

These days if you really want to roll your own you can buy an HD Home Run [1] and let it stream to you. It doesn't install any malware but I haven't seen the UI so no clue if it's beautiful or absolute crap. Since it can stream via DLNA that probably depends on the device you stream to as much as anything. I punted and used Youtube TV which I'm about to cancel since hurricane season is over. Like the old song says "500 channels and nothing's on."

[1] https://www.silicondust.com/


I previously used a Mac Mini as an HTPC, but have switched to separate NAS running Plex server + a streaming box or stick as the client (currently a Synology but anything with Docker and maybe hardware accelerated transcoding should be good, and a Nvidia Shield as the client, but 4K Apple TV and Roku are supposedly ok too)

I like the flexibility of having them separate (upgrade separately, keep in separate rooms, etc), and having a unified “10 foot” UI I can control with a remote to access Plex, Netflix, etc.

You could easily get a 2 or 4 bay NAS + HDDs + Shield/Roku/AppleTV for less than the $800 base price of the new Mac Mini. Unless there’s a reason you need Mac OS I wouldn’t get the Mac Mini.

If you’ve already got the Dell I’d consider just using it as a server and getting a Nvidia Shield as the client.


Windows 10 with MadVR for video rendering with GPU upscaling, and a 3D LUT for colour correction.


Yeah, I regret living without Windows for one thing only: good and accurate video playback. Got close with LibreELEC, can't get anything right with macOS.


Have you tried Kodi (kodi.tv)?


Kodi is good, but I got a lot happier when I switched to Plex, and put the backend on my home server, and just used a Roku device for playback.


No! Looks very interesting, thanks! (^_^)

Will give it a shot.


Bung Kodi on a Raspberry Pi 3B+ - You'll be pleasantly surprised I think.

Downside is... 99% of raspberry pi cases are really ugly :(


It may be better with III but with a Raspberry Pie II XBMC/Kodi is quite slow and a bit laggy.

Personally, I switch to an Intel NUC with a fanless akasa case, and aside from the occasional Lodi crash it works relatively well.


It's not particularly great with the Pi III either. I have one, and I find it's under powered for the "media server" role. Especially, if more than one person connects. I like it for lots of other things, but this isn't one of them.


Ah I use my mine as a playback front end only, pulling media from a NAS. I guess that's a different use case really.


Just stick the Pi to the back of your TV with some self-adhesive PCB risers. No case required.


That's what I do.

Am I the only one though that uses the Pi with a (Synology) NAS with MySQL (MariaDB) to store the Kodi data, and my NAS (NFS) for the content? you can pick up where you left off from your Pi, to your laptop, or Phone.

I know...it's like Plex in architecture.

As for the remote, CEC will let you control Kodi from your TV's remote. Or if you have seperate audio, like a soundbar or AV receiver, a harmony remote will do.


What do you use as a remote? Also does RP+Kodi do Netflix/Amazon/HBO etc. streaming in a nice way?


Some options besides CEC over HDMI:

- Amazon FireTV remote (connects via Bluetooth)

- Logitech F710 (if you also run RetroPie, connects via USB)

https://retropie.org.uk/ is a great emulation system for several vintage consoles (80's-early 2000's).


With a Rpi over HDMI you can use your remote on most tvs if they support the CEC standard.


No and that’s the issue. Works great for your local / NAS media. Lots of plugins for all sorts of things, except, Netflix, Amazon, XM. I still use the TV apps for that.


There’s a Netflix plugin that uses the directory structure on Kodi for v18.


Thanks, I have seen and tried it. It’s still not GA and quite buggy. Not at a level I can hand over to my family for use.


Wife and I use the Kore app. Less likely to forget where my phone is than a dedicated remote. Handy to have the youtube plugin and share a video to kodi from that app too.


Kodi supports network remotes among other types. You can just install a kodi remote app on your phone.


> 100% of Windows media players are garbage.

You might want to take a look at Daum Pot Player.

I can't bear to use VLC anymore. It is just that much better.

I haven't tried it for your use case though.


Since the gp also mentioned NAS and backup, I would presume they're definitely talking about some ∗nix, yes.

I'm not sure if Linux would make for a play-for-play better home theatre OS than MacOS, but it would certainly be both better than Windows, and better value than a Mac.


I used Linux Mint on a Lenovo Thinkcentre micro-PC for a while, at 2x scaling on a 1080p TV. Now I've changed over to KDE Neon (because the KDE UI is more flexible) at around 2.5x scaling, and it's a bit more comfortable. The main input device is a Logitech wireless keyboard with a touchpad. I mostly play videos through SMPlayer/VLC, so it doesn't need anything fancy.

A dedicated 10-foot UI would maybe work better, but as a low-cost low-effort solution, this works great. I got the PC for free (somebody was throwing it out at work) and the keyboard plus a DP to HDMI cable was $35 total. It plays 1080p x265 just fine.


Have you considered any of the ELEC releases? These are Kodi on minimal Linux with a diverse plugin ecosystem, with docker support. I am using LibreELEC. There is a live option in the installer to give a whirl.


I have looked at them, but not seriously. Maybe it's time to give it a whirl, thanks :-)


I think the GP uses the MacMini as a "hack" to be able to back up the NAS cheaply with Backblaze's unlimited personal desktop backup offering (there's no official Linux client, imho):

"Get peace of mind knowing your files are backed up securely in the cloud. Backup your Mac or PC just $5/month."

And use their GUI program on MacOS and sync that remote NAS storage folder.


What is your beef with VLC? It’s the only player that has yet to fail me.


Comparing vlc to a home theater setup is laughable at best.

Can vlc scrape Metadata and download art? Can I browse my catalog through vlc? Does it work with my remote? How hard is it to get to the next episode when watching a season?

Vlc is great, but it's just a player.


> Can vlc scrape Metadata and download art?

yes

> Can I browse my catalog through vlc?

yes

> Does it work with my remote?

if you got a remote that connects to your pc, yes. there are quite a few of these around, especially if you buy a microserver or similar.

> How hard is it to get to the next episode when watching a season?

pretty easy.. actually, its imo way easier than anywhere else, as you can always use a wireless keyboard/mouse and just jump around

VLC can even cast to a chromecast device and serve its content to TVs if you want that

most of that needs to be enabled through the settings though.


VLC has a lot of features, but it doesn't offer a 10 foot interface at all. Expecting to use the desktop interface in a home theater setup is ridiculous.


I think Kodi is closer to what you're looking for


I also use a Synology. And a Raspberry Pi as the client.


XBox, PS4, Android with a NVidia Shield.


Why do you need a PC at all? A TV can play movies directly from a NAS.


A couple of years ago I would have argued with you, but we recently added a WebOS TV to the household: 4K HDR+ Netflix and Amazon Prime, perfect playback of seemingly everything on the NAS box (which is just a hard drive hanging off the router), all effortlessly and with zero mental effort or maintenance.

To reiterate what I said elsewhere, this Mac Mini is meant to be a desktop, not a media device (or some sort of perverse NAS). It's being pushed into the home theater niche by people who probably have a big trashcan Mac Pro at their desk and try to justify this lesser device. But it's an extremely competent legitimate PC for most people.


I don’t trust my TV enough to be on the network.


Yes. I recall reading the user agreement for our smart TV. It recommended we not say sensitive things in front of it. We haven’t even upgraded the firmware.

I maintain that if LG or Sony offered a “Pro” version of their OLED that had no apps whatsoever and no internet connectivity I would pay a premium.


Yes, this precisely. Smart TVs of all flavors as well as Roku boxes voraciously mine your viewing habits, connected devices, network layout, mic input, and practically anything else they can get their dirty little hands on. They’re usually running out of date OSes and are full of holes to boot, making you a potential vector for smart device botnets.

Smart TVs and Roku boxes simply cannot be trusted. I have a brand new, relatively expensive Sony smart TV and while its Android TV support is more than adequate I don’t use it at all and keep it off the internet at all times. Instead, I use an Apple TV 4K.


> Instead, I use an Apple TV 4K

Which is what I'd do but unfortunately Google insists on using it's own video codec for 4k instead of H.265. Meaning your 4k AppleTV can't play YouTube videos at 4K. My smarttv can though... We use AppleTV for everything else.


Yeah, I thought Media PCs were a thing in the 00's. But it's 2018 now. Why would you have a Media PC?


I personally use Chromecasts, but HTPCs are still a thing, and for multiple reasons : - gaming (console "alternative", emulators, etc) - streaming a personal library of media files (e.g. through Plex) - it's way more versatile than the usually pretty terrible SmartTV onboard software in terms of what content you can access - like Amazon Video that only streams to their own crap stick

Media PCs never stopped working, and they're still better at some things. They're just less "convenient" in some ways, and requires more investment, both in time and money, which is why some people switched to streaming boxes or simply using their TV's software.


>it's way more versatile than the usually pretty terrible SmartTV onboard software in terms of what content you can access - like Amazon Video that only streams to their own crap stick.

The sad thing is: This has reversed. I gave up trying to Netflix and Amazon Video from my HTPC. The former at least has an app and now even supports all formats (Dolby Vision 4k, Atmos), but the controls are terrible for a remote. The latter doesn't even have an app, and all you get is HD+Stereo. And don't get me started on Blu-Ray.


Because it's 2018 and media format support in smart TVs is still garbage. I have relatively new Samsung and Sony TVs, and they are both really finicky about what they support - various combinations of container format, video codec, audio codec and bitrates work on only one or the other, and plenty works on neither (including 10-bit video).


Same thing with a media PC. You could get some transcoder on your NAS (Plex?). Newer TVs support newer formats. Last year's LG OLED supports everything I could throw at it.


> A TV can play movies directly from a NAS.

An iPad can’t though. And NAS’s don’t generally support transcoding.

A DLNA compatible media server can stream content to any device on the network.


Many NAS have that built in. My Synology can stream to an iPad with VLC just fine.


I went through this when I built a Win7 HTPC perhaps 5 or 6 years ago. I was so happy to no longer have to use Connect360 on my xbox 360 to stream transcoded video from my Mac Pro (that I had to power on to run the whole thing). Finally I'd be able to just play files directly from my NAS into the endless world of Windows media support. No more transcoding from whatever stupid codec/container/disaster the 'scene' is using these days. It will Just Work because all of these people use Windows and it plays everything, right?

Nope. Unmitigated disaster. Same thing, navigating endless spy/mal/adware laden sites to download CCCP codec packages and plug things together, every time I installed one thing I found another thing that didn't work right. I have some weird Jockersoft thing to force some other application to keep running and restart if it fails, and even though I no longer use whatever that thing was, I can't uninstall the goddamned jockersoft thing to save my life.

Windows Media Center itself was actually a joy, though, I really enjoyed how well it worked for the things it could do out of the box which were far fewer than I had been led to believe.

I also realized that a decade+ of using OS X had left me unprepared to navigate the latest online threats in the world of not-entirely-legitimate download sites for Windows utilities. It got really bad while I was away.


I would say most uses of the mac mini are simply as a desktop computer. It's what I use as my primary OSX workstation -- I have an old late 2012 model, upgraded with 16GB and an external SSD that I made the primary -- and it works fantastic. I plan on getting the new model.

Two monitors. Monitors that are exactly what I want.


2012 model work for iOS programming? I'm writing in react native, hopefully I only need to hit compile and never have to use an apple product.


The i7 (3615QM) in it holds up extremely well. The storage was catastrophically slow but the system allowed me to effortlessly migrate over to an external SSD (then using the internal storage just for time machine backups).

Finally considering upgrading as it has been a while.

The low power consumption of the thing is one thing that always just blows me away. No noise, at all. My UPS seems to last forever when the power does go out.


I have the "Server" model which came with 2x1TB 2.5" drives. Swapped one of them out for a SSD: ample storage without having to add anything externally.


Yes, you could only compile once, but as soon as you add new native modules, you have to recompile the whole app. But you can basically use any other OS to write RN code. If you don't plan to add new native modules to it, then you can compile / deploy your app once from a Mac, then use CodePush to remote update the JS which can be done from any other OS.


Yes but it’s the oldest model supported by Mojave, so consider getting a used 2014 maybe if you want it to be relevant for a few more OS releases.


An Audi SUV's price isn't justifiable given almost every use case for a consumer automobile, but a lot of people enjoy the experience and find it worth paying tens of thousands dollars more than a Camry. By comparison, if you work on a computer for several hours every day and prefer the ergonomics of a Mac, spending an extra $1,000 is a pretty minuscule premium.

And I say that as someone that happily switched from a Mac to a PC I built a few years ago - I like getting the maximum grunt for my money.


The NUC8, like pretty much every small form factor pc, uses a mobile cpu, whereas the new mini uses desktop cpu’s. You have to look at the top model nuc8 with i7 to outperform the entry-level mac mini with the i3, and then the price advantage disappears.

That CPU power comes in handy for most of the mini’s use cases, transcoding for home theater / nas, or build times for build server.


>The NUC8 with comparable specs (sans 6 core CPU) comes at under half the price!

I'm actually very interested in these - could you point me at some? I've been interested in getting a NUC8 machine, but the cheapest I can find one for (granted, in the UK) is £750 before RAM & SSD... and I have been seriously considering the new Mac Mini as a result because the final price doesn't seem like it'll be much different.

edit: found some on SimplyNUC.co.uk starting at £650 (inc VAT), 8GB of RAM and 128GB SSD, awesome! https://simplynuc.co.uk/8i7beh-full/


I have a NUC7 (NUC7i5BNH) in a silent Akasa Newton case. It's great, I love having a completely silent computer.

I tested the original case, and it was fairly noisy, like a busy laptop. I don't recommend it.

You may find it slightly cheaper if you install the RAM and SSD yourself. This would be easy with the normal case. Installing the silent case was more tricky, since it requires using thermal paste to connect the case to the CPU.


That's very good to know, thanks!


Doesn’t matter to everyone, but something to keep in mind is that Mac Mini’s have their power supply inside the unit itself while NUCs need those awful wall warts.


You might want to look at some Chinese mini/industrial PCs

Same 8GB/128 GB, but probably lower CPU for £307: https://www.amazon.co.uk/dp/B077SLPZ63

There are many different configurations on Amazon/Ebay

http://www.inctel.com.cn/product/MiniPC.html


Yeah, the model you linked has a i3-5005U - a very large step down from a i7-8559U. Still, could be sufficient depending on what you need it for.


Bear in mind the i7 model in the article is $2,499!


In case it wasn't clear, user 21 was responding to user garblegarble with alternatives to the Intel NUC8i7BEH (~$484 in the US), not the Mac Mini.

But yes, the decked-out i7 Mac Mini is really expensive! As noted elsewhere in the comments (and in the article), a big chunk of that cost is in Apple's 1TB SSD pricing - ticking that option adds $800 over the 128GB SSD for some reason.


I have the 5i5 (Skylake) NUC running as a HTPC, and it's pretty nice. At the time I think I paid about $600 with the RAM and an SSD. I use it for the typical things, playing music and movies, but also for some lean-back computing on the couch when I don't want to sit at my desk to do email or whatever.

A quick glance at Amazon (US) says that a 7i5 NUC with 8GB of RAM and a 250GB SSD is $540.


If you’re bought in to the Apple ecosystem, you can get a good HTPC by buying a 5+ year old Mac Mini from Craigslist. I’ve seen em for around $100, and unlike an AppleTV, you can run whatever additional Mac software you want. My 2010 is running great, recently upgraded to a SSD, and thanks to a kind soul here on HN pointing me to a link, can even run Mojave even though it’s technically unsupported.


If you think the hardware and cost is overkill for "most use cases" perhaps the problem is your assumption of what "most use cases" are.


There is a very specific use case for this: Apple ecosystem integration.

- Home theater -> iTunes movies play

- NAS -> iCloud clone with eventualy automation running

- Backup -> iCloud / Dropbox / Google Drive backup

- bonus: siri, xcode builds, local+ third party remote iOS backup

I have a raspberry pi setup handling the above for Google + dropbox. The day I want to switch to an Apple centric system I’d move to the Mac mini in a heartbeat.


> The pricing is not really justifiable especially given the most use cases of the Mac mini (Home Theater, NAS, Backup etc.) are better handled by other OSes.

I dont' know about "most", but pros who want quiet, small and fast (read: musician, audio engineers) will be all over this Mini.


And hopefully it is not just me, but it feels so much better to have a solution based on "open" hardware (i.e. PC based) and software (e.g. Linux), rather than support Apple's lock-in model.


It’s hilarious how often this exact comment gets made in every single new apple product thread everywhere.


This exact comment is specific to this reviewers use case for his Mac Mini; it can't be made in every apple product thread, other apple products are used for things other than home media centres.


and its pointing out the truly obvious as well...


Well the reviewer seemed to go out of the way to claim the price was justified - I was pointing out that wasn't the case, not that price is high which it undeniably is.


Meh… the things I'm looking to do require some fancy outboard and don't ask all that much of the host computer. I've been looking at getting a fairly spiffy newer iMac or medium-old Mac Pro to hook the fancy outboard to (things like video production gear that'll take SDI inputs, external hardware x264 encoding, my MOTU 16A multichannel interface).

It's looking very much like all that can be done just as well off the back of this tiny computer that's less than $1000. I was expecting the computer side to have to be more than $2000 to properly handle the fancy outboard. What with the Mini doing certain single-core processing tasks faster than ANY other Mac, and even having some RAM upgradeability (?!?) there is no question that it'd be able to work with the gear I mean.

I don't need my host computer to be a quarter the cost of all the outboard gear that plugs into it. It'll be roughly the same cost as each of the pieces of outboard gear. That doesn't seem too expensive at all.


I really don't like when reviewers test the high end configuration only. This makes the review pretty useless for most of us who may be considering a lower spec.


I agree with you, but somehow i feel like Marco gets a pass - he's not really trying to run a tech review site to give this thing an objective score, it's his personal blog and his opinion and this is the model he finds interesting.

I know it's probably not fair to give him a pass on reviewing the top-end, because really it is a review and should be held to the same standard as other reviews, but something about his tone makes me not care so much.


He said specifically that it's the model Apple decided to send him for review, so he had no choice.


Well it was an early Apple provided review unit.

But more than that, Marco Arment actually uses high end rigs in his daily work and describes pushing them to the limit. It is highly appropriate for Marco to review a Mac Mini with high end specs, and I find it informative


Here’s a review of the entry-level model: https://iphone.appleinsider.com/articles/18/11/06/mac-mini-2...

It outperforms every previous mac mini, and doesn’t benchmark that far below the 2018 i7 macbook pro. This is a desktop i3, the desktop-class cpu’s are a different performance class from the mobile line.


Car manufacturers rarely send their low end models out to reviews as well.


and Ferrari sends tuned versions of the car


Some car companies not only send out tuned versions of the cars, but a team to tune the car during the review.

You see this on The Grand Tour all the time.


Basically all performance car manufacturers send out cars to journalists with a higher power setting, top tier tires, optimized suspension settings, etc. But journalists know that.


He did test one thing to match a lower end configuration:

>Interestingly, I disabled Turbo Boost to simulate the base i3 model’s thermals, and couldn’t get the fan to spin up audibly, no matter what I did. Those who prioritize silence under heavy loads should probably stick with the i3.


This is a pretty questionable methodology. I don't think it teaches us much about lower end configs.


What would you want to see tested in the i3 8GB model?


Test how it performs? Maybe some benchmarks? Or even compare it to the high end model?


Macs aside - this video clearly shows the wonderful experience you can get when an audio/video producer cares about audio quality. Spoken word recorded with clear audio is so lovely.

I listen to podcasts and audiobooks on a daily basis and I really really appreciate someone putting the effort into this. Thanks!


I had a hard time watching him talk because the over dubbed audio is completely out of sync. It's really bad right around the 1 minute mark. I ended up minimizing the window and just listening.

I honestly don't care for this audio, it's crystal clear and perfect for a podcast but the video is of him sitting in front of a window with a dog and the absence of any environmental noise is odd.


"Yeah, sorry about the audio sync. I was fighting it for the entire edit, and ran out of time to really nail it.

Next time, I’ll pipe the audio directly into the camera instead of relying on manual syncing during post, which had tons of drift."

https://twitter.com/marcoarment/status/1059869643928670208


Thank you for confirming that I'm not insane. :)


Agreed. It's drifting around and rarely in sync.


Watching on iOS I had no sync problems. Maybe it’s a transcoding problem caused by YouTube? At 1:25 you can clearly hear how when he turns away from the camera the audio changes as well. I’d say the audio is recorded live, not dubbed.


I don't think it's dubbed. Just out of sync.


He makes Overcast, the best iOS Podcasting app. Many good features to handle all the fiddly edge cases. Almost no complaints[0] and he keeps making it better.

[0] The only one thing I wish he'd add would be the ability to limit concurrent downloads to prioritize at least one of them finishing. I routinely get bad wifi and seeing 8 downloads at 55% is maddening.


The only complaint that I have is that the UI is ugly ugly ugly and has so much wasted space. The previous version had so much better UI.


I do agree with the wasted space part but why would you call it ugly? Also, I don't find the UI intuitive at all. Many actions like to go to the list of episode of a podcast you have to kind of start from the beginning. But ugly? No. In fact I find it pretty clean.


It's not an eye sore, but I say "ugly" because I don't like the organization, everything is misplaced in my opinion. Also thick lines and rounded corners galore.


Anyone looking for crystal clear vocals should do exactly what NPR does: Stick a Neumann U87 six inches from the mouth with the bass rolled off and use a P-pop and Bob's your uncle.

Money will be an issue with the U87, so you can look for good clones; there are a few in the market.


Boy oh boy, I can't stand that sound. Must be years of being trapped in other kids' parents' cars, who were educated and listened to NPR. I just wanted to listen to Big Douche and the Boys talk over the intro to Enter Sandman, like my high school dropout parents would.


What does it mean to have the bass "rolled off", and what is a p-pop?


The mic has a switch on it that imposes a hardware high-pass filter to counterbalance the proximity effect you get from being close to it. Since the mic is a cardioid mic (directional sound pickup) it would normally pick up a great deal of extra bass due to the geometry of the cardioid capsule (radio announcer sound, big and deep and bassy). The switch lets you take away that bassiness and the pop filter prevents the mic from going 'THUMP' when you say 'P' into it, and the combination of these things gives you that super-present, 'NPR interview' sound.


Most studio mics have a setting to taper off the bass end of the spectrum. A pop filter is a grate/net that sits between the mic and mouth to stop letters like ‘p’ sounding like a blast due to the air flow.


I'd guess he means a pop filter.


There's a running joke on ATP that Marco buys just tons of mics. Not to mention he made Overcast, so he knows audio pretty well.


I think he just buys tons of everything.


And then he sells them


A $2500 desktop computer with integrated graphics?

Honestly, it feels like a bit of a joke.


It starts at $800, and you can instantly save $600 even for this top-of-the-line model by installing RAM yourself (it's user-upgradable again).

The model & price reviewed here is the "let's not stop people from throwing money at us" option.


The RAM is upgrade-able on a Mac? Who put his neck on the line to get that option into the final product? That man is a hero.


Wow, didn't see that coming.


You can't upgrade the i3, or the measly 128GB hard drive they bundle with it though. For $800, nearly every other option is better.


It has TB3. If you need more storage you can add it externally.


Why you can't upgrade i3?


It's soldered to the motherboard.


How much graphics processing power does your text editor demand? How about your build server?

Apple made it pretty clear in their keynote they're focussing this product to be attractive to the 'pro' market that gives zero fucks about a whiz-bang graphics card.


My text editor needs the processing power of my $100 ThinkPad, and that's where I'll be using my text editor.

What is it that justifies the $2500 price?


6-core latest CPU, 32GB RAM, 1TB SSD, plus “apple tax” (design, thermals, build/hw quality, OS, support).

You can build a similarly spec’d PC for maybe $1500, with a better GPU, but 20x as large, not silent, higher power consumption, etc. Many trade-offs, not for everyone.


Or it'll be right around $1250, the same size or smaller, and still silent:

https://www.amazon.com/NUC7i7BNH-Dual-Core-i7-7567U-Bluetoot...


That's a laptop CPU. The top-end Mac Mini he's using is a 65W TDP i7-8700. Pretty large difference in performance.


With 1/2 the max RAM, 1/3 the CPU cores and from a previous generation, 1/4 the TB3 ports, no NVME, No 10GbE option. But sure. "The same".


It would be $3500 with GPU, so I'm glad.

Don't need GPU for typical coding.


A huge chunk of that is the SSD. It's actually very reasonably priced, especially considering that you get a Mac ecosystem along with the computer.


Reasonably priced because SSD? You know a 1TB NVME PCIe M.2 drive can be had for under $250 right? How Apple justifies $800 for this single part is beyond baffling.

https://www.amazon.com/dp/B07BN217QG/?coliid=I2D5ZX13M1AMZ7&...


Yeah, and that's one of the best consumer models available! I believe the WD Black 1TB is in the same league for performance and price as well.


The price for those SSD upgrade are more than double of the current Retail Pro Level SSD at same capacity and similar speed. Which Apple saves on SSD Controller, better NAND pricing, no retail and distributor margin. Even though I expected the Apple SSD upgrade were to be expensive, those upgrade price levels are simply ridiculous.

The DRAM too but at least I could upgrade that myself to 64GB.


I bought a 1 TB Samsung EVO 860 one month ago for a little more than 170 Euro. The SSD could be a huge chunk of the manufacturing cost but that means that Apple enjoys a huge margin on this computer.


Have you looked at SSD prices lately!?


I'm with you. A very high end 1TB M2 SSD card can be had for under $400, with good SSD cards being under $200.


The pretty-high-end 1TB Samsung 970 EVO has dropped as low as $228 recently from multiple vendors.

https://slickdeals.net/f/12192367-1tb-samsung-970-evo-nvme-m...

If you really need the very-high-end, then yeah, the 970 PRO is just shy of $400.

For cheaper stuff, ADATA's 960GB was recently down around $180.

SSD prices have really dropped the last few months.


TBH - Do you even need 1TB of storage on your desktop? If your image/music library is that large, you'd have it on a NAS. Maybe if you've got a big batch of VMs for some reason...


> considering that you get a Mac ecosystem

People consider this a good thing?

EDIT: To clarify, the cost of their proprietary devices mean spending an additional bit. Every other company uses USB C now.


I have a Windows 10 computer for gaming and oh my god am I glad I don't have to use that for anything productive.

But hey, at least it's the year of desktop linux!


With Steam Proton you could call this the year of desktop linux gaming!


Desktop linux is actually coming now, who'da thunk it would be as a trojan horse riding in via Windows.


As an aside- I do not like desktop linux or anyone that defends this awful status quo.

Its desktop, the user should not have to type anything ever ever ever. The community is toxic and harmful to the future of linux by telling everyone that terminal for the average user is okay.

I love my ubuntu server, its the greatest thing I've ever used. No problems ever.


Yeah, turns out OS X is popular. Who knew?


Get get 3.2GHz 6‑core 8th‑generation Intel Core i7 8GB model sell 8GB stick get 32gb for 270 and external 1TB SSD for 160 you will have very decent dev box for 1730 - whatever 8gb goes for


You could add an eGPU over TB3.


Lack of GPU wasn't OP's gripe.


How else would you interpret the comment? The only thing they mention is the lack of GPU, so I'm curious why you would say that.

In any case the price is due to the SSD and RAM upgrades. You can get the i7 with 8gb RAM and 256gb SSD for $1300, which is still pricey but what do you expect from Apple...


An eGPU is going to add $300 for an enclosure and $500 or whatever for the GPU itself, so even with that it gives you a pro desktop option priced cheaper than a Mac Pro.

Base price on the Mac Pro has been $3000 even before the trashcan version. With the pattern in all of Apple's recent products, the next one might be more.


You would interpret it as saying that it is overpriced. A high-end GPU at the same price might make sense, but paying that much for RAM and an SSD is nonsense.


I've interpreted it as being overpriced, with provision that it might not be if there was a (higher end) GPU inside.


With the availability of eGPUs, and Apple even officially supporting and selling them, it's like complaining about a barebone being a barebone.


Ugh, all the comments about how it's overpriced in this thread.

"This is crazy! You can just build your own for half the price!"

Snooze. You could have made this comment about most of Apple's products for decades. Some people apparently just don't get it, so here it is from my perspective:

As long as Apple does what I need, I'm not going to buy anything else. I'm definitely not going to build my own PC to save a few bucks. I recognize that not everyone is in this position, but my time is valuable and the difference between "click buy" and "research and buy a ton of parts and assemble them and install software and drivers and blah blah blah" is worth thousands of dollars to me.

I rely on a Macbook Pro for work every day. I generally keep them for 2-3 years, and given my preferences, work setup, software, and how much money I make with these things, I'd pay triple (or more) what Apple currently charges for them vs. getting some shitty Dell or Lenovo and hassling with Windows or Linux. That's not to say everyone should feel that way, but that's my situation, and I'm not alone.


100% this, but for people who don't make money from using the computer I can see how the high prices are a real problem


But Apple's strategy has never been to go after the price-sensitive market. Complaining that their prices are way higher than commodity parts you can piece together yourself makes zero sense. That's like complaining that a Ferrari is way more expensive than a Honda.


Yep, good point. I read someone online commenting that Apple's "price sensitive" option are second-hand devices as they (historically) remain updated and working well for a long time. Hopefully the new 3rd generation keyboard will stand up to years of use - I'm not at all confident that my 2017 will have much resale value due to the keyboard issues.


Resale value on my 6 month-old MBP 2017 was pretty rough.


The review doesnt tell you that you cannot upgrade or replace your SSD.


Or boot Linux, because the T2 chip prevents it[1].

[1] https://www.phoronix.com/scan.php?page=news_item&px=Apple-T2...


Why would you buy this to run Linux?


Well, I certainly wouldn't, but in four to five years if I need a general purpose server, it'd be nice to install Linux.


That turned out to be rubbish. You can easily disable secure boot.


From the link:

> Update 2: It looks like even if disabling the Secure Boot functionality, the T2 chip is reportedly still blocking operating systems aside from macOS and Windows 10.

That seems to contradict your claim, and I haven't seen that refuted elsewhere.



Yes, you can disable Secure Boot, but Linux still isn't allowed to access the NVMe device.


I will never again buy a laptop or computer where I can’t replace the SSD myself.

Last time I had a problem with my MacBook Pro and took it in to get fixed. They wanted me to write down my login password before they would accept it!

I had to reschedule and wiped my disk before bringing it back again.


Another way of looking at that is that macs are so secure even apple can’t get into them without your password.


So if the motherboard dies, all your data is gone?


You should use backups anyway. ‍️

But if the computer dies and the motherboard survives, you can connect to some thingy on the board to get the data.


You can, or Apple can for a fee?


Apple products are anti consumer and anti developer.

I consider using Apple products on par with using Keurig, its bad capitalism.


Which also means the entire thing has to be thrown out when the SSD eventually dies. Sure it's going to take a while, but it will happen. Yay consumerism, I guess.


In late 2014 I replaced the disks in my 2008 Mac Pro at work and my 2009 Mac Pro at home with Samsung EVO 840 and 850 SSD disks. I kept track of accumulated writes.

Both machines were heavily used for development and consumer type stuff. No big data stuff or big media stuff.

Samsung rates these things at 150 TB write endurance for the 850 and something around 120 TB for the 840.

My projection based on that usage is that it will take over 35 years to reach 100 TB.

Samsung's ratings are quite conservative. Reviewers that have put these things through write torture tests to the point of error have typically gotten several times Samsung's rating. Around 170 years worth of writes at my usage rate.

I haven't seen numbers for Apple's durability, but if it is within even distant sight of Samsung's EVO performance you can expect to have retired the computer for other reasons long before the SSD dies.

I'd expect that the only reason the SSD might lead to getting a new computer is that you want a bigger SSD, but even that might not be an issue because Thunderbolt 3 gives pretty good performance for external disks.

My Mac Pros have both been retired, replaced with a 2017 iMac (one iMac could replace both office and work computer because we switched to working at home). My Samsung EVOs are now on the iMac in an AKiTiO Thunder3 Quad Mini enclosure.

The internal SSD in the iMac gets just short of 2000 MB/s for both read and write, according to Disk Speed Test from Blackmagic Design. The Samsung EVOs get around 465 MB/s write, 520 MB/s read, so 1/4 the internal disk but still fast enough for most purposes.

Note that the Samsungs are SATA drives. Something that made more direct use of TB3 could probably go much faster.


I bought a 1TB Samsung 840 Evo in September 2013, and it just died last month. So it was in operation for around 5 years.

I used it for development type stuff. It outlasted the 2011 MBP I bought it for. In fact, I ran it on 3 different computers during its life span.

Overall, I'm pretty happy that it got the 5 years. With spindles, I prefer to replace them every 2 years to avoid headaches.


Technically not. You can boot from an external thunderbolt SSD, which is capable of the same speeds as an internal drive. It’s mildly ugly, but not a showstopper.

But I thought that modern SSDs were supposed to last decades, even under heavy use. Is this actually a real problem?


Does anyone know if TRIM is supported using an external SSD over thunderbolt?

I remember reading it’s not supported over USB3.


My wife is using her iMac with an external SSD via USB3 since the HDD is too slow. She even can boot her Macbook Air from the macOS installation on that same SSD and keep working.


This article makes me happy. To see that Apple is listening. I'm super excited for the next Macbook Pro,and I really hope we really get something that is better, more reliable and practical. I'll be thrilled if they can find their groove again for the Macbook Pro


I'm in the market for a new machine, but unfortunately that GPU is too anaemic to make sense. It's a shame, because otherwise it looks like a pretty great proposition. An eGPU is an option, but at £500+ and twice the size of the machine, not a vary attractive one.

I guess the options are throwing down the cash for a 15" MacBook Pro, or waiting to see what they come up with for the Mac Pro next year…


There are some “puck-sized” eGPUs, somewhere between the size of an AppleTV and a Mac Mini, have chips comparable to the high end MacBooks Pro options.


I have the same issue, I don't understand why they are able to fit a Vega in the 15" MBP but not in the larger Mac Mini.

15" MBP is not an option for me because of the touchbar and keyboard, so I am basically out of options.


The mini has a 65w tdp cpu, vs the mbp’s 45w cpu. That means more heat to dump from a smaller surface area. I’m not surprised they couldn’t fit a warmer gpu in the thermal profile.


Yea, it could at least have been an option. But my guess is they didn't want too many SKUs, and wanted to hit a price-point of less than $1000.


Also they need to preserve a market for the "modular" Mac Pro next year


In a world of 5K displays and Apple's insistence on smoothness, this GPU just won't work.


There isn't a "world of 5k displays" unfortunately, only Apple's high-end all-in-ones offer this option.


Supposedly Apple is planning an 8k display to go with the modular Mac Pro.

I suspect if they release a new display it will somehow be beholden to graphics capability of new Apple machines, possibly requiring A chips or their ilk. https://daringfireball.net/2017/04/the_mac_pro_lives


Philips 275P4VYKEB, but yeah, just one monitor.

Edit. Remembered Iiyama ProLite XB2779QQS, but it uses 6bit colors.


Appears to be discontinued along with the Dell 5k. Only current option is the LG.


There's also an LG one.


Is that really an issue with Intel's integrated GPUs? Honestly asking.

I'm considering a Mac Mini, but there's no way I'm going back to low PPI displays and the lack of affordable 5K / 27" options concerns me (in my opinion 4K/UHD at 27" is a borderline unusable combination with its effective resolution of 1080p - feel free to convince me otherwise).

If those few 5K options can't be driven smoothly by an Intel GPU, it's settled and I'm waiting for an iMac update instead. Ideally I'd like a multi monitor setup though.


I have an iMac 5K and I need to scale things up a little bit for my (tired) eyes.

For me personally 4K 27'' would be better. At 163 PPI it would be worse than the 218 PPI of the 5K display, but it would not be a deal breaker if the color quality was the same.


Looking to buy a 4k 27 inch external monitor for my 2018 macbook. What's the problem with it and why is it not better than a 1080p?


It's better than 1080p as it is much sharper and crisper, but 1080p at 27 inches results in rather large UI elements. An effective resolution of 2560x1440px (as seen in any iMac 27) seems like the proper resolution for a display of that size.

Read more about it here: https://bjango.com/articles/macexternaldisplays/


I'm curios about this too, does the Mac Mini offer "scaled" options, like MacBooks do with their built-in display, when connecting to an external 4k monitor? I think it does some supersampling, resulting in higher effective resolution.


Sure, it does offer scaling, but you really want 2x scaling - I tried non-integer values on a 5k display, just to get a sense for it, and it was as disappointing as expected. Slightly blurry text, rather obvious that this wasn‘t the recommended native setting.

In my opinion 2x scaling is the only option and that leaves you with an effective resolution of 1920x1080 points - sure, it‘s sharp, but far from true 5k and more importantly the same screen real estate we used to fit into 20-24 incch displays for years. It‘s the wrong resolution for 27“.


Yes, it does. Though at that point it's effectively rendering at 5k anyway, so any performance problems you have with a real 5k you'll also have with that setup.


I found I preferred 4k at 27" to 24", because the larger sizes help with eye strain. Would definitely recommend trying both before buying if possible.


My Thinkpad with Intel integerated graphics is unable to power a 4k display as smoothly as my desktop with 1070. It definitely makes a difference.


The 545/645 found in the 13" MBPs can drive a 5k display adequately. Not sure about this 630, though.


I bought a 2016 13" MBP to use it on an external display. It only works well on 2.5K. Always consider the GPU nowadays.


That's strange. I use an early 2015 13" model which happily drives two external 4k displays simultaneously. I'm not running them high-DPI though – the performance there is noticeably worse.


I have a 2017 13" and while it was working with a single 4k screen at 60 Hz, it simply isn't fun working on it since everything is laggy and stuttering.


I guess that high dpi then as well?

I found that wasn’t really nice to use because of the stuttering, but native 4K was much better. You need good eyesight though :)


Scaled down to 1440p yep. Native 4K is no option :D


On scaling other than 2x, it's unusable, while the MBP mobile Radeon Pro is good enough. They should have included that at least.


Yeah, I was doing 2x4k plus the internal screen off of the MacBook with a 555, and it was totally fluid. I would have happily forked out for the Mini if it had that GPU; it seems like a bit of a strange decision for such an otherwise powerful machine!


Is anyone worried that Apple's transition to their own CPUs will render these x86 machines obsolete very fast? How long did it take for the pre-x86 Macs to fall behind when the software started focusing on x86 instead?


Apple waited for 5 years to release a version of OSX that couldn’t run PPC apps on x86 Macs.

Last PPC Mac: 2005

First x86 Macs: 2006, PPC apps work on x86 Macs

Adobe CS3 (first native x86 version): 2007

OSX 10.5, last OSX version released for PPC Macs: 2007

--

OSX 10.6 (x86 only, PPC apps still work): 2009

Adobe CS5 (x86 only): 2010

OSX 10.7, PPC apps no longer work: 2011


I remember when they went to x86, but I think the landscape is different now. I was working in graphic design at the time, and I didn't know a single person outside of that field who used a mac. Now, I'm surprised when someone's laptop isn't a Mac. So, I imagine there's a lot more critical software that needs ported.

One area I can speak to now, which I'm heavily invested in, is audio applications. It's the main reason I still run a Mac. Upgrades are usually slow and cautious in this domain. I'm still on 10.12.6, and have no plan on upgrading anytime soon.

I, for one, am quite concerned. It used to be in regards to performance, but now it's all the applications and plugins that will need ported. There's a very real chance I'll go to Windows if it looks like a nightmare. Would love to see a Ubuntu variant come out of the ashes of such a thing, but that's probably a pipe dream.


Yep, it's time for my to change the laptop at work, I currently have an old i5 Macbook Pro and it held for +5 years. Even my personal macbook air is from late 2010 (reddit + spotify machine) I doubt that any mac today would held for another 6 or 7 years.


> Unfortunately, we still don’t have any great standalone 5K displays. (The LG UltraFine isn’t.)

What’s wrong with the LG display? It gets a lot of criticism, but I find it hard to understand why. There was an interference shielding issue, but it was fixed more than a year ago.

For anyone familiar with the matter, what is it that makes the LG not great?


My personal issue with it is the lack of standard inputs, which make it impossible to use with an eGPU setup (excluding the Blackmagic, but that's a whole other kind of subpar). The data ports being exclusively USB-C is probably an issue for some as well.


If your complaint about the Blackmagic eGPU being “subpar” was the power of the Radeon Pro 580, you’ll be happy to hear that one with a Vega 56 is now available:

https://store.apple.com/xc/product/HMQT2VC/A


It was partially the 580, partially the inability to swap out the card. It's nice that they're offering Vega options now, but for that model I'd be paying a $500 premium over a comparable setup, with TB3 out being the only benefit. I'll stick with my Dell P2415Q + Akitio Node for now.


I have two of them and they are both excellent. I can see no discernible difference between them and my 5k iMac. They are expensive, though.


I found that the LG UltraFine is by far the best monitor for my MacBook Pro. Nothing else comes close.


I bought a mac mini ten years ago (for about E400?). It still lives on today as my parent's desktop. I passed to them many years ago and not one problem. Fantastic machine.

However I cannot justify the prices they're asking for these ones. I'm looking for a new machine for myself now, but no way will these be considered.


Let me get this straight: You would not consider paying a premium for a machine even though your experience tells you, that it has a useful life of 10+ years?


All of my experiences with Mac have been like this.

Got a MBP for college, 2009. Still running as my mom's laptop in 2018. Meanwhile my dad's 2 year old dell has endless problems, it's incredibly cheap, despite having better internal hardware. It's slower, needed tons of configuration out of the box, and Windows 10 is just awful. My dad's laptop constantly has issues. My mom? Never once. It just works.

I will gladly pay for a product that is like that. I don't care what an SSD costs on Amazon. I don't want to spend time looking up components and motherboards and bargain hunting. I don't care about gaming and therefore the GPU. People on here look at numbers and costs but they never consider the customer experience.

It's far more damning of Apple to have the bad keyboard on the new MBPs than have some overpriced hardware. Ask 99% of people on the street if they even know what an i7 is.


I used to be as enthusiastic as you.

I've bought a number of Macs since I switched in 2007 and I've had a number of tragic stories.

My wife's previous Macbook Air died during its second year. It was working fine and one day it didn't turn on. Apple Mexico asked for close to $1000 at the time to replace the logic which was simply ridiculous.

My top of the line 2011 MBP died when it was 2.5 years old because of a known GPU defect. Apple fixed it over a year later but it was too late. I already had a new machine and the second hand value of the 2011 plummeted. I ended up giving it away to a junior dev in my team a couple of years later.

My current laptop is a 2014 13'' rMBP. I wanted to change the battery and Apple Mexico asked for close to $400 since it argued the complete top panel had to be replaced. I ended up doing it myself for less than $100.

I still prefer Macs for working because of macOS, but I don't know what I will do when my current laptop dies.


I'm sure lots of people have stories like ours, one way or another. Things break. Not everything is perfect, but sometimes it is. How often are macs really failing? We hear stories because Apple is hated and you-tubers love getting scenarios where they can make an attack apple video. But without actual numbers we have no idea if this is a trend or not. Consumer Reports regularly ranks Apple as having the best failure rates.

For a long time smart car buyers never bought redesigned vehicles. Why? Their reliability is unknown. As device gains in hardware continue to diminish, perhaps it'd be wise for us to take this stance with electronics - and wait a few years.

> I still prefer Macs for working because of macOS, but I don't know what I will do when my current laptop dies.

Sadly it's not much better on the other side. Premium window machines still can't get basic things like the touchpad right. Windows 10 is pretty bad. I'd gladly get another machine - but nothing offers what I like about my Macs. The linux people aren't worth bothering with. Most people don't want to deal with the limits - and there aren't a lot of manufacturers.


My biggest gripe is not that things fail (that's completely expected) but how Apple reacts to that.

For example my 2007 MBP suffered from Nvidiagate. The GPU died during its third year, many months after the warranty had ended. Apple fixed it, no questions asked.

The 2011 Radeongate affair was ridiculous. There were thousands and thousands of users complaining online. It took Apple 2 years from the first machines failing to start a repair program AFTER a couple of class action lawsuits. It was a massive fuckup.

I haven't bought any of the redesigned MBPs with the butterfly keyboard, but again it took a couple of years to get a repair program after a couple of class action lawsuits. Also, in the US Apple is all fine and dandy, but in Mexico I've personally witnessed cases of Apple refusing to repair the keyboard because apparently they couldn't reproduce the issue.

> But without actual numbers we have no idea if this is a trend or not

Yeah, Apple is as opaque as things can be. Even more now than they will not even share the number of units sold in future reports.


I mean, if we're giving anecdotes, I bought a Dell laptop in 2006, refurbished for something like $600. I was able to upgrade it over the years all the way to Windows 10 (I think I paid $40 for the Windows 7 upgrade at one point). The only thing that failed for me on that machine was the built-in wifi. I only ended up getting rid of it two years ago because I really had no use for it, and had long since replaced both my laptop and my desktop machines by then.

Topically - it was at one point driving my television off this old E1505 and got a 2010 Mac Mini as a Christmas present, and hooked that up instead. Netflix and Hulu chugged on the Mac Mini, which also locked up for no reason from time to time. I literally installed nothing but Flash, for playing back videos.

Hulu and Netflix ran without hiccups or lag on the 2006 Dell laptop, so we put the laptop back.


Around 2010, Flash on anything other than Windows was notoriously craptastic -- to the point that Apple refused to support Flash in their mobile browser.

Nowadays, that same 2010 Mini could probably stream Netflix reliably, since it'll be decoding HTML5+DRM instead of Flash.


Yup, I have a 2009 core 2 duo mac mini and it does netflix just fine thanks to html5 video. It still does everything I need reliably (web browsing, non-vm web dev, ms office), but I will upgrade to the new mini and hand this one off to the kids.


meh. I had a 2007 mini that crapped out one month after the guarantee expired. Took it in for support and they told me to buy a new one, not interested in taking it on. To be fair this was a specialist apple dealer. At the time there was no local Apple store.

My general experience with Apple products has been 50:50 some good, some bad, Apple is nothing special when it comes to quality. Better than some is about all I can say.


1 - it wouldn't be suitable for me now (nor for the past few years).

2 - I remember there being a big differential in equivalently priced machines back then. One of my current work Dell workstations is a much cheaper option than this Mac, much better specced and I've had no issues with it for over two years.

3 - It's £800 for the thing, seems way out of line with inflation compared to what I bought then.


> seems way out of line with inflation compared to what I bought then

Can't speak for European pricing, but the base model Mac Mini 10 years ago cost $599, and adjusted for inflation it would cost $750 today ($50 less than the current base model).


No, I wouldn't consider buying a Toyota Corolla for 60K.


Looks promising. I've ordered one to replace my ancient cheese grater Mac Pro, and it should be quite an update.

I've always been fond of the old Mac Pros (still one of most beautiful machines from Apple), but the extensibility through TB3 has rendered many of the advantages moot (and my Mac Pro is stuffed to the gills).


Side discussion: What are everyone's thoughts when it comes to external HiDPI monitor options to use with this Mac Mini?

Ideally I'd want 5K displays (I'm used to the iMac 5K) but it seems that LG's UltraFine is the only real option and the rest of the industry has settled on 4K/UHD at 27 inches, which results in a) limited screen real estate due to an effective resolution of 1080p at 2x scaling and b) slightly less pixel density.

I maybe fine with b) but both issues considered I'm seriously wondering why UHD displays at 27" are so popular - it seems like a subpar and regretful combination. Are my worries unwarranted?


I’ve been using dell P2415Qs. I run 3 at the HiDPI “looks like 2560x1440”. It’s not perfect 2x scaling, but it is close enough (185 dpi) and the price $300-400 each cant be beat. I’ve considered switching to the 21.5 inch LG ultrafines (220 dpi) but they are $700 and have very limited port choices compared to the dells. I use the dell displays with multiple other machines that don’t have USB-C graphics out. And want to be able to use the displays with eGPUs that won’t have USB-C out.

I just hope the default GPU of the mini can drive 3 UHD displays without choking on the dock animations :|


Well, at least 24“ are more reasonable for UHD or 1080p at 2x scaling. However I currently use a non-retina iMac 27 at home (which I‘m looking to replace) and an iMac 5k at work - I really want both: Screen real estate and retina-level pixel density.

Concerning the Ultrafine: Yes, it‘s too expensive and since I‘d need to connect it to another PC without Thunderbolt it‘s not an option for me.

edit: There‘s also this option: https://iiyama.com/gl_en/products/prolite-xb2779qqs-s1/ - it‘s sold for under 700€ here in europe, but the 6bit is concerning.


I love my P2415Q, will probably get a second one as well. My only gripe with it are the rather large bezels, compared to some of Dell's other offerings. It really is a shame that basically no one is focusing on HiDPI monitors, especially when 4K laptop displays are all the rage now.


I too am disappointed by the lack of 5k options. The iMac 5k display is truly outstanding and I wish there were other options. The 1080p aspect ratio is not great in my opinion.

Something you might want to consider is this 32" 4k monitor: https://www.benq.com/en/monitor/designer/pd3200u.html

It has great reviews, and while it's a bit different from what you're looking for, you might be able to run it at a higher resolution and get more screen real estate. Food for thought...


That's about 138 ppi which is rather close to non-HiDPI displays and shoudl be noticeably less crips than an iMac 5K (at 217 ppi).


I have a 31.5" 4K display I'm using at native resolution with my MacBook Pro and honestly I can't wait for 8K... I wouldn't want to trade off any of this screen space but every day the low ppi bothers me.


Nobody's forcing you to use 2x scaling. I use two 27" UHD displays with Debian/GNOME. The system-level scaling factor is 1, and I adjust application zoom levels as needed.

I typically divide each display in half, and each half comfortably fits one browser window, mail client, terminal, Spotify, etc.


1x scaling will be too small for UHD at 27".

MacOS only works probably with integer-based scaling - 1x, 2x, theoretically 3x. Everything inbetween will lead to blurry text/assets and decreased performance since the output as a whole will be scaled and downsampled by the GPU.

I'm working a lot with text and a sharp/crisp display is important to me. 2x scaling is the only proper option in my opinion.


Everything inbetween will lead to blurry text/assets and decreased performance since the output as a whole will be scaled and downsampled by the GPU.

I run 2 27" 4k displays off a MacBook with integrated graphics, both scaled at 1.5x to "look like" 2560x1440. It works fine; text is not quite as sharp at it would be at 2x, but it's far better than it would be on an actual 1440p display. I haven't noticed decreased performance, but I'm not doing heavy graphics work.

Agreed that display manufacturers are messing this up. 4k displays should be either 20-24" for 2x scaling or 40+ inches for unscaled; 27" displays should be 5k.


text is not quite as sharp at it would be at 2x

I think that would bother me.


Note that Apple uses non-integer scaling _on their own computers_ these days. Default for the 15" MBP, which has a native resolution of 2880x1800, is fake 1680x1050.


PDF’s look blurry on non-Retina displays, even those that are 130-140dpi


> 4K/UHD at 27 inches, which results in a) limited screen real estate due to an effective resolution of 1080p at 2x scaling

MacOS's non-integer scaling works surprisingly well on these; I use one with an apparently resolution of 1440p with no issues.


There is also Philips 275P4VYKEB. But you need two displays ports to power it. IIRC only some new and not so adopted(both by displays and GPUs) display port standards support 5K. Apple iMac 5K uses same setup of two DPs to power display.


I'm with you.

Not high enough DPI for retina, not big enough to avoid scaling. The monitor market seems to be messed up at the moment, there's LG monitor models which were shown at CES 2017 that are barely appearing in some countries today.


Why does Marco call the LG 5K “not great”?


The Mac user consensus seems to be that the panel is great, but everything around it is flakey (at least compared to previous Apple displays) - plugging and unplugging doesn't work 100% of the time, stuff like OS integration of brightness and volume keys isn't seamless etc. Looking at user ratings it seems like reliability isn't great either but that could be skewed.


Anecdotally... I bought an LG Ultrafine 5K when they came out - I believe I've got one of the initially faulty ones that can't go too close to a WiFi access point due to lack of shielding.

It was flaky as hell when it first came out but over the course of about a month of receiving it, the problems largely seemed to stop occurring.

I typically unplug and plug it in once per day when I take my Macbook Pro away to sit somewhere else in the evening. But when I'm working during the day I use it as my primary display. For my own experience the integration of brightness and volume is seamless. The speakers are good (although I use some old Genelec monitors instead for this), and the quality of the display is absolutely fantastic as you said.

About one in fifty times plugging it in to the laptop doesn't immediately work, and then I plug it into a different USB-C port on the laptop, and then it does.

Despite this I'm very happy with it.


Just one data point: I have two of those screens connected to an iMac Pro. This is a setup officially supported by Apple, but sometimes when waking up the Mac one of the displays doesn't turn on and the only option to get it back to work properly is disconnecting either the power or the USB cable.

This is reproducible by connecting both screens to a 15" MBP Pro 2016.

If the screens would at least have an ordinary power switch so I would have to crawl over my desk once every two weeks my life would be a lot better,...


It is somewhat surprising they put so much into the usbc push, but then failed to deliver updated monitors that work as flawlessly as the cinema.

They were like: LG does this now.


There is zero justification for the pricing. It needs to be starting around $399. Apple is still working the privileged pricing model. Considering it has year old processor and 8GB RAM. For $800 an i5 and 16GB should be standard. I can forgive that 128GB SSD but not the CPU or RAM.

Side note, why is a computer only for games, streaming, and mining bitcoin? It that what the end user experience is limited to?


>> There is zero justification for the pricing.

There is a justification for the pricing, but you probably won't like it: More than enough people will buy the product at a high margin price to make up for the price-sensitive people who won't buy it.


Apple uses value-based pricing, not cost-based pricing.


Fair enough, but it "values" the Mac Mini at $799 with a year-old i3 and 8GB of RAM? It valued the last one at $499 and it was 5 years old. I guess what the market will bear is really true...for Apple.


It's not about how much Apple values their products. Value-based pricing is based on how much value the customer gets out of the product.


The customer will not get $799 of value out of a Mac Mini. It will be made obsolete by Apple before that happens. Especially if Apple takes another 4 years to release the next one.


Keep in mind that the old Mini, priced at $499, only had a Laptop Dual Core i5 CPU. The new ones all have Desktop grade Quadcore CPUs in them. This isn't a totally justifiable reason for the price increase, but it is seemingly a large performance increase.


>> I can forgive that 128GB SSD but not the CPU or RAM.

I had the curse of having to use a 2017 MacBook Pro 13” with a 128GB SSD...it’s not forgivable, it’s not usable.

By the time I put xCode in there as well as Office, et cetera, and project files, I was left with 20-30GB left, and anytime I dipped below...20? it bombarded me, literally, continually, with a message ‘you are low on disk space.’

You know what is never forgivable? Soldering a hard drive to the motherboard on a $1500 device.


I agree that Apple has gone completely crazy with the markup, but who is saying those are the only applications for computers???


Too expensive. If you just want a redundant NAS everything else is cheaper. If you want a non-redundant NAS asingle disk NAS enclosure costs 54 bucks: https://www.hardkernel.com/main/products/prdt_info.php?g_cod...


I'm fairly new to this NAS stuff. Is there a reason to have one hard drive per sever, or is that just a limitation of this one in particular? I'm sure that RAID over a network of hard drives is a solved problem, but I wouldn't know where to start.


It's a limitation of that odroid enclosure. You could do raid over iscsi (but I wouldn't). Buffalo do a 2 drive enclosure very cheap, but I'd go for a synology or qnap enclosure for a bit more money.


This is a really weird device. The mac mini has long been used as a home theater pc of the apple ecosystem, due to it's formfactor and being the one apple device that didn't come with its own 20 inch screen or 3 foot tall tower. Apple tv, for all it's shortcomings, has sort of supplanted that and you're much better off sticking that under your tv and running your plex server somewhere else. So now we have a mac mini which is too expensive to be a home theater pc or to be stuck in a closet serving media, and is comparably priced with macs that have actual screens. So its like good for someone who for some reason doesn't want to carry a macbook around but wants a semi-portable workstation? or someone who wants an iMac with a bigger screen? It's not useful as a mac mini, and it's pretty redundant thanks to every use case being covered by an existing mac. Who is this device for? Who is buying this over a macbook or imac?


I've bought two Minis as dev machines in the past. Portability was part of it. I also saved a bunch of money because I already had a full set of peripherals. Edit: Upgradable RAM and the ability to install a second drive was also great in retrospect, I wonder if the latter is still possible with the 2018 model.

And I'm not sure whether glossy Retina screens are really the best bang for the buck for developers. I often wish I had dual 1440p or a ginormous 21:9 screen when I'm dealing with complex projects.


Weirdly, it has a lot of enterprise-scale applications and implications with it's ability to accept 64GB of RAM.

We're replacing a ton of those 'garbage-can' Mac Pro models with them. Lord knows we don't need kick ass graphics cards.

In terms of consumer-level applications, it definitely seems like the sell used to be the pleasant $500 price point.

The G4 Mac Mini, at $499, was my very first entry into the Mac world. I certainly couldn't afford $1,100 for an iBook G4 in High School.


Many people. This is targeted at professionals who need a max with good performance. Eg. Software developers. I can imagine this being a popular device.


I really wish people would more clearly disclose when they have been provided a product to review by the manufacturer.


Can it really be more clear than this:

Apple lent me a high-end configuration for review — 6-core i7, 32 GB RAM, 1 TB SSD — which would cost $2499 (much of which is the SSD)


How about blazing red letters at the top of the page?

How about putting "paid review" in the title of the HN submit?


I don't think it's fair to suspect Marco being paid by Apple, he's far from a shill. See for example [1][2].

[1] https://marco.org/2017/11/14/best-laptop-ever

[2] https://marco.org/2017/11/24/fixing-the-macbook-pro


Is it "paid" if you have to return it after? Presuming "lent" is being used in it's ordinary meaning...


But is this a paid review? He was lent the hardware, which means he had to return it after he had made his review. While there’s certainly some conflict of interest there, I don’t know anyone that would consider that a payment.


it's common practice for journalists to get review unit, which they later return. Not just apple, everybody does it.

Also, in case you're not familiar with Marco, he was a co-founder of Tumblr, created Instapaper, and is the developer of Overcast (podcast app for iOS). Not sure he needs the money


Most reviews are done on devices _loaned_ by the vendor; Consumer Reports and a couple of others buy their own stuff but they're very much the exception. A paid review is where the vendor pays for the review; that's very different. Basically any professional review of a computer you see will be on a review unit loaned by the vendor.


Yeah, how about putting that disclaimer first paragraph, first sentence.

I think (and I have no relation to this person) http://lon.tv is an example of an exemplary and ethical reviewer in this regard.


He puts it at the very first mention of the hardware he's reviewing. I think it's about as clear as you could ask for.


> much of which is the SSD

A 1 TB SSD is on the order of $150.


Not saying the price is not outrageous but I found it ironic that you spec maniacs can’t tell the difference between a sata ssd vs pcie ssd. You will be hard pressed to find retail ssd that’s remotely close to the performance in these new macs.


For users that care, a 1TB PCIe SSD currently costs a mere $230 (for the Samsung 970 EVO on Newegg, which has pretty impressive performance). Sure, that's slightly more expensive, but it's nowhere near the premium Apple are charging. On the other hand, users who don't need the extra performance still have to pay through the nose for it - and arguably the difference between SSD and HDD is more important from a user experience perspective than the difference between SATA and PCIe.


You can find PCIe SSDs for about $150. Like this one:

https://www.amazon.com/Samsung-860-SATA-Internal-MZ-N6E1T0BW...

That isn't a performance monster. But you could get something really top-of-the-line for, say, $400.


I clicked on your link and it says "SATA 6 Gb/s Interface". is the Amazon description incorrect or did you paste the wrong link?


so your last sentence contradicts your first. Why would you want the slow PCIe SSD, you would want a better performing one for the technology being offered.

That said, Apple should not be sodering these SSDs to the motherboard.


How is this statement accurate?

> which would cost $2499 (much of which is the SSD)

A 1TB SSD is maybe $200 retail?


If this was a PC then a 1TB SSD would be about $200, but the 2018 Mac Mini has a non-upgradable soldered-on SSD on the motherboard so you have to pay Apple's build-to-order prices if you want one. (The RAM is technically upgradable but requires a security Torx bit to access just to be annoying.)


It's accurate maybe not according to the market in general, but it's how the pricing on the BTO works. I tried building a similarly specced Mac Mini just now and upgrading to the 1TB SSD is +$600, whereas 2TB would be +$1,400.


A 1TB SSD is maybe $200 retail?

For the cheap ones. The faster/better quality ones from Samsung and Intel, which are more comparable to what Apple is using, are $400-500


The 970 EVO performs noticeably better in almost all benchmark categories (latency, sequential read/write, 4K read/write) than Apple SSDs. The 970 EVO uses MLC.

The 970 EVO is $230.

Come again?


970 Evo 1TB (3500MB/sec): $228

970 Pro (effectively same speed, but 3bit MLC versus 2bit): $380


That should really be the very first line of the review instead of being buried at the 10th paragraph...


> Apple lent me a high-end configuration for review — 6-core i7, 32 GB RAM, 1 TB SSD

The problem is that the i7 has hyper-threading, which creates a giant security hole in your system. I'd be much more interested in the i5 benchmarks, since the i5 (supposedly) does not have hyper-threading.


The "giant security hole" I assume you're referring to isn't particularly worrisome for your own hardware running trusted software. It's more of a problem in a shared environment where you don't know who else is running code.


> The "giant security hole" I assume you're referring to isn't particularly worrisome for your own hardware running trusted software.

My concern is mainly javascript or webasm running in the browser. Even if there aren't currently any unpatched exploits of this nature, that's no guarantee that there won't be in the future.


First, that means it isn't a "giant security hole"; it's a potential security hole.

Second, if an exploit is discovered just disable the extra hardware threads until it gets patched.

As an aside, I don't know what your "supposedly" about the i5 is supposed to mean. The i5 has only one hardware thread per core.


> As an aside, I don't know what your "supposedly" about the i5 is supposed to mean.

The tech specs page on the Apple site doesn't give the exact part number, so there's always the chance it could be some previously unreleased chip.


I've been an avid Apple user and supporter for many years, preached about the stability and the superior (tongue in cheek) hardware. I was waiting for a decent replacement for an ageing mini, but seriously Apple, at these prices?? It's insane. I've made up my mind after I saw the Apple event and reading/following the news of them hiding their sales numbers from now on [0].

I've switched over to Linux as my main desktop completely. While I've been using Windows for years, Windows 10 is just not doing it for me, even with WSL, it just feels limited and unpolished. Anyway, rant over. [0] https://www.marketwatch.com/story/when-the-going-gets-tough-...


Slightly off topic but is there cloud Mac/Xcode that I subscribe to for my occasional App development?


There probably is but it will be built on top of Mac mini or Mac Pro farms as Apple does not allow virtualization of macOS on any hardware not carrying a Apple logo (I believe that is how it is actually stated in the terms even). So of it is available it might not be a cheaper option.

But I believe Travis CI does have macOS workers available for free, but they might be scarce in available time slots.


I've been using MacInCloud at work for the past 2 years or so, renting a single server for use as a VSTS/Azure DevOps build agent server, for iOS builde. I'm very happy with it - the server we have is fast enough, both CPU and disk, and it's been rock solid.

I've only contacted their support team about billing stuff, but they were responsive and helpful.

Disclaimer: no affiliation, just a happy user.


Checkout Macstadium. https://www.macstadium.com



Tag for interest, I have no interest in using Apple for anything, but I need to compile my RN app.


hostmyapple macincloud


"...cost $2499 (much of which is the SSD)"

The cheapest SSD I can find goes for €129.47, and it doesn't seem to get up to much more for consumer SSDs. Unless this SSD has some crazy specs that I'm missing, it seems seriously overpriced to me.


Little of both. The SSD is a very fast 1TB model running on Apple’s custom controller (so you can’t just compare to a random one on Amazon). It’s also an obscenely high priced upgrade.

Probably more column B (expensive) than A (good)


It's the fastest SSD on the consumer market by a good margin. My main gripe with it is that Apple doesn't provide any affordable option for large, slow storage. Of course you can always go external, but then it won't be so much mini anymore.


Except... it's not. We know that larger SSDs are faster... but if we compare a 1TB Apple SSD in the iMac Pro to a 500GB 970 EVO (around $120)... the 970 EVO is at least equal, if not faster in performance.


Apple charges +$600 (USD) for a 0.25 → 1TB SSD upgrade.

I heard some complaints, but was all like "Yeah, but modern Apple SSDs are 3GB/sec, that is pretty good, bros..."

My coworker clued me in, though:

    1TB samsung 970Pro M.2 (3GB/s) 
    was as low as AUD$304 here just
    yesterday.   Thats ~USD$215.  I
    don't usually mind paying the 
    _AppleTax_ but that is ridiculously 
    overpriced.
I buy all sorts of overpriced Apple shit, but that is so overpriced it's kinda offensive.


I don't know anything about SSDs (hint) but that SSD looks like it's on sale at Amazon right now for $393. I'm more than happy to pay Apple a premium of $200 or whatever to just get what I want out of the box and not have to think about this, research options and compatibility, worry about drivers or who knows what else, etc.


...which is why Apple is more than happy to charge you said premium. As long as there are enough consumers like you who prefer to pay more there will be suppliers who will charge more.

Those who prefer to pay less, have more freedom and get a higher-performance system which is more tailored to their needs will, for the (small) price of some more up-front thought, choose other suppliers. They will have the added advantage of being able to partially upgrade their system by swapping the SSD while those who choose the Apple option will have to wait for the next iteration of this product.


There are no compatibility issues, drivers or who knows what else with PCIe m.2 SSDs.


Yeah, I have no idea what PCIe m.2 is, and I don't really care to find out. That's my point.


I don't believe you:

If you put anything inside your PC, you'll have to look up how it connects :)

Even outside - whether it's USB-C or thunderbolt actually matters, HDMI vs Displayport etc.


You could get the 970 EVO as well. IIRC it's also MLC.


There is a hefty premium on top of any upgrades from the base model, but Apple SSD with the T2 chip is insanely fast - check out the benchmark for the iMac Pro and MacBook Pro 2018.


Plus I don't really see why a crazy-spec SSD is really key here. It's not going to be the bottleneck for most use cases in a Mac Mini that has an integrated graphics card (at $2.5k...) A regular SSD would do.


Everyone's use cases will be different. I have absolutely no use for dedicated GPU, but require fast storage. Of course that doesn't mean that Apple's SSD upgrade options aren't overpriced.


>it seems seriously overpriced to me.

I havent bought Apple products, but the obvious anti-consumer and anti-developer practices are a no brainer.


This sounds like it’d make a decent development machine - better in many ways than a MPB.


> ... not a low end product anymore...

Apple is clearly pushing everything towards a more luxury high end. More than ever, they're pushing the market to see how far they can go.

They're pricing me out. I can afford their stuff, I just find it hard to justify it.


Having a user serviceable M.2 slot would have made this the perfect Mac for me. Lets say I have Applecare for the first 3 years and the SSD dies on the 4th year, what are my options? Replace entire logic board with new CPU & SSD?


I'm impressed, but I am also not sure how accurate the benchmarks he's using are for replicating real-world load.

I actually think it's pretty cost effective if you go for 32GB RAM instead of 64GB, and keep the baseline 256GB SSD storage. It comes out to $1,899. With 64GB RAM and 2TB storage, it comes out to $4,099, which is a huge markup for things that can be self-upgraded on a PC.

Even $2,699 for the 64GB version doesn't seem that bad for effectively the best MacOS running computer you can get currently.


How hard is it to upgrade yourself?


According to the article, you just need a T6.


Can't replace the SSD? Pass.


I can upgrade the ram, the most useful piece of information in the review. If I could upgrade the SSD too, I'd be tempted.


How is it compared to Intel's Hades Canyon? I'm thinking between those two mini-sized computers.


I have the Hades' predecessor, one advantage of this line is a pair of M.2 slots in RAID that will later hold 4tb, 8tb, etc NVMe sticks allowing massively more capacity at similar speed to the Mini's storage.

The AMD graphics in the Hades' is vastly better, but still significantly inferior to anything you will connect via eGPU. A lot of casual usage will not require the Hades' GPU but you'll always pay a power/thermal price for it. As resolutions increase, or depending on your requirements now, you'll probably be forced to pair it with an eGPU because it really only excels at 1080p for 3d stuff afaik.

One nice thing from the Hades' predecessor is fanless cases, I don't know if they've emerged yet for the Hades Canyon but going fanless is really amazing. Prior to that it could be loud, and the Hades added a second loud fan for the GPU.


It doesn't seems Apple is interested in using AMD CPU / APU in their products, otherwise they would have a decent iGPU along side.

I was expecting the Intel + AMD GPU combo to end up in this Mac mini, turns out not.


I’m pretty sure I would rather buy an iMac than a Mac mini. An iMac at least comes with a display, and the base 4K version is competitive in price and features as far as I can tell.


I bought an iMac last year and regretted it. The performance was fine, but I ended up disliking the fact that it was an all-in-one. The fact that you can get iMac performance without a display attached is a big feature of the new Mac mini for me.


Yeah. I guess if portability is a plus, that would be better than an iMac. I move once a year so that wouldn’t really be a problem for me, but if you need a Mac for portability, why not just get a MacBook?

I think the Mac mini is at an awkward price point. You need a monitor, keyboard and mouse which cost you extra. At that point you can get a base MacBook Pro.

I think it would only really make sense for businesses


Can anyone explain the market for this to me? In Ireland Mac Mini: 8th Gen i5 Processor/8GB RAM/256GB SSD €1269

Dell XPS 13 with: 8th Gen i5 Processor/8GB RAM/256GB SSD €1189


Unless the XPS can run macOS the comparison will be useless to the target audience of the Mac Mini. Furthermore both are very different product categories.


I bought my first-ever Mac just over three weeks ago, a refurbished Mac Mini (2010 version, with Core 2 Duo 2.4GHz, upgraded by seller to 8GB and 240GB SSD and High Sierra).

Pretty happy so far.


I would have liked to see performance compared to the 2018 15” MacBook Pro, since they are around the same cost. The 13-inch really got its butt kicked.


I'm just waiting for the same article but for Macbooks.

I can't wait to throw my money at Apple when they bring back MagSafe and scissor switches.

Not holding my breath though.


>Apple lent me a high-end configuration for review

Is this common, for a company to lend a product for review, instead of just giving it to the reviewer?


The new mac mini seems like a straightforward improvement from the other versions with few things getting worse. I love that.


I'm looking forward to see server blades made out of those Macs. It's just satisfying to see how they are built


Best display to buy with the new mini is probably: Eizo Flexscan ev2785 4k 27"


I am eagerly awaiting iFixit's teardown of the new mac mini.


I wonder how it'll perform as a "super" Apple Tv.


I wonder how long the Mac mini will stick around


> $2499 (much of which is the SSD)

Where 'much' is approximately 5-10%.


In terms of the retail price. More like 30-35%.

The 1TB SSD is a $800 option on the 128GB base model, and a $600 option on the 256GB model.

The 2TB SSD is an even more eye-watering $1600 option from the base model.


Configuring a mac mini on Apple.com now: +$600 for the 1TB, so, 40%. I'd say it qualifies as "much" :)


I think it refers to the fact that if you configure it on Apple.com, a 2TB SSD adds +$1,600 to the price tag.


Stop using Geekbench for reviews :(


What would you recommend instead?


FCPX BruceX


Why?


Apple’s new strategy seems to be, instead of having products that sell themselves for their clear benefits, just flood either YouTube casuals or Twitter “celebrities” (depending on target segment) with the highest end config machine and expect praise.


Please point me to a succesful hardware company that has ever let their products sell themselves. I simply don't understand what your gripe is. Providing review units for members of the press is completely standard. Every company does it, and Apple has done it always. The only difference is that it is now bloggers and YouTubers customer's look to for buying recommendations, instead of the Walt Mossbergs of yore.


True, I used to go out of my way to find unbiased and honest reviews from "real" people that weren't "seeded" by manufacturers.

The problem is that many others did the same, so these honest reviewers gained followers, that gave them exposure and eventually they get approached by the PR departments which would like to "lend" them top end review samples for an undisclosed amount of time. Well before the press NDA, of course. Lately I find myself buying paper magazines again.

That being said the new Mac Mini does seem like a great product.


You realise that every single thing reviewed in a paper magazine is a loaner review unit, unless it's Consumer Reports, Which?, or one of about three others? It's an industry standard.


Sure, but the authors of larger magazines typically do not get to keep (or even use) the product for their own private purposes, so they're not nearly as invested and have far less reason to be biased.

I've heard many companies "forget" to ask for their seeded samples back, particularly those sent to so-called influencers.


That's why I subscribe to Consumer Reports even though it sucks in many ways.


Pretty much every computer manufacturer provides review units, and always has. In fact, if anything, Apple's known for being a little on the stingy side here; for some of their product releases only the biggest publications have gotten early review units, whereas some vendors hand them out far more freely.


Can't say there was too much criticism on this review.


In Marco's defence he has slated Apple for what they are doing to the laptop line for the last few years. If he didn't like it I am certain he would have given a bad review.

ATP and Overcast (not to mention Tumblr) are successful enough that he doesn't need to be on Apple's good list to make ends meet.


I wasn't looking to start a fight with specific people. I just dislike the general direction Apple Mac products are heading in, as well as their marketing efforts.


Ah, fair enough


Well, you know, we waited years for a new Mac mini replacement, and we got one that is very disappointing on the low end, very expensive on the high end, no clear future support, missing features from previous iterations, but hey, it’s amazing machine because Apple finally graced us with it and I got it for free, so let me overlook that 2500$ price tag.


Those units are not given out for free. Apple get them back after the review period. It would be a massive conflict of interest otherwise.


There is still massive conflict of interest. They want to continue getting these review units, and Apple is known for "being offended" and blacklisting people from media events and review units.

None of the reviews I've seen from "general public" "reviewers", in the year or so Apple has been doing it, have provided real criticism. It's always "fine", always "great", big issues are glossed upon.

At the very least, Apple picks their target "reviewers" very well based on their bias for Apple products. That's fine in and of itself, but it creates a bad image when these people get their review units before even the media reviewers.


If you listened to Marcos podcasts, you'd know that he has been very critical of the most of the Macs that have come out in recent years. He remains critical of most of their laptops. That doesn't mean there is not still a conflict of interest in principle. But in practice, I would say he has a track record that proves, that he is not afraid to bite the hand that feeds him.[1] With or without conflicts of interest, there's really no substitute for getting to know a reviewer to find out if you generally share that person's sense of what is valuable and not.

[1] Maybe it really isn't the hand that feeds him. Marco's primary gig is his podcast player for iOS.


I really wish that devs out there, including the HN community, give PCs and Windows another chance.

The PC specs are amazing, value for your money is excellent, the OS is beautiful - the biggest obstacle now is so much open source documentation explains how to install or build on a linux based machine, ignoring the Windows users and making them feel like sh*t for working on a Windows box.

If more dev's offered docs around building/compiling on Windows, and Windows support, that would be excellent...


Pity that it's so hard to source W10 Enterprise LTSC (previously LTSB), you can bypass pretty much all Windows Update woes that way, miss out completely on Cortana and Edge, and avoid having "features" no-one asked for rammed down your throat every few months.

Despite Microsoft's FUD[0] ("The Long Term Servicing Channel, which is designed to be used only for specialized devices (which typically don't run Office) such as those that control medical equipment or ATM machines"), it also appears to run Just Fine[tm] on the latest desktop hardware[1]

It's the only version of W10 I'll go near.

[0] https://docs.microsoft.com/en-us/windows/deployment/update/w... [1] I've had W10 LTSB 2016 running for months on two home-built Ryzen boxes (1x 2700X + 1x 2400G)


I do prefer the macOS ecosystem overall and distinctly dislike the UX mess that is Windows 10. I have been using Windows for decades now, and feel very comfortable with it, and I still maintain a desktop tower for gaming and intensive work, but if I had the option to switch to a macOS machine I would. There just isn't a Mac hardware out there that fits my needs. But each year I feel Apple is making the Mac ecosystem worse and worse. At some point the software advantage will not be worth it.

That being said, if someone likes Windows 10 now, there really is little reason to remain on Mac hardware. Practically, most creative software exists on Windows.


It's my impression that most Mac (and Linux) users have given Windows a chance. Maybe not the very latest versions, but most Mac users I know (including myself) have switched from Windows at some point in the past. I don't doubt that Windows is less infuriating than it was when I left in 1998, but nothing its more modern incarnations appeals to me in a way that would even make me consider switching back. "Not so infuriating anymore" is a weak value proposition. There is also an element of "fool me once shame on you, fool me twice shame on me".

At the same time, I am perfectly happy with my Macs; I still absolutely love macOS, my work 5k iMac is a delight to use, and my private 2010 MacBook Pro is still going strong (although as of Mojave, it can no longer run the latest OS). My 2008 Mac Mini is nearing end-of-life, but only because I can't justify upgrading its internal storage to an SSD when the computer is stuck on Mountain Lion. I worry about the price hikes, but OTOH, as long as I can expect getting 8-10 years useful life out of a system, I don't mind paying the Apple Tax.


I find WSL can really help in this regard. It's being actively developed by Microsoft and they're making the interop between WSL and windows better over time (a good example of this is https://blogs.msdn.microsoft.com/commandline/2018/11/05/what... on what's new in October)

So what I tend to do is use linux install paths and do my scripting work in WSL, with my more GUI/Corp apps in windows.

All they need to do now is get raw network sockets working and finish off Linux Docker engine support in WSL and I'll be sorted!


Cons: Windows update.

I'll pass.


Windows makes sense for game devs, Android devs, and Windows app devs. It’s my impression that HN devs are mostly web devs, with some mobile devs. Windows is not best for those types of devs.

IOS devs need to run Xcode.

Server devs can and usually should use remote servers for development. So their local machine’s OS doesn’t matter. It could be Windows, but they are mostly ssh-ing to Linux boxes or VMs for work.


I like PCs.

The PC specs are amazing, value for your money is excellent, and now is so much open source documentation explains how to install or build on a linux based machine.

And outside of laptops, a custom PC can still be built with whatever specs needed.


What advantages does Windows offer over Linux other than gaming?


Software and drivers that consistently work without the need for fiddling. Every year I give Linux a chance, and every year I quickly sober up.


I think it must strongly depend on your use case and exact hardware, because those exact reasons in my case are an argument against windows. I have to download and install what in order to get driver support? How many tray icons can one person possibly need? It's the worst in my experience with printers and scanners; "please install this 300MB package which will constantly run in the background and annoy you at the least convenient time to replace your ink - oh, and in three months we're going to completely change the interface and replace it with something that doesn't even support feature that you're using". Or, hear me out, I could install CUPS and xsane (or another sane frontend) and be done. Driver support is indeed hit-or-miss, but when it hits there's absolutely no work at all. (Exception: if your printer isn't already supported, you can often find and download a single small file to add support; CUPS is beautiful)


I agree with you that a lot of vendor software is garbage. If the hardware you use is supported by your Linux distro of choice, I'd agree it could be a better experience than installing any driver. But at least there is always a driver. That has not been the case for desktops and laptops I have had in recent years. Always something missing in plethora of Linux distributions, and the ways to solve those issues are not straightforward even for a 10-year experience software engineer. That is not something I want to think about when buying new hardware.


I have dual booted Ubuntu for ~3 years on my MacBook Pro (13” late 2013 model) and used it as my main OS.

I had to install a webcam driver but other than that everything just worked(TM)


Yes, especially on Mac, high DPI on Linux is still garbage to this day. It's not really good on Windows either, but at least with Windows 10, it is somewhat serviceable. On Linux, support is so abysmal, it is really comical in 2018. Moreover, it seems the dev community still hasn't "seen the light" in high DPI displays, and will often dismiss or backlog required changes for support. In 2018.


HiDPI worked / works great in UbuntuGNOME. I spend most of my time in a text editor, terminal or browser, but even things like games for my kid (Tux and Tux Kart) supported HiDPI without requiring any magic tricks.

/edit setting a different scalefactor on my external (non-HiDPI) monitor was frustrating in that it worked... sometimes


For me setting up Linux got easier than Windows a few years ago. There are other problems that stop me using the platform but drivers isn't it anymore.


That's if you are on supported hardware. If your sound/wifi/disk storage/modern GPU lacks support, you are SOL for a long time and have to resort to experimental drivers that may or may not be stable and may or may not be working as expected. Meanwhile, practically all hardware has Windows drivers.


Tired meme, Windows drivers are the only ones you have to "fiddle" with out of band, I can't even think of another OS where going to a third-party website to download an executable is how you install drivers for the machine. (Not OSX, not BSD, not Linux).

Buy any desktop in any supermarket in the world and it's going to work out of the box with the latest Ubuntu. This has been the case for 5 years now.

If you have some weird hardware that needs a kernel module that isn't enabled by default or packaged as a kms for your distro, then sure I can definitely see some awkwardness there, but for instance I haven't had so much as a wifi problem in 10 years.

Definitely a stark contrast to Windows where you have to go to the manufacturer website for each component of your machine if you're not using the bloated OS that comes pre-installed.


What meme? I was speaking about personal experience.

> but for instance I

"I" being the key word there. On the other hand, I have had issues with Linux drivers every year.


What distro, what type of machine? Because I’ve been prolific in my purchase of brand new hardware and I have had only the most minute of issues.


I think the advantage right now is largely in a few select sectors right now, and mainly due to ecosystem.

Music production is a big one from my perspective -- currently, if you are looking at the tools and plugins that are the most popular in the professional world, your practical choice is basically between Macintosh or Windows.

The majority of DAW plugin synths / effects currently are not compiled for Linux, and I do not believe many popular DAWs (stuff like Ableton Live, Logic, Cubase, etc.) are currently supported yet either. I have heard that if you keep it light weight, some audio plugins will run just fine under something like WINE. But for a heavy-hitting plugin like, say, Spectrasonics Omnisphere, I have heard that emulation is too slow to be practical.

There are certainly native Linux DAWs and plugins out there, you can probably go quite far with Ardour or the Linux version of Reaper, and there's a few plugins too (u-He has some native Linux builds of their excellent synthesizer plugins for instance). It's just that the native ecosystem out there is quite a bit smaller, unfortunately.


That has always been Apple's MO. Ever since their "Think Different" marketing campaign, where they realized they can convince the sheep to follow if they associate with the brightest minds and celebrities (e.g., put an Apple logo next to Einstein, next to an astronaut, etc.) without actually offering a better product.

It's always been a fashion statement to own an Apple product...

What bothers me is how the technical community, both software engineers and academics, have fallen into this trap.

A new Win10 pc is a much better development machine. Sorry, but I do prefer to Think Different and don't care about what is fashionable, but make my choices based on specs, utility and value.


Since I am part of this aforementioned technical community, I will spend a few extra dollars in order to get: 1) high resolution screen, 2) large track pad that works well, 3) native "unix like" shell, 4) sleep that works 100% reliably when I close the lid.

I don't consider it a fashion statement, I just want it to work. After 20+ years I don't like to tinker with my desktop anymore. I would go back to Gnu/Linux for desktop, but finding the right hardware with the above specs is a challenge if not impossible (I have yet to find a track pad driver that works as well as Apples).

I have not reevaluated in 2018, If there is anything better I would like to hear specific examples of hardware and OS combinations?


> native "unix like" shell

Funnily enough, IIRC Darwin isn't unix-like - it's an actual certified UNIX™:) Although in the modern ecosystem, perhaps it's more appropriate and useful to say that it's a Linux-like :-)


Most people I've heard this rant from, when further questioned, have barely touched a Mac and don't know anything about macOS to be able to substantively trash it.

Approximately how many cumulative weeks have you spent in OS X / macOS?


I unfortunately took a job at a company that thought they were being fashionable and progressive by forcing each employee to use a Mac. They saw it as a benefit. It was idiotic.

95% of their workforce was using Excel for reporting tasks. But, of course, this was Mac Office 20xx (08, 12?), when the rest of the world was on Office2016+. Half the features weren't available, or required people to hold down 8 keys simultaneously to work.


Right. I guess, for me at least, before it might have existed to an extent, but since machine we’re good value and it was celebs that did the marketing, this didn’t bother me much. Now it seems to me like they are desperately trying to push those things in crowds where I do not want to see being corrupted like this, namely the technical community and engineers.


Win10 is known for autoupgrades now. It’s another thing than just telemetry.


they can convince the sheep to follow

Baaa


As a side note, it is a good thing that Apple is supporting 16Gbit DDR4 on time and schedule. (this was not always true in the past)


TLDR - Overpriced. Buy something else.


The pricing is a joke.

Refurbised Mac Pro with dual GPUs and SSDs are much better value.


Off topic, but Marco has lost a bunch of weight! Not sure I love the beard though.


A 2500$ windows desktop will give you.

1. A much better discrete GPU which can run titles in 4k.

2. Ability to run games cause windows has DirectX.

3. Ability to run Linux natively using WSL.

4. Expandability - You can add components like more RAM or an additional GPU.


> 3. Ability to run Linux natively using WSL.

But what are you doing in WSL that you can't do on a Mac (or, of course, a Linux machine) without a 'subsystem'?

Sincere question, because the 'Mac is not Linux' things that wind me up tend to require X11 to fix, WSL would be no different.


Mac ships with their own posix subsystem, WSL allows you to install a bunch of popular flavours of linux on top of Windows. I personally run Arch Linux and have access to a lot more bleeding edge packages.


To clarify, Mac ships with a native Unix [1], derived from BSD, not a posix subsystem. It has its own init (launchd) and its own GUI, but most of it is a Unix.

Using Macports or homebrew gets you most of the packages that are available for the various Linuxes, GNU utilities etc.

[1] https://en.wikipedia.org/wiki/Darwin_%28operating_system%29


> WSL allows you to install a bunch of popular flavours of linux on top of Windows. I personally run Arch Linux

Oh, I didn't know that, that does make it a bit more interesting to me!

I still think I'd be frustrated (as I am on Mac) with the lack of a configurable (or even of a choice of) window manager though.


Yet it doesn't have the one thing someone would care about: macOS.


The review was good but I'm honestly surprised that Marco, of all people, would have such a badly dubbed video. Get Casey Liss to help you, his audio in videos isn't as good as your audio but it matches his voice at least. #AccidentalBadVideoDubbing




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: