If this thing can run even just the Xcode command-line utilities, it'd be a huge asset for cross-platform projects, build automation, etc. Definitely keeping an eye on this!
Little known fact but you can build most of the utilities on Linux, and if you use Mac to prepare some of the bits, you can end up with complete build environment. Not Xcode itself, but enough to be able to build from source, afaik.
If Apple isn't getting their developer license and hardware brought, they will find a way to make it impossible.
Apple needs to force developers to use their hardware or it won't have a native/refined feeling and developers will stick to Windows/Linux without the extra step of Apple.
>Apple needs to force developers to use their hardware or it won't have a native/refined feeling
Or Apple stops making as much money in hardware sales and might actually have to lower their prices?
>developers will stick to Windows/Linux without the extra step of Apple.
Doesn't that mean Apple is already superfluous and only exists because of arbitrary restrictions created by Apple making the entire ecosystem artifical?
>Or Apple stops making as much money in hardware sales and might actually have to lower their prices?
That already was a threat, in 2018 they removed their total unit sales numbers for a reason. The number was falling. Now they only report total revenue which they are boosting by cutting corners where they can.
And they are still clamping down in the walled garden to further cement their position.
Blocking xcode from running in darling is just a question of when.
Running Darwin on Linux - Dar-ling. Plus nobody dreams of being the sidekick. But yeah, it would have been clever - I'm still disappointed that PyQT isn't called QTPy.
Pronouncing Qt as Cute is like pronouncing gif as jif. Maybe someone 'officially' says it's correct but basically no one does it and if they do, everything secretly judges them.
And yet Americans pronounce Audi as ow-dee. British pronounce Nissan as niss-an (should be knee-sawn), etc. Americans pronounce La Croix brand water as La Croy (Should be ~lakwa, it's a pun on l'aqua) etc etc. Let's not worry and just enjoy life. :)
The creator of x decided to spell it as x. One could call it entitlement for the creator to try to demand that everyone follow some unintuitive pronunciation rule (or, in some cases, a rule that made sense only in another language).
I don't trust people that do not pronounce it as jif. If you can't bother to learn the correct pronunciation, how can you be trusted to do other things correctly?
I assume you also spell "colour", "naïvety" and "aluminium" and don't trust anyone who doesn't? Though you're clearly not even spelling "pronunciacioun" correctly so maybe you shouldn't be trusted?
Just to drive the point home: Language is fluid, English has no organisation setting any rules and we the people get to decide how we pronounce things. And we've pretty clearly decided on what way we want to pronounce gif.
language evolves. How the creator of a word pronounced a thing is irrelevant. Witness essentially every word in the English dictionary. None (possibly literally) of them are pronounced as they originally were (frequently not spelled as they originally were either) unless they are _very_ new.
the "correct" pronunciation is whatever the community decides, despite what we may individually think of it.
As such "jif" is decidedly _not_ the "correct" pronunciation because essentially the entire community says "gif"
"jif" would be the "archaic" pronunciation. Technically valid, but not used by anyone except for hold-outs and people who refuse to accept modern pronunciation standards.
You remind me of a history teacher I had in high school that lectured me in front of the entire class on how my name was being mispronounced based on the way it was spelled. At the time, I just thought to myself "what an idiot" and carried on with my life.
In the UK, the shop Lidl was supposed to be pronounced Lie-del and I think that is how it is pronounced in Europe.
But, everyone pronounced it with a short 'i' and now they've adopted that in adverts - 'Every little [lidl] helps' - doesn't work if you pronounce it Lie-del!
If they wanted it pronounced Lie-del - they should have spelt it like that - Lidle!
Horrible decision imo, since ping can easily be confused with the networking action and there’s no way an “i” could be easily assumed from just seeing “png”.
But in practice I think it is pronounced cutie, because pretty much nobody cares enough to learn enough Qt lore to know better, and cutie apparently is the most intuitive choice. At any rate, I've never heard anyone pronounce it cute, including people who develop with Qt.
According to me, and according to my professionel coder friends who work with the thing daily, it's cue tee.
Same way Tcl isn't tickle, PNG isn't ping, and LaTeX only grumplingly and with quite some self discipline on our side manages to remain la TECH. These jokes were marginally entertaining 30 years ago, tired by the turn of the century, and way past expiry today.
I wasn't born when people started pronouncing Houston St in New York as "House-ton" or Rodeo Dr as "RoDayO", yet that's the accepted pronunciation so that's what gets used.
Well, Houston St in New York is actually named after a person who called pronounced it "House-ton", not after the same Sam Houston as Houston, TX. This was the street's namesake, using an alternate spelling of the last name (spelling was in general less standardized back then):
Note for Brits and Canadians: "house-ton" is not how you pronounce Houston St in New York, it's "how-stun" (IPA /ˈhaʊstən/). In constrast, Houston TX sounds like "hew-stun" run together (IPA /ˈhjuːstən/).
i checked out their repos on github, at first glance it seemed somewhat low activity (but not dead), but there's actually a lot of activity on the ARM branch at least:
This is somewhat a philosophical question/ Ship of Theseus - you are only licensed to run MacOS on Apple hardware, but what if I had, say, an Xserve and replaced a hard drive? What about a Motherboard? Would a Franken-XServe with an AMD Threadripper still be ‘Apple Hardware?’
Supposing you could adapt the ROM/EFI/SMC to work with the new motherboard. (I'm not claiming practicality here, just possibility), perhaps desoldering & resoldering the respective chips onto a new daughterboard or even a PCI card.
Does that make it a Mac still? You could literally replace almost every component in a Mac except for the motherboard and have it still be a Mac, but shouldn't the converse be true as well? If I used the original case, hard drive/SSD, video card, keyboard & mouse, even the CPU, but replaced just the logic board (swapping the ROM/EFI/SMC somehow), it seems to me it would still be considered a Mac. Particularly if in the process the original logic board was rendered inoperative to head off any claims of duplicate license use.
No, it's not related to Darling, but it did provoke the question for me of whether it is [legally] possible to 'upgrade' an original Mac to modern components and still have it be considered a Mac. Clearly a hard drive, ram replacement, and video card replacement are acceptable, CPU if it happens to be replaceable, keyboard, mouse, monitor. Really the only thing that might remain is the motherboard. But if you kept that and replaced anything else, shouldn't that could as still being a Mac?
Windows 7 (and 8?) had an activation procedure that would be blocked if the hardware changed since the last one (I can’t remember the exact hw they hashed). They have since went back on the restriction and windows 10 has more forgiving activation rules.
I remember hitting that on Vista when I swapped out a motherboard that died, circa 2007. It asked me to call a phone number. So I did. I explained to the phone rep what I did to cause the error. They asked me where I got my copy of Windows. I told them. They unblocked me.
To me it was amusing that they put up this barrier, but hired a staff of humans to make sure you could bypass it.
I suppose that could be a gate to prevent people from installing on more machines than the license allows if done at scale. But doesn't lock out legit reinstalls from paying customers. For a cost to MS, to employ that staff.
I believe the scale part is correct. They don't want anyone sharing the software, but they especially want to stop people from selling illegal copies and to stop businesses from using a single license across a hundred machines
Most of the Xcode build system is non-gui. Having things like xcode-build, codesign, altool, etc. running on a non-mac would be great for continuous integration purposes. (If it didn't violate some arbitrary Apple rule.)
I have strong doubts about the legal enforceability of those requirements in the EULA in many jurisdictions. I strongly doubt such a requirement would fly in many European countries, for example.
Statically linking the library might be preventable because it involves spreading Apple's code, but if you run a legally obtained program that generates code not containing any libraries on your own computer, I doubt there's anything that the EULA can prevent you from developing that way.
I agree, this provision is probably only there to knock people trying to run MacOS and related software on Cloud services as a managed service (think Mac Stadium but not using Apple hardware). Using it just for your org or for personal use isn't going to cause Apple to send you a C&D letter/file suit.
Some places still observe the "first sale doctrine" which means that if you legally got a copy of a piece of software, you have a right to use it whichever way you please -- so, in such a jurisdiction, as long as you obtained it legally, you have the right to use it.
(Analogy: You buy a music CD; You do not need a specific license to listen to it, and you're not violating anything even if it has a sticker that says "by opening this sticker, you hereby agree to never play this using a Sony CD player, and to never play it at faster than 1.25x on any non-Sony CD player)
My unit tests and some integration tests. Yes, it's not as good as running them native, but running Linux native and darwin and windows in Darling and Wine is a very nice 80/20 for some of my work.
I've recently been developing a cross-platform (gui) app in python + qt. I develop on a Windows machine, but want to run pyinstaller on a Mac to make a Mac package available. Unfortunately, not having a Mac, I can't do this simple thing. I played around briefly trying to (illegally) get a MacOS VM running, but gave up. This would solve my problems (well, not testing that it actually works as expected on a Mac, but much better than nothing)
> trying to (illegally) get a MacOS VM running, but gave up.
I've tried this myself and failed.
But I genuinely wonder, why you think this is illegal. I mean, do you think that Apple's T&C are laws? I can't think of anything else that activity would be in violation of.
They better figure this out and release some good VM support because I have very little interest in buying all of their stuff just so I can run my build.
I did get the VM to work eventually and have been experimenting with builds on that. The VM is slow so I'm starting to lose interest in doing any more grunt work for an extra 10% of revenue. Ah, who am I kidding. I'll do it because I'm just as greedy as they are.
Apart from breach of contract (which is already mentioned), there’s also the practical issue of acquiring a license. Apple no longer sells licenses for macOS separately. This means that getting ahold of a macOS license without resorting to copyright infringement will be difficult.
I've never had problems with getting a macOS VM working (using VirtualBox) --- all the stories about Hackintosh made me think it was going to be very difficult --- and even Apple seems to freely let you download from their servers, without any authentication, all the files that can be made into an installation ISO (search keyword: InstallESD.dmg), so I was indeed quite surprised the first time it booted and installed successfully. I didn't even have to do any specific driver-hacking or similar. I suspect Hackintoshing on real hardware is a lot harder.
Doesn't that have the same problem? You have to get hold of a CI server running OSX, either physical or in the cloud, and neither is easy since they're expensive and non-rackmountable.
You don't need a dedicated server running macOS, which would be more problematic as you say. There are plenty of CI services supporting macOS. Looking at the pricing sheets of CircleCi, Github actions and TravisCI. CircleCI was only one listing lower maximum concurrent and varying cost based on machine size and type. Others only listed total minutes without differentiation between machines. As long as you are using it only for making builds and daily testing instead of anything compute intensive that takes long time, the prices should be reasonable. There are also free tiers for open source projects.
In my experience developing cross platform open source software, doing builds isn't the biggest problem. The situations when you need to debug platform specific behaviour, integration with OS and verifying that your software is properly packaged for macOS is when you might want direct access to a real machine.
I'm guessing that is because they need dedicated workers for MacOS while linux/windows can just run on any node. Macs also do not have a lot of data-center related features like IPMI so running them in a headless setup is not as easy.
Basically apple forces everyone to only run MacOS on it's own hardware and it's own hardware is unsuitable for the usecase.
Sure, it's totally reasonable. Just pointing it out as the parent comment said that they didn't differentiate (and IIRC it's not too obvious on the pricing page).
After searching for a while was able to find it, but it wasn't easy. Seems scammy to claim that you get x minutes in their main pricing page without even an asterisk. Imagine going to store buying 1kg of fish and after paying them giving you only 100g and suggestion that you can get 1kg of potatoes instead. Others at least call it execution credits that there is some non obvious mapping involved.
The drivers comes with a command-line program that processes the input postscript(?) data into a proprietary image format the printer understands. This program is a native Mach-O executable that cannot run under Linux (without Darling or whatever). (It’s called “RicohAficioSPC231SFFilter”.)
Those two aren't particularly interesting outside macOS/iOS -- they're mostly just front-ends. The real implementation is in diskarbitrationd and the kernel.
I wish they took more of a ReactOS kind of approach. Namely, use all the open-source parts that go into macOS, like the kernel and CoreFoundation, and write the proprietary ones from scratch. Then you'd finally have a desktop OS with no compromises and a huge existing app ecosystem (and a consistent GUI).
That was tried with OpenDarwin, but Apple was not a fan of the project IIRC.
More generally, it doesn't offer much of practical value to use the Darwin kernel which is at this point pretty tied to Apple hardware that comes with OS X anyway. You can still Hackintosh but it just adds a lot of additional complexity that you won't have to deal with on Linux.
The point is that desktop Linux is still pretty much a bunch of standalone components loosely duct taped together. Sometimes they fall apart, but even when they don't, it still shows. Linux GUIs still feel like an afterthought.
upd: googling "OpenDarwin" bings up two first results that are news of it closing, in 2006
And working with the Darwin kernel would offer zero benefit in fixing this. The GUI coming out the other side would still rely on all the same technology as Linux desktops do, just as is the case on BSDs. Using a different UNIX kernel doesn't change any of that, and any effort made on the Darwin side would be easily ported to Linux. The GUI components of Mac OS are not open source.
> The GUI coming out the other side would still rely on all the same technology as Linux desktops do
Why? Where does this logic come from? And apparently it's common, because PureDarwin mentioned in another comment does just that for some reason.
IMO, the right thing to do would be to replicate Apple's proprietary stack as closely as possible. Like, literally take the real macOS and start replacing essential proprietary frameworks like Cocoa and Quartz and core apps like Finder and Dock with fully API-compatible open-source ones. You'll eventually end up with a fully open-source operating system capable of running macOS apps.
That would have to be done from scratch as literally none of that is open source. While it could be done, the same amount of effort spent building a better Linux desktop would be more efficiently spent if the goal was to just get a high-quality open source desktop offering.
There's also some hairiness legally about working from within the official Mac OS system outwards AFAIK.
> That would have to be done from scratch as literally none of that is open source.
That's the point. ReactOS doesn't use any of the Linux parts, and it's reimplementing everything from scratch because there are no open-source components in Windows afaik.
It would offer access to the mac software ecosystem, particularly the (imho) cohesive set of underlying frameworks. Just like linux is "just" a kernel but is in reality a center of floss development, actively organizing around bringing binary compatibility can bring incidental benefits.
That said at this point I don't see anything inherently easier or beneficial about this than creating some other project around an alternative paradigm than PC derived desktop systems on top of some existing better supported kernel.
That said contrary to what the previous poster implied I don't think either OpenDarwin or PureDarwin aimed at reproducing the proprietary bits—they were/are essentially quirky, buggy, poorly documented BSDs just as reliant on the floss ecosystem as much better supported stacks.
> The point is that desktop Linux is still pretty much a bunch of standalone components loosely duct taped together.
To some degree it is. On the other hand this is also why I prefer to use Ubuntu or Debian over any other distribution. They are highly integrated and opinionated in the sense that they keep the packages as unmodified as possible.
Linux GUIs are extremely inconsistent and just generally feel like an afterthought. Out of all GUI desktop operating systems I've used, nothing comes close to macOS as far as consistency goes.
Besides that, Linux desktop is this loosely stacked pile of different components that sometimes falls apart. No one wants to spend 50% of their time writing config files and setting up drivers instead of actually doing something useful.
I don't understand how you can make sweeping generalizations about every desktop environment that runs on Linux. Every DE is an afterthought? Just the fact you call them "Linux GUIs" makes me question how much you have even used any DE with Linux.
I spend no time writing system config files. Drivers are part of the kernel, so I also spend zero time on that. Brand-new laptop hardware sometimes takes a while to get support, but it is really easy to check before you buy.
It is perfectly OK to prefer a padded, walled garden. Just like it is OK to prefer a modular OSS Linux environment where everything is knowable. Nobody needs to be "right", and nobody needs to bash the other with unsubstantiated claims.
I do, of course, know that there are many DEs and window managers. I generalize them quite purposefully. The point I'm trying to make is that on Linux, there are hardly any well-defined standards for anything beyond system calls. There also are no UI guidelines and there's no single accepted UI toolkit (some apps use Qt, some apps use GTK, some use something else). This leads to overall subpar user experience. Many apps have UIs that make you think before you do something with it. A good UI doesn't make you think, it feels like a continuation of your thoughts right when you see it for the first time. Consistency helps a lot here. (FWIW, the way GUI apps feel and behave doesn't depend much on your choice of desktop environment.)
I don't like walled gardens — I despise Apple's service ecosystem and the app store. I just prefer things to work sensibly out of the box. The only reason I made an Apple ID is because it won't let me download Xcode without one.
That really depends on the distro. It sounds like you're also talking about how Linux was 10 years ago. I've run Mint for the past decade and I personally consider it a fully featured OS which does everything I need it to do. I admit there is some inconsistency in GUIs depending of the software which you use, though that is simply the nature of the beast. The reason Mac is so consistent is simply due to the fact that you're in a walled garden which Apple fully controls.
why bother following the clearly failed model? ReactOS has been there for more than 20 years, how many people are actually running ReactOS in production?
ReactOS got too ambitious IMO. If they had stopped chasing versions at say XP/2003 then it may have matured more quickly. That said they are getting closer to useable with every release. Like FreeDOS before it I suspect it will fill a niche where Microsoft binaries are too expensive yet compatibility is important.
Not targetting versions beyond XP is what ReactOS does, and pretty much always has to my knowledge. It's explicitly mentioned in its doco that it targets Windows NT 5.2. Indeed, this is the first thing dealt with in the ReactOS FAQ on its wiki.
I found this disappointing at the time that ReactOS was announced, as I recall, because there were a few things that were architected better in Windows NT 6.
If they stopped at 2003/XP then no modern web browser would run on it and the system would be even more useless than it is today. If you want any decent usage out of the system, you need to support a whole bunch of APIs from at least Windows 7 (probably even Windows 8 now that 7 is our of support) because people have stopped compiling for XP ages ago.
Looks like they are open to implementing newer APIs in the future. So if the version reported can be spoofed or modern browsers only feature sniff then there is hope.
No, iMessage is heavily controlled by Apple to the point where even Hackintosh computers running the entire Mac operating system can't always get it working. Without a real Apple device (or: a real Apple serial number cloned to your device), iMessage has little possibility of ever working, even if this project evolves to GUI applications.
Yeah, this is one of those myths that persists because it has some truth—if you aren't careful to generate a serial number that logically matches other (generated) parameters about the machine—model number, board id, etc—then iMessage won't work. But it's very much possible to do!
If Zeus and other gods were able to exist under certain conditions that were reliably reproduced multiple times, they wouldn't be called myths anymore but real.
it would be really nice to be able to debug safari from linux actually. the other day I had to fix a safari bug by relying on friends to test things for me, because I didn't have access to a recent enough apple device
yeah, my plan b would have been buying some time on lambdatest.com, which is the same idea, but it was just a small thing on a personal project, so I was trying to get around having to pay money. thanks for the tip, though, I hadn't heard of that one
Sketch. At my company we use a mix of PC running Linux and Mac. Being able to run Sketch on Linux would open perspectives for designer types. Although the switch to Linux would meet opposition for other reasons. :)
The home page of Darling specifies "experimental support for running simple" GUI. Does anyone have any idea whether Metal support (which Sketch uses) is out of the question?
That leaves me out as Linux user from our pair-programming-sessions (Jitsi and Google meet with screensharing does solve some). Not sure if this is a blessing or if I'm really missing out.
iTerm2 is markedly different to the terminal emulators available for Linux. There's a whole bunch of stuff in it, that amongst other things tries to couple it to the operation of a shell. MacOS users asking how to do what they do with iTerm2 in other terminal emulators is a fairly common thing.
1. Apple hardware
2. Bare-metal linux OS
3. Emulation/shims to run MacOS software on top
Sure, it seems like a horrific Frankenstein's Monster of a setup, but since I can't have the best commercial applications, running on the (IMHO) best OS, running on (IMHO) the best hardware, this is a pill I would be willing to swallow.
> A: Almost! This took us a lot of time and effort, but we finally have basic experimental support for running simple graphical applications. It requires some special setup for now though, so do not expect it to work out of the box just yet.
GUI support seems not to be ready yet, but when it is, it looks like they're using cocotron, so it'll look something like any of the examples you see on this page: http://www.cocotron.org/Examples/
That should be possible without this approach. I found this project (which seems dead), but iCloud Drive itself should be fully possible to implement without a full translation layer.
> We aim to fully integrate apps running under Darling into the Linux desktop experience by making them look, feel and behave just like native Linux apps.
If this goes beyond just the "window dressing", it may cause UI issues
Could Apple do that? The project is distributed under GPL3.0 which allows for commercial use, but apart from that they're not using it in any way that could be detrimental to Apple's business.
I think the insinuation is that in order to run iOS apps, one would necessarily need to infringe on Apple's copyrights.
But that's not necessary. There is open source code at the base of macOS and iOS, and it's only that being used currently. Even to go further and create a full clone of iOS would be fine, if (and only if) a "clean room" implementation were made.
As I understand it, it's not a binding precedent yet. The Federal Circuit ruled in Oracle's favor on that question and the Supreme Court denied certiorari, but the Federal Circuit wasn't really supposed to be hearing a copyright question to begin with. They're the appeals court for patents, and on other issues tied up in a patent case they are only supposed to follow local circuit precedent (Ninth Circuit, in this case). So until the Supreme Court grants certiorari on this case or another case with the same question comes before a local circuit, there can't be any binding precedent on this matter that would apply to new cases outside Oracle v Google.
They denied the original appeal over copyrightable APIs. There was a second trial over Google's fair use argument and the jury ruled for Google. Oracle appealed and the Federal Circuit ruled against Google again. That decision was appealed to SCOTUS on January 24, 2019. Google's petition challenged both decisions by the Federal Circuit: APIs are copyrightable and Google's use of the Java API wasn't fair use. Certiorari was granted November 15, 2019 and arguments were planned for October 7, 2020.
But that doesn't indicate automatic infringement if used. The current decision in Oracle vs. Google says Google's implementation of Oracle's copyrighted Java APIs is fair use. Oracle is appealing, so maybe that will change, but the point is "copyrightable" still comes with "fair use."
Corellium is a low-level iPhone emulator, which is furthermore only provided as a cloud service with Apple's code built-in. They're infringing on Apple's code in so many places that I'm pretty much convinced no lawyers were in the room when they were developing this.
Traditionally emulators are distributed without any infringing code; it's up to the user to extract what they want to emulate from devices they own. For example, 3DS emulators don't ship with a copy of Nintendo's system software, they instead require you to buy a 3DS, hack it, and use some GodMode9 scripts to extract the relevant files. Similarly, one could have developed an iPhone emulator that requires the user buy and jailbreak a phone, and dump the relevant OS files (or extract it from an IPSW available from Apple, etc). This would have most likely been legal.
Projects like Darling are a step removed from even that. Instead of providing a tool that lets the user run Apple's OS in emulation, you instead write your own OS that provides all of the relevant APIs/ABIs necessary to allow software dependent on Apple's OS to run. This tool can then be distributed entirely freely with zero Apple ownership involved. This approach is entirely legal, at least for another week or two, depending on if the Supreme Court understands what an "API" is better than the Federal Circuit.