Hacker News new | past | comments | ask | show | jobs | submit | asiekierka's comments login

IIRC, at least a few years ago, Mumfrey had a tendency to develop Mixin in private and only push the commit backlog for releases, leading to periods of time where no activity was publicly visible. (Also IIRC, this is part of the reason why the FabricMC fork exists.)

Doe it literally check for the specific solution?

Writing

    to = append(to, ctrl)
which is functionally equivalent and, in my personal opinion, with clearer intent (ctrl = 0 at that point in the code), returns "incorrect".

In fact, it seems that any placeholder value should work - as it is always overwritten by the final value of ctrl for a given set of bytes at the end; however, the checker rejects this.


Your fix is wrong, so the site rejects it.

Go doesn't do integer type conversions implicitly, to avoid the implicit-type-casting bugs endemic to C. Go (thankfully) doesn't allow an implicit conversion from int to byte. You must do the cast explicitly.

So you would have to say

  to = append(to, byte(ctrl))
and that's correct.

The site builds and executes the code you submit, and any correct solution is accepted.


The Linux kernel did not even compile on most non-GCC compilers (like LLVM) as recently as a few years ago: https://www.phoronix.com/news/Clang-Kernel-2018

Therefore, for the Linux kernel specifically, I think the only concern is whether or not GCC remains supported in addition to LLVM - as GCC and Clang are, as far as I know, the only two compilers which actually can be used to build the kernel as we speak. There's work being done on both gcc-rs and rustc_codegen_gcc to allow using GCC as a Rust backend, meaning that all platforms currently supported by the Linux kernel should be eventually capable of being supported without porting a compiler backend.


> The Linux kernel did not even compile on most non-GCC compilers

I believe that was a feature.

non-GCC compiler, like Clang+LLVM, are (were?) considered to be open source but not Free Software (TM)


The kernel community has never been particularly interested in the whole free vs open source software debate. The leadership is mostly pragmatic people.

It’s just that Linux used GCC extensions and no one was interested in doing the work necessary to have it compile on a non-GCC compiler.


That's plainly not true as there are plenty of intra-kernel interfaces that are tagged as GPL only and if your kernel module isn't GPL you can't use said interfaces. For example, they added the GPL tag to the floating point context switch functions which broke ZFS at the time.


I don’t understand your comment. The kernel cares about its license which is indeed the GPL, yes.

But Linux didn’t go out of its way to be incompatible with ZFS out of ideology. Sun intentionally picked a license which would not be compatible. That’s why ZFS lives outside of the main tree.

The ideology that’s driving the Linux kernel development: “shared source code leads to better code, user land should never be broken” are very different from the one that led to GCC gimping itself. Generally speaking the kernel community is very technically oriented and doesn’t go out of its way to prevent things for social reasons.


The kernel devs went out of their way to label a whole bunch of kernel functions as GPL only. It's not about mainlining third party kernel modules, it's about not letting non-GPLed modules use certain functions in the kernel. This includes the fully open source OpenZFS project. It's not about CDDL / GPL incompatibilities. For OpenZFS it was some FPU context switching functions that had to be worked around when the kernel team labelled those functions as GPL only.

See: https://github.com/openzfs/zfs/issues/13042


You seem confused. There is no mysterious GPL only label in the kernel. The whole thing is licensed under the GPL v2 period. It’s just that some ABI were broken for an unrelated reason as Linux doesn’t guaranty ABI stability and OpenZFS can’t find an alternative which satisfy their dependency needs. The kernel team doesn’t care about out of tree code when making changes. It has always been the rule.


> There is no mysterious GPL only label in the kernel

EXPORT_GPL_ONLY


Yes, you are right. I didn’t know the API was tagged. My take seems indeed a bit too extreme.

The kernel does care about enforcing the GPL explicitly on some of the interface it presents to modules to ensure the openness of the code which is indeed a form of statement in favour of open code.

I don’t think the situation was the same regarding compiling only with GCC - after all clang is free software - and I think the heart of my argument still hold: the kernel community makes decisions mostly for reason related to the kernel - even there they just want to force code to be mainlined - rather than for the movement like the FSS.


> The kernel community has never been particularly interested in the whole free vs open source software debate

that's total BS

are you telling me that Alan Cox had no involvement in the Free Software?

https://en.wikipedia.org/wiki/Alan_Cox_(computer_programmer)

> It’s just that Linux used GCC extensions and no one was interested in doing the work necessary to have it compile on a non-GCC compiler.

the GCC extensions were essential to enforce the GCC supremacy because no other non-free compiler could implement them


> are you telling me that Alan Cox had no involvement in the Free Software?

I don’t see how your statement contradicts or is even linked to mine.

The kernel community as a whole very much has little interest into the philosophical arguments surrounding open source. Apart from being convinced that sharing code is the best way to develop a kernel they have next to no active involvement in the whole charade.

See for exemple keeping GPL v2, not opposing TIVOisation, disapproving on technical merits but allowing proprietary drivers and binary blobs.

> the GCC extensions were essential to enforce the GCC supremacy because no other non-free compiler could implement them

Linux uses GCC extensions because they are handy and GCC was the compiler everyone used to compile C projects for a long time. It’s not intentionally done to promote GCC on ideological ground, something pretty much no one cares about in the kernel community.


> The kernel community as a whole very much has little interest into the philosophical arguments surrounding open source

because they didn't have to.

someone already established that it was the foundation, people like Linus Torvalds, Alan Cox, Maddog Hall and many (not too many, actually) others.

The "community" for the longest time has been a bunch of people

"is a small and well-defined group: Linus, Maddog Hall, Alan Cox, and somewhere between 6 and 12 others (varying at times)." (Steven Suson, 1999)

"Watch the linux-kernel mailing list. The "Inner Circle" becomes very obvious. People go in and out of the Circle, so a list probably isn't possible [...] I would say it includes may be 2 dozen people." (Eric Princen, 1999)

> It’s not intentionally done to promote GCC on ideological ground, something pretty much no one cares about in the kernel community.

again: you're talking at the present, I am talking about the first two decades

you may have forgotten about it, I did not.


Torvalds was never a free software zealot in the way the FSS views the movement. He seems to believe open code leads to better code but I don’t think he is against the idea of closed source. He has worked on closed source software himself if I’m not mistaken.


> Torvalds was never a free software zealot in the way the FSS views the movement

Who said anything about zealots?

please, stop putting words in someone else's mouth.

Linus was a big supporter of free software and the fact that the Linux kernel was free software is what compelled many developers to donate their work for free

They didn't do it to improve NT Kernel or Solaris kernel or... you know it.

---

Software is like sex: it's better when it's free.

-- Linus Torvalds


If course LLVM is Free Software. It’s available in the mainline Guix and Debian repositories, and its Apache license is recognized as Free by the FSF.


If it was a feature, what's the benefit?


benefit != convenience

for some at least, and it was a goal in the first two decades of the Linux kernel community

TL;DR: LLVM and Clang allow non-free (proprietary) extensions

The Clang and LLVM developers reach different conclusions from ours because they do not share our values and goals. They object to the measures we have taken to defend freedom because they see the inconvenience of them and do not recognize (or don't care about) the need for them. I would guess they describe their work as "open source" and do not talk about freedom. They have been supported by Apple, the company which hates our freedom so much that its app store for the ithings _requires_ all apps to be nonfree. (*)

The nonfree compilers that are now based on LLVM prove that I was right -- that the danger was real. If I had "opened" up GCC code for use in nonfree combinations, that would not have prevented a defeat; rather, it would have caused that defeat to occur very soon.

For GCC to be replaced by another technically superior compiler that defended freedom equally well would cause me some personal regret, but I would rejoice for the community's advance. The existence of LLVM is a terrible setback for our community precisely because it is not copylefted and can be used as the basis for nonfree compilers -- so that all contribution to LLVM directly helps proprietary software as much as it helps us.

The cause of the setback is the existence of a non-copylefted compiler that therefore becomes the base for nonfree compilers. The identity of that compiler -- whether it be LLVM, GCC, or something else -- is a secondary detail. To make GCC available for such use would be throwing in the towel. If that enables GCC to "win", the victory would be hollow, because it would not be a victory for what really matters: users' freedom.


> it was a goal in the first two decades of the Linux kernel community

do you have any evidence of that? and why are you quoting RMS while talking about development of the linux kernel?


> do you have any evidence of that?

Yeah, the fact that only few years ago Linux was finally buildable with Clang.

If you have any proof that there were other reasons besides "nobody was working on that and the Kernel was using GCC extensions not found anywhere else" I'll be willing to look at them

These debates have been going on since the mid 90s, I don't know why people still ask questions about them, it's ancient history, GCC was the de facto compiler for Linux because Linus decided so and everybody else followed.

If there was no other reason than a philosophical one (the technical one is the GCC extensions) you could have built Linux with MSVC in 1997.

Reminder: the first two decades ended 11 (ELEVEN) years ago.

We are in the fourth decade right now.

> and why are you quoting RMS while talking about development of the linux kernel?

Because Stallman was in charge of GCC and the history of GCC and Linux are almost inextricable from 1991 onwards.


> If you have any proof that there were other reasons besides "nobody was working on that and the Kernel was using GCC extensions not found anywhere else" I'll be willing to look at them

I can't cite sources, but AFAIK Linux uses GNU compiler extensions. The reason is not to lock other compilers out, it's just that some of those extensions are genuinely useful.


> I can't cite sources, but AFAIK Linux uses GNU compiler extensions. The reason is not to lock other compilers out, it's just that some of those extensions are genuinely useful.

Of course.

They are genuinely useful to Linux.

Other did not use those extensions because they did not want to use GCC, for reasons beyond the technical merits, but because GCC is free software

If they could copy them, they would have done it.


For something to be considered a feature, it has to be beneficial to someone. I'm not arguing that the benefit should manifest in convenience. Who benefits from the Linux kernel not being able to be compiled by a certain compiler, and how?


> For something to be considered a feature, it has to be beneficial to someone

It was beneficial to Linux, the open source and free software, as in GPL licensed, kernel.

It was beneficial to its users.

It was beneficial to the free software movement.

It wasn't beneficial to corporations, probably.

Things in fact have changed when corporate sponsored interests raised around Linux as a money maker platform. And when Apple started its war against GPLv3 (suddenly GCC was not good anymore for them) .

> Who benefits from the Linux kernel not being able to be compiled by a certain compiler, and how?

Again: the fact that Linux was not being able to be compiled by a certain compiler doesn't seem to me it hindered its ability to become the most used platform in the server space (or one of the most used).

Any counter proof?

-----

As already stated: beneficial != convenient

Going to the doctor for regular checks is not convenient, but it's beneficial for your health.


Linux remained GPL, compiled with clang or not. It didn't become possible to use Linux in a non-free way, did it?


> Linux remained GPL

Apple started to use Clang because GPLv3 was stricter than GPLv2.

That's why you can compile Linux with Clang, because Apple did not want to use GCC anymore!

That was a feature back in the day.

It forced corporations to adapt to Linux, instead of the opposite.


And apple "is" vanguard/blackrock, which is msft/google/starbuck!/etc, keep that in mind.

Apple doing open source feels more like PR than anything else (they do "maintain"/"employ the main dev of" cups if I am not mistaken).

And yes, open source is not enough anymore: we need "lean" open source, and that includes the SDK (excluding de facto the ultra-complex c++ and similar).

The real hard part is, once a piece of lean open source is mostly "done", it is to resist planned obsolescence.

That said, I am a "everything in risc-v assembly" (with interpreters of high level languages written in assembly) kind of guy.


The key words are "optimized" and "OpenGL 4.6 with Zink". "Functional" and "OpenGL 2.1" is a different story, and the same trustworthy source said in https://rosenzweig.io/blog/asahi-gpu-part-6.html that:

> thanks to the tremendous shared code in Mesa, a basic OpenGL driver is doable by a single person. I’m optimistic that we’ll have native OpenGL 2.1 in Asahi Linux by the end of the year.

It's likely that even a bare-bones OpenGL driver will probably run better than llvmpipe, which is especially important in a laptop context due to the resulting power use improvements.


Ah, so it's a "90% of the iceberg" situation. Great info, thanks!


Yes, the Apple laptops need a whole host of proprietary blobs for bringup and firmware.


For me personally, there's a few reasons:

1. It's a kind of code golf - or rather, it's about seeing how much one can squeeze out of a highly limited and fixed platform. Writing this, I think of things like Mahoney's Cubase64 ( https://www.youtube.com/watch?v=PTGkf21UpJ8 ), which uses a combination of skilled coding and creative (mis)use of hardware functionality to perform real-time effects on sampled audio data on the Commodore 64. As the demonstration itself states: "You need 32-bit, 2000MHz, 1GB at least; what if... 8-bit, 1MHz, 64KB is enough?".

2. Creating a game which can stand its own to the platform's contemporaries on an 8-bit ecosystem can be reasonably achieved by a bedroom coder in their spare time. To do so for modern consoles or machines, one typically needs a higher budget and multiple people.

For this point alone, though, there are arguably easier ways to accomplish this. PICO-8, the herald of the "fantasy computer" phenomenon, created a system inspired by 8-bit limitations in graphics and sound, but offering a modified Lua interpreter in place of coding assembly by hand, in an attempt to balance the stack more towards creativity and away from complexity. (And, of course, nobody is stopping anyone from simply making a retro-styled game with modern tools. Many indie games do exacly that. There's two separate "notable" Game Boy jams on Itch.io - one enforces games actually targetting the real platform, the other cares more about matching the look/feel.)

3. The ability to reason about every component of the system down to bare metal is a welcome retreat from writing complex, object-oriented code running on top of stacks on top of libraries on top of runtimes...

In the end, it's a novelty which tickles a particular part of my brain, I suppose. So long as I enjoy it...


For me at least, an AMD/Intel desktop box plus its multi-year power bill still is likely to come out cheaper than a comparable Apple desktop or high-end laptop plus its multi-year power bill. The lower-end Apple laptops are not an option for me due to the "one external display" limitation. Competing ARM desktop hardware is not an option for me due to insufficient performance - I might as well not upgrade at all.


FWIW, I was a long-time dual monitor user who migrated down to one due to MacBook support. I still have dual monitors at the office, and find my home setup broadly comparable - I use a 34” ultrawide with my MacBook mounted on a monitor arm next to it and serving as a second monitor. (Obviously doesn’t scale to >2 monitors.)


The only world in which fair product reviews lead to no product review units whatsoever is one in which no company ever actually considers their own product to be of quality.

Ultimately, follow-ups are rare because they don't sell - the benchmarks which get referenced are the ones which spread the widest, which are the ones closest to release, which is when usually the most people are interested in a given product.


3DMM was a lucky one, because Microsoft and Foone managed to get sign-offs from the rightsholders of third-party code utilized.

Not even Windows 1.0x, 2.0x, or 3.x got open-sourced. In terms of complete operating systems, we got source code dumps of MS-DOS 1.25 and 2.0, and that was that.

I don't think it's going to happen.


If $30k is at stake, it might be better to spend a little bit more money and ask a professional lawyer.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: