The difference being that no one in their right mind is thinking of rewriting a browser in Java to also make it faster, while that's exactly what Servo/Stylo etc are all about.
For what it's worth, Python was also considered at some point for use in the Firefox codebase. I don't remember the rationale for not adopting it, but I think the idea was "we all like Python, but we already have one messy language (JavaScript), let's not make it two".
I work with it daily in a bank, and I couldnt find a better way to express it. Many colleagues throwing their keyboard in despair at this stupid impossible to remember syntax.
There are a lot of things in various programming languages which are hard to remember, but k and array languages have such a small surface area, not being able to remember it while working with it daily amounts to learned helplessness.
(source: mostly amateur k programmer, also worked with it in a bank, find it vastly easier to read/write/remember than most mainstream languages)
Not that it's impossible to remember, bit it's definitely contrary to most traditional use of the symbols employed in it, though not without logic. My favorite is the functions from io package, called 0, 1, and 2 (yes, numbers) which handle interaction with stdin, stdout, and stderr respectively. In dyadic form they at least have a colon, but in monadic form they look like plain numbers: 1 "Hello world".
I suspect that to study k (and use kdb) efficiently, you need to actively forget what you knew about the syntax of other languages, and study k as a language from Mars that happens to map to ASCII characters somehow.
It is really easy to remember; it is so small that remembering is the least of the issue. The rest is just using it a lot; I find it readable and nice to work with. Unlike other some other languages we get shoved down your throats.
"Debugging is twice as hard as writing a program in the first place. So if you're as clever as you can be when you write it, how will you ever debug it?"
— Kernighan, Brian. The Elements of Programming Style (2e). McGraw-Hill, 1978.
I prefer a different approach: "smart is good; clever isn't smart". If you have to express something in a clever, that is, highly contrived but actually working way, it means that you lack the right way to express it, and maybe your mental model of the problem is not that good. The theory of epicycles is clever; Kepler's laws are smart.
This can probably true for some people, but still will not work for many other. One probable outcome of a frustrated debugging session is "let's rewrite/refactor it to make it easier to debug next time", and not self-enlightenment.
That is not possible, because the environment can (and at some point always will) change which wasn't planned for due to a lack of working crystal balls. Data, user behavior, the network, the system(s) the software runs on can all change over time.
Also, it is way too expensive to try to cover every single conceivable possibility, so we deliberately leave holes.
For non-trivial things we often prefer to wait to see what problems actually come up during use, and then fix exactly those, but not the many more problems that could come up but are too unlikely and/or too costly to guard against.
In a living environment the software lives too, and keeps changing and adapting.
you might've missed the quip, since this whole thread is about a quote, which i'm countering with an alternative quote from Hoare
> There are two methods in software design. One is to make the program so simple, there are obviously no errors. The other is to make it so complicated, there are no obvious errors.
> That is not possible, because the environment can (and at some point always will) change which wasn't planned for due to a lack of working crystal balls. Data, user behavior, the network, the system(s) the software runs on can all change over time.
It sounds to me like you are describing a change of problem, not bugs in the solution. If in the future someone redefines the concept of a Sudoku puzzle such that this solution is no longer applicable, or tries to use the solution verbatim in a language which is different from K and therefore yields different results, it's not a bug in the original code that it's not a solution to that new problem. It's still a solution to the same problem it was always a solution to.
I can see what you mean in a practical sense, but also consider (practically) that a lot of problems can be broken down into smaller, well-defined problems which are themselves practically immutable. You can throw the solutions to such problems out when you no longer need them, and come up with solutions to whatever new problems replaced the original problems.
In my experience, the vast majority of problems are insufficiently specified. No matter how well you solve the current problem, there are bound to be certain assumptions you've made about the requirements. And when those assumptions don't hold true, your solution may no longer work.
> What do you mean the input file can't be ISO-2WTF encoded?
Debugging problems is part of maintenance, but a small part. Extensibility is probably a much larger part, and what I think of first when someone says "maintenance".
The average bank/company would rather have an average solution maintained by 10 easily replaceable average developers than a nutty, smart solution only understood by 1 highly talented developer.
You could also say that the average bank/company should have learned from previous mistakes doing exactly that for many decades. Select a language that is well tested, understood and supported. Set a limit on cleverness and instead focus on maintainability and simplicity.
If only. In my experience, banks end up building a solution that is maintained by 100 mediocre developers that a reasonably smart developer can't make sense of when it behaves erratically or has extremely poor performance.
Which was precisely my point (and I agree with all the responses in this thread), though my wording and light sarcasm seems to have been a bit too dry, and didn't quite take up as intended.
Is it your first job ? If it is, don't worry, it's way worse everywhere else. Sometimes you have committees eating many man-hours, every day, to green light releases with non-technical people having the last word, asking no question, and always, always approving.
When I do a release as a dev, I don't do it myself: someone in another country presses the buttons I ask them to press, type the linux commands I ask them to type, and accept my answer when I say it looks good. Because I am, and all my colleagues are, considered a security risk, and it's better we dictate everything to someone who has no idea what we're releasing, for security reason. We call that segregation in duty, instead of "complete waste of time".
You are wrong to think public debates are here to reach an understanding between the debaters. Whining that nobody submits to you in a fight is ridiculous: fight, show us your genius, and don't come crying to mommy if we fight back.
Your goal is to convince the audience, not your opponents.
Or hey, the courage to even act upon the most basic of conclusion. Being useful is very different from being aware or intelligent. It's sometimes more useful to be ignorant and brave, than scholarly and asleep high in an ivory tower.
Knowledge, or its synthesis into derivative knowledge, have nothing to do with utility I think ? It's sometimes pointless to acquire knowledge, and useful to act upon the world. So I don't really understand what you meant by "useful" yourself there.
There's interior applicability to knowledge: it feeds thought and the inductive reasoning process. If you have never been exposed to TV, being exposed to TV changes how you think.
There's exterior applicability to knowledge: feed a man a fish vs teach a man to fish.
I see both as "utility" although I suppose achieving nirvana is seen as utility in other people's domains. It's subjective.
By analogy is one of the most powerful tools that humanity has. Using a rod to get a lure where you want it might cross over later in to getting a rope across a ravine.
I think all understanding and therefore all learning is hermeneutic.
I was persuaded by Hans-Georg Gadamer’s magnum opus, Truth and Method.
I think his theory, which he called “philosophical hermeneutics,” is a skeleton key for understanding our understanding.
Right around the time he published the book, there was a large shift in academic fashion towards critical theory and deconstructionism, of which Gadamer is neither.
The result is that Gadamer and his work took a backseat to critical theory.
We lost something important in that shift.
Others recognized the importance of Gadamer’s work but diluted it by trying to merge it with their pet theories (Ricoeur with critical theory, Rorty with American-style pragmatism).
——————————
That said, Gadamer’s work is dense but not impenetrable. And it is beautiful. And profound. Go forth and read!
That sounds like the sort of response I'd expect from a mystery religion or cult. As opposed to say expending part of the contained information into a form that's comprehensible to non-initiates.
Yeah but they say "pre-purchase conversations", could be some BS about how they call consumers sometimes, from call centers... it doesn't say that they eavesdrop.