Hacker News new | past | comments | ask | show | jobs | submit | more ccdan's comments login

Good luck porting a professional grade tool like Eclipse. :) But I'm afraid we'd have to wait until tablets have the same specs as current average laptops. Although I guess stuff like HTML and Javascript could be done on current tables.


Bases and military relationships = wars? LOL!

That site is ridiculous.


Not much of a surprise. There's way, way too much not only useless research but totally wrong research (bad data, close to zero use of the scientific method, reliance on fallacies like correlation = causation and so on) Rigor in science must increase and fund should be directed to serious, experimental research based on the scientific method.


Haskell can offer absolutely nothing Java can offer, while Haskell can't offer even 0.001% of what Java offers. You see, there's a very, very good reason why you won't see any (even remotely) serious and useful software made in Haskell and why you won't see it used by companies (which means that jobs are literally extremely close to zero) Haskell is nothing more than an exercise of constructing a language based on certain mathematical concepts, mainly abstract stuff like category theory. As a result it as cryptic, hard to understand and useless in practice as those theoretical concepts (though the theoretical stuff might serve as basis for some other stuff that might be useful - not so with functional languages and especially Haskell)


Huge mistake on their part, typical of math people. High level programming languages have to be as close as possible to human languages (English) not to some obscure mathematical concepts and notation that most people don't care about. Actually math notation itself (cryptic, inconsistent, ambiguous) is a horrible result of math people's communication handicap and ineptitude.


No. The math notation that is used across all branches of mathematics is consistent and unambiguous (the notation of formal logic, naive set theory, etc). The only ambiguity typically comes from traversing different branches of the discipline, which is inevitable considering how many branches there are and how deep they go.

Furthermore, mathematics is all about communication. The language of mathematics exists to codify concepts so that they can be talked about concisely. Being able to say 'group' instead of 'set with an associative binary operation with identities' is essential if you want to be able to build on top of that concept without taking an hour to read one theorem.

The handicap in communication isn't on the mathematics end, it's on your end. You seem to expect them to be able to explain structures to you that took years of work to build by using the same language that you use to talk about sports or social events. The reality is that you are not the target audience of their communication, and they are okay with that. You should be too.

The weirdest assertion that you made is that high-level programming languages ought to be as close as possible to human languages. The two categories of languages exist to communicate fundamentally and widely different groups of concepts. Words represent categories of analogous concepts, and the relevant categories in human life are nothing like the relevant categories in programming. In Haskell, 'functor', 'applicative functor', and 'monad' are highly relevant categories. They pop up everywhere and can be leveraged with great benefit. In human life these concepts are far less common, and thus do not merit words in the common vernacular. Were we to use a programming language modeled on English, we would miss the benefit of these abstractions, trading them for categories like 'dog' and 'car' which have very little practical use in typical programming.


>>No. The math notation that is used across all branches of mathematics is consistent and unambiguous (the notation of formal logic, naive set theory, etc). << Nonsense. Actually, ambiguity starts with basic arithmetic. Take multiplication for example. We have several kinds of notation for it. Which is inconsistent. In the case of juxtaposition, it's ambiguous because two or more juxtaposed letters don't necessarily imply multiplication. And I'm talking about arithmetic only. Then, ambiguity only builds up. Cross product, Dot product & crap.

>>>Being able to say 'group' instead of 'set with an associative binary operation with identities' is essential <<< OK, but the word "group" should be used for no other meaning...

>>>The reality is that you are not the target audience of their communication, and they are okay with that. You should be too.<<< As you can see pretty much anyone is the audience of some math and its inconsistency and ambiguity. It just varies the level and the amount of it.

>>>The weirdest assertion that you made is that high-level programming languages ought to be as close as possible to human languages. The two categories of languages exist to communicate fundamentally and widely different groups of concepts. Words represent categories of analogous concepts, and the relevant categories in human life are nothing like the relevant categories in programming. In Haskell, 'functor', 'applicative functor', and 'monad' are highly relevant categories. They pop up everywhere and can be leveraged with great benefit.<<<

False. Computers and software are mainly used to emulate some real world stuff (objects, actions etc.) and to help people with real world stuff in a more automated way. They aren't used too much to prove theorems or some other math stuff. And pretty much no one cares about proving the so called "mathematical correctness" of a program - a concept that doesn't even make sense in most cases. Old misconception among FP advocates, even Dijkstra himself admitted that he was kinda wrong about how computer would evolve and what they'd used for. But the associated misconceptions live on. A language close to human language also helps avoiding errors. That's why you won't see functional languages in critical systems, but rather languages like Ada which is probably the closest programming language to human language. The claims of clarity of FP languages are pretty much at odds with the evidence the real world provides.


You are simultaneously arguing that we should be using a natural language and that the language of mathematics is too ambiguous. I think you have not looked very closely at natural languages.


A programming language can be created such that it resembles a natural language but it also avoids the ambiguities of that natural language. Anyway, the main idea is that FP languages are way too cryptic and ambiguous, they use too many symbols with multiple meanings which don't make any logical sense at a first glance. If you have to go to great lengths to explain the meaning of a simple symbol, then its use is wrong in a language that is claimed to be general purpose, clear, easy to read and so on. Either that, or the language is not general purpose and/or doesn't have those claimed qualities (clear, etc.) in a general sense.


Or, you simply have not learned enough yet to understand the language. Programming languages are not designed to be easy to pick up with no prior knowledge, they're designed to be powerful when used by professionals who actually know what they are doing.

If you're going to make the claim that functional languages use ambiguous symbols, you're going to need to back that up with some examples. I find it exceedingly hard to believe that there is any ambiguity in the operators of a statically and strongly typed language like Haskell.


Or, maybe the language is very poorly designed. There are many programming languages. From Basic to LISP to C to Java to Haskell to ... Brainfuck. Some of them are used in the software industry and some are not. There are many claims about many languages: language x is good because [insert some random ramblings], language y is good because [...] However, no language is adopted by the industry solely based on claims (and btw. I have seen some utterly ridiculous claims made by those who try to promote Haskell.) Every once in a while, some companies try out new languages. Very few such languages get adopted and as you can see functional languages are almost completely absent from the industry. And there's a very good reason for it: they're simply not suitable for producing professional grade commercial software. If it had been otherwise, someone would have figured it out. The funny thing is that the start-ups that try to use them (usually founded by FP advocates themselves) also fail one after another. But some people never learn. Furthermore, many companies forbid the use of functional style or characteristics implemented in certain imperative languages. The code of good, proper lanaguages for general purpose software engineering, is almost self describing! What is unclear should be sorted out quite easily using the documentation.

Those "professionals who actually know what they are doing" don't seem to exist when it comes to functional languages. The evidence is the very fact there's not a single piece of important commercial software written in such a language. The question is rather: can such specialists exist? Because I'm afraid they can't exist because the functional approach is fundamentally wrong.

Examples of ambiguity in FP? What is the following line supposed to mean and what part of it suggest anything about that:

a b c

How is ~ an intuitive replacement for minus? How is (* 5 5) supposed to be as clear as 5 * 5 ?

ps. dynamic typing and type inference are two awfully bad things and either of them can lead to trouble in large programs


Honestly, I'm chalking this up to Poe's Law at this point. Take care.


LOL... that's all you had to reply? Typical of FP advocates..


I don't think it's ever a mistake for something to push standards higher, even if those standards are so high that it might turn off some talented people.

I also disagree overall with your assessment of mathematical notation. But it doesn't really matter because Haskell doesn't actually use any math notation. It does use some math terminology, though, and overall mathematicians are absurdly pedantic about terminology. Mathematicians may not be great at telling jokes at parties (though you may be surprised!), but it's their job to make sure they are writing in consistent and unambiguous ways.


This depends on the point of view. I expect to find a rather high standards in Haskell programs than, say, Ruby, for exactly the same reasons you've mentioned and I think it's an awesome feature, quite the opposite to a mistake.

And yes, I am a "real programmer" and don't use Haskell at work myself (for the run-time performance reasons).


> High level programming languages have to be as close as possible to human languages (English) not to some obscure mathematical concepts and notation that most people don't care about.

Haskell wasn't created for "most people". It's not some replacement for Python or Ruby, it was designed for formality.


I don't know who it was created for, but I know lots of FP advocates (Haskellers included) whining that FP languages aren't used (almost at all) in the software industry.


Another stupid scare... most idiots will blame the entire set of Java products, even though applets are just a very minor feat, almost a deprecated one... and the real threat is greatly exaggerated... there was never a major security incident in the entire history of java...


Unfortunately, the only thing standard users are going to remember is "Java is bad". I've already had friends asking me about this and saying, "isn't it going to ruin my entire online experience to disable Java?". "No, that's JavaScript, different thing entirely."

And that's from people who work with computers all day as their main job. Looks like it may be difficult/impossible to get people to install Java client apps in future.


Back in the early applet days a lot of people were complaining on Usenet's comp.lang.java.programmer that applets were both a toy and a disaster waiting to happen that would never bring anything good to Java.

Yet countless Java programmers fought teeth and nails to defend Applets, saying how great the techno was and how it was going to revolutionize the Web, etc.

They are the very programmers who, today, say that Java is good but that Java applets weren't maybe that great after all. It's a bit too easy.

Now I don't know what you call a "major security incident" but in 2011 you could DoS any Java webserver by crafting an URL (hashtable exploit). A single customer Internet connection was sufficient to take down entire Java server farms.

Granted a DoS is not remote-root but still...

Then, still in 2011, there was the 12-years old floating-point parsing bug where you could DoS any Java web server, again, by sending a thread into an infinite loop. All that was needed was one parameter set to a certain value on the client side and make your HTML GET and that's it: one thread into infinite loop.

Repeat a few times and you had DoS any Java webserver.

It's still not remote root but that's not exactly not major either...


Java Applets had great promise. Early interest was high. For the life of me, I can't phantom how Sun managed to biff this one. I'm among the biggest Java fans, so my disappointment is acute.

#1 - Netscape. Their Java support always sucked. Broken thread implementation. The joke was write once debug everywhere. The early troubles soured most and Sun lost the precious mindshare. Relying on a third party for the success of Java was a huge mistake.

#2 - Sun killed their HotJava web browser, written in Java. It was the ideal applet platform. Ran great. Had a great UI for the time. Imagine if they'd kept that going.

#3 - Sun waited until Java 6, a full decade, to revamp their Java plugin. Way too late to make a difference.

#4 - AWT controls looked terrible, were too minimal. But Swing was just too heavy. The design was great for time, being the logical successor to NextStep -> Cocoa -> Netscape IFC development line. But Sun never put it on a diet, either the API or the payload.

(Thanks for the floating point parse bug tip. Writing a web server using Netty, that's a good one to know.)


Nonsense... I have never heard of anyone being attacked in any way through java... it's just "security" firms that come up with all kinds of obscure things and try to scare people for pretty much nothing...


Where is functional "good?" Other than in abstract math or FP advocates' heads? :D If it had been any good, it would have been used on a pretty large scale... we live in a pretty large world and someone would have figured it out... and then the others would have fallowed suit...


Nice troll.


"- Most blue collar developers don't understand FP concepts"

HAHAHA! The vast majority of FP advocates are either unemployed or work as math teachers. Like it or hate it, FP is absolutely nothing more than a pseudo-programming paradigm (largely emulating some concepts from astract math and using notation somewhat similar to math notation) that attracts people who can't wrap their heads around OOP, rich frameworks and other associated stuff. Sorry folks, computers are neither abstract nor stateless. And the same holds true for software, which often deals with real world stuff, which again, is neither abstract nor stateless. Virtually everything that can be done in a functional language, can also be done in a procedural or OOP language. The opposite on the other hand is totally untrue. It's really funny to see how FP advocates struggle even with some extremely basic things. Using languages/platforms like C/C++, Java,.Net - there's always an increase in performance compared to any functional language (yeah, including scala, f#, clojure ocaml an so on) The "elegant code" argument is one of the most ridiculous things FP advocated come with, since it's almost always synonymous with crappy, cryptic code that no one wants to read besides its authors (maybe not even them after a few weeks or months) :D So I'm afraid that the FP advocates are far worse than real blue collar workers.


"Sorry folks, computers are neither abstract nor stateless."

It's true. But you know what computers are first and foremost? They're deterministic.

And you know what's one the biggest problem programmers do face in the Real-World [TM] when the shit hits the fan (and most devs' jobs is to fix shit that just hit the fan)? It's being able to recreate the state and to then be able to deterministically reproduce the shit that did hit the fan. As to prevent it from hitting the fan again.

Why do we seen market makers using 90 Ocaml programmers and raving about it?

Why do we see investment banks moving from Java to Clojure and slashing their codebase by doing so by a factor of ten? And then explaining how easier their life became in the face of changing requirements (eg new laws/regulations coming in)?

Do you really think that a codebase ten times smaller is "harder to read"? Do you really think that making it easier to reproduce the state is not a goal worthy to achieve?

I realize you feel insecure in your Java/C# + ORM + XML + SQL hell but don't worry: there's always going to be lots of real-world jobs for code monkeys like you ; )


"It's true. But you know what computers are first and foremost? They're deterministic."

That's like saying that computers have mass and are made of matter.

"And you know what's one the biggest problem programmers do face in the Real-World [TM] when the shit hits the fan (and most devs' jobs is to fix shit that just hit the fan)? It's being able to recreate the state and to then be able to deterministically reproduce the shit that did hit the fan. As to prevent it from hitting the fan again."

That's false. Programmers don't have to recreate the same exact state, actually it's not necessary to recreate the error at all in many cases. There are more tools than you can imagine for identifying errors from logging to memory dumpers and analyzers/profilers...

"Why do we seen market makers using 90 Ocaml programmers and raving about it? Why do we see investment banks moving from Java to Clojure and slashing their codebase by doing so by a factor of ten?"

Well, I'm afraid that happens in your imagination only. I also happen to be a trader. Almost NO ONE uses functional languages (fewer than 0.01 %) for financial trading. The main languages are C/C++ (especially for high frequency trading) and, of course, Java and also .Net.

"I realize you feel insecure in your Java/C# + ORM + XML + SQL hell but don't worry: there's always going to be lots of real-world jobs for code monkeys like you ; )"

You're pretty delusional about how secure or insecure I feel (haha!) and how much of a "codemonkey" I am. LOL! You don't even know me, but you already pretend that you know me. Unfortunately for you(and all those like you), this is a typical characteristic of FP advocates: you live in an illusory world, have a totally distorted view about software engineering and of course about the people who do make real world software. Anyway, it's always funny to see the reactions of FP advocates when they're left without any objective, verifiable, real-world arguments. :D


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: