As someone who rather dislikes Java, I'd still tell others to learn it (but certainly not as their first or even second language - it teaches too many bad habits).
The main reason would simply be that a lot of interesting problems require tools/libs written in Java (Hadoop and family, Lucene, etc) if you want to get off the ground quickly and efficiently. Eventually, you're going to need to tweak your tools - so learn Java.
Furthermore, a lot of far better languages run on the JVM and have Java interop (Clojure, Scala, etc). Strictly speaking you don't have to know Java to use Clojure. But in practice, it's going to make your life miserable if you don't know the Java libs reasonably well.
- Too much global state. You call it a Singleton. I call it a global variable with a fancy name and a lot more code.
- You lose the ability to think functionally and dynamically. You can see the core of the architecture much more clearly if your default thought is 'I want to pass a function' or 'I want to call a fn based on rutime types.' And then you should convert that to what's implementable in language X (strategy pattern and visitor pattern, respectively for Java). Thay way you don't lose sight of the forest for the trees (and can implement what you're trying to do without cruft in a good language).
- Verbosity. It's bad enough that passing a small fn (1 line of code in a decent language), requires a separate class and 20 lines of code in Java. On top of that the type system is too weak to actually stop most interesting errors but just strong enough to be annoying. And for some idiotic reason, they won't even implement compile time type inference which would save lots of verbosity and maintain code/bytecode compatibility.
I agree with you about the verbosity of Java, it's driving me nuts. Also, overuse of XML in java-related frameworks/platforms etc. I mean, when it gets to the point where you have to use complex logic to perform a task, you have to start wondering whether xml is the way to go...
As for overuse of patterns - well, to be honest, I sometimes actually wish that my colleagues would learn some (at least basic) patterns. Quite a large number of them wouldn't know how to implement Strategy, for example.
As for your other points.. I don't know. I think this is debatable. What does too much global state mean? Is having singletons in your application inherently wrong, because they introduce global state? I don't think so. Overuse of singletons (and overuse of any pattern for that matter) is certainly a code smell, but there are simply some situations where singletons are appropriate. Or would you argue that this is true for Java simply because of bad design of the language itself?
What I, personally, dislike about Java is it's really bad handling of lists. You see, as somebody working on Java enterprise projects, I constantly have to do something with lists, I fetch them, sort them, filter them, perform operations on certain elements of the lists etc. And it's incredible how verbose and ugly and littered with for-ifs the code gets. Yes, members of JCP EC, we need those closures, and we need them now.
> What does too much global state mean? Is having singletons in your application inherently wrong, because they introduce global state? I don't think so.
I do. If you have an hour, the following talk by Rich Hickey (clojure) covers philosophy behind global state being the problem. This is not a code talk and he takes his time getting into it so it's not the easiest talk to get into but it does explain the mindset.
I believe Java's lack of functions/closures lead to excessive factory use, which tend to be singletonish and so you get the set of dependency injection/IoC frameworks that simply don't exist in other languages but exist in Java due to too much global state. There are other symptoms, but this is the first one that came to mind.
I'm personally mad at Doug Cutting for perpetuating Java.
Java is a virus. If Lucene and Hadoop were written in C/C++ you'd be able to easily use them from basically any language. Because they're written in Java, it forces the rest of your code to live in Java land as well.
Yeah, but because it's not the main project, the CLucene port is perpetually the red-headed stepchild. It runs several versions behind the Lucene trunk, and therefore, you have to choose to ignore its disadvantages.
...which is a shame, because I agree with the OP that the choice of Java was a mistake for a large, high-performance system.
Think of it from his point of view: if you've already got a large chunk of code and working systems in Java-land that you want to strap a text indexer to, what language would you pick?
> It's bad enough that passing a small fn (1 line of code in a decent language), requires a separate class and 20 lines of code in Java.
No need to exaggerate, it's only a few extra lines
functionTakingCallback(new Callback() {
public void fn() {
...
}});
I agree it's verbose enough to be annoying, but hardly a deal-breaker imho.
EDIT: Just occurred to me that you might be looking at it from the other side, where you would have to define the callback class. Still, defining a static nested with a single function is only 3 lines of code.
Saying that you can't think functionaly in Java is not a "bad habit" - that's a style choice. Likewise, being verbose is not a bad habit - it's an issue of style.
As for global state - using singletons is a design decision. If you don't like them, don't use them!
So many people on HN hate on Java just because they've made the switch to functional programming. All power to you, but none of what's good about FP makes Java bad - it's just different.
I'd agree that there are some inherent design issues with Java - I've never liked primitives and I don't like how generics were implemented but other that that, it's a really excellent tool for a number of tasks. Just because it doesn't look like what you like, doesn't make it bad (this reminds me of when Java came out in the first place and all the C++ programmers were saying that it was useless because it didn't have operator overloading. not bad, just different!)
Inheritance instead of composition. Too much worry about security theater public/protected/default/private instead of actually reusing code. The inability to fake stuff for tests without designing ability to fake things into every class. Basically, you write the code, and that's what you have. If anything changes, it's basically a rewrite, even with Eclipse's "refactoring" features. Type erasure.
When all you have is Java-style OO, everything looks like an AbstractHammerFactoryInterface.
Yes! It's amazingly hard to do inheritance right. As a rule of thumb, if you can't prove that the inheritance doesn't violate the Liskov substitution principle, you shouldn't be allowed to do it. (It amazes me how many java 'programmers' have never heard of lsp or a refinement proof).
Ironically, Liskov is the exact opposite of how people do OOP. Most people subclass to add restrictions, where Liskov suggests that subtypes should relax restrictions.
This means your programs work better, but you have to plan ahead.
I know I'm not an average Java programmer (I hate IDE's, hate patterns for example), but I've never seen anything as an AbstractHammerFactoryInterface.
Seems like you're confusing Java the language, with how some people choose to use Java.
If you are going to use Java contrary to 90% of Java programmers, what's the point? You can't use any libraries and they won't understand your code. So you might as well use a good language instead.
I do not use languages just based on what other people use it for.
I completely agree, most people abuse Java using it in stupid ways, writing absolutely ridiculously verbose and complex needless code, using XML for everything etc etc. But that's not Java's fault. It's simply the fact that it looked attractive to Enterprise and they threw their crapstorm at it.
If <insert hip language here> suddenly looked attractive to corporate world, it'd get thrown a ton of crap as well and suddenly fall out of fashion with the people who choose languages based on fashion.
If you pick languages based on community / how most people use it, I think you're missing the point of languages.
The reason why you want popularity is because they write libraries for you. That means you can worry about something more interesting than an HTTP parser. Or, it means that if you want to write an HTTP parser, lots of people will help you.
If you are the only person in the world and refuse to collaborate with anyone, then it doesn't really matter what language you use. Except, of course, the compiler and runtime are libraries, and you had better be prepared to maintain those, too.
> If you are the only person in the world and refuse to collaborate with anyone, then it doesn't really matter what language you use.
Yup! And FWIW, I wrote an HTTP parser. And webserver. And DNS server. Amongst other bits and pieces I needed for Mibbit.
Programming isn't rocket science, I think more people should write everything themselves rather than rely on 3rd party libraries personally.
Yes, it's an investment to write your own libraries, but you end up knowing more, being able to maintain/fix bugs far easier, and often ending up with a far better solution.
Is it common for java developers to use a "dumb" editor like textmate / jedit / nano? I've heard that a good IDE is a requirement for writing significant amounts of Java. To be fair, I've mostly been interested in Scala; but it seems like you have to know Java to get the most out of Scala.
It's a myth. People who don't like Java moan on that it's so verbose and the only way to get anything done is by using an IDE.
Do people use IDE's to write assembly language? I'd say majority do not. You just learn to organize your code well and to be concise and tidy. A good skill to learn.
My current project (Mibbit), is about 25k loc, in around 200 files. I've never had any issue working with it.
I think the difference lies in if you write 'Enterprise' java, use XML for everything, use patterns, tons of threads etc or if you just write tight, tidy, concise speedy java.
> I think the difference lies in if you write 'Enterprise' java, use XML for everything, use patterns, tons of threads etc or if you just write tight, tidy, concise speedy java.
There's a saying, "one of the problems that lawyers have, is that 95% of those in the profession give all the rest a bad name". That's also true for people who view Java.
notch's screencasts show that there is an alternative, and I'll take your word for it that you're in that class as well. But you're part of the 5%; maybe the 1%.
- It is just barely object oriented (and Alan Kay agrees with me on this). Checkout CLOS and the MOP or Smalltalk sometime for a good object system.
- No. Java has a complex syntax. Take a look at the grammar sometime. It's 10s of times longer than Scheme's (see: http://java.sun.com/docs/books/jls/second_edition/html/synta...). You just like it because it's almost, kind of, a little, like C and C++.
- It is, in large part pointless verbosity. Explain to me how: Map<String, Pair<Integer, Set<String>>> data = new HashMap<String, Pair<Integer, Set<String>>>(); is useful. Explain how 20 lines for a strategy pattern or a visitor pattern is better than just having built in first order functions and multiple dispatch.
- The type system is abysmal (and not sound). Runtime Erasure of generic types? Casts? Nulls?
- I do agree it is better than C++ for most purposes though ;-)
Explain to me how: Map<String, Pair<Integer, Set<String>>> data = new HashMap<String, Pair<Integer, Set<String>>>(); is useful.
Firstly, in Java 7 or in Java 6 in a modern IDE, you only have to type:
Map<String, Pair<Integer, Set<String>>> data = new HashMap<>();
So let's break this down: if I was using another language, I could just do:
var data = {};
But consider what you've now lost:
1. If your code was just a simple associative array, not much. But if you have a complex nested data structure, you need to now put a comment somewhere so you don't forget what you're storing there. You can get into serious problems when multiple programmers are working on the same project if you don't clearly describe your data structures, and so you're not really saving on typing by writing Map<String, Pair<Integer, Set<String>>> because you'd have to document that in a comment instead. Since people are lazy when it comes to commenting code, Java therefore forces a degree of self documentation into data structures.
2. I notice that you're storing a Set of Strings (Set<String>). As I'm sure you know, this means you can throw a bunch of strings into the set without worrying about duplicates. In Javascript, this would be messy. Since there is no 'Set' data type in Javascript you'd have to have an associative array where you store the string as the key and 'undefined' as the value.
3. HashMap isn't the only kind of map. In Java, you could use a TreeMap so that content is sorted in key order. This saves you code later by not having to take the associative array, converting it into a list, and then sorting the list.
4. In addition to TreeMap, another extremely useful kind of map is the ConcurrentHashMap, which allows you to manipulate the Map simultaneously from multiple threads (e.g. multiple simultaneous web request threads) without getting concurrent access problems or having to write your own synchronization code.
5. Tons of other really cool data Java datastructures available to you in the Java libs. E.g. WeakHashMap which won't prevent referenced objects from being garbage collected if they are dereferenced elsewhere, and LinkedHashMap which is useful for guaranteeing the order of stored objects even if they are removed and reinserted.
So although I agree that var data = {} can be more convenient in simple cases, that's not to say Java's approach has no upside.
Yes, Java 7 finally will fix this bug (which is 7 years old!). The fact that good IDEs can work around this bug, doesn't mean Java sucks any less for it though. I have similar issues with syntax for anonymous classes (which you need a lot of, to even poorly simulate fp), but we'll save that for another day.
I don't want var data = {} in a statically typed language. I want something like Map<Int, [Int, Set<String>]> data = new HashMap(); to use the exact same example that I used above. This way gets all the benefits of what java currently does, with literally half the characters and more readability.
- True; I was just considering the mainstream languages. Yes, if you include Smalltalk, LISP, I'm sure there are better alternatives to learn specific concepts. But Java is used a lot in practice, and looks a lot like other languages used in practice (C++, ObjC, C#, and friends), which in my experience makes students more enthusiastic.
- I agree the "generics" are complex and overly verbose. They almost look like the C++ templated types, which I would go around with a 10 foot pole if I wanted to learn someone programming... IMO, pre-1.5 Java was best for learning.
- Yes, a lot of the verbosity is pointless, I think it helps for learning to specifically mention everything and be a bit repetitive. Other languages luckily allow leaving the obvious stuff out.
Ruby and Python are also mainstream, and both provide saner OOP.
C++ templates at least have a purpose and provide extra benefits (e.g. performance, compile-time computations for pre-caching). In Java generics are practically implicit type-casts and nothing else.
About verbosity ... clear conventions are a lot more effective for readability and learning. In a dynamic language like Ruby / Python, if you don't know what an object's type is or what it does, the fix is as easy as ...
That opens up a Python debugging console at the current point of execution, and you can inspect "obj" for its type / members and also modify its state. Also, if you're using "ipdb" instead of pdb then you've also got auto-completion on <TAB> (e.g. Intellisense). You can do something similar in Ruby, and in Smalltalk the whole application is active while you're typing in the IDE.
The problem isn't solved. You now know what the type is for that particular moment. What you don't know are the invariants of the don_t_know_the_return_type() function.
What types/subtypes is can return, whether it can return None, what exceptions it can throw, and whether those will change in the future. That information can only come from a type system and/or documentation. Simply reading the implementation only tells you the current state of the system, not the rules that will govern future iterations of it.
The more invariants that can be expressed concisely by the language itself and enforced by the compiler, the less work is left to the user of the function to review documentation/implementation.
This is one large reason why well-designed advanced type systems are so valuable -- you can express complex invariants using them, and then let the compiler enforce those invariants.
First, even in Java you don't know the return type / invariants ...
int n = func_that_returns_positive_even_number()
What you need is to document the thing:
def func_that_returns_positive_even_number():
"""Returns positive even number."""
This comment will be available when typing "help(func_that_returns_positive_even_number)" in the Python console btw.
Or if you're paranoid and that function can totally break your code:
n = func_that_returns_positive_even_number()
assert isinstance(n, int) and n % 2 == 0 and n >= 0
Or to make extra sure this will hold in the future (i.e. protecting from code-changes done by other people) ...
import unittest
class TestMyFunc(unittest.TestCase):
def test_is_positive_and_even(self):
n = func_that_returns_positive_even_number()
self.assertTrue( n % 2 == 0 and n >= 0 )
First, even in Java you don't know the return type / invariants ...
This is a classic type system straw man. The language doesn't support encoding integer ranges in the type system, ergo, the type system is not ever a significant advantage and all invariants must be documented. You fool!
What you need is to document the thing:
Some invariants require further documentation. The more you can express concisely in code via the type system, the more time you and your API clients save in both development and maintenance.
More succinctly: By expressing them in code you let the compiler automate the work of enforcing them.
Or if you're paranoid and that function can totally break your code:
An assert doesn't "protect" your code from future changes (better phrasing would be: make your code adaptable to change, loosely coupled with its dependencies as to allow iteration of your code and its dependencies independently).
An assert simply causes your code to fail in an obvious way. It's still up to you to track them down (at runtime) and figure out where you went wrong.
"""The more you can express concisely in code via the type system, the more time you and your API clients save in both development and maintenance"""
That's not necessarily true ... the more complex the type-system, the more time you lose feeding it.
"""It's still up to you to track them down (at runtime) and figure out where you went wrong"""
A language with runtime-dispatching and/or where NULL is allowed will have the same problems. I mean, personally I had more null-pointer-exceptions than all the other kinds of errors combined and multiplied by a dozen.
We are talking about Python versus Java here ... Haskell's type system is smarter and can detect lots of errors for you, but then again we were also talking about beginner-friendliness.
> Ruby and Python are also mainstream, and both provide saner OOP.
For various other reasons I prefer Python to Java (or Ruby), but there is one thing that Java got right and Python totally messed up: avoiding the disaster that is multiple inheritance.
Of course a much better option is to completely do away with inheritance like Go has done (plus Go combines some of the advantages of Python's 'duck typing' with the static type checking of Java's interfaces).
From where did you get the idea that multiple-inheritance (in general) is a disaster? Multiple-inheritance is only a disaster when the rules aren't clear on ...
(a) what you're inheriting
(b) what you're overriding
(c) what happens when you try calling super::
Otherwise, inheritance is really useful, and Python scores pretty good on all the above points.
Also, GO doesn't have "duck typing". That's called "structural typing" and it is a lot more limited than dynamic typing. GO is not really object-oriented either.
How do you define (and enforce) those rules? My general position when writing library code is that none of my classes should be subclassed.
The very few classes that may be subclassed are documented as such, and the methods that may be overridden are explicitly documented, as well as what behavior is required from the subclass when overriding those methods.
The invariants of complex inheritance hierarchies are very hard to understand. What happens if the superclass method isn't called? What happens if one of those methods is called, but another isn't, and the object is placed into an indeterminate state? What enforces that your subclass -- and all other subclasses -- will conform to these often complex and difficult to describe invariants?
This is very similar to multi-threading with mutable vs. immutable data. By making your data immutable, you grossly simplify the understanding of your system's behavior.
Why do you need to enforce rules, other than documenting the classes/methods defined?
What happens if the superclass method isn't called
Shit doesn't work or it breaks, then the person who sub-classed needs to fix it. Sometimes it also means your class is leaking encapsulation.
This also happens with plain aggregation/composition btw. It also happens with data-immutability (which has nothing to do with inheritance, as your object can be immutable and extend a dozen classes).
Why do you need to enforce rules, other than documenting the classes/methods defined?
The less repetitive work we delegate to human fallibility and instead delegate to a machine, the more time we have for human ingenuity.
Shit doesn't work or it breaks, then the person who sub-classed needs to fix it.
It's not that simple. The more difficult it is to understand the rules of behavior before changing the code, the more difficult it is to change the behavior. It's not just a question of expressing valuable -- but simple -- kindergarden requirements (this value may not be NULL), but higher-level requirements (this method must be called in the context of a READ-COMMITTED transaction).
The more you can express concisely, the easier it is to mutate the system over time. It's not a question of breaking code -- or noticing when it breaks -- but having the language assist in simply not breaking it at all.
This also happens with plain aggregation/composition btw.
Composition makes invariants easier to understand. If you then design your classes so poorly as to fail to enforce correct behavior through their API insofar as it is possible to do so, that is the programmer's failure.
It also happens with data-immutability (which has nothing to do with inheritance, as your object can be immutable and extend a dozen classes).
Data immutability is related to the avoidance of inheritance insofar as they both very significantly facilitate the full and easy comprehension of an implementation invariants.
this method must be called in the context of a
READ-COMMITTED transaction
I get what you're saying, but I like conventions and clear APIs with proper encapsulation.
Here's a sample from Python/Django ...
@transaction.commit_on_success
def do_stuff_with_the_db():
db.execute("insert into tmp values (1)")
raise Exception
Or if you need to supply the DB queries yourself, you can implement your own context-manager than use it with a with block ...
with stuff.inside_transaction() as obj:
obj.execute("query")
No need to extend a class that represents a transaction or some other shit like that.
having the language assist in simply not breaking it at all
You know that's an utopian goal. What I dislike most about languages that try to detect too much shit for me is that it gives me a false sense of security. And the worst offender is Java: not only is its type-system too weak, because it is manifest-typed you get the false impression that it guarantees stuff for you, when it doesn't.
> I think it helps for learning to specifically mention everything and be a bit repetitive.
That's right at the bottom level of the Dreyfus model of skills acquisition. One may be able to draw inference from this to the apparent profusion of useless Java coders, but that's further than I'd really want to go. Once you've stepped off that bottom rung, however, it's all noise and just gets in the way.
- Strong push towards threading. Before NIO, there wasn't a way to avoid threads if you wanted to talk to multiple sockets. NIO has solved that for networks, but the whole java framework/idioms push you towards threads where it shouldn't.
- No way to tell how things are implemented. I had written a heapsort in Java. It run x40 slower than the equivalent C. I was trying to figure out why -- e.g. was it inlining the things it should? if not, why? It's like Java is a subcontractor that won't answer about their methods - that's a bad habit.
> Strong push towards threading. Before NIO, there wasn't a way to avoid threads if you wanted to talk to multiple sockets. NIO has solved that for networks, but the whole java framework/idioms push you towards threads where it shouldn't.
Where? I'm starting to think people dislike Java for the community / way it's used in enterprise rather than anything to do with the language.
There's no push toward threading in the language :/ And NIO has been around for years.
- No way to tell how things are implemented.
What?? Look at the source code. Decompile the class file into byte code :/
>> Strong push towards threading. Before NIO, there wasn't a way to avoid threads if you wanted to talk to multiple sockets. NIO has solved that for networks, but the whole java framework/idioms push you towards threads where it shouldn't.
> Where? I'm starting to think people dislike Java for the community / way it's used in enterprise rather than anything to do with the language. There's no push toward threading in the language :/ And NIO has been around for years.
I've used Java (and was burnt by it) before NIO made its appearance -- it was 2002 or maybe even 2003. I recently had to join a Java project that's "only" been around 2 years, so it could have benefited from the years of NIO -- but other than the base language, there is no async interface to be found. You want to talk to a database or key value store? You have to block (meaning thread). Want to do a URL fetch? you have to block, because there is no non-blocking libraries to be found.
It's not that it is impossible to do -- it is quite possible -- but it is almost against java culture.
> Where? I'm starting to think people dislike Java for the community / way it's used in enterprise rather than anything to do with the language.
You can say that again!
The java language is, I guess, ok (I don't like it, but it's mostly a matter of taste). The java ecosystem encourages verbosity-for-verbosity's sake, engineering-for-engineering sake, and ignoring common sense in the name of "purity".
That Java project I joined, for example, has hundreds of classes that do nothing but store a field or two, have getters and setters for them, and perhaps have one "action" in them. Each such class is only a couple of hundred lines. This is totally useless, and when optimization time comes (and it comes, because the project is running too slow because of hundreds of layers of encapsulation), it is nearly impossible to find a culprit because everything is spread thin through hundreds of useless classes.
I put up a plea for good java code on HN a few days ago: http://news.ycombinator.com/item?id=2094274 it didn't get high enough to ellicit much response (20 on the "ask hn" page, not sure how far on the main). But the responses that are there did not cause me to re-evaluate.
> - No way to tell how things are implemented. What?? Look at the source code. Decompile the class file into byte code :/
I have a merge sort that takes 40x long as it does in C. The reason I needed to implement it myself is that for the 400MB of data I'm looking it, the library tried to allocate over 2GB (which is all the deployment machine has), and that's just my test data -- when I try to sort 2GB of data, it would have required ~10GB.
I carefully wrote it so that no object gets allocated during the sort. 40x slower than C. WTF? Perhaps that MemoryMappedByteBuffer I am using does not inline it's ".get()" method -- that would explain x5, I think, not x40. But I have no frigging way of checking.
I am doing high performance code, which I agree is not common, but my experience with Hadoop (which is becoming ever more popular) is that the design decisions are such that the performance cost is x10 for just using hadoop compared to writing straightforward code -- meaning you would need 10 nodes to just break even.
I've been burnt by Java. I came back only after I could treat it with an open mind. I've been burnt again. I hate Java.
My main problem with Java isn't the language itself. I think it's a pretty ok language as far as statically typed ones go.
My problem with Java is the way it's currently being used. E.g. using annotations to fix things that the language won't let you do elegantly.
There's a lot of horrible overengineering for relatively simple problems (you don't HAVE to use GOF patterns, they're just a suggestion for a solution. If you don't have the problem, don't use the solution!).
While there are a few nice libraries (e.g. http://jsoup.org/ ), even those usually don't have examples. Especially when you don't know the library yet, it's wonderful to get a few minimalist examples how to use it. Just compare:
My main problem with Java isn't the language itself. [...] My problem with Java is the way it's currently being used.
Absolutely agreed. Java has some weak spots, but the really bad things (and the most criticized things) are not in the language per se.
E.g. the complete over- and abuse of XML for anything. Complex Ant build scripts. The whole EJB disaster (the older versions, that is). Bad APIs (though in all honesty, most of the really bad ones are 3rd party and not from Sun). The over-use of dependency injection, beans, etc. without a justifiable reason.
Many of these things are just stuff we learned to do better by now, for example JSON and/or Yaml over XML configuration, and convention over configuration. That lower overhead and higher understandability are more important than some would be architect's dreams. On some things the jury is still out (e.g. more or less functionality in basic classes like String, List, ...).
Overall, I think some people are overlooking the many benefits Java has made available to the main stream. It's entirely possible to write beautiful Java code, you just have to go the extra mile of leaving the convention sometimes.
> My main problem with Java isn't the language itself. I think it's a pretty ok language as far as statically typed ones go.
Really? Which statically typed languages have you written in? I think Java (together with C and C++, lets not let Java take all the blame here) is pretty much the bottom of the barrel as far as statically typed languages go.
Compare some of the type examples in the threads here with Haskell's type system...
(Is this post meaningless functional programming propaganda? Probably. But for the love of god stop the myth that Java/C/C++ are good examples of static typing...)
"My problem with Java is the way it's currently being used"
That is a good point. However, comparing the way things are done in Java and they way things are done in Ruby is a really bad comparison. Those languages live in different problem domains and were grown to solve totally different problems.
As the article points out, Java itself isn't most popular in the niches it was supposed to fill. I don't think it's really appropriate to talk of what they were grown to solve. What they are actually used for is far more illuminating, and while Ruby is encroaching more and more on Java's "home turf," especially now that JRuby has the momentum and capability it has, you can't really say the same the other way.
It says something when Ruby's notoriously shonky documentation is the paragon in an example like this. The comparison with, say Python, is even more illuminating.
Learning Java itself is simple. As a language it doesn't offer particularly anything that isn't already found in other languages, given you know other languages.
The biggest effort would be to learn the libraries and you basically can't do that unless you're regularly writing code that interfaces them. But that you can do from Clojure or Scala or whatever language that runs on JVM.
For someone that didn't grew with Java, what is hard is finding your way on the ecosystem.
If you're a newcomer, and you're going to do a website in Python, you soon find out Django, which recommends you to use, lets say, Apache and PostgreSQL. You feel good because you got a sense, immediately, of what you need to know and install to run a website.
Compare that with Java...
I'm using Clojure with Jetty now, I haven't launched yet, and I'm afraid of things like application servers and doing everything with enormous amounts of XML.
I think a great way to learn programming is to start with the simplest languages.
Start with Scheme. It doesn't even have proper objects and classes, so you'll be forced (guided, actually, if you go the SICP route) to implement them yourselves. This teaches you in an elegant way about essential concepts in programming, but manages to keep your head clear and unbiased about the many different ways to do things, whereas a language with a strong fundamental paradigm (i.e. Java) may bias you forever.
Then learn C. It couldn't be more different from Scheme, but that's a good thing. It forces you to think about things you thought you didn't reaaly have to think about - such as memory allocation.
With a solid basis in these two languages, you've covered just about every paradigm or building block of programming languages you can imagine. Learning another language then becomes trivial, and for bonus points, you'll be a lot better programmer at those languages too.
Learning Java the language is not a big problem. Preventing a young Java programmer from becoming an architecture astronaut is. Most Java frameworks I have played with seem made by and for them (it's like they have been playing design-pattern bingo) and exposing young programmers to architecture astronautics may damage them irreversibly. By the time they start talking about real-life stuff like they are method invocations, they are lost.
I can't say I disagree with the author. Even with all its troubles, there's simply too much legacy Java code to abandon. Much like Cobol and Perl, there will always be a need for Java programmers to--at the very least--maintain existing code.
If you're a complete newb, learning Java will also introduce you to the C-like family of languages. (Although I say everyone should learn C first in order to appreciate memory management)
It depends on what you think is cheap. But Java developers that even know a little are doing fairly well right now. Java developers that know a lot can demand top dollar and there are tons more opportunities to earn that amount than with pretty much any technology.
Java will certainly not teach you anything else from C but how to put statements in braces and a couple of constructs with similar syntax. And if someone thinks that is C, he doesn't know C.
Although a Java translation of a C program still looks C-ish, I agree; idiomatic Java is very different from idiomatic C. (This is not surprising; Java is intended to be "C++ done right", and idiomatic C++ is rather different from idiomatic C.)
Java is part of the C family of languages, you can draw a pretty straight line between C++ and most of Java's features (or non-features, or lacks of features).
The basic grammar and syntax are pretty much the same, as are most of the fundamental data types and their behaviors, and many other things. There are obviously a lot of differences, but if you know one language it usually isn't hard to read well-written code in the other language.
> as are most of the fundamental data types and their behaviors
Don't ever tell that any C-novice coming from Java. Or you will have fun with interesting array constructs or functions returning pointers to variables created on the stack, etc.
I think he means C-style syntax rather than C family of languages. Java certainly is made to read like C and C++ more than anything else even if it is very different under the hood.
I'm formerly a COBOL developer, now working with Java. Did anyone else think that his main argument for learning Java could also work for learning COBOL? (substitute COBOL for Java in the quote below)
<quote>
It's not going to go away any time soon. There's too much momentum. There's no need to worry. Ever since the launch of Java I've heard that it's going to be gone or unusable tomorrow. History shows that just doesn't happen to popular programming languages.
If you learn Java now, you may still be using it 20 years from now. Or 30.
</quote>
Like others here have said, my main issue with Java is not the Java language itself, but rather with the culture of overcomplexity. As an example, every web framework has the issue of abstracting pieces of templates in a consistent manner. Django solved this by writing their own (simple) template language that supports template inheritance.
The java ecosystem seems to have solved this with Sitemesh, a library that implicitly decorates xhtml for you, based on another XML configuration.
I see a few things wrong with this approach. For one, it presupposes all of this nice template work is only ever going to be needed for xml/html. Need to send a text email? Need to make a latex document? Second, it's entirely implicit. This is a common theme with java libraries-- if the usual task is too verbose, usually implicit is the solution. I don't think there's a good reason Java can't be concise and explicit. It just isn't the thing to do.
Your point is absolutely correct, but I think your example is a little unfair. There are a ton of template libraries that handle inheritance or code sharing in both Python and Java, some restricted to XML, others freeform. Saying that any one on either side has "solved" it ignores the others that have approached the problem from another direction.
While I agree with the authors summation, Java will be here forever and a day, the real question one has to ask one's self is will Java continue to receive the developer mind-share in the future? this generally translates to all the cool problems are being worked in the mind-share language.
The second question is will Oracle further drive away developer mind-share?
I think the answer to both of those are self-evident and when analyzed in that light learning Java looks a lot like learning COBOL around the time Java hit 1.2. The cool stuff was happening over in Java which was now stable enough to take off.
I would say the big difference at this juncture is there is no clearly evident replacement at this point. As their was with Java and COBOL, but I do believe the mind-share shift has happened and we are all looking for the next king amount the court.
I would say learn Java not because there is a lot of existing Java code (using legacy code as the main argument is a bit presumptuous) but because the JVM is a versatile and powerful foundation for your code: profiling, debugging, hot class-swapping, monitoring are really well done.
Then there's the ton of libraries for everything and good tooling.
Of course, the language itself might be a bit too verbose for some people but I'd rather take Java and the JVM then another language where I'm missing some of this.
I guess most people don't differentiate Java the language from the Java Virtual Machine and that's a shame. I could be doing JRuby on the JVM and be happy then drop into Java for some optimized code. Thus, Java just happens to be the "C" programming language of the JVM.
You should always understand the programming language your platform of choice is built upon. That means Java if you are going to use the JVM in some way, just like it means C if you are going to develop native applications on Windows or Linux.
I have always been astonished at VB6 programmers which ignored what a Variant was, when they were using such datatype all the time.
So let me say first that Java was my first language, and I don't believe I've been corrupted in any nasty ways.
I was 10 when I learned Java, and it always threw me off having to copy those first two magic lines which I never understood. What is 'public class'? Why 'static'? What even does the word 'void' mean? 'String[]'?!
Sure, after a few months I figured out the meaning of all of those words -- but it wasn't nice having to do that "open up last file I wrote / copy magic header to new file".
I don't think it's wrong to teach Java as a first language, but I think that there are better first languages which show what real computer science is faster than Java does. Less coercion of a language to do what you want and more free-flowing ideas.
First off, even removing line breaks it is still verbose. I can't think of another language that's even close. Even C++ would give you a shorter version.
Far more important though is how much stuff is in that one line of code that needs to be understood in order for what you are typing to be more than just arcane magic words. Pedagogically speaking that is an awful way to get started, so I think the claim that that is not a great way to teach kids is pretty valid.
To be fair I fully agree that 'Hello world' is not a good way to assess the worth of a language in general, but the ease of getting started is a valid concern when using a language to introduce programming
class M { public static void main(String[] a) { System.out.println("Hello World"); } }
The problem with using this as learning material is that the teacher has to treat the majority of the code as "magic" until the student is ready to learn about classes, access permissions, methods, arrays, namespaces, types and so forth.
In many other languages, there's much less boilerplate to handwave away:
Your example illustrates one of the main problems with verbose languages. Imagine showing this line of code to a child! I think anyone new to programming would be completely baffled. Before they could understand even the "simplest" Java program they have to know what is meant by public, static, void, main, String, arrays and more...
"And that startup time of JVM is orders of magnitude worse than pretty much anything else."
Sure, let's throw out of the window all the optimizations the VM can do at runtime because it takes a whole second to start the VM.
And you need to actually compile the source file before executing it!
Those server applications should all be written in bash because it's instantaneous!
Kidding aside, one prof at my university once told me that she wished students should still have to compile and wait for a program to start. When there is such a delay, it pays to read your code and try to understand it before running it (as opposed to run-and-see-what-happens).
> And that startup time of JVM is orders of magnitude worse than pretty much anything else.
This one never ceases to annoy me.
The JVM is optimized for LONGRUNNINGPROCESSES. It's targeted specifically toward running on servers, because that's the default use case. There's about a million options switches, gc controls, etc if you want to run it in another way.
If you want fast startup time, try compiling with gcj or something.
In any event, I run a javac->java loop many times an hour when developing, and the startup time is un-noticable.
Yes, I think what you allude to is that Java is just too verbose and unnecessarily complex and I agree. That said, the tools available for Java (i.e. eclipse) are pretty nice.
My kids will get the full C drill. Basically the same way I learned programming.
Training wheels are only necessary if you could hurt yourself by practicing. Where there is no such a danger there is no reason not to start with the "hard" stuff. It's like learning guitar or piano - you don't have a practicing instrument with only two strings/5 keys that always sounds good and forgives errors.
The main reason would simply be that a lot of interesting problems require tools/libs written in Java (Hadoop and family, Lucene, etc) if you want to get off the ground quickly and efficiently. Eventually, you're going to need to tweak your tools - so learn Java.
Furthermore, a lot of far better languages run on the JVM and have Java interop (Clojure, Scala, etc). Strictly speaking you don't have to know Java to use Clojure. But in practice, it's going to make your life miserable if you don't know the Java libs reasonably well.