I'm no fan of universities focusing just on Java programming either, but the contentions Dewar makes are completely ludicrous. Either he's deliberately distorting things to make his point or he's completely out of touch with real-world engineering projects; Java is pretty much the main language for business programmers these days, and plenty of things like financial transaction systems are written in it, so saying it's only used in simple web applications is laughable. A large percentage of people writing simple web-apps have moved on to Ruby and Python anyway, leaving Java to be used primarily for back-end enterprise systems.
Similarly, the idea of quizzing candidates on whether they can diagnose and track down compiler or processor bugs is similarly useless; there are far more critical skills to test for in an interview. If I was interviewing someone for a position at a hardware company writing software for prototype chips, then I'd probably care; if the job was for about 98% of the other programming jobs in the world hardware knowledge is about 152nd on the list of things I'd care about.
It's hard to imagine how someone could take this guy as an authority on how universities should educate people and what job skills graduates will need when he's so obviously out of touch with the real world.
#!/usr/bin/python
####################################
# Curmudgeon_Article_Generator.py #
####################################
import random
i=[ "C", "Ada", "Assembler", "Fortran", "lisp", "binary programming with patch cords", "time sharing mainframes",
"by dropping off punch cards and wait two days for the output", "butterflies", "a magnetized needle and hard disk platters",
"C++", "C#", "Python", "Ruby", "Django", "Ruby on Rails", "Cobol on Cogs", "Scheme", "Smalltalk", "Haskell", "Erlang",
"Distributed Systems", "Parallel algorithms", "network programming", "Win32 API's", ".NET", "Perl", "Unix", "Windows 95",
"Windows Vista", "GUI's", "VAX command line", "vacuum tubes and a sodering iron", "an abacus", "clay tablets",
"wire, duct tape and spit", "transistors", "slide rules", "microcomputers with 5 1/2 inch floppies", "machine language", "a PDP 10"]
for x in range(20):
print "\nKids today don't know squat about programming."
print "Why when I was that age, we didn't have %s." % random.choice(i)
print "We learned programming the hard way, using %s!" % random.choice(i)
print "Boy, am I ever grateful."
I am biased, but I completely disagree with you. I had him as my professor three times while I was at NYU, and Dewar is very much in touch with real-world engineering projects. Also, hardware and OS-level knowledge is very important during performance optimization, which very often gets left out in "real-world" programming jobs due to lack of time in the project schedule and lack of knowledge on behalf of the very same programmers that came out of a vocational education. This is something to lament about in the industry, and not say "oh well, since we don't do it and seem to get away with it, let's maintain the status quo".
That's a different argument than the one that was made in the article; the argument in the article, that Java isn't used for complicated projects and that engineers need to be able to debug compiler or processor bugs, is an incredibly poor one.
I've done a lot of performance optimization of Java apps in the last 6 years and I've never once had to look at bytecode or think about processor instructions or OS issues; I've had to think about memory allocation and consumption, synchronization, algorithmic complexity, caching, query plans, database denormalization, etc. Obviously optimizing a C application is a different world, but it's definitely not the case that even the majority of software engineers ever need to dive that close to the hardware these days given the rise of managed/VMed languages like Java, Ruby, Python, and C#; it's good to hire someone who can do that or who can learn to if your application might need it, but in general it's just not as important as a lot of other skills, and I think it's pretty incorrect to say that engineers who can't do that will have no job prospects in the US.
The point is not that he thinks "Google is a simple web app". His point is that the vast majority of engineers that learn Java as their first programming language, and never hit systems-level stuff below that are unlikely to ever create a system as complex as Google. That seems plausible since we know how much low-level hardware stuff Google does to keep their systems running, from running a custom Linux kernel to writing their own in-house programming languages.
Not that I don't agree with you, but are the universities meant to be that vocational? Are they there for specific job skills or more for general knowledge rather then specific skill sets?
Universities should not be vocational and neither should high schools, IMO, but for some reason in the US the idea of purely vocational education is taboo. So the two get mashed together and you get this tension between the lofty notions of a liberal education and the need to make a living.
Separating people who want to build car engines from people who want a liberal education at the secondary level is seen as a sort of elitism.
I totally agree that they shouldn't be vocational, I was just pointing out that the guy's arguments about what was useful for the job market were totally specious. I personally think schools should teach Computer Science instead of just Programming (or, if they want to be vocational and give people certificates in Java Programming go for it, just don't call it a BSCS) and that they should expose you to a large variety of languages, techniques, and ways of thinking. There are plenty of things to complain about with CS coursework at various places, but saying Java is useless in the real world or that debugging compiler or processor issues is a critical skill for every engineer is ludicrous.
What we need are software engineers who understand how to build complex systems. By the way Java has almost no presence in such systems. At least as of a few months ago, there was not a single line of safety-critical Java flying in commercial or military aircraft.
Such a breadth of options! Surely if we teach students how to create avionics systems for commercial aircraft or military aircraft, they will never want for jobs!
If he were to point out that using any single language as a vehicle for computer science is bad because it ties the knowledge to APIs rather than underlying concepts, that would be one thing. If he were to further point out that a lot of complex systems nowadays aren't written in a single language, and that being able to quickly acquire competence in new technologies is possibly the most valuable skill a university can give to its students, that would lend even more weight to his argument.
But no, he's just whining that no one thinks Ada is relevant anymore. I'm sure he's having trouble finding fresh college grads who share his passion for safety-critical avionics systems, but I really can't muster up that much sympathy.
A long long long time ago (my first out of uni job) I used to do machine control systems (thankfully, not for aircraft) but there were often safety issues (so you didn't burn/crush/soak people with acid etc). We never used a turing complete programming language, as it was seen as too risky, was always something primitive like ladder logic (which seemed intuitively easy to reason over and forced simplicity and fail safe behaviour).
I am not sure what has changed, but I wouldn't want to be doing stuff that different now with a general purpose programming language (be that Ada, Java or for reasons I would never understand, C).
I dunno, you deal with the risk of standing on thin air no matter what you use, unless you've designed your device from the silicon up.
There's not a strong argument to be made for using Java in an industrial setting, but I'm not clear that a finite-state machine that compiled down to JVM bytecodes would be inherently less safe than anything else.
Ada doesn't guarantee the correctness of your program, and C certainly doesn't. This interview reads more like an inarticulate complaint about how things keep changing than a real argument against trends in education.
Minor nit-pick. I work around (but not with) ada code, and a common pattern is that if you can get your ada code to compile; it's pretty likely that it's safe.
If you're just saying Ada doesn't guarantee that the program does what you really need... no language can do that.
C is the 2nd most unsafe language I've worked with, asm being the first.
This professor is also the CEO of AdaCore, which sells an Ada development environment and other Ada-related tools. This explains why he is opposed to Java and is so bothered by people who file sloppy bug reports for compilers. Also his claim that "Java has almost no presence in complex systems" is ridiculous. This article is junk.
I don't think those were his main arguments. His strongest argument is much more about out sourcing and that students need to be able to do the hard jobs. We should also remember that the prof. didn't write the article himself so this is some journalist paraphrasing his words.
I think the bottom line is that Students who's university career consists of only programming in Java miss out on a lot of the low level technical details about how a computer works. A good CS program will teach concepts instead of tools and languages. That would be called a "technology" degree. There is a place for Java but to teach it exclusively at the expense of C, C++, Assembly, etc... is foolish and leads to exactly what this article says it does people who don't really know what they are doing and are easily replaceable by cheap overseas labor.
"[...] students need to be able to do the hard jobs".
Difficulty comes in all shapes and sizes. Your next point is better: "[...] teach concepts instead of tools and languages".
That approach to education sees any language and/or run-time environment as an opportunity to work with an underlying concept.
A study of any language or run-time to the exclusion of an underlying concept is short-sighted. Any student with a language-centric approach to learning faces many difficulties in their professional life. We must all be able to adapt to changes in our field. We must be able to separate fad from fundamentally different.
> You begin to suspect that a problem you are having is due to a hardware problem, where the processor is not conforming to its specification.
What is the probability of that happening?
The last time this happened in any significant way was the FDIV bug on the original Pentium. That was 1994. It was extremely rare in practice. That's a ridiculous question and completely irrelevant to 99%+ of software development.
I agree with sofal that reading all these "you should think and work exactly the way I do or else you're stupid" pieces I see get really fucking old.
Fucking around with disassemblers and assembly language instead of just running on AMD chips instead of Intel until the problem gets resolved is a massive waste of time.
I have worked with more than one microcontroller in the last few years that was not operating to spec. On top of that, in one case we couldn't even do a work around due to compiler error. I think the problem is a lot more prevalent than you think.
Was that some random product from some 3rd party outfit, or a flagship CPU from a major manufacturer that's installed on tens of millions of computers?
I think you're missing the point. He's not looking for hardware hackers. He's looking for people who know how a computer works. God job candidates should have no trouble answering that question regardless of whether they've actually ever seen a CPU bug.
"You begin to suspect that a problem you are having is due to a hardware problem, where the processor is not conforming to its specification. How would you track this down? How would you prepare a bug report for the chip manufacturer, and how would you work around the problem?"
Actually he's looking for people who know how to track down a processor bug, report it appropriately to a manufacturer, and work around it. Would it demonstrate complete incompetence and a failure of my CS education to answer: "I would learn how to track it down, learn how to submit a report, and then figure out how to work around it."?
It's one thing to expect job candidates for AdaCore to be experienced embedded-system developers, but it's another thing to expect university CS departments to throw their resources into that small corner of computer science so that he can hire new grads without having to bring them up on embedded systems work. I'm sure that CS departments with tons of resources can provide an embedded-systems track for undergraduates.
IMO the most pragmatic way to deal with the question as posed requires no knowledge of computer hardware.
If you want hardware knowledge, ask about pointer arithmetic and memory hierarchies, which is still mostly irrelevant to what most developers spend most of their time doing.
Before someone launches into a jihad about memcached and load balancing and super high-scale stuff, most people never have to do that.
Why can't they teach both - the high level reality of today, but also down to the metal (its how I did it, although the "high level" back then was miranda and pascal !).
It really depends on the context of your products. If you are working on embedded systems, your have to know when your compiler and hardware has problems. And not all software products works like web based software. Web based software has its problem domain like scaling, security, which is very different to other area of software systems,
Another case is for programmers working in Operating Systems. I remember like this url http://marc.info/?l=openbsd-misc&m=118296441702631 was mentioned a while ago when OpenBSD group found a lot of hardware bugs in Core 2 Duo.
The danger is the misplaced blind faith on the reliability of the system that we are using. And teaching only Java to students is a sin for computer science to build such a blind faith.
I am the last one to defend using Java as a teaching language.
The professor in the OP seems to be angry that there are people who spend their time doing things other than making toys for the Pentagon, and I say I don't give a hoot about his Ada silliness.
I was in fact a student who took his compiler class 12 years ago in NYU. A lot of students didn't like his classroom presentation because he would rather talk topics that was not covered inside textbooks (books are for reading at home for homeworks and project). And his class is a project based so in fact the only factor for the grade is the project. I had to write a code generator for GCC AST tree to x86 assembly in the subset of Ada that I chose. In fact, the only exciting part is to write this backend, which is beyond descriptions unless you did that.
He just belongs to a very old school in this discipline that holds the belief that you need to know what are you building from the lowest level components. You don't need to be an expert unless you work in that domain. But at least you are not clueless.
My recollection may be wrong. But I thought his Ph.D. was in fact in Chemistry. And he built and sold Cobol compilers before he worked on SETL language and Ada.
You are correct. Like I mentioned in this comment stream, I also took his class. This was 5 years ago. I have to say that I learned a ridiculous amount of low level systems stuff from his classes, and very much enjoyed his classroom presentation - I completely agreed with his view that a professor is there to present his expert opinion on the topic that may very well be different or tangential with the curriculum that gets covered by you doing the reading.
I was more attracted to project based courses when I was there in graduate school. Other professors that I liked very much, one is Ken Perlin and another is Denis Zorin. It was the real learning experience to build graphic stuffs from scratch.
I didn't take Dennis Shasha's class. But his heuristic learning and distributed computing are also project based.
Although he has an odd way of expressing his opinion (and mostly appears mistaken ) I think his teaching would be very interesting and I would definitely listen to it if I had a chance.
I wonder how much of this is headline grabbing (which is fair enough, it will help enrollments perhaps of the smarter students I think).
I don't think he is active teaching now. He holds emeritus now so he has no duty to teach unless he wants. He probably spends most his time working on AdaCore and its spinoff. I suspect his attitudes is due to his father's influence, who was a very famous organic chemistry researcher Michael J. S. Dewar.
In fact you can learn the similar experience just by doing a project yourself. I learned most stuffs in my life from projects. A course just gives a pressure to finish something before deadline. PG and YC are using the demo day to push applicants to finish something and with a potential payoff for finishing it. (Why? because of human nature tends to procrastinate!)
So maybe you just pick a non-trivial but not too difficult problem (Unless you aim for Turing Award and wants to be the first one to prove/disprove P = NP), find some friends/comrades who wants a challenge. Set up a deadline and payoff and go for it. You learn by doing it and the more you do it, the easier it will be.
So I guess that kinda shows that uni teaching should not be too vocational, other then for team/assignment/project work. Sounds reasonable. I pretty much had to learn C in that context, of projects etc, (we weren't initially taught it, we were taught other languages, and when it came time for C they threw the K&R book at us and said "off you go").
The Pentium has 70-something errata, each of which had to be found and worked around by somebody. Now consider the range of microprocessors and microcontrollers used in handheld devices, appliances, and other electronic equipment.
I personally have run into compiler issues for Lattice FPGAs where it was outputting incorrect binaries. And I couldn't run on another chip until the problem got resolved, because hardware was being developed in tandem with the firmware and we had already progressed beyond the point of component selection.
I understand that this is beyond your experience. These are obviously not the droids you are looking for.
Honestly, my introductory CS class was in Java, and the dependence on libraries is complete BS.
We were required to write all of our own data structures/classes as we learned them.
After we proved our competence in making proper structures, our professors would then allow us to use certain classes from libraries to get to the meat of more advanced topics.
While I don't agree with using Java as a first CS class language solely based on popularity, the future of students who take one or two CS classes is greatly benefited by the word "Java" on a resume. Java isn't THAT bad. And at least the C-style syntax matches up with a lot of the most popular languages.
Using Java as the primary teaching language in a CS curriculum does not necessarily mean the students are relying on libraries. From what I can tell, it's his perception that there is more reliance on libraries in course work. He mentions no studies.
The main language we used was C++ (although for the intro course, it was the C subset of C++, but with C++ style I/O). In our classes we were expected to implement our own data structures, we weren't allowed to use the STL (except for strings). When we got up to classes where they assumed we knew how to implement the classic data structures (say, compilers), we could use the STL.
Personally, I don't think Java is a good first programming language. But it's still possible to construct good courses that use Java - it depends more on how the course is structured than what language is being used.
'Furthermore, Java is mainly used in Web applications that are mostly fairly trivial,'
I'll add an Nth comment to how ludicrous this article is. Is building a highly scalable server really so trivial as all that?
Higher levels of abstraction are good. Using libraries and avoiding low level recoding of basic functionality is good. Get serious this is so obviously trash it's not worth commenting further.
I think some of the commenters are missing the underlying message here.
The point is that if you start out by using a really high-level language like Java or Ruby or, going even higher, JavaScript then overall you won't be a very good engineer. To give an extreme example, JavaScript is Turing-complete, so you can re-implement anything in it, but I still wouldn't hire somebody who's only ever used JavaScript, even if he wrote an "OS" in it.
The reason is simple: sometimes you have to go down to the C/C++ level, sometimes you have to go down to the TCP/IP level, and so on. If you can't do that, you're not as valuable as someone who can, and you'll be outsourced.
I can even give an example from my own life: For 3 years I worked as a C/C++ programmer, where the work involved lots of debugging, dealing with memory managers, all kinds of fairly low-level crap. It wasn't rocket science, but it required a different mindset then working at higher levels. I hated it, because I wanted to do more interesting, more modern stuff. So I quit. The company desperately tried to keep me on, but I wouldn't. I went to work for a smaller company, like a startup, where I was given a small project to design and implement on my own, since I was hired as a senior programmer. Now, I'm not a particular fan of Java, but this project just cried out for it. 1 year later I was basically layed off (among other reasons) because the company realized that since the work I'm doing is not low-level, they can hire any "PHP-monkey" for 1/3 of what I make. I've basically been "outsourced", although not geographically.
So the lesson to engineers is: don't go work for small webby startups unless you're a co-founder, because most of the stuff they do requires moderate skills and thus they don't need to pay good engineers, plus they're always short on cash =)
How does "low-level" imply difficulty? Why are low-level programming tasks considered harder than tasks higher on the abstraction ladder? I don't think the difficulty of your job has anything to do with with the level of abstraction you're working at. If you hold job difficulty constant and vary the level of abstraction you'd probably see different salaries based on the supply and demand for programmers working at that level.
Low-level is not superior to high-level. They each have their own different challenges. I don't think it is a bad thing to focus on high-level for your career. That does not mean spitting out half-baked PHP web pages. Of course it's helpful to have been exposed to low-level concepts at some point.
This is the same tired old argument about how real engineers program in binary and you ignorant handicapped kiddies can only program in C. The only thing that changes is the level of abstraction at which the 'real engineers' work, and that just keeps going up.
The point is, low-level, as the name implies, is at a lower level than high-level. So every once in a while, you'll run into problems at the higher levels, and you have to dig into lower level stuff to solve a problem.
So what do you do if you cannot because you have no conception about the lower-levels / overall architecture? I've seen PHP programmers run into bugs and throw up their hands and yell "It just doesn't work! But the bug cannot be in my code!". Of course the bug was in their code, they just have no deep understanding of even their own environment.
'students' reliance on Java's libraries of pre-written code means they aren't developing the deep programming skills necessary to make them invaluable... they are not fully prepared to compete against what is now a global field of rigorously educated software developers'
I think it's fair to say that most professional programmers in developed countries (a) make plenty of use of existing libraries, (b) may not have 'deep programming skills', and (c) probably don't have a formal education in Computer Science. It also seems unlikely that the majority of developers in India or wherever have these either.
The main reason it's hard for a CS graduate to get a job seems to be that agencies and companies want commercial experience, which can be hard to come by.
'Dewar says that if he were interviewing applicants for a development job, he would quickly eliminate the under-trained by asking the following questions... "I am afraid I would be met by blank stares from most recent CS graduates"...'
...and by many 'professionals' too, I would imagine.
The beauty of Java is that it frees the student up from making low-level mistakes (pointer mis-initializations) and spend more time on design/making and fixing higher-level issues. Getting core dumps in GCC while compiling C code ain't the most stress-free experience.
spend more time on design/making and fixing higher-level issues
Of course, there are "higher-level issues" that you can't express in Java. Java convinces students (and professors, sadly) that the only kind of OO is single-inheritance with interfaces and no MOP. Java convinces students that programming is all about making the not-really-static types match. I won't even mention what a hack functional programming is in Java (and I'm sure most students never even hear how to hack in FP to Java). Checked exceptions were widely considered a huge failure. Honestly, I think most of the time you'll spend with Java is going to be time spent working around its retarded limitations. Do we really want to teach kids that programming is about dealing with limitations in your tools?
At the very least, limiting yourself to Java limits your ability to learn programming. Java isn't a very good language. It is missing features that were widely considered "a good thing" 20 years before its invention. Note that I don't care about teaching memory allocation and freeing the memory, that's monkey work, not computer science. So I'm not advocating teaching C. But languages like Lisp will let you teach all the computer science without workarounds, and still have all the "safety" of Java. (Plus, Lisp programmers get paid a lot more than Java programmers. Surely that's a selling point?)
It's also true that the failure of Java is that it cannot express a mis-initialized pointer.
A good programmer has to know both high-level and low-level stuff. And also has to be able to understand or, at least, to make a decent hypothesis on why GCC core-dumped.
I agree. I don't have a problem with using Java for an intro to data structures and algorithms course, provided that it isn't the last stop in the curriculum. I had to learn data structures using C++, which meant I had to figure out pointers and segmentation faults at the same time I was learning the algorithms behind lists, tree traversal, etc.
A curriculum shouldn't be dumbed down, for sure, but if you can make the learning curve easier, increase retention, and keep exit standards high, why not?
I'd think debugging those kinds of mistakes is part of learning. I've worked with too many programmers that can't debug because they don't really know what a pointer is. Even in Java understanding what the JVM is doing ON TOP of the hardware help you understand how to write good code and debug slow code.
Once qualified, sure don't reinvent the wheel, but when you are learning it's a vital experience.
> I think it's fair to say that most professional programmers in developed countries (a) make plenty of use of existing libraries, (b) may not have 'deep programming skills', and (c) probably don't have a formal education in Computer Science. It also seems unlikely that the majority of developers in India or wherever have these either.
Making good use of existing libraries is a good thing. Lacking basic skills obviously isn't, but I think the reality is that many 'professionals' do lack these skills.
As to having a formal education - I did a year of Computer Science, and I'm glad I did, but I think that programming requires so much self-learning that it probably isn't that important.
I'm tired of reading about how one person's favorite position on the abstraction continuum is the most important. Or at least I'm tired of reading that argument with nothing to back it up. Sure, CS departments may benefit from changes, but I'd like Dr. Dewar to tell us why safety-critical systems programming, compiler debugging, and hardware spec testing are what university CS departments should focus on.
To be fair, there was a brief bit along the lines of "because the easy, low-hanging fruit, web apps in Java jobs are the first ones to get outsourced". Whether that's really true or not is another matter.
I'm all for shifting CS education away from easily outsourced commodity skills, but Dr. Dewar goes beyond that and argues that safety-critical embedded systems is where a CS education should be focusing. I'm sure if you asked a CEO of another tech company with a different technology focus you'd get a completely different take on what a CS department should be instilling into its students. Dewar is just another special interest lobbying for changes that will benefit his own industry.
The reason why compilers are quite possibly _the most_ important subject of CS, and should be a mandatory course in every department was well outlined in Steve Yegge's "Rich Programmer Food" blog post. The quick-and-dirty summary: anyone who can do compilers has the tools to manage codebases that are a magnitude larger than someone who doesn't. http://steve-yegge.blogspot.com/2007/06/rich-programmer-food...
The broad generalizations in Dewar's opinions are unhelpful. If you want to write software for military airplanes, then you should go ahead and learn Ada (whatever that is). If you want to build applications for the web or for "low-tech" enterprise, then you'll probably be better off learning Ruby, Python, Java et al. Both are perfectly viable ways of making a living and there is plenty of demand for both types of programmers.
In brief Ada is a safety and reliability oriented language popular in aerospace and the military. It is the official (computer) language of the US army.
"In essence, he said that today’s Java-savvy college grad is tomorrow’s pizza delivery man. Their skills are so easily outsourced that they’re heading for near-term obsolescence."
That would only be the case if there was a finite amount of programming to do. Since there isn't there's room for local and outsourced programmers.
I think that part of his point, which he has failed to articulate, is that future programmers need to undergo the inherent rigor of low level programming so as to build the mental muscle - so to speak - needed to solve highly complex problems.
I believe a suitable analogy is how Roman legionaries initially trained with lead swords.
Similarly, the idea of quizzing candidates on whether they can diagnose and track down compiler or processor bugs is similarly useless; there are far more critical skills to test for in an interview. If I was interviewing someone for a position at a hardware company writing software for prototype chips, then I'd probably care; if the job was for about 98% of the other programming jobs in the world hardware knowledge is about 152nd on the list of things I'd care about.
It's hard to imagine how someone could take this guy as an authority on how universities should educate people and what job skills graduates will need when he's so obviously out of touch with the real world.