I've been programming for 20+ years, and recently moved over to Python. Sure, I could code on day one and figure out how to get programs working pretty easily. But the nuances with it are still things I need to work on a lot. I still don't program Pythonicly, I program like a C programmer writing Python. In fact, I probably program in all languages like I would a C programmer, and that's not good enough, in my opinion.
I have fallen in love with Python because it's so damn easy to get productive, and I really want to be a great Python programmer. That takes a lot more time than the OP suggests, and requires you to immerse yourself in the patterns of the language and in the community, in my opinion. Not just dabble a bit and then check a box saying "I'm a polyglot!"
Those shortened links take you to amazon (not affiliated in any way, FYI, just though the length of the URLs was obnoxious)
Those are the definitive python programming books I came up with anyway.
While we are at, I have to say once I learned to really code Pythonically I find that I can apply the PEP8 standards to almost any language. Admittedly, I, like John Siracusa, am a top level language debutante and don't live in C or C++ production code (I sometimes use objc but swift is...easier :). I remember learning C, and thanks to arduino I certainly using some varient of C/c++ there more heavily, but my coding style follows more or less the pythonic standard (with PEP8 being the backbone of that).
Just FYI: Hacker News already shortens links, no need to pass them through a 3rd party service. I personally prefer seeing a readable domain name before clicking (as I'm sure many others do as well).
But then the same question as with other short URLs appears: Does amzn.com really belong to amazon.com, or is it a third-party service that may redirect you somewhere else at will?
The point is, I don't care whether amzn.com really belongs to amazon.com or not. I want to check a domain quickly without having to research that kind of stuff.
Thanks for sharing these. I've been writing a lot of python code solo lately and have started to worry about how to make it more pythonic so it's maintainable and comprehensible to any future collaborators.
I think the Python community considers list comprehension more readable than loops.
And writing generators allows you to write simple code with them. I don't consider generators to be fancy - just a tool to pull out complexity into a small area and make the rest of the code readable.
It takes a while to get used to the idiomatic constructs of a language. Luckily it's much easier than with natural languages.
My first Ruby programs were very Java like. I doubt that my first Java programs were C like, if you use classes and methods it just can't be. No problem with using Python and Ruby together. They are maybe like German and English, close but clearly distinguishable. And Javascript, Perl, PHP, Elixir... too long to write about.
A consequence of multilingualism is that one starts noticing the differences in the implementation of the same features in different languages. Some are smooth, others are frustratingly hard to use or to remember. A quick test on a trivial nuisance, you must not Google it: in Python it's array.join(",") or ",".join(array)? And "1,2,3".split(",") or ",".split("1,2,3")? I remember only that "," goes to the opposite ends in the two expressions and I can't understand why that should be good.
The first one is neither - it's string.join(iterable). If you keep that in mind then the design becomes clear - it would have been weirder to force every iterable to have some string-related method. Better to put it on string, where it more reasonably belongs.
The second one is the typical OO pattern where you ask an object to perform some operation on itself. This is how split works in just about every OO language - if you have trouble remembering it, it might be helpful to remember split can be called without any parameters - in that case your alternative variant won't make any sense.
It makes more sense now, thanks. I missed that the argument is an iterable. Ruby's join is a method of Array and of nothing else. Examples with ranges:
I've noticed my Elixir experience bleed into how I write Ruby. I'm much more likely to avoid creating objects if I can solve things with a pure function in Ruby now.
And for things like pattern matching, I've played around with doing similar things in Ruby, too. For example, in Elixir you might write a method that should only ever return `:ok`, and a pattern match like this...
# if :ok, nothing special happens
# if not :ok, "MatchError" is raised
:ok = Fooer.foo(foo)
So in Ruby, I played around a watered down equivalent like this
def ok!(obj)
raise "match error" unless obj == :ok
end
# if :ok, nothing special happens
# if not :ok, "match error" is raised
ok! fooer.foo
One day I wrote a bunch of Ruby code of 'ok! <arg>` all over the app, and it made getting the program to run correctly a lot easier (at a time my brain was acclimated to Elixir). I don't know if I'd ever try getting developers on a Ruby project to embrace Erlang/Elixir's "let it crash" mentality, which might not be a great idea for a number of reasons, but it was interesting to me that I had even considered it as an option. Before writing Elixir code, writing a program that crashed on purpose was an alien concept to me.
What I love about Python is that it's so similar to how I've written pseudocode over the years. So, I often do quick prototypes of ideas in it, even if the final product isn't in Python.
Like you, I'm basically a C programmer at heart. But, to get efficient at some of the Maple code I had to write for my graduate work, I learnt a number of their functional tools so I now think quite a bit about using some of those as part of my toolset. I'm still largely a C-style programmer, though.
Working with the SymPy project helped me in my Python style.
Additionally, if you really want to get productive, you'll also need to get to know which libraries/frameworks to use, and how to use them (idiomatically). This is the most time-consuming tasks, I think.
I know formal education often gets a lot of criticism around HN, but I think the approach the article is talking about is heavily mirrored in most university computer science curriculums.
Universities tend to focus on paradigms and patterns, and typically force a student to learn at least 3 languages throughout their education (much more if they want to). Just in my undergrad I learned C, C++, C#, Objective C, Swift, assembler, Java, Go, Javascript, Python, and Haskell. My personal experience is that university grads are much better at adapting to new languages than someone with 4 years experience in only a single language.
That's of course not to say you can't do the exact same sort of education without going to school (and probably in less time).
Yes! My university CS program focused on teaching you to think about computation, with the particular language you'd express your thoughts in as an interchangeable detail.
Scheme for beginners, Python for web scraping and data munging, C for concurrent network and systems programming, and some small exposure to Java, Haskell, Standard ML, awk, yacc/lex, C++, and your mobile environment of choice depending on which classes you took.
Many were upset by this "very theoretical" approach, as they'd prefer to have immediately employable skills in JS-framework-of-the-week. Instead they were taught how to think independently of a particular language, and to get comfortable with learning new ones.
From a programming craft perspective, it was a little disappointing that we were never focused on advanced language features or idiomatic code, but I felt I had a solid enough base to self-teach that sort of thing.
Same with my degree program. Learning the programming language used for any particular class was an "exercise left to the student."
After I finished my undergrad degree I had an "exit interview" and I mentioned that this was one of the things I liked about the curriculum, because it helped me see that the abstract computing theory is what is important, and languages are largely a matter of syntax and convenience. The professor's response was that they so often hear the opposite, that students complain that they don't learn languages that employers want.
I agree that's what they aim for, but in my experience far too many CS grads (especially from non-top universities) have absolutely no understanding of the core paradigms. Instead, they've basically memorized "this is how I make a linked list in Java." Ask them to do the same exact task in another language and they're totally stumped.
The perfect example of this approach is Concepts, Models, and Techniques of Computer Programming. It covers a wide range of paradigms [1] using Oz/Mozart and it also shows examples using many famous languages.
Agree so much. I had a very theoretical course and am often shocked at how reluctant my peers are to investigate a problem in a language they don't already know (normally just one or max two).
Also, if they need to quickly munge some text they'll have a dozen Java classes and several poms.
I think this has real practical consequences for decision making (not knowing alternatives etc).
Also agree. My programming languages course had me writing code in Racket, Julia, Erlang, Prolog, and Haskell, in addition to the languages that all students had to use for certain classes (C, C++, Java, Python, JavaScript, Assembly, and C#)
> Universities tend to focus on paradigms and patterns...
> ...That's of course not to say you can't do the exact same sort of education without going to school...
I think the personal motivation that someone brings has a lot to do with how they fare short term and long term in this regard. Yes, a good University program will teach you the concepts and underlying theory expressed by language implementations; coming out of such a program you will absolutely have the jump on someone that jumped into JavaScript programming informally. You will adapt more quickly to other technologies faster, too, as you have the essentials to make that shift whereas the JS guy may not.
Having said that, if you attended and graduated college because it was something that you were "suppose to do" or because it would get you a good job and computer science was just a way to a good job after college (with no particular interest in the field otherwise)... over time the more personally interested developer that learned JS informally will likely catch up with you and overtake you.
My formal education is in music composition. However, I was programming assembly on the old 8bit machines when I was a kid. 20 years on as a technology professional, I devote large amounts of time to learning concepts and theory that aren't strictly necessary for me to program something that works well enough for my customers. I learn these things because I'm genuinely interested in the underpinnings of what I do and because I want to perfect my professional skills... not for an employer, but for my own edification: I care about what I produce because it's me producing it. There are many of my friends that did get formal education in computer science and I've surpassed them in both quality of output and overall understanding. (BTW... don't get me wrong, I have colleagues that have formal education in computer science AND the same sort of professional dedication I have... a good number of them far exceed my knowledge and understanding and I very much look up to them. Still, I'm glad I don't have to feel ashamed standing by their side professionally either).
I think that's great, and I agree completely. I added that last sentence because I think at the end of the day your education is what you make of it, regardless of whether you went to school for it or not.
I think the top-10 universities do it right and the rest don't. (This is a bit of an exaggeration but you most likely know what I mean).
Most of my classmates don't usually "learn" a language. They do an assignment by copying, hammering at the computer, or just asking for help. The basis of how university is set up is antithetical to the learning and exploration of new programming languages. This is based on one simple thing that most universities do and is very easy to fix.
Don't mandate a programming language for you assignments
If this one thing was done, we'd probably instantly see a much higher failure rate and a much hire quality of turnout. If early on in the curriculum students got a tour of every one of the big name languages and got to just choose the best tool for the job for the rest of their time at university there'd be a much closer approximation to how things (at least for me) work.
If one of your projects is something like "scrape a webpage for X data" you wouldn't want to write that in a bash script (which I've been made to do [0]) I'd want to do that in Python with BS4. Or if your project is to write a parallel dot product function you wouldn't want to write that in C (which I've been made to do [1]). I'd want to write that in Julia.
Even in my class that I took for exploring programming languages we were forced to use C++. We were writing an interpreter using C++ which I'd rather have done in some Lisp-like languages.
Unfortunatly I've not been able to make the design decition to play with other language (in school) to see how they will better impact the development of these applications. I've not had to prototype stuff to see what language it will work best in (in school). These are decitions made for me.
I've found that this isn't how things work, at least for me, and I'm the one who is told "Do X" and I pick a way to do it. Whether it's by setting up a spreadsheet it a macro in it that generates the data or writing some real code. I get to choose the most elegant solution and I suffer the consequences then I didn't choose the most elegant solution since I have to my software when it breaks 2 years down the line.
There are three problems with "Don't mandate a programming language" -- I'm a lecturer and I've done it for advanced practicals in later years.
* Students expect to be able to get help when they have problems. There is a good chance no member of staff knows Julia / Moonscript / ...
* Some languages make tasks trivial -- while this is nice when you are in the real world, if I want to test student's ability to create something I don't want some students missing most of the work. How do I then mark it?
* Similarly, if the question was "implement a malloc-like memory manager", well you really have to do that in C,C++,Objective-C, maybe Rust, but it makes less sense in python.
Also, getting a "quick tour" of (say) C++ isn't really useful, students who try to pick it up by just googling are likely to write terrible code. Learning a language properly takes work.
> Students expect to be able to get help when they have problems. There is a good chance no member of staff knows Julia / Moonscript / ...
For this I have two answers. Past the first year people shouldn't be getting help with "my code won't compile". They should be able to develop the skills needed to search that on google and find SOF links.
The second answer is a question: Why don't members of staff know "Julia / Moonscript / ..." and if they don't why can't they logically reason about what's going on in the language without having used it? I don't know Go but when people have asked me to look at some Go code to see there is a bug I can still reason about what's happening. Isn't that what this article is about? All languages are just mix-matches of common idioms with new ways of expressing them. If the best of the best, those who are teaching the future generations of computer scientists, can't do this it would seems strange to me.
> Some languages make tasks trivial -- while this is nice when you are in the real world, if I want to test student's ability to create something I don't want some students missing most of the work. How do I then mark it?
If the student understands how to use the abstraction then they have likely learned something far more valuable. If you're assigning labs that consist of basic idioms that can be whisked away by common library functions then you might consider changing your curriculum to focus more on solving problems rather then codifying solutions.
> Similarly, if the question was "implement a malloc-like memory manager", well you really have to do that in C,C++,Objective-C, maybe Rust, but it makes less sense in python.
I see no reason why you'd have to write that in C/C++/Objective-C/Rust. If you're going after the idea of writing a working memory manager, and not write me a kernel that has a memory manager, then Python would work great for it. Here is an example:
class Allocation:
def __init__(self, start, size):
self.start = start
self.size = size
@property
def end():
return self.start + self.size + 1
class FirstFit:
def __init__(self, total_memory):
self.allocations = []
self.total_memory = total_memory
def alloc(self, size):
if not self.allocations:
allocation = Allocation(0, size)
else:
allocation = None
for i in range(len(self.allocations)):
current = self.allocations[i]
if i + 1 < len(self.allocations):
# Check to see if we can fit inbetween this current allocation and the next
if not self.allocations[i + 1] - current.end >= size:
continue
else:
# Check to see if we can fit inbetween this end allocation and the end of memory
if not self.total_memory - current.end >= size:
continue
# We can so allocate
allocation = Allocation(current.end, size)
if not allocation: # We can't so return NULL
return None
self.allocations.push(allocation) # Store this allocation in our memory allocation table
return allocation[0]
def free(self, start):
for i in self.allocations:
if self.allocations[i].start == start: # Find out pointer
self.allocations = self.allocations[:i] + self.allocations[i + 1:] # Slice it out
return True # We made it!
return False # We couldn't find this! PANIC!
This is crappy code but it can be done very eligently and I think this gets across the theory better then doing this in C. In this you can also experiment is far more complex datastructures easily. (What if I think of memroy as a Tree and divide the value of my node by 2 every time it's size is too big?)
> Also, getting a "quick tour" of (say) C++ isn't really useful, students who try to pick it up by just googling are likely to write terrible code. Learning a language properly takes work.
I'd say that's just because of the poor design of modern C++. You can do a quick tour of python and easily get basics, of C and easily get the basiscs, of Java and get the basics, of Common Lisp and get the basics. You don't need to master a language to see where it is applicable.
> Why don't members of staff know "Julia / Moonscript / ..." and if they don't why can't they logically reason about what's going on in the language without having used it?
If the bug is a shallow/algorithmic bug, that's reasonable. Recently I was helping someone debug some javascript code. Eventually, we found the problem was that, in javascript, that [11] < [2]. I'm not particularly knowledgeable on javascript, so I didn't know it did < comparison of arrays by string comparison. That's the kind of thing that really needs.
> If the student understands how to use the abstraction then they have likely learned something far more valuable. If you're assigning labs that consist of basic idioms that can be whisked away by common library functions then you might consider changing your curriculum to focus more on solving problems rather then codifying solutions.
This I just have to disagree with. I think it's valuable for students to learn how to implement quick sort. It's useful to learn how to implement big-integer arithmetic. It's useful to learn how you can "fake" Java-style inheritance with structs and function pointers, so you really understand what's going on under the hood.
Of course long term, you wouldn't typically implement these things yourself, but understanding the fundamentals is important.
Also, if I set a practical which involves (for example) connecting to a HTTP server, intending them to do the raw connection themselves, and they use a 3 line python program, using the standard library, have they really learnt anything at all?
There certainly is a place for giving students more freedom, particularly in later years. It's clear the modern world is moving into "slap together 50 javascript/python packages with string" type programs (and that's because it's a great way to be productive quickly), which universities don't currently teach that well. But don't throw the baby out with the bathwater!
> This I just have to disagree with. I think it's valuable for students to learn how to implement quick sort.
I'd rather the students kow how to implement large software architectures, keep line counts down, abstract problems correctly, and learn how things are done in the real world.
> It's useful to learn how to implement big-integer arithmetic
Not in my opinion. Maybe, maybe, show them how emulating FP math works but writing big-integer arithmatic functions is pretty useless for most people and is far too strait forward to require them to develop their skills of development software architectures.
> It's useful to learn how you can "fake" Java-style inheritance with structs and function pointers
You can't force them to learn patterns, you can only give them work that is better suited to using the patterns provided. You can even hint to your students "Hey you can make a get_car and get_bike and make a Drivable struct that has a Drivable->stear() and stear can be a function pointer!" Forcing them to use a pattern isn't useful.
> so you really understand what's going on under the hood
Using function pointers isn't really correct for how Java stores class/object information. This is kind of only used in virtual functions IIRC. When I've decompiled static bytecode you see stuff like LString(some function).
> Also, if I set a practical which involves (for example) connecting to a HTTP server, intending them to do the raw connection themselves, and they use a 3 line python program, using the standard library, have they really learnt anything at all?
You're assigning the wrong problem. Don't say "Make and HTTP request" say "implement an HTTP header parser". The problem is now language and abstraction agnostic and involves a much more complex problem who's complexity lays in the realm of the software organization. The HTTP Parser code can be used to read and write requests and the next lab can be to use your new library in a larger project. I think that is far more useful.
> There certainly is a place for giving students more freedom, particularly in later years. It's clear the modern world is moving into "slap together 50 javascript/python packages with string" type programs (and that's because it's a great way to be productive quickly), which universities don't currently teach that well. But don't throw the baby out with the bathwater!
There is a measurable reason for this and it's not because of productivity. It's about maintainableilty, consistency, and bug erradication. I'd love for you to read this paper Do Code Clones Matter? [0] to see what they have found.
Making it so people know how to:
* List all the featurs that a library will need
* Take those features and write them in a clean API
* Do it in the most clean language-specific way (ex Pythonic code)
* Distribution methodologies
* Maintainablilty and support of these libraries
* Using these, and others, libraries in larger applications
* Documentation & Technical writing
I'd really love it if you could email me and follow up after you read that paper and tell me what you think. My school username is jk369 and my school's email server is @njit.edu. (I've split this up to avoid spam)
I will look at your paper, but I think it depends what your target is, for students.
Many of our students go on to do PhDs. For that an understanding of deep algorithmics is much more important than being able to use and distribute a library, or building larger applications. They need, in their field of student, to be able to reimplement and understand many important difficult algorithms, not (for example) put together some node.js libraries.
However, there is a place for that kind of degree. Someone once said to me something which stuck with me: "You wouldn't try to merge a maths, and accounting degree, just because they both contain numbers", yet that's still what we do in computer science.
I just want to make sure, and state for everyone, that I'm far too lazy to do work that good. That's done by people much further along in their development of their profecian then I. Not my work just something that I think is good.
> Many of our students go on to do PhDs. For that an understanding of deep algorithmics is much more important than being able to use and distribute a library, or building larger applications
It depends on what their PhD or major is. Many CS majors here go into math and physics and they would have done better to learn how to write pythonic (or matlabic?) code and learn how to correcly design APIs. For instance, tonight I'm going to be rewriting a library from Python 2 to Python 3 and I have no way to tell if there's been any regression because of a lack of testing frameworks, consistentcy in APIs and lack of modularity. I'd be basically impossible for me to mock even single parts of this.
> They need, in their field of student, to be able to reimplement and understand many important difficult algorithms, not (for example) put together some node.js libraries.
I don't think left-pad is a good charictarization of my position here. I think that the hardest part of large scale software development is the architecture portion and that's mainly because very few people ever actually start large complex projects on their own and no universities offer courses in such a field that practicies that.
> However, there is a place for that kind of degree. Someone once said to me something which stuck with me: "You wouldn't try to merge a maths, and accounting degree, just because they both contain numbers", yet that's still what we do in computer science.
The moment you, or one of your coworkers, creates a major like that (Software Engineering and Developemnt or something) I'll be transfering over from my Computer Science degree. It's useless for anything but the name and the prestige that gets my foot in the door for a lot of very fun and interesting opportunities. I've had to hobcobble together everything I've needed to learn from a software perspective on reverse engineering, bare-metal systems devlopment, assembly, networking, game development, operating system development, web development, and many abstractions/patterns that go along with all of those. It was very difficult and I'd rather have given 30k/year over 4 years to a school who could teach me the "real" way from "professionals" and end up with a piece of extremely expensive framed paper that says my name and "degree" on it.
I'd also like for a school to hold my hand while exploring different paradigms.
Don't mandate a programming language for you assignments
I'd do them all in highly optimised x86 Asm, just because I can. ;-)
Not all students start at the same level; making the choice of language a free-for-all is just going to stretch that disparity even more. The ones at the top will naturally find ways to entertain themselves more, and the ones at the bottom will have no less idea of how to do things than copy-pasting code they found somewhere else (if the language doesn't matter, it makes that even easier...)
We were writing an interpreter using C++ which I'd rather have done in some Lisp-like languages.
On the other hand, I believe that doing something in an "unconventional" language for it is one way of improving your skills since you have to then apply true creativity and knowledge instead of just following an existing solution. Using a language which makes the task easy or even trivial doesn't really benefit the learning process.
> I'd do them all in highly optimised x86 Asm, just because I can. ;-)
I'd love to see someone do this because I'd say it'll be harder to do assignments I'd give out in x86 assembly then probably i686 or even ARM.
Most people would likely do everything in the language they want to learn or will need in industry. If you're just messing around then write it all in some obscure architecture like AP-101 or PDP-8 or even 4004. Maybe I just love the classics.
> Not all students start at the same level; making the choice of language a free-for-all is just going to stretch that disparity even more
That's why I said they should first be introduced to languages. That should be the first thing, day one, and continue until they get a basic idea of all the common features in programming languages.
> On the other hand, I believe that doing something in an "unconventional" language for it is one way of improving your skills since you have to then apply true creativity and knowledge instead of just following an existing solution
It may be fun but I wouldn't want to do it in production or on my grade.
That was me when I was in college. At the time my school was in the mid-teens for CS rankings although it is worse now. While I was there I worked heavily with C but also worked with C++, Java, List, MIPS RISC assembly, ML, Prolog, Ada, Fortran, Perl, Eiffel and a few others. For all of the languages that were supposed to be showing me a different paradigm in retrospect I didn't learn what I was supposed to.
Years later when I had both enough experience to appreciate these things as well as interest in programming languages I did start to grok all of this but it certainly wasn't due to my education.
To be fair, and I don't have your context on this, but looking at your link of the assignment for [0] it appears to me that this is more of an exercise in getting people comfortable with more advanced bash features, specifically looks like it relies fairly heavy on regex? I agree that BS4 is a better tool for this job, certainly, but there is perhaps a point in what is going on here too.
As far as the second one goes, I bet that has more to do with the professors particular familiarity with C than it does an understanding of what the best tool of the job for that is. I'll cede that point :)
>My personal experience is that university grads are much better at adapting to new languages than someone with 4 years experience in only a single language.
These benefits seem dubious to me in the long term unless you plan to work on compilers and languages which is certainly a noble goal and is very much a hot area right now. Picking up brainfuck in a short period of time is not really noteworthy IMO. I am also struggling to come up with a way to present this to other people without coming off as an annoying know-it-all. Do we want to value the ability to sprint or the ability to finish a marathon?
Plus, your memory will deteriorate over time without constant practice. Am I really going to commit time to reviewing all languages I choose to learn every year or so? The article seems like a challenge-to-take more than career advice.
If someone doesn't immediately want to work with languages, I would rather teach them what might be analogous to the lay of the land in our industry:
- What tools do you use to make a desktop application on Windows/Linux/Mac?
- How do the different browsers (Edge/Chrome/Firefox) implement HTML/CSS/JS? Can you make a consistent behaving application for all of them? And why should you run far, far away from any company asking you to support IE8 in 2017?
- How would you make a cross-platform library for Windows/Linux/Mac?
- How would you make a mobile application?
- What are the most popular IDE options available?
- What are the different database options available?
The difference in opinion is exactly teaching more engineering vs. teaching more science, but also learning about the ecosystem that drives language choice and development. This knowledge contributes as much to "knowing the right tool to solve the problem" ability as diverse language knowledge.
> Do we want to value the ability to sprint or the ability to finish a marathon?
We want to value the ability to finish the marathon, which is exactly why it's important to be able to adapt faster. The industry is constantly moving, and someone who can learn new languages and technologies easier/faster is at a huge advantage.
For a concrete example, take Objective-C and Swift. Apple has made it pretty clear that's were things are going, and a developer who has C#, Haskell, Objective-C, Python, Rust, and Ruby experience is going to make that transition much better than a developer with just Objective-C experience. This same thing even applies to frameworks within the same language (think React/Redux and functional programming experience).
> Am I really going to commit time to reviewing all languages I choose to learn every year or so?
Definitely not! You focus on the concepts in the language and don't worry about memorizing anything. Quite a few years ago I learned Go for fun. I basically ignored it after that, but when I needed it for a project recently it came back very quick.
Those other questions you pointed out are, of course, very important. I think you learn the answer to those as you learn languages as well.
I just wish those universities actually produced programmers who understand the low level mechanics of how computers work. I.e. Why your program runs 100x slower if you're not careful about cache misses, pretty basic stuff like that. Moore's law is over, it's time to start caring about performance again.
What makes you think universities produce people who don't care about performance and aren't educated about low level mechanics? That's an honest question, I'm not challenging your experience. I don't know what your experience is, of course.
In my experience, people with CS degrees do understand the low level mechanics statistically more often, on average, than people who don't have a formal education, or people who go to 2 year colleges or trade schools. There are definitely people who make it through a degree without thinking much about cache misses, but in my life I've seen far more programmers without an education fail to understand caching than programmers with an education.
I've never met anyone, educated or not, that doesn't care about performance at all. I do meet people that prioritize what they care about, and choose to weigh performance of the programmer over performance of the computer, choose to prefer working with valuable abstractions over milking every cycle out of the machine.
It depends on what you're doing. Usually, thinking about CPU cache misses in a web development environment is premature optimization and a massive waste of effort. Not always, but usually there are other more important performance considerations.
Personal experience. I've been trying to hire moderately competent, junior software engineers would be able to write and optimize C++ without much supervision. They're now more rare than the Sasquatch. Today, "caring about performance" means picking a language that's only 5x slower than C, rather than 100x slower. Hardly anyone even knows why alignment might be desirable, or how long it takes to fetch memory after a cache miss, or how concurrency primitives actually work. For my generation, "caring about performance" means that C compiler generates suboptimal assembly, and you handcode it because you know better and need that extra oomph.
> It depends on what you're doing
It sure does. You don't have to worry about the lower level stuff every day. We deal with extremely high performance code, and even _we_ don't think about it every day. But it definitely pays to know what's going on. Otherwise it's not software engineering, it's cargo cult science and rain dancing, prone to fall apart at the first sign of difficulty. It's like having a certified car mechanic who doesn't know how to open the hood. You can have a few of those to change tires and broken tail lights on the cheap, but you also need the dudes/dudettes who know what to do when "check engine" light comes on.
Optimizing C++ code is enough of a niche requirement nowadays that it is totally reasonable for you have to train junior devs in this area rather than hiring them ready to go.
If you don't want to train them, pay the price for an experienced C++ developer instead.
> junior software engineers would be able to write and optimize C++ without much supervision
"Junior" and "optimize c++" just do not typically go together. Even more so if you're talking about deep optimizations like cache hit misses, and not just algorithm decisions or breaking down problems properly.
Ok, "optimize" might be an overstatement in this case. More like "write code that's not totally oblivious to how compiler will compile it, and how CPU will execute it". A reasonable request for a software professional with a degree, one would hope.
As an anecdotal counterpoint, I'd love to see more people hiring for positions like that. Most C++ jobs are between "we know you're clueless, but hopefully you'll learn" (rare) and "you know this stuff, how are you not in a senior/architect position?". Hacking away on moderately complex performance-conscious code without necessarily spending too much time on architect-level concerns sounds lovely.
> Personal experience. I've been trying to hire moderately competent, junior software engineers would be able to write and optimize C++ without much supervision. They're now more rare than the Sasquatch.
Are you certain they're rare? As opposed to not responding to your hiring ads, for whatever reason?
I do think the percentage of programmers who care about CPU cycles has gone down; I would totally agree with that. Decades ago it used to be that you had to care, it wasn't an option. Today, for most work, it's simply no longer a requirement to worry about cycles. BUT, I've hired lots of kids in recent years who knew plenty about compilers and assembly and caching, so I have a general feeling that they're out there. I would speculate that a lot of the kids who can deal with C++ and performance issues are interested in games...
> ...it definitely pays to know what's going on. Otherwise it's not software engineering, it's cargo cult science and rain dancing, prone to fall apart at the first sign of difficulty. It's like having a certified car mechanic who doesn't know how to open the hood.
It does pay to know what's going on, I totally agree. In some sense literally, as the deeper you go, the more "expert" and "high paid" you will expect the technician to be.
Your analogy to car mechanics is apt in the sense that what you're talking about has already happened in both the auto industry and the tech industry due to increasing complexity of the systems over time -- the first line or two of support people don't fix the internals for you and often don't know how (yet they do "fix" most problems). Even the higher paid expert technicians at car dealers can't fix a huge portion of the problems anymore because the problems are software. Same goes for very smart and very capable QA engineers - they know how to test systems, they know how to find which system is broken, they know how to reproduce issues, but the system may still be a black box. And the engineers writing your core app code still can't generally fix issues in their libraries or operating systems. And so on. So, I personally wouldn't go so far as to say it's cargo cult science or lacking software engineering, we just have more layers than we used to. We can and do engineer at one layer on top of other layers that are black boxes, almost nobody can claim otherwise. But, as you pointed out, you do need people staffing those multiple layers.
Could be response rate, sure. Could be our proximity to Google, which manages to hire C++ devs just fine (fully a half of their codebase is C++).
We've managed to hire some folks after all, with some remedial training they're doing fine. I just wish it wasn't so hard to find them, and they wouldn't require months of close supervision after you hire them.
Nah. Google doesn't "train" per se. They expect that you already know your shit pretty well when you get hired, or else they just don't hire you. From there on out you're on your own. No one will "train" you specifically, although opportunities are sometimes available. In particular, no one will specifically train you to write tight code. Your CLs will just get rejected until you learn that on your own. Source: spent 8 years at Google.
Yes, when I graduated in the 90's, we had Pascal, C++, C, Prolog, Caml Light, Java, PL/SQL, Assembly (x86 and MIPS), Tarski's World language, Lisp, Algol, PL/I.
Those like myself doing compiler related classes got to additionally explore Forth, Oberon, Oberon-2, Active Oberon, Modula-2, Modula-3, Eiffel, Ada, Concurrent C, Objective-C, *Lisp, Sather, C+@ [1] among a few others I cannot remember now.
As Wirth puts it, it is all about algorithms and data structures, in an abstract way.
Mine was a little less about patterns in some cases (2 year school though) and you got to take plenty of different programming classes from Web, to C to C++, Scripting Languages, C#, Java, and others. After it all I somewhat try my best to be language agnostic, with sane reasoning, I will raise my concerns if I know the faults of a language. It would be silly to just code in any language asked to and never raise concerns or offer alternatives if they fit your workflow better.
I agree: in my French school we had to learn assembly language, C, Prolog and Lisp.
Clearly Prolog and Lisp were taught to us for the different paradigm they introduce not because they expected us to use Prolog and Lisp in our jobs..
I used to arrogantly think I could be productive in any language given a week or so to adapt. I often used the phrase "a good dev is a good dev in any language." That belief was rather abruptly broken when I joined a project using C++/CX it was so far outside of my previous experience that I had a really bad time of it.
I happily picked up the actual language very quickly. It was the surrounding ecosystem of compilers, build tools, debugging tools, libraries, standard patterns and best practises that was too deep for me to become proficient in.
That said. Learning every language you can is definitely beneficial. Just don't expect to hit the same level of productivity in all of them.
C++ is a special beast though. It's famously complex.
What gets my goat is the idea that some recruiters have that I - as someone with mostly C# experience - would be completely useless in a Java environment.
Java and C# are so similar it's barely worth noticing. There'd be different IDEs and libraries in play - and sure, a lot of googling to remind myself of stuff in the first week. But that pales into the comparison to the amount of time I'll likely spend learning the business domain, or how the legacy code base works. That's the stuff that's truly important.
I agree. It's not always the language which makes it tough to integrate yourself into a project, sometimes it's all the frameworks, libraries, build tools, and general makeup of the project itself.
I worked for a company a few years back doing ASP.NET (my experience is mostly Linux based tech, but you can pick up a language, right?). The language was fine, the design patterns used all quite standard, but the project had an array of frameworks and libraries which made maintaining it painstaking.
I still find the "frameworks and libraries" argument to be false, but enough people say it that I believe it must be at least partially true. I'll settle for a belief that the argument is overstated.
IMO the largest issue is the breadth of one's experience, or lack thereof. And by that I don't mean simply the number of languages but how well versed one is in different languages and their surrounding ecosystems. For instance having good knowledge of both C# and Java isn't what I'm describing here.
I think it's about how much newness you have to handle at the same time. I can happily take a project using a framework or two I've never touched before. Or something in a new language but similar domain. But when you have the perfect storm of new language, tools and framework (and maybe OS) that's when it hurts. I'm not saying I totally couldn't switch, but it made me change me attitude a bit from '2 weeks and I'll be back at full productivity' to something more like '2 to the power of number of new things'. The more newness that has to be handled at the same time, the longer it takes to pick up all the interconnections.
This is also true, I would say it's a combo of the two. The more new things at once the more likely it is that you'll be working with something without a good correlation to something in your previous experience. But at the same time having a broader and richer base of experience will make it more likely that you'll have an easier go of it for any of those individual things.
Let's just say we've both worked for a particular online retailer who thought having the mobile site act like an app but not be an app would be a good idea.
> I happily picked up the actual language very quickly. It was the surrounding ecosystem of compilers, build tools, debugging tools, libraries, standard patterns and best practises that was too deep for me to become proficient in.
I felt the same way trying to learn Haskell. The language itself isn't that hard to learn, but just try and go read some code for one of the large open source hs code bases (like Yesod, etc.) :)
I also remember being put on a C++ project after being away from the language for years (C++ 11 was... different from the C++ I used back in the early 2000s). I was completely lost for a while :O
On the other hand, some languages are easy to pick up quickly (like Go), and some just 'click' for some people (as Clojure did for me).
The C++ template system alone makes it very difficult. Plus, many people have a habit of writing a mess of interconnected classes that become very difficult to pull apart to understand what's happening.
During my Master's (and my Ph.D.) I was using a finite element approach developed by a former Ph.D. of my supervisor(s). So, I asked for the code. I got back a mess of C++ and it was dependent on a linear algebra C++ library that wasn't around in the same form. I spent some time trying to reason out what he was trying to do, gave up and wrote my own in Fortran in about as much time as I spent trying to understand his.
> I happily picked up the actual language very quickly. It was the surrounding ecosystem of compilers, build tools, debugging tools, libraries, standard patterns and best practises that was too deep for me to become proficient in.
Yes, those take a while. It's even harder if the new language is outside of paradigms you are familiar with. Eg Prolog or Haskell would be that for most people.
I’ve started coded C++/CX without noticeable degradation in productivity.
But when I need to code for e.g. linux, because of those tools and practices you’ve mentioned my productivity was much worse compared to using Visual Studio on Windows, even when using same standard C++.
I think nowadays people spend too much time learning new tools and too less time doing something really valuable using that tools. I wish I had only one language, so I can concentrate on more interesting things rather than learning yet another random set of operators and library function's names.
While using fewer, better languages for more is fantastic, I doubt you really want a single language.
Try replacing shell, Structured Query Language, and C with the same language. I won't say that it can't be done, but I think you'll lose a lot if you succeed.
I think Smalltalk machines did it, and so maybe did Lisp machines. Intersystems is doing something similar, partially, with M (their version of Mumps).
I'd hope that with embedded DSLs we can get closer. People are already doing lots of it in Haskell, including domains you listed.
Of course there is always the risk that one invents an "inner language" with poorer semantics and tools.
Haskell is a pretty good host for DSLs. But if you want to go lower level than Haskell, you have to essentially write compilers for your embedded DSL rather than the usual interpreters.
And of course, Haskell's type system is not endlessly flexible (yet..). Eg Haskell still struggles expressing relational programming or linear types / uniqueness.
Yes the low-level DSLs tend to become their own compilers. But the good thing is that they as a side-effect also have an API, so they can hopefully be reused for new DSLs.
Interoperability of different DSLs does not neccesarily follow though, unfortunately...
Yes. And of course, you still need to write a decent compiler to produce decent code.
The situation is similar to Lisp macros: yes, you can implement Prolog in Common Lisp in a few lines, but no, it won't be a fully featured and fast production system, unless you actually put in the work. (Paul Graham's 'On Lisp' makes these excellent points in the chapter on the Prolog interpreter.)
Of course, you might want to go all the way to dependent typing. I think one of Ysabelle or Idris actually compile to 'low-level' languages like Haskell by default?
The main benefit I would like to see in Haskell is totality / termination of programs by default, and hiding Turing completeness behind something like unsafePerformCompute. Similar, we could split IO into IOReadWrite and IOReadOnly.
The former would be the same as the old IO, the latter's actions could depend on the environment but wouldn't be allowed to influence it (or weaker: would at least require idempotence?)---thus allowing more scope for optimization and human understanding when reading code.
I think you gain more than you lose. (For myself I have replaced shell, SQL, and C when it doesn't have realtime requirements (which is most of the time) with Scala FWIW). Language boundaries are really high-overhead, and most of the time being able to use the same tools and libraries everywhere more than makes up for a language not being quite as good a fit.
I'm with you. A standard set of languages/tools would be great, so you don't have to pay a cognitive overhead cost every time before you dive in. It's fatiguing. It adds up. I much more enjoy the act of programming than learning a new tool or language....and then getting on with the real work.
Pipe dream, it would seem, but: I feel like the energy spent picking up new things could be used so much better going deeper and building more with established tools--tools hardened by collective reuse.
I am sick of having to rewrite difficult data structures. I though if haxe could compile libraries I might never have to. But maybe just doing everything in rust might be a better approach?
It would be nice if you could "auto-export" from Rust. The pieces of the puzzle are something to generate a C API from a module's public API -- without `extern` declarations -- and then something to turn that API into Ruby, Python, Swift, Node, Java, Go, &c.
Clang modules are the beginning of something for the latter task. SWIG also has a lot of ideas that are usable -- and SWIG can handle the API description having classes and methods, since it handles C++. Though I would hesitate to make C++ headers the lingua-franca of code.
After you have some experience with, say, 3 to 5 different languages, you can learn new ones very quickly (hours, days) with little effort because you know almost all of the core concepts (OOP, FP, mutability, generics, ...). There's no point in learning "all of the languages" for its own sake. Just learn a bunch of different ones to learn different ideas and programming styles.
What takes more time is the standard library, common libraries, tools. A good example of that is iOS development in Swift. Learning the language is almost negligible compared to learning all of the technologies and tools. But coming from Android, I think I'm close to about 50 % of full productivity after two weeks.
I never understood why there's so much focus on the "x years experience in y" in the industry. A solid developer should be able to become fluent in any language and technology within a month or two.
"I never understood why there's so much focus on the "x years experience in y" in the industry. A solid developer should be able to become fluent in any language and technology within a month or two."
I don't think any developer can become fluent at a new language in 1-2 months. I have seen so much terrible code over the years the compiles and appears to run just fine but with massive performance issues, poor design, unreadable, security issues, memory leaks, etc. etc.
I see learning the syntax of a language to be something like 0-5% of the effort of mastering a language, knowing how to properly use a language really does take years. Although I agree that # of years of experience is not a perfect metric.
I'm glad that the trend is slowly going towards hiring generally smart people instead of a "PHP/Rails/Python programmer". I'd say, better hire people that can easily adapt something new if needed rather than saying "I am a PHP programmer, I don't want to do this new project in Ruby." (I met my good share of people in previous companies thinking in that way to the extend that they would quit if asked to write in something else than what they were hired for)
In my current company they were looking for a Go programmer. I never wrote a single line Go before, yet they hired me and now I am writing Go. Then we had a deficit in the iOS team, so I learned Swift and now work on our app.
I have a sort of an issue with learning a lot of new languages. As much as I like to try them out, I like (and have) to learn things like big data processing or devops stuff and distributed systems and so on. And to be honest, when I get into the theory of those fields, the language does not mean so much. At the end I end up using what the targeted industry is searching for, and they do not need languages, they need solutions. And the language is rarely a solution.
In that set of things, a new language is somewhere lower in the priority list for learning new things. Sry.
A few recommendations on this, since at where I work a talk is organized on this very topic today.
1. A great book that covers multiple paradigms of programming is Roy and Haridi's "Concepts, Techniques and Models of Computer Programming" https://mitpress.mit.edu/books/concepts-techniques-and-model... . This stands hand in hand with the better known SICP.
2. Folks in the CS and programming world seem to ignore bleeding edge work being done in the arts space. To get a broader view of languages than "characters that go into a plain text file", expose yourself to the live-ness of the following -
2.ø Smalltalk - one of the first fully available language and runtime that is still usable today.
2.a Max/MSP/Jitter - by David Zicrelli and Millet Pickette's - Visual data flow programming language with decades of dominance in the Computer Music scene.
2.b SuperCollider - for architecture lessons as well as another multi-paradigm language.
2.c Impromptu - a Scheme based live coding environment for music and visuals by Andrew Sorenson. Normal REPLs will bow in front of most "live coding" languages used for music.
2.d Ixilang by Thor Magnusson - another live coding language, where the language is in a sense inseparable from its run time environment. The current running behaviour of a textual program could also depend on how the program evolved.
In short, break out of normal modes of thinking and attain Turing nature, at which point you can proclaim that all languages have Turing nature and yet retain your discriminating view.
A few other data flow programming languages, LabView (which has been around for a very long time) and is extremely popular for data acquisition (it's basically the goto choice most of the time) and the Houdini computer graphics software (there's other examples in CG as well). I pick Houdini because there's a version people can download and try. It actually works pretty well for most things.
I feel we are starting to lose the plot. The problem isn't about learning another language or which to use next. Instead it is about how to solve and represent complex problems and systems in code.
Any language is a means to an end, not the end itself.
Focusing on languages and language constructs is of value to those in academia and those working solely in the domains of computing and computer science. For the rest, it is the equivalent of navel gazing, the equivalent of focusing on grammar, when the task is authoring a novel.
I agree whole-heartedly. However, as we take on projects of larger scope and complexity, reflecting on the craft in a measured and skeptical way can be powerful and productive. It's a tricky line to walk.
I'd be surprised if it wasn't. This is immediately obvious to anyone who's tried to solve practical problems in languages, rather than just learned them for resume embellishment purposes. That's also why sane companies limit the languages that can be used for development: if you let your "theoreticians" run wild, you will soon end up with code written in Brainfuck and Malbolge using libraries no one understands. I'm exaggerating, obviously, but only slightly.
I think Norvig and the author are wrong about "parallelism" requirement. I mean it's good to know what's out there, but you can't really understand it if you try to learn it from a multithreading point of view. Fundamentals for it are part of distributed systems and this is where people should get into and learn things about ordering, consensus, asynchronous and synchronous systems, etc.
Yeah, they are mixing all kinds of things together, Go with Erlang and even throwing a GPU in there. These are all very different worlds and none of them teach you fundamentals. I remember how useless the parallel programming class was, I only started to understand the underlying ideas behind semaphores, mutexes and memory barriers long after I got into distributed systems.
It's far from enough to know a programming language. If someone knows how to write binary search on C++, he can rewrite it on Python, JS or C# pretty easily.
But software development is not about writing pure functions, it's about writing applications, so frameworks and good practices is more important than the language.
What you should really do is learn the concepts of structured programming. Every language is just a notation on top of a fundamental programming construct. If you learn those instead, you'll know every language.
I recommend reading Exercises in Programming Style by Cristina Lopes as a good starting point.
Note: I didn't watch the video yet and my opinion might change by then.
C teaches you something very useful, which is how the computer works under the hood. Most things you do in C loosely match to what the computer does, and learning C helps you understanding the computer.
I'm not saying that it's a good language to start with, but it's okay. Unlike C++.
IMO, C++ is the worst possible language to start with. The reason for this is that C++ tries to support everything, often using handy but cryptic syntax. C++ doesn't guide you to program one way or another, it simply adapts to every possible way to solve the problem. This might be good on the hands of someone that already knows how to solve the problem, but it's completly overwhelming for a begginer. It feels like you're taking the wrong turn at every step.
A much better alternative would be JavaScript. Yes, it does have flaws, but it's much simpler and has personality.
IMHO "wrong" being mostly the opinion of the evangelists who think C++ is a completely different language, and have been trying to push it in that direction, but it's still mostly C at the core with a thick layer of abstraction/indirection on top. Don't believe all the hype. "Climbing the ladder of abstraction" is the best way to learn how everything works.
It seemed like the point of the presentation was really that C and C++ should be thought of as totally separate languages.
As long as that's remembered, I think the opposite is probably true: knowing C helps a lot to learn C++ (and vice versa) as opposed to learning either from scratch.
This is true, but there's a 'surface simularity' problem. It's easy to go from writing C to writing old-fashioned 'bad' C++. Modern C++ really is a different beast.
The thing is that if you're actually teaching C++ (which I have done), there's still the C parts that you actually have to teach as well that are common between both. So, if you know C, you don't need to learn those over again. I actually took a course at one job on C++ for C programmers back in the 90s and it was very well done for the time.
True, modern C++ is quite different. But, there's nothing from preventing a C programmer from starting as a "bad" C++ programmer and learning new techniques as they go.
An excellent article, well worth reading. As it points out, implementing the idea of learning a programming language for the sake of learning requires being highly selective in deciding which languages to learn. One language I would advise against learning is C++ - because, first, it may take too much time to learn to be worth the effort, and, second, the language itself is not all that interesting, and its standard library (especially STL), while does bring some novel ideas to the table, is so heavily affected by the particulars of the language itself, that maybe the D language would serve the same purpose better. Also, please do yourself a service and learn C# instead of Java. (This is not to create any doubt in the extreme usefulness of learning both C++ and/or Java for practical purposes.)
> Also, please do yourself a service and learn C# instead of Java.
Nah. Learn whatever takes your fancy. Or whatever you think will get you a job. Or learn whatever you can get tutored in because it's almost always easier to learn when you have someone you can ask questions to in person.
Learn structure and logic, but don't tell anyone they're less of a programmer because they use X instead of Y.