Quick scan reveals some GNU extensions being used, lots of tripe about pkg-config, autotools, how to write a makefile properly without autoconf (commendable), gdb tutorial (looks resonable), short valgrind mention, unit testing, doxygen, proper assertions (woot!), short bit on VCS with git (looks reasonable), integration with python (looks reasonable), decent discussion on aliasing and copying, some serious standard-breaking things (bad), how not too shoot yourself with macros (good), linking discussion (looks ok), an odd quote from pussy riot, lots of discussion of gotchas, unicode (good!), some stuff to do with glib (yuck), OOP in C (looks ok), more glib stuff (yuck), posix threading (some mines in there), GSL (good), SQLite (good), libxml (yuck - personal preference though), libcurl (good).
So generally a mix of good and bad. Probably worth it though.
Thanks for posting this. I often wondered `what are the Y-combinator people are going to say about this...?' while writing some opinionated section of the book, and the reaction seems pretty positive so far. Nobody has yet compared it to K&R...
There _is_ a lot of GNU in there, but that's because there's a lot of GNU in the C ecosystem. I made an effort to neither overuse or underuse GNU tools, and this is where I wound up. Also, I made a serious effort to label anything that's GNU-specific as such, e.g., the long segment on the GNU/BSD-specific asprintf() (which I was _so_ disappointed to not see adopted in C11).
Everything tests out on clang unless otherwise stated. In fact, there are little things that work better on clang, because it uses c99 out of the box. E.g., as of this writing, you'll need clang to make the example for (C11) _Generic work. If anything breaks standards without my telling you it does, you can file a bug report at O'Reilly's site for the book (or email me), and I'll fix it in later electronic versions.
As for autotools, I had a sort of epiphany somewhere along the course of writing the book: it's not so bad. At the core, if you want to do something relatively simple, like set up a basic shared library, you can do so in just a few lines of Automake. It's filled with hacks and historical cruft, but as a user I...I've grown to actually like it. Also, as above, it's a huge part of the ecosystem and therefore worthy of discussion for that reason alone. If you don't like it, skip the Autotools chapter and stick with make. I won't know.
You also acknowledge, in the text, that autotools gets some hate, and you justify its use and your inclusion of it in the book.
My sense of the book so far-- though I've only had it for a day or two-- is that it's pretty balanced in terms of avoiding evangelical fervor for any one tool or approach.
thanks for the reply - its nice to get an insight on your perspective. My commentary above doesn't detract from my opinion which is basically: looks good, definitely didn't waste my cash. Inevitably there is stuff people disagree with - there isn't a worthy book out there that doesn't fit that description.
People love to gripe about autotools, but if you take the time to understand it and get comfortable with what it is and is not, it will get the job done.
There are alternatives like cmake, but they just suck differently. I use autotools to manage cross-platform (Linux/glibc, Linux/uclibc, mingw32, OS X) C library projects and their Python extension module wrappers, with heavy dependencies on system-specific functionality and symbol replacement (e.g., if strlcpy isn't in libc, then link in libbsd if you can find it there, otherwise compile our own copy of strlcpy.c). Autoconf is a wonderful tool for this once one RTFMs and gets over the fact that it's a bunch of helpers for writing a shell script, rather than a fully abstracted programming language of its own.
Or in the words of Lennart Poettering, "every custom config/makefile/build system is worse for everybody than autotools is":
I wonder why the author decided integrating with python was a better idea than, say, lua. Seems to me that lua is winning that fight, but that's just my current impression - I don't really have any solid numbers to back that up.
How mature is string manipulation in Lua? Personally, I use python for anything involving string manipulation, and then use C or C++ (depending on context) to do significant data processing on the (now hashed) data.
Lua's core string manipulation has been kept (as I understand it) deliberately limited in order to keep the implementation simple. What is present - e.g., simplified pattern-matching - is cleverly designed and often useful but, by design, there's nothing to compare to Python's rich built-in set of libraries.
Well in this case, it's more integrating Python with C i.e. writing a C callable and calling it from Python.
I'm not really a Lua fan personally (the confusing indice and ordinal notation scares me), but I can see why it makes sense for integration purposes. Clean, simple, to the point.
Having just completed a large project with Lua, I have to say that the indice/ordinal notation thing is something that we quickly got used to. In fact, some nice succinct idioms work because elements run from 1 to arraySize (inclusive).
clang and gcc are not very different from a user's point of view—clang deliberately supports a large number of gcc extensions and command-line options. For that matter, so do other C compilers (e.g. the EDG front-end, used by the intel C compiler).
So really these days "gcc" is as much a common dialect of C as it is a particular compiler.
Just curious, what's wrong with glib? I happen to like it a lot, and it's a good option for getting access to data structures that other languages take for granted.
The complaint about typedefs and Win32 is kind of odd. These are style choices that libraries and large projects should be entitled to make.
I used to work at MS and grew very comfortable (and appreciative) of the stylistic choices in the NT kernel (distinct from the Win32 style, one key difference is much less Hungarian notation). But I can jump back into more typical-of-*nix styles with ease. Or the more Win32 style if preferred. I think it's important to see the merits and drawbacks of various style decisions in a non-judgmental fashion, and be able to switch back and forth depending on the code base you're working in. It's a mistake to confuse style and convention with "not being real C". Different doesn't mean not real.
gobject is built on top of glib, not the other way around, and it's mostly there to make it easier to create language bindings, so there's no real reason to use it (gobject) unless you're building a library like GTK. As for the typedefs, they're there for consistency more than anything. A couple of them are legitimately useful for portability reasons, but the rest (like gpointer) are there simply to keep things looking similar in the API.
Say what you will, but I see the book's apparent inclusion of glib as a plus.
I'll second that, I guess I'm biased as I wrote some early gnome stuff back in the day before I started writing graphics device drivers for Precision Insight and VA Linux systems. I rather like glib, even though the gpointer thing isn't something I like too much aesthetically. Everything obviously has its warts and trade offs, nothing is perfect after all. I think glib brings a lot to the table though and with it and a few other choice libraries I can code very complex projects just as fast or faster in C as I do in Java.
Would you care to list your other "choice libraries" ? I'm not familiar with actually programming in C (I can read C somewhat, but have never used it for a real (even small) project).
Could you elaborate on the 'horrible design'? In general, I've found GLib very useful (say, the data structures, mainloops, GError, the G_GNUC_ stuff, ...) . If the biggest problem is some typedef that you can happily ignore, it can't be that bad, or?
GObject is a bit different matter - there's quite a bit of boilerplate, which makes me think twice before adding one. But at the same time, it's quite nice to use them in libraries like GStreamer.
GLib is certainly not perfect, but I found it more pleasant to use than, say, talloc. Are there any specific advantage is APR? (I've never used it) Does it cover the same functionality?
I haven't checked recently, but the problem with APR was for so many years it had (possibly still has) no documentation other than Doxygen. Also, the memory pool design seemed an enormous complication. Have you used it recently and have either of those factors improved much?
It arose out of a different time. Linux/Solaris/BSD are trivial, compared to all the differences of the commercial unices of ages past, with their idiosyncratic libraries, BSD vs. SysV vs. POSIX, never mind all the different compilers. People would've sacrificed their firstborn to get something running on both HP/UX and SunOS, compared to that autotools was a blessing.
I think there are more papers on build systems than on sort algorithms.
I've been through that as well. I have in the past just provided a different Makefile for each platform and run as follows:
make -f Makefile.osf1
make -f Makefile.sunos4
make -f Makefile.sunos5
There is usually little difference between them - mainly compiler names/flags and the occasional def for code. Occasionally make is broken at which point you have to frig something.
Keep It Simple is the #1 rule. Rule #2 is every bad thing you do will do two things bad back to you in the future.
We decided to use gyp instead of autoconf and with some tweaking shipped a massive library that builds on Linux, Solaris, FreeBSD, OSX, AIX, HP-UX, Windows w/ it in both 32/64-bit. Definitely not for the feint of heart, but it is possible.
Check out http://swtch.com/plan9port/, it's a huge project that runs on (to quote the site) Linux (x86, x86-64, PowerPC, and ARM), FreeBSD (x86, x86-64), Mac OS X (x86 and Power PC), NetBSD (x86 and PowerPC), OpenBSD (x86 and PowerPC), SunOS (Sparc).
It uses a shell script to do a little bit of config, then bootstraps mk (a Bell Labs successor to make) and builds everything from some very human-readable mkfiles. When you want to build, you just untar the source and run ./INSTALL
Agree with meaty: make and knowing what you're doing is good.
Occasionally, though, my needs will be more complicated than what make can handle on its own (lots of compile-time options, for example, or lots of dependency checking), and in those cases I'll use waf. Waf is lovely and hasn't let me down, and it's easy to use for people who are used to the ./configure && make && sudo make install sequence.
Sorry for the shameless plug, but to discover books like this people can use my project, http://anynewbooks.com. If you subscribe to the category Programming, either via feed or via email, you'll definitely learn about new gems coming out. For example, 21st Century C was picked as the Staff Pick for the Programming category last week.
Looks potentially interesting, though I think I'd still recommend Zed's Learn C the Hard Way instead. http://c.learncodethehardway.org/book/ Zed's book is supposed to be a "beginner" book but it sure does cover a lot...
And O'Reilly gives you free updates forever. And if you subscribe to their newsletter, you'll occasionally see 50% off promo codes. This HN posting was particularly ill-timed, they had a promo code for this particular book that expired on Nov 13th.
Amazon does provide updates as well; not sure if O'Reilly offers this for their Amazon "prints" (I'm sure they would, but I usually opt for direct so unable to confirm), but I have received mails telling me my Kindle books have been updated.
Which is why I now buy O'Reilly books direct from them. However, while digging around in my older ebooks, I discovered the O'Reilly books I bought through Apple or B&N were also drm-free. Not sure if that's now also the case with Kindle versions... but as mentioned there are other perks of buying direct from O'Reilly.
I remember hearing Amazon now allows publishers to opt-out of DRM. .mobi doesn't always cleanly convert to ePub though, so having multi-format from the source is still helpful.
I just ordered it 2 days ago directly from O'Reilly website (they currently have "buy 2 get 1 free" promotion, just use code OPC10, http://shop.oreilly.com/product/0636920025108.do). Really excited to read it when it comes. In case you're curious the other 2 books I ordered were "Python for Data Analysis" and "JavaScript Web Applications" which also look very promising.
I read that list as prefixed with "learn to...". And getting a basic grasp on autotools is certainly useful, as it enjoys a pretty large market share. Everything else would be rather opinionated. Just like Java's maven, you have to at least wade through it before picking something else, if you're able to do that at all in a given project.
Different tools for different tasks. Autotools is horrible, but there are times when it is less horrible than any other options. The same is probably true of CMake.
Personally, I stick to POSIX make for most of my code -- and handle cross-platform issues by assuming POSIX compliance and telling people to fix their damn OSes if they're not POSIX compliant.
Frankly, depending on the nature of the product, that is the best option.
Projects with small teams who have programmers as audiences can reasonably target POSIX and let their users work out their personal issues themselves. If they are using something particularly weird, that is their own damn fault and they should take responsibility for that. If not, chances are they won't have a problem.
Running HP-UX with a GNU Hurd kernel on your toaster? Fix it yourself. Sure, I'm sure autotools could cover that situation, but seriously, just fix it yourself.
It's fine to have a limit to what you'll support, but please lose the arrogance and condescension. A user's choice of OS is made based on a number of concerns, some of which he has no control over. Disparaging his choice or saying it's their own damn fault or "fix it yourself" is incredibly rude, especially when you can tactfully say "We currently support the following systems:" or "We currently only support systems that can run autotools, which includes the following popular operating systems:" and maybe even add "Code contributions to expand support are of course welcome!"
That's a damn sight nicer than "but seriously, just fix it yourself."
Sorry, the 'arrogance/condescension' comes from the remembered horror of having to deal with autotools too many times. Repeatedly among the most unpleasant experiences with technology I've ever had.
I don't think it is fair to accuse me of "blaming the user", as though that were horrific and plainly wrong, just because I want nothing to do with that nonsense again.
"Sorry, the 'arrogance/condescension' comes from the remembered horror of having to deal with autotools too many times."
That is a valid feeling, but it does not excuse arrogant or condescending behavior. It's not the user's fault that you've had these experiences; you just ended up working with a painful tool. Moving on and refusing to work with it again is perfectly fine. Blaming the users for your misery is not.
"I don't think it is fair to accuse me of "blaming the user", as though that were horrific and plainly wrong, just because I want nothing to do with that nonsense again."
I accused you of blaming the user because of comments such as "If they are using something particularly weird, that is their own damn fault"
Separate the problem from the people, because the people you're attacking are not responsible for your pain.
I apologize if my tone has particularly bothered you, but I don't think that I have done anybody any harm. I'm not going to apologize for it any more than that.
Anyway, I think we have misunderstood each other.
> "I accused you of blaming the user because of comments such as "If they are using something particularly weird, that is their own damn fault""
I agree with you there. I am blaming the user.
Where we differ is that I have, with admittedly strong words, objected to the implication that blaming the user is inherently wrong or bad to do.
There is a certain amount of work I am willing do to for the sake of the user with use-cases dissimilar to my own. If I'm on linux, and inotify seems like the best too for the job, I'll either abstain from using it, or use it but make an attempt to cover other common possibilities. I think it is my responsibility as a developer to think of others to this degree, if I am simultaneously not following standards and not making concessions, then I am not being fair.
Using autotools is way over that helpfulness threshold though. If that much effort is involved then I am sorry, but I expect the user to do some legwork themselves.
I'm not blaming the users -- I'm blaming the developers of non-POSIX-compliant OSes. And I want users to do the same thing, since that's the only way things are likely to ever
get fixed.
Any opinions on the book? I have been away from C for about 15 years, and would like to get back into it. This sounds like it may be a good place to (re)start.
The next book to read after your intro to C. Actually explains how to write C programs after you know the grammar.
http://www.amazon.co.jp/dp/4774146129
I know what you are saying, most of these are in Japanese. But the code is not. And rather easy to look at the code and see what they are doing. Anyhow, live dangerous I say. Not many books in English explain this stuff it seems. However, if anyone has nice C related info in any language, I would be appreciative.
I've been reading the early release. So far it's the best resource on C I've come across, although I am a C beginner. It's quite practical, focusing on basic tools at the start and then getting into syntax, code structure, and libraries.
I know C well, but check out links like these on news sites occasionally. (Just like this story)
This book reminded me of an online C tutorial that proceeded to teach C in a functional style. I don't remember if they explicitly stated it was FP when presenting the material.
The style was really clean. The author avoided assignment as much as possible. All variables were initialized as if they were 'let' statements. Variables were
never re-used.
I just can't find the link anywhere.
Does anybody know that web page that I'm referring to?
Not sure if I'm thinking of the same thing, but I believe Cornell's former introduction to computer science used C and a functional style. I'm also unable to find a description online, now.
I was handed a few lecture notes by my former professor who interned there in the 90s or so.
edit: To clarify, I looked, but couldn't find the notes right now either. Not sure if I still have them.
"The C Language is Purely Functional" - a satirical, but also, thoughtful post that I don't think has been submitted to HN before (or the search system is rubbish):
So, I studied C in college and really enjoyed it. It really fit my style of thinking at the time (having a mental model of what the machine is doing is really valuable). I wrote a game in C/OpenGL, but really didn't learn much about "modern C" or much of the ecosystem other than doxygen/make/gdb.
These days I primarily program web services and iPhone apps in Ruby, occasionally using Java for performance issues. Is there any good reason for me to further my C knowledge?
Modern C++ Design (Andrei Alexandrescu)[1] is wonderful, but C++11 has "simplified" much of what he covers. If you want to be able to read the C++11 standard[2] with little effort, Alexandrescu's book is nevertheless a good primer.
First or second books? Stroustrups Programming Principle and Practice is quite good as a starter book. Not quite C++11, if I remember correctly, but further along than just C with Classes.
There's also Alexandrescus "Modern C++" book, but even that is now a decade old. We'll have to see what turns up in the wake of the recent standard, so far it's mostly introductions to new features and some updated references, useful for someone getting reacquainted with modern features, but no K&R of C++11 yet.
The new edition of the C++ Primer is helping me quite a bit. It explicitly notes which things are from C++11 and which aren't, and they talk quite a bit about the new stuff.
I had tried to learn from Stroustrup before C++11 and didn't get as far.
I'm reading C++ Primer 5/e at the moment and I'm enjoying it so far. I like how it starts using the hight level containers before introducing other more low-level structs. The second part is a nice overview of the C++ Library so you won't end reinventing the wheel when writing your first programs.
Also the Kindle edition is really well done.
The biggest flaw of this book is that's huge.
So I'm a web developer, as I imagine a large portion of the Hacker News community is. I learned to program on C in college, but am interested in learning a more proper (in the engineering sense) way of programming C. Should I look at this book? Should I just read Zed's Learn C the Hard Way?
It's a hefty tome but that's not necessarily bad if you want to (re-)learn the language and the tricks in a proper way. Easily in my personal Top 5 books for C (The others are in no particular order: "Pointers on C", "Expert C Programming: Deep C Secrets", "The Standard C Library", "C Interfaces and Implementations".)
edit just bought it.
Quick scan reveals some GNU extensions being used, lots of tripe about pkg-config, autotools, how to write a makefile properly without autoconf (commendable), gdb tutorial (looks resonable), short valgrind mention, unit testing, doxygen, proper assertions (woot!), short bit on VCS with git (looks reasonable), integration with python (looks reasonable), decent discussion on aliasing and copying, some serious standard-breaking things (bad), how not too shoot yourself with macros (good), linking discussion (looks ok), an odd quote from pussy riot, lots of discussion of gotchas, unicode (good!), some stuff to do with glib (yuck), OOP in C (looks ok), more glib stuff (yuck), posix threading (some mines in there), GSL (good), SQLite (good), libxml (yuck - personal preference though), libcurl (good).
So generally a mix of good and bad. Probably worth it though.