I started learning programming with K & R in January. Working through the exercises was an amazing foundation in the thought processes programming requires.
The language is simple enough that you're left with nothing to get in the way of the problem in front of you, and you're forced to think of what is happening on a very low level.
That low level understanding helped when I tried languages with more abstractions.
I've seen this mentioned several times. Is there a resource you recommend on the 'new ways'?
Asking out of curiosity, about to start a side project with C, and I do have the text mentioned.
Hmm, here are some things that I do routinely that don't really show up:
ꙮ I always use enum instead of #define for numeric constants.
ꙮ I nearly always use inline instead of #define for small functions whose efficiency worries me.
ꙮ As a result of those two things, I don't use #define much.
ꙮ Everything global is static unless it's in the public API provided by the file.
ꙮ I use dynamic allocation a lot more than the C in the K&R book.
ꙮ I avoid integer function arguments and return values as much as possible, because they're basically untyped, and I benefit from the limited static type checking done by C compilers. I still use them for cases where I'm actually counting or measuring something, and unfortunately for bitfields. When I'm counting bytes, the correct type is usually size_t or ptrdiff_t, which benefits humans and LP64 platforms, but doesn't get you much static type checking benefit.
ꙮ As a result of those two things, I use arrays relatively sparingly.
ꙮ I basically never use function-static variables because they make thread-safety very difficult if not impossible.
ꙮ Use const wherever applicable. It catches some errors, and it's not nearly as draconian as in C++, so it doesn't cause as many problems.
ꙮ After some experience with C++, I usually "typedef struct foo foo", since it costs me two words in the declaration and saves me the "struct" every time I use "foo" later.
I don't know if these are the kinds of things people mean when they talk about new ways of coding in C, or where to find them documented.
I like that idea of using a (presumably unnamed) enum as a bag of constants that aren't necessarily related. From the small amount of Go I tried, its syntax for constants seemed a more evolved version of that.
What do you think is the minimum realistic preprocessor usage that can be got away with? #include + #pragma once?
You don't need #pragma once if you don't include .h files in other .h files, or if you only have a single .h file that includes other .h files. This may sound unrealistic but there are real projects that work that way. You probably do need #include.
I know - I was just spitballing the minimum number, since guards require conditionals too. Though, portability aside, you have to agree that it's simply nicer, since it plainly states the intent without requiring a little dance each time.
The advantage is smallest for things like that, which you'd put in a nameless enum. To my eyes, there's still a better signal-to-noise ratio in
max_line = 512,
than in
#define MAXLINE 512
but I agree that the difference is small. In some cases, your code is more readable if you can define more than one constant per line, and the difference becomes larger:
Beyond that, non-anonymous enums have the additional advantages that you can define them closer to where they're used, and the debugger knows how to print them.
It's a pretty big advantage when it applies, but it doesn't apply to the MAXLINE case, and it only applies when you're using a debugger. Which, for me, is very rarely.
I use #define to do things that plain C just can't do, and I wind up wanting to do these things pretty often. Two of the most important strengths of macro functions, for me, are VA_ARGS and the related ability to write a macro that counts the number of arguments which it has been passed. Oh, and using container_of() to write generic data structures.
Given that Kernigham and Ritchie were using C to write an OS, it's not surprising that they'd be wary of dynamic allocations.
For me, modern C means modern C features more than modern C style. I use the following "new" features all the time: compound literals, designated initializers, variadic macros, and anonymous arrays.
Dr. Dobbs did a nice series on C99 features, so I'd recommend that as a supplement to K&R.
The language is simple enough that you're left with nothing to get in the way of the problem in front of you, and you're forced to think of what is happening on a very low level.
That low level understanding helped when I tried languages with more abstractions.