> C... is in absolutely no sense of the term a high level language.
Of course it is. You're not worrying about how many registers your CPU has, or when you swap a register out to memory. All that has been abstracted away for you. The fact that you can access memory locations directly gives you some low-level access, but in a very real sense C is a high-level language. There are higher-level languages with more abstraction, of course. From the article: "It's not as high level as Java or C#, and certainly no where near as high level as Erlang, Python, or Javascript."
The problem is that the "level" of a language is relative to other languages. If C is a high level language, that means you can group it into the same pool as Java, C#, Haskell, Python, and Ruby. Don't you see a bit of a difference here? Unless there is a significant number of actual languages, not concepts, that are below C, it logically has to be called a "low level language" because there really isn't much below it but there are TONS of languages above it.
At the very least, it's something like a medium-level language. Grouping it with much higher-abstracted languages is just wrong.
Here's the thing: "high-level", when applied to a programming language, has a historical context. It means something specific (see: http://en.wikipedia.org/wiki/High-level_programming_language), and what it means and has always meant is that the language in question abstracts away registers and allows structured programming (as in, through the use of if, while, switch, for, as opposed to using labels and branching).
It's fine if you want to call "high-level" relative. But it should be acknowledged that "high-level" is not simply a strict comparison between the features of one language and another. And it SHOULD be acknowledged that C is, by convention, a high-level language. Python is definitely a higher-level language, but C is still a high-level language.
Here's the thing: The English language is polysemous. Computer science jargon even more so.
Meaning that any particular term can often have many gradations of meaning. So "X means Y" does not necessarily imply "X does not mean Z". Especially when Y and Z are similar concepts.
I suppose we could lament the inherent ambiguity of the jargon, but the truth is that for the most part problems only result when ambiguous language is used in combination with an argumentative person armed with equal measures of pedantic zeal and failure to grasp the fundamental characteristics of natural language. Without that element, for the most part any reasonably knowledgeable person should be able to figure out which of the particular meanings is at play from context. For example, even though the term "high level language" is used in multiple ways, the statement "C is not a high level language" is not all that ambiguous, even when considered in isolation. As long as you're willing to grant that the person making the statement is not an idiot, then it's trivial to determine that they weren't using the "everything but machine and assembly language" definition.
Yes, I did read the section. And I'm not blind to the fact that languages will age, and there are certainly many, many languages more advanced and higher-level than C is now. I think that's good. It's a bit weird to me to see people who really never programmed in C, and for whom the concept of a pointer is a bit foreign, but time marches on.
I suppose my quibble is semantic at its heart. To say C is not high-level language may be an opinion, both valid and not without reason. But it may also show a lack of perspective -- some missing history. It's a lower-level language compared to many, without question. But it's still a high-level language, and I have not only my reasons for saying so, but evidence to the point.
Under that classification, what is a low level language then? Even Forth abstracts away registers, and allows structured programming (or a sort). Is the set of low level languages that people have heard of in 2013 an empty set?
"Heard of", maybe not. But "working with" I can believe. A friend of mine was writing an assembler with a built-in arithmetic expression syntax last year.
Clearly it was low-level, because it wasn't trying to abstract away hardware - but it also wasn't just a straight-up (macro) assembly language, since it provided some specific features that needed a more sophisticated compiler to work properly.
When we speak of low-level coding, almost any abstraction can constitute a huge jump in power from the machine language.
Yes, and refusing to update (human) language in response to changes in the computing landscape is exactly the reason C fanatics don't see the weaknesses of C.
You can do some very interesting things with function pointers, structs, unions and judicious use of void pointers. I would absolutely group it with Java. Not saying they are very very similar, but you can do similar things.
Now grouping Java with Haskell, that doesn't seem like a reasonable grouping.
Of course, it's all swings and roundabouts really. You can draw lines anywhere.
When you use void*s, you do so carefully. You can make mistakes in any language. You can get objects in Python that don't do what you expected because there's no type safety, leading to crashy behaviour (granted, no memory corruption). I think you might have misread my comment as "you can't make mistakes in C".
It may have been a high level language when it was a choice between ASM and C but the definition of "high level" has moved on. C is no longer a high level language.
Just like diamonds are no longer at the top of the mohs scale. C is no longer anywhere near the top of "high" in "High level" languages.
C is and will (probably) always be a high level language by definition. That it's categorized as a lower level high level language than say Python or Ruby does not magically make it a low level language. There are low level languages which map mnemonics to instruction sets and high level languages that provide an abstraction over having to use the former and are more akin to the written word. That hasn't changed just because we have "higher level high level languages" these days.
Langauge itself is a highly variable thing. People can understand something completely different from what the originator intended, but it doesn't change what said originator meant by it. People can always say "oh C is not a high level language because I think that a low level language should be X and Y" the same way they can say "oh men have to have a beard and know how to fight to be be real men", but in reality men - beards or not - are still fundamentally men.
A categorization that contains all but one language (Assembler) is not a useful categorization.
Besides, why die on this hill? So everybody suddenly concedes that, fine, C is a "high level language". Is anybody's opinions going to be changed? No.
So, how about we stick to useful definitions, and agree that in modern times C is a low-level language, and, likewise, understand that agreeing to that isn't going to change one letter of the C specification or remove one line of C's libraries or anything else?
There's no gain to be had by anyone in this silly line of argument.
But that's exactly the point I was trying to make, albeit I wasn't very good at getting it across the internetz. Assembler is not truly "one language" it's actually a collection of mnemonics that are all extremely similar but based on specific architectures and instruction sets.
The point I want to get across is that for you C is a low level language. To the guy programming a fancy toaster in whatever version of an Assembly language C is a high level language. To be anally retentive and follow your comment about the ranking of languages, if I follow that logic then the only two low level languages would be C, Assembly, and perhaps a compiler complier circa the early 70's that whose name I cant remember. The difference between Assembler and C is extremely jarring in comparison to the difference between C and Javascript (for example).
One thing to note is that C is used a lot for low level systems programming and because of it it's so commonly described as a low level programming language. That said, writing low level systems programs does not mean that we're exclusively using a low level language to do so.
That's being deliberately obtuse. How many languages are lower level than C, besides assembly? If your ranking has every language in the "high level" camp, then your ranking is poorly calibrated.
Assembly is not one language (even if they all look very alike). It's a collection of opcodes or mnemonics specific to a specific architecture with a specific instruction set that map to the later. Even then writing assembly for MASM, TASM, WASM, or WhateverASM for Linux, Windows, or Winux (or whatever) can and will probably differ in structure and in the specific mnemonics used.
There's machine code, low level languages which are basically the plethora of different Assembly languages, and high level languages which provide abstractions to either or both of the former languages and resemble the actual written language. Putting C in the low level languages rank is doing a disservice to the language and not taking into account the bunch of people that program (for example) embedded devices in whatever architecture they're using.
Hah! You deserve a billion upvotes because of the chuckles I just had remembering trying to explain the difference between one pass and two pass to a junior dev (which was completely irrelevant to our jobs at the time, I suspect he found something in Stack Overflow about it or something)
I think what constitutes a "high level language" changes with every few iterations of moores law, as it becomes permittible to add layers and abstractions and virtual machines on top of the old "high level".
The C basic data types have minimum defined ranges, with the actual ranges being available as compile-time constants, and that's really all you need in almost all cases.
C is not a high level language, because it does not contain some abstractions that actual high level languages contain.
Here's a non-obvious example. With C, the programmer must always be aware of evaluation order. In fact, this must be specified even in cases where it is not important. For example:
x = a + b;
y = c + d;
Since the values x and y don't depend on each other, they could each be calculated in any order. However, with C, the programmer must explicitly decide on the temporal order that x and y are calculated (the order they appear in a function), even though it doesn't matter. And yes, the C compiler may decide to re-order those calculations under the covers as it decides (for performance reasons perhaps).
With Haskell, when you are outside I/O Monads and such, you don't specify the order of calculations like the above. Those two calculations may appear in that order in the source code, but the Haskell programmer knows that doesn't mean anything. The system may calculate x first, y first, or perhaps neither if those values aren't actually used elsewhere. Temporal ordering is not necessary (or desirable) under most circumstances.
You can never forget about temporal order in C, but there are higher level languages that allow this.
Of course it is. You're not worrying about how many registers your CPU has, or when you swap a register out to memory. All that has been abstracted away for you. The fact that you can access memory locations directly gives you some low-level access, but in a very real sense C is a high-level language. There are higher-level languages with more abstraction, of course. From the article: "It's not as high level as Java or C#, and certainly no where near as high level as Erlang, Python, or Javascript."