These are primarily for algebra mistakes. Teaching a proof based course, the most frustrating error I saw was people doing a fine proof but in reverse (this is related to the article's point about irreversible steps). If I wanted them to prove that the square of any
number was nonnegative, they'd start by assuming that x^2 >= 0 and then driving equations from that until they got something obviously true like 0>=0. This was the proof written "upside down" if every step were reversible. They should have just taken the last step and started from there until they derived their first step.
So how was I supposed to grade that? It showed a significant misunderstanding but also often literally had every step necessary for a correct proof. Usually I just wrote overly long notes about how they should write proofs in the future that I'm sure the students never read.
The other fun one was unnecessary proof by contradiction. They'd start by assuming not P. Then they'd prove P directly, not using that assumption. But they, that contradicts the assumption of not P. Hence not P is false, so then they conclude P. Nothing was wrong there, they just had a couple extraneous sentences that overcomplicated things.
If each reversible step was clearly marked as such, I don't see a problem with starting at the goal and then working towards a trivially true statement. Usually the problem is that the steps are written down without explaining how they are supposed to relate to each other, so you are left to guess based on the order they are written down.
Across two math departments that I have taken courses with, this was handled by having an official document that explained how to write proofs ("Read this if you want to pass the homework requirement."), then mercilessly marking down any missing details (e.g. whether two statements are equivalent).
This was probably quite a shock for many who hadn't needed this kind of rigor before, but they did learn pretty quickly to state their proof attempts according to the guidelines. (There were still mistakes, of course, but at least they were less ambiguous.)
A proof is more than just a collection of true statements: it is a means of communication. And in a classroom it is also a demonstration of knowledge. Even if every step is technically correct, the presentation may not be sufficient for full marks. Depending on the class, we may or may not have high expectations for a proof. Also I don't believe the students ever once made this setup but remarked that the steps were reversible, but I would give that full points with a little note that it could be better organized.
I've never encountered a formal "do these steps to make a proof" document. I'd have to see it, but it could be a good idea.
Yes, proofs by contradiction appeal to new mathematics students because one can just wander around until a contradiction pops up and then you're done! Of course, these meandering proofs are harder to understand, and it's easier to make mistakes when there are lots of unnecessary derivations along the way.
I've always found proofs in algebra and topology to be less intuitive and consequently often had to resort to trying to find a proof by contradiction and then unraveling it for its essential elements to make a better, clearer proof. Proofs in combinatorics (often by contradiction) don't even seem to have an better, clear proof buried inside; they are just essentially inscrutable.
Short digression:
It's been interesting to read the criticisms of Uber on HN. Every single one has been accompanied by harsh criticisms of the taxi medallion system. Now I'm quite certain that a tiny proportion of HN have any working knowledge of that system (where working knowledge would be something like your reading, teaching, and genre literacy in proofs), but the general uninformed opinion seems to be that Uber's disruption of that industry is welcome and has no significant drawbacks to speak of.
With that in mind, what do you think is going to happen when someone in Silicon Valley disrupts higher education accreditation and/or secondary degrees? Don't get me wrong-- I think there is a lot of value in university education and university life. But if people who have never even heard of taxi medallions are quick to write off the costs of it being replaced by an unregulated contractor, they are going to be even more willing to say goodbye to a system whose instructors (who themselves are mostly independent contractors) cannot or will not accurately assess the correctness of a student's work?
I don't mean to single you out here-- I've certainly done the similar. The point is everyone on HN has their own stories of experiencing this problem-- where the intellectual environment advertised by the university was violated. Coupled with the absurd debt students today carry (at least in the U.S.), that is more than enough ammunition for a disruptive company to put an enormous dent in the undergraduate degree. My fear is that will strip the money needed for grad programs-- where the real learning and research happens-- and the modern uni will fall like a house of cards.
Have you ever graded? If you have you would know that there is no such thing as "the correctness" in anything that isn't close to perfect. The OP had to decide to give a number somehow to something that was in small parts right, but completely misguided. And then they wrote a long comment explaining the problem.
That's far more useful to the student then just numbers. Maybe the student could actually read the comment and learn from it. Any violation of the intellectual promise in OP is from the student who doesn't put in the effort to learn.
I remember a physics academic I knew giving all his colleagues a short question and lots of sample incorrect answers and asking them how they would grade them out a certain mark. There was absolutely no agreement whatsoever. On the other hand I imagine that they all agreed that a correct answer would merit full marks.
Does it matter? In the UK, your final degree generally has one of only four classes and I expect most people end up roughly the right class. There are obviously borderlines but a first class student isn't going to get a lower second or vice versa because of subtleties in the marking scheme.
It's also an issue for me with too much weighting on coursework - work during the course should be for things like getting feedback on how to write proofs properly (if you are studying mathematics). By the time, you come to sit actual exams, you really shouldn't have any excuse for not knowing how to.
In terms of disruption to accreditation, I think one of the main things that a degree of a certain standard signals is whether a person works hard in an environment less structured that school and can learn independently. That is much harder to measure than knowledge or raw IQ.
> ...whose instructors... cannot or will not accurately assess the correctness of a student's work?
Have you graded proofs? GP's approach is entirely sensible.
Proofs are written for humans. Writing a proof upside down is wrong, even if the derivation is correct.
The underlying cause of that incorrectness could be a simple matter of poor communication, or the student might have a more fundamental misconception about how implication works (and/or quantifiers). Sometimes the instructor knows enough about the student to determine which is the case. Sometimes it's impossible without talking to the student. Hence the long note and (perhaps implicit) invitation to office hours.
I can't find anything in GP's post that indicates he "cannot or will not accurately assess the correctness of a student's work", aside from the obvious impossibility of projecting a multi-objective assessment of a fundamentally human task onto an integer in [0,100].
If anything, GP provides an excellent example of why auto-grading for upper division courses -- especially courses that are trying to teach students how to write for a human audience -- will probably require fundamental scientific advances in AI in addition to innovation on top of existing tech stacks. My money's on self-driving cars happening first.
> With that in mind, what do you think is going to happen when someone in Silicon Valley disrupts higher education accreditation and/or secondary degrees?
This already happened! University of Phoenix has been around for a while.
> The point is everyone on HN has their own stories of experiencing this problem-- where the intellectual environment advertised by the university was violated
If you want to know how people really feel, look at where they send their kids rather than they say/say they will do in online debates. Again, University of Phoenix et al have been around for a while.
> My fear is that will strip the money needed for grad programs-- where the real learning and research happens-- and the modern uni will fall like a house of cards.
Masters programs are cash cows.
Top Ph.D. programs usually aren't appreciably funded through undergraduate programs. And when they are, it's almost always backed by Ph.D. student labor. Which is crazy cheap, and which startups are going to have a very hard time competing with.
> Short digression: It's been interesting to read the criticisms of Uber on HN.
> Have you graded proofs? GP's approach is entirely sensible.
GP wrote characterized the mistake as a "significant misunderstanding" of the material. GP use of the word "just" strongly implied that the assessment inside the long, probably unread note was not reflect in the grade. If that is indeed the case then the student's "significant misunderstanding" of this material wasn't reflected in the calculation of their final grade, which is the textbook definition of grade inflation.
Again-- this kind of grade inflation happens often. (If the class size is greater than 75 and fulfills one of the uni's writing requirements, I'd argue it's almost a requirement that it happens.) It's a truth about an impossible situation with only bad choices, not an ad hominem.
> This already happened! University of Phoenix has been around for a while.
By disruption, I mean Uber/AirBnB style disruption. Disruption resulting in customers who leverage the technology saying things like, "Dude, it's changed the way I go out in Atlanta." I seriously doubt the existence of University of Phoenix has ever made an employer say anything close to that. (E.g., "it's changed the way we hire employees.")
I mean an accreditation service that basically tells an employer: look, do you really want to know if that employee is worth hiring?
I doubt this even requires any testing whatsoever-- just require incoming Freshman to get a Chromebook. But the end of their studies there could be a new Google "Fasttrack" app. You could make it almost like a digital scratch ticket-- giving Google the relevant permissions and you can see whether you really graduated with honors.
> If that is indeed the case then the student's "significant misunderstanding" of this material wasn't reflected in the calculation of their final grade, which is the textbook definition of grade inflation.
I don't know; it depends. I typically write my rubrics so that there are "style" points. In a really easy proof, style might be up to half the available points. In a very difficult proof perhaps only 1/10 or less.
But even that doesn't totally solve the problem. Some mistakes are on the border between style and substance. Grading and assessment aren't a perfect science.
> I seriously doubt the existence of University of Phoenix has ever made an employer say anything close to that.
Call me old-fashioned, but IMO the actual education is still the hard part of the Education Industry.
My comments have all assumed the goal is student understanding. IMO the obsession over grades (by students and by external observers), testing, and accreditation is completely mis-guided. Focus on exceptional student outcomes and the rest is easy.
Transforming a statement using a series of equavilence transforms into an always true statement isn't a valid proof? After all, you've shown that the original statement is the same as a true statement. Isn't that a valid proof as well? It is often easier to go in that direction, since you have a clear starting point. In the other direction, you'd have to guess from where to start.
1. Many, many theorems are not full equivalences. Students who write proofs bottom-up are, ime, MUCH more likely to mistake implications for equivalences when using theorems/lemmas that are not equivalence conditions.
2. Writing proofs upside-down is bad form. Sometimes it doesn't matter -- e.g., for very short proofs or for proofs that are just a lot of symbol pushing. But for proofs that benefit from intuition, writing the proof in the correct order (from assumptions to conclusion) is infinitely easier to follow.
3. Even when all the steps are honest equivalences, it's a matter of convention and familiarity. Yes, community norms are sometimes arbitrary and silly. And there are all sorts of exceptions to those norms. But I don't walk around my neighborhood naked or in a speedo, and I don't write relatively short direct derivations upside down ;-)
4. This is not unique to mathematics. Legal texts, philosophical texts, and even business plans tend to follow the same structure when presenting arguments.
> In the other direction, you'd have to guess from where to start.
But a proof is NOT a brain dump of your process! It's an exposition of your final result.
It's fine to write it that way as long as you justify that the steps are equivalent and thereby showing you know what you are doing. If you don't write something like 'by equivalence' then the assumption should be that you don't know what you are doing. Or, when there comes a time where there is actually a proof or step of a proof where there is no equivalence, mark it wrong. Don't patronize your students but do write better problems, perhaps.
>It's fine to write it that way as long as you justify that the steps are equivalent and thereby showing you know what you are doing.
Have you heard of cargo culting? Some students will just learn that writing "by equivalence" sometimes gets them most of the points. Correct proofs don't necessarily indicate understanding. That's not patronizing, it's just true.
But that's really beside the point.
The primary problem with these proofs is not that they are logically wrong. Re-read #2 and #3 above.
These proofs are usually difficult to read and understand, in part but not exclusively because they break with traditional style. And not just mathematical style (#4).
Don't be the guy who complains that he doesn't need to follow style guides or write maintainable code because "my code doesn't have bugs".
I don't want a brain dump of the student's thought process. I want a clear exposition.
Sometimes upside down proofs provide a clear exposition; in #3 I mention these exceptions! But more often, the proof is upside down because the student is narrating their problem solving process instead of communicating a valid argument.
Unreadable code is technical debt.
Unreadable documentation will lose you customers.
Unreadable proofs are wrong.
Developers who write easy-to-understand code and clear documentation are strictly superior, all else equal, to developers who write in an idiosyncratic and difficult-to-follow style.
Math is no different, and one goal of proof-based courses is to teach students how to communicate technical content clearly.
> Don't patronize your students but do write better problems, perhaps.
At this level, the problem is usually just a bog standard statement of a theorem in established mathematical prose.
Some number of students new to proof writing will write upside down proofs regardless of what you ask them to prove.
The wording of the question does not prompt this behavior.
Tell them they're required to include equivalence markers (⇔) and implication markers (⇒) in their proofs. The arrows make it obvious when the proof is correct and when it's wrong.
I've noticed this (backward steps) with some friends. I think it's because people don't think logically with math notation--they just play with symbols until they get the sentence they want.
To get around this, I just get people to argue aloud. People wear their logic hats again when they have to talk to a human and convince them of something.
Alternately, it's a very useful technique to identify the penultimate step; it's often a useful rephrasing of the problem, and gives something easier to prove. And then just repeat...
(It's probably usually a hangover from poorly taught rote proof exercises in a lower grade, though.)
Writing the proof in the wrong direction is useful in some kind of problems, for example to prove by induction that 1^3+2^3+3^3+4^3+...+N^3 = N^2 * (N+1)^2 / 4.
My recommendation to my students is to write to "proof" in the wrong direction in a scratch paper, the rewrite it carefully in the correct order in the final paper checking that all the implication go in the right direction, and then destroy the scratch paper.
I think it ultimately comes down to the level of primary school teachers, and good luck reversing that. I taught at a community college some classes for future elementary school teachers. When I found things like 1/2+1/3=2/5 I tried my best to get everybody's attention as to how important fixing this misconception was, especially given they'd be teaching children in a few years... Unfortunately, I mainly received questions about how much of an impact an error like that would have on their grades. Major sigh...
This a merely a symptom of bad schooling, not the students (well maybe a little bit the students and the math/learning-hostile culture they grow up in).
There should be no reason why kids who spent such a large part of their childhood all day long, 5 times a week, in a learning institution and can't demonstrate basic learning despite getting good enough grades to get in college.
Don't most college courses requiring math have an algebra recap course to start in first year? Likely as a result of this lack of basic math literacy.
I honestly don't blame a lot of kids who enter college not knowing how to add fractions. Math classes are pretty awful in most schools and they likely just memorized enough stuff before tests then subsequently forgot it.
They didn't have any deep grasp of the fundamental rules, such as finding a common denominator.
Although there is no excuse for answering that by guessing.
When I started a journey down learning advanced math I bought 'Forgotten Algebra' [1]. It's a great recap over everything taught in elementary/high schools and can be finished pretty quickly. Any college course involving math should make this required reading the summer before school or as part of first year for those who fail a basic math test.
I'm convinced this wouldn't been nearly as necessary if we had classrooms with the Khan Academy model with homework in class in group settings with the teacher walking around helping and lectures watched online, with interactive teaching tools. This is particularly useful for math.
Something built on top of Sage (now called Vocal) [1] or Python notebooks would be great for this.
For instance, contrary to the belief of many students
sin(x + y) is NOT sin(x) + sin(y)
(x+y)^2 is NOT (x^2) + (y^2)
sqrt(x+y) is NOT sqrt(x) + sqrt(y)
1/(x+y) is NOT (1/x) + (1/y)
Wow, seriously? Especially the one with the fraction is painful. I'm not sure if university is the right place for people who still make mistakes like that at that age (if they had more than a decade of public school behind them). Certainly not STEM fields.
I have a story that is too long here about my struggles with math in high school, and while I'm maybe not the model CS student for a variety of related reasons, I think that I turned out mostly fine...
None of the symbolic manipulation really made sense to me until I started learning programming (and, much later, mathematical foundations) and how to define these operations. The answer to "Why isn't sqrt(x+y) = sqrt(x) + sqrt(y)" shouldn't be "It's not in the table of 'laws' we're allowed to use". Sure, you can (easily) provide examples where that purported identity is false, but trying to understand why was really hard for me, and I wouldn't accept it until I understood why.
But it takes a lot of "playing" with math to understand the value in thinking that way. Similarly to how some people start out programming with LOGO or BASIC or bastardized javascript, people who start out in math need some extra help in order be put in the right mental place before they can think about it "correctly."
> Sure, you can (easily) provide examples where that purported identity is false, but trying to understand why was really hard for me
What do you mean by understanding why?
You can look at a graph of sqrt(x) and it's immediately obvious that sqrt(1+3) is much larger compared to sqrt(1) than sqrt(10+3) is compared to sqrt(10), but the graph is just an aggregation of an infinite number of counterexamples. The answer to "why isn't something true?" really is just a counterexample.
What "why" were you looking for beyond "that isn't a property that sqrt(x) has"?
That's not really relevant. You only need to check if the equations hold if you don't understand the math involved. You've already lost at that point.
Checking is the smart thing to do then, but ideally what is important is the understanding rather than just the diligence to make sure a guess isn't trivially shown incorrect.
I strongly disagree. Brains are fallible; a quick check at the end lets you unit test your understanding. I find the ability to identify one's own mistakes and correct them to be a sign of an excellent student.
I agree with you in general, but not when only talking about the mathematical understanding.
If you want to add fractions by individually adding the numerators and denominators and the only reason you don't is one of these checks, you have some kind of fundamental lack of understanding about fractions. It's great that you are wise enough to verify, but that's irrelevant to your mathematical understanding.
Misunderstanding also exists, though, at many stages of the game. There's a pair of famous books - counterexamples in topology and another on analysis - dedicated entirely to counterexamples which dispel some common mis understandings.
And these misunderstanding are not limited to undergrads; often they stand in the way of progress at the edge off research, simply because we don't actually know what's possible. Research level math is often guided by folklore: important conjectures and shadows of what might be. And when the folklore is wrong, we go down the wrong track until someone finds a counterexample...
>That's not really relevant. You only need to check if the equations hold if you don't understand the math involved. You've already lost at that point.
You have no idea how much understanding I gained by doing such 'quick checks', sometimes even during exams. Just a little sanity check to be sure you're still on track.
Besides, inserting 1 for both x and y is an actual mathematical proof that the equation does not hold! It's always a fantastic thing to strengthen your understanding with formal proofs that you came up with by yourself.
I've always done those quick checks myself because otherwise I'm prone to making errors. They give me a quick answer of "no, you can't do this" but I never found they actually improved my understanding of anything. It just told me I can't do the thing I wanted to do and it was time to move to the next idea. It's a valid proof, but not one that really ever helped me get a grasp on what's happening.
Mostly it just means I'm in trouble and need to be really careful and do more checks because I'm working with things I don't understand fully.
For example, I've never needed to do that sort of check with adding fractions. I have enough understanding of what those numbers represent and how they work together that I've just never thought I might want to add numerators and denominators separately.
I regularely teach math for university students, and get favourable feedback.
When somebody asks why isn't 1/(x+y) = 1/x + 1/y, then I ask why it should be. Just because something looks intuitive or pleasing does not mean it's true. While this is only a meta-level answer for the question at hand, I consider it to be more important to learn than the special case. The students need to learn basic scientific methodology, for then they wouldn't ask such question, as one of the basic pillars of science, scepticism, easily empowers the student to answer the question on their own. Many haven't learned the scientific method in school, since in school pupils are conditioned to say what the teachers want to hear, not what is true.
When I'm convinced that the former point has been made clear, I proceed to the next step of how to approach a question like this. First, what is the statement supposed to mean? What are x and y allowed to be? Naturals? Integers? Reals? Complex Numbers? Matrices? Polynomials? All, of course, restricted to those where the expressions are defined, since worrying about the truth of an undefined statement is not sound. The context usually defines the possible range of the variables, but one needs to learn to identify that, which is akin to type interference.
Then, maybe someone simply forgot whether that equation was one of the proved ones, or not. The overwhelming majority of wrong, practically ocurring mathematical statements have small counterexamples. That's why I recommend plugging in arbitrary values for the variables, which seem interesting or make the evaluation easy. Whereas intuition and beauty don't imply truth, they remarkably successfully guide us to where to look first. To this specific problem, x = 1 and y = 1 seem like a good idea. We immediately see that they form a counterexample inside all the mentioned possible ranges for the variables. Now, we now the statement is in general wrong, and we can move on.
Depending on whether the time allows, I would show that for real x and y we have
1/(x+y) = 1/x + 1/y
<=> (multiplied with xy(x+y)
xy = y(x+y) + x(x+y)
<=> (use of law of distribution and substraction of xy)
0 = y^2 + xy + x^2
<=> (use of well-known p-q-method)
y in {-x/2-sqrt(x^2/4-x^2), -x/2+sqrt(x^2/4-x^2)}
However, the expression under the root, x^2/4-x^2, simplifies to -3/4*x^2, and is for real, nonzero x always negative, so there is no real solution in y. We just showed that the statement is not just in general wrong, we proved that it is always false, when the variables are only allowed to be real, and that there do exist true cases for complex variables.
It's a bit funny that I explain the things in that order while I despise philosophy. Learning the scientific method is not philosophy, or — to state it better — it's as much philosophy as learning mathematics as a physicist is.
Now, I don't really know anymore why exactly I wrote this down here. Probably it was just the pedagog in me wanting to make clear that questions like these do not indicate lack of mathematical talent, but lack of scientific education.
Do you know a way to teach science and math in a more interactive way than blog posts or youtube videos on the internet? I would love to do that.
Similarly, very little of the trig and pre-calculus stuff that I had to learn in math class made any sense to me, until I either had to use it for drawing graphics with QBasic in my programming class, or for a plethora of things in my shop classes.
Part of the point of inventing concise notation for mathematics is so that you can "let the formalism carry you through"; a great deal of the time you can write down the next line of a proof by looking at the previous line or two and simplifying something without thinking too hard about what it means.
One of the things that the students are practising is performing a longish series of operations while thinking about the important stuff and letting the simple stuff take care of itself.
So I don't think it should be too concerning if a student starts to make mistakes like this: what's gone wrong may not be "the student doesn't understand what sin means", but "the student has gone astray in converting their understanding to an internalised rule of formal manipulation".
(It's a bit like if a younger pupil mistakenly gets it stuck in their head that 6x7 is 49: it doesn't mean they've fundamentally misunderstood what multiplication is.)
My algebra teacher in college did the first mistake. I still remember it after 21 years:
1. She was reducing some formula, and got it down to cos(a - 0), which is obviously cos(a). The problem is, her cheat sheet had the solution as sin(a).
2. She "expanded" to cos(a - 0) = cos(a) - cos(0)
3. We know cos(0), so this leads us to cos(a) - 1
4. This is where she got stuck. After thinking for a long time she wrote on the board: "(from trigonometry) cos(a) - 1 = sin(a)"
5. Final answer: sin(a)
I went up to the board and explained step 2 was wrong and how she should've done it, and I explained that step 4 was wronger, because that would mean that for some angles cos/sin would be outside the [-1, 1] range.
She said, "let's try your way then, young man", and doing the expansion on cos(a - 0) got it to cos(a). She then proceed to say I was wrong, because the answer had to be sin(a).
I still thank her for teaching me to question authority.
Mathematics is usually taught as rote. Many, even at higher levels of academia, never realize that mathematics is a language. Truth be told, I'm not sure most are cut out to fully grasp much beyond rote.
I hated math. I hated it, with a passion. I was figuring out what area of right triangles was, when the teacher spoke from behind me. He said, "You know, you're just squaring the triangles and dividing that in half."
I realize how trivial and silly that is. But, it was that step that made me realize that the notation was telling me something. I realized that it was a language, one which I didn't understand and still hated.
I'd eventually get my Ph.D. in Applied Mathematics, all because I learned you could square a triangle, find the area of that, and divide it in half. I'm not sure how many folks can point to the exact moment their life changed? I can, and I am glad.
I don't know the solution, because math really can be hard. Not many folks want to learn that much. I am reluctant to say this, but maybe they don't really need to?
> And for some purposes, an ellipsis is not just a convenience, it's a necessity. For instance, "1, 2, 3, ..., n" represents all the integers from 1 to n, where n is some unspecified positive integer; there's no way to write that without an ellipsis.
This claim is wrong. Let n be a positive integer. If one wants to denote the set of integers between 1 and n, then one can do so by just formally writing what one wants: {k in Integers | 1 <= k <= n}. If one wants the (ordered) list of integers between 1 and n, then one can write (k)_{k=1}^n.
That is not necessary, at all. One can always just literally write "the list of ascendingly ordered integers that are greater or equal to 1 and smaller or equal to n". The introduction of formal notation is just a shortcut.
One point that wasn't noted is that division is anticommutative of the multiplicative variety. It always intrigued me as to why commutativity is seen as being applicable to all operators and functions that provide the necessary property, but seemingly only applies to the prefix arithmetic inverse for the anticommutative variety.
It seems unlikely that students who need this advice will be willing to read all the words in this essay. But, so long as they continue to pay tuition...
So how was I supposed to grade that? It showed a significant misunderstanding but also often literally had every step necessary for a correct proof. Usually I just wrote overly long notes about how they should write proofs in the future that I'm sure the students never read.
The other fun one was unnecessary proof by contradiction. They'd start by assuming not P. Then they'd prove P directly, not using that assumption. But they, that contradicts the assumption of not P. Hence not P is false, so then they conclude P. Nothing was wrong there, they just had a couple extraneous sentences that overcomplicated things.