> Apparently, it's not even something students feel they need to hide.
Which is good! If a bit of work is trivially accomplished by a machine, we should take it for granted and move on to the next layer of complexity. I have always maintained that teachers complaining about students cheating at homework assignments with AI need to instead work on providing better homework.
Should we? Basic arithmetic has long since been solved, but I've met plenty of people who struggle at higher level math because they haven't mastered enough basic arithmetic. Solving complex problems will often involve many much simpler problems that must be solved as well. The time to offload this to another system to solve it for you is immensely more expensive than being able to solve it mentally, meaning that to solve the complex problem ends up being far more expensive as well. Eventually students will reach problems whose price they can no longer afford.
It is related to the reason we teach concepts starting with simple, small, easy to solve problems before building up. If I want to teach a student how to solve the limit of x*sin(1/x) as x approaches 0, I need them to understand quite a bit of math to even know what the problem is asking.
No, we are supposed to believe you are not supposed to learn any of that because some technology exists and you will be better off with “higher level” tasks … that’s what they tell me, anyway.
Well, as far as I know, that is an special case of an essay.
And this you can test onsite.
The kind of essays I had to write in schools were more about nice sounding words and less the content. CheatGPT can produce nice sounding words, so I am hoping that the focus will move towards rewarding content.
Knowing how to produce an essay is exactly the same as "analytical thinking, research, and argument skills" with the added challenge of making it legible to a reader — which is what makes those skills useful.
I suppose, but having written plenty of essays as an adult I can say with complete certainty that nothing I learned from my 5 paragraph days was of any use. No one, not you, not your teacher, not any real life audience for any topic you would be presenting on or publishing for, wants to read anything remotely close to what you're forced to write in school.
What you were forced to write in school. I readily admit that I had an quality of education several SDs higher than usual, but the trite "5 paragraph" nonsense is neither universal or (more importantly) inevitable.
>By this reasonning nobody should ever learn anything, because it's all 'trivially doable' by machine.
>Like 'addition' and 'subtraction'.
A better analogy would be low level coding. I don't know (or care) how my processor calculates `var f = 3+2` at the register level. And being able to ignore that allows me to focus on higher level concerns.
One of the required classes for CS degree was Assembly Language. Nobody taking or teaching the class pretended there would be a great need for this language in a job setting. But that wasn't the point of this class.
I see what you mean, but it's not really a better analogy.
We need to learn how to do addition at some point, so we can't have ChatGPT do that.
We need to learn how 'registers' work, so we can't have ChatGPT do that.
We need to learn basic algorithms work, so we can't have ChatGPT do that.
AKA - almost whatever is being assigned as homework, is the 'thing to be learned' and it's ridiculous to suggest that ChatGPT do that, and doubly so to gaslight teachers.
The thing is, what applies to you doesn't necessary apply to everybody. Somebody has to understand low level coding. Somebody has to be introduced to it without necessarily knowing going in that it will be a career path. Somebody will need to write compilers and reverse engineer and design CPUs. Just because a skill isn't valuable to you or those you know doesn't mean it isn't valuable to others, especially those who don't know enough yet to know that it might interest them.
If you don’t know that then you really have a real lack of knowledge of computing. I don’t mean that as an attack or anything but I don’t think it is a badge of honor or anything. You could just as easily say the person typing a Word document doesn’t know what RAM is so we can just stop teaching that foolishness so people can focus on the “complex” things.
One could also argue, and many have successfully, that this type of thinking is why software is so much slower than in the 80s and 90s.
I don’t generally like the word “gaslight” as it is normally used by the unsavory but in this case, I think you used it perfectly. That is exactly what people are doing, except for the extremely naive ones. I don’t really know why, though?
Gaslighting is just a term for a social interaction that didn't exist in our vocabulary before. That's it. Which is different than terms like 'woke' for which we have different connotations and contexts.
Or just novel applications of the things you learn in class.
"Congratulations! You leaned depth-first-search! ^award noises^ Below is the algorithm for reference because memorizing it just for this test is silly. You're working on a real time mapping application called Maply. Locations are represented as nodes and all direct routes between any two nodes are represented by directed weighted edges."
a) Write a function that takes a start node, an end node, and a maximum distance to travel and return the shortest path between the two.
b) Your boss said that users need to be able to add stops along their journey. Write a function that takes the final path you computed in part a and the new node for the added stop and compute the amended path changing as few of the original legs of the trip as possible (don't want to disorient your users).
c) Now your boss is saying you need to handle the situation where users make mistakes. Use the function you wrote in part b to implement this feature.
Novel applications? You mean incremental combinations. You realize finding and solving nontrivial problems is much more complicated task, right? This seems to imply that someone that cannot do simple things can do complicated things, where is the evidence of that?
Why do the most complicated mathematics start with basic principles and work up to complex problems? Why don’t they just start with the Collatz Conjecture?
How do you move on to other layers of complexity if you don’t know anything. History books have trivially had answers to questions about history for centuries, does that mean we should take it for “granted” and no one needs yo actually know anything about history?
Where does this idea come from that you can be a Terence Tap of mathematics without even knowing basic algebra?
Which is good! If a bit of work is trivially accomplished by a machine, we should take it for granted and move on to the next layer of complexity. I have always maintained that teachers complaining about students cheating at homework assignments with AI need to instead work on providing better homework.