Nice effort and good example of how to cleanly present mathematical and computational concepts in html with some interactivity.
The positioning (target audience) is somewhat incongruous though. Somebody who is not familiar with basic derivatives is severely lacking in mathematical background and will not be able to start from zero and understand backpropagation by the end of reading the post (nor should they try).
Wherever you go (I second the Khan Academy recommendation), you'll need to become familiar with at least basic linear algebra (vector spaces, linear independence, spanning, bases, matrices and their operations) and multi-variable calculus (partial derivatives, the chain rule [and other basic derivative rules], the gradient vector, and gradient ascent/descent for finding the local maximum/minimum of a function) before you can really understand machine learning and backprop.
If you do better with an honest-to-goodness course, I'd recommend Outlier, as they have for-credit courses in introductory stats, calculus 1, college algebra, and pre-calc, depending on your previous math level: https://www.outlier.org/
Many people, however, have found more success starting with a course like fast.ai, which does not assume much mathematical maturity, before later diving into mathematics. But, I'd still recommend learning the math, even if you ultimately don't fall in love with the field of statistical learning. I've met many people who took a "higher" math course for the first time as adults and discovered that they loved the subject so much more than they did in high school.
I think its quite challenging to pursue this as self-study. Its a longish journey and it requires a lot of stamina and self-discipline.
Think of it as pyramid, with ML being your objective to reach the tip of the structure and a number of mathematical topics being the pre-requisite foundations. You need to build them layer by layer, so that the next step is stable and doesn't fall apart in the first difficulty.
Listicles are easy to do: multi-dimensional calculus, analytic geometry and linear algebra, fourier analysis, probability and statistics etc. But to understand rather than stochastically parrot these things (the way an LLM would do) you'd need to spend serious time studying and solving problems in each of those areas.
The good news is that all of this knowledge is more or less what we call "applied mathematics". It does not require too much mathematical abstraction (which would require even more effort to develop).
Thanks for the kind words. In this article, I covered calculus in a way that provides a solid foundation for understanding core concepts like backpropagation in machine learning. My intention was not to create a math-heavy article but rather to offer an introductory explanation of the fundamental mathematical concepts needed for neural networks.
I just skimmed through the article and immediately noticed that the code was taken from Andrej Karpathy's lectures [0][1]. This is not a bad thing since the author included the bibliography. BTW, the author should replace "Andrej Karaphy".
The positioning (target audience) is somewhat incongruous though. Somebody who is not familiar with basic derivatives is severely lacking in mathematical background and will not be able to start from zero and understand backpropagation by the end of reading the post (nor should they try).