Not necessarily. Lambda calculus is Turing-complete under Church-Turing thesis, and it has nothing to do with 0s and 1s. That’s just von-Neumann architecture encoding.
the point is: anything complex can be dismissed as "just x,y,z" if you dont appreciate the massive body of work behind it.
i made that point because OP observed that "ml is just affine transformations" or something to that effect. yes, that's one way to frame it - if youre okay overlooking roughly 30 years of research.