Same can be said for the madelbtot set or any other fractal - there is a physical bound on the precision of the values - you can in theory run a NN with arbitary precision floats...
But the image is based on the attributes of each node in the NN matrix. The Mandelbrot set can be calculated at any precision you desire and it keeps going. An NN is discrete and finite.
That's not what it shows at all - it shows how varying hyper parameters (which are floats and thus can be set at any arbitary precision) effects the speed at which convergence happens - so its some function F: R^n -> Z - it has literally nothing to do with the nodes in the neural network...