that's not what backpropagation is. Backpropagation is best thought of as a 'cheat' (algorithmic simplification) that allows you to calculate the derivative of a feed-forward neural net. You would need to calculate the derivative of a neural net in general to optimize relative to some cost function, you could use, for example, gradient descent, but that is computationally costly.
For some neural nets, you still have a gradient, but the concept of back or forward propagation is not defineable. Based on the topology and structure of biological neural nets, what would you think is the case?
For some neural nets, you still have a gradient, but the concept of back or forward propagation is not defineable. Based on the topology and structure of biological neural nets, what would you think is the case?