I guess you're right. But I was trying to defend Valmar's statement which I still think is basically right in this context. Computers can't really implement non-deterministic algorithms.
Well they can. Quantum computers are inherently non-deterministic.
Even for a classical computer, access to a true randomness source (such as a sufficiently good hardware RNG) is enough to make a classical computer non-deterministic, and hence programs that rely on that true randomness source are classified as non-deterministic.
In practice, we sometimes classify a program as nondeterministic even if it only has access to pseudorandomness, provided that pseudorandomness is "random enough". If a program uses a high quality PRNG seeded with the current time, that might be practically considered non-deterministic, even though strictly speaking the program's behaviour is a deterministic function of the current time (and other inputs).
I wanted to say the same thing, but you managed to find the words I couldn't.
> An algorithm, by definition, is an actual deterministic procedure
That's not necessarily true. There are non-deterministic algorithms: https://en.wikipedia.org/wiki/Nondeterministic_algorithm