I've always thought of "jitter" as just adding a dash of entropy to the system I'm building.
I suppose a textbook example is... If I'm managing a pool of long-running threads I want to periodically bounce, I could write logic to throttle respawning threads. Or I could just add a rand() to the conditional -- introduce "jitter" -- and let probability theory "throttle" for me.
I'd rather build on top of probability than statistics.
I suppose a textbook example is... If I'm managing a pool of long-running threads I want to periodically bounce, I could write logic to throttle respawning threads. Or I could just add a rand() to the conditional -- introduce "jitter" -- and let probability theory "throttle" for me.
I'd rather build on top of probability than statistics.