Indeed there exist this low-level filtering, but what I suppose is at a higher level. Say you have a button in the top left quadrant and your mouse around the bottom right corner. Now you move the mouse to point over the button. In doing so your initial move is quite ample towards the button. From speed and direction the OS could infer that you want to reach the button and ever so slightly steer your move towards it. This would be reevaluated at each 'step' across the move, giving a smooth, spline-like movement, and as you get closer to the button, its virtual size would grow ever so slightly, so that if you rush to the click and undershoot or overshoot it by a few pixels, the click still get registered. Even when there's nothing of interest on screen this system would still be registering samples, but doing NOOP, hence the preservation of the delay in a game situation.
Such a scenario would be impossible to do so close to the hardware as it would have no knowledge of the GUI. I have not even the foggiest idea if OSX does something even close to that but hey, you'll never know. It could just as well be some eager host-side smoothing to accommodate for vintage or unknown mice.
Such a scenario would be impossible to do so close to the hardware as it would have no knowledge of the GUI. I have not even the foggiest idea if OSX does something even close to that but hey, you'll never know. It could just as well be some eager host-side smoothing to accommodate for vintage or unknown mice.