Isn't haptic feedback supposed to mean that you feel something as feedback that an action happened? If so, then this would be more like haptic feedforward. Apple vision reacts because you feel something, and that sounds as reliable as it probably is.
Apple doesn't react because you feel something. Apple estimates, based on the kinematics it recreates from it's camera feed, when something happens. It is NOT looking for a visual gap between fingers to disappear, as this would require an exactly correct camera angle.
I guess you get the natural haptic, but the feedback is visual/audio (happens in software). In any case the link between haptic and visual/audio action is kinda broken on the vision pro
the reason is that it is camera based, unlike Orion. And this is why people describe Orion as magical, whereas nobody talks about the hand gestures of the Apple Vision Pro (but people do talk about the eye tracking of the AVP as magical)