There is an avalanche of gig economy employee management and monitoring software (Teramind, ActivTrak, Ekran System, BrowseReporter, workpuls, Cerebral, monday.com, DeskTime, Time Doctor, etc). Many of these systems contain algorithms that surveil and manage workers. These tools eliminate much of the risk of outsourcing work to remote workers that cannot be directly supervised. They enable the remote workers by making them eligible for employment, and engineering a higher degree of trust into the relationship.
I don't think it even needs to go that far, if you choose to delagate your descion making to a machine you should be as responsible for whatever results from that, it is your role to ensure the machine is making the right choices.
I think it's a harmful act of technological hubris to suggest that it's possible. Employment is a human relationship. I don't see much evidence that humans are able to construct a system that can respond to all of the human interaction that goes into a human relationship. Look at the state of the art of HR software, let alone spyware.
If someone invented such a system, I wouldn't want to stop them from using it, I just hold them accountable. I personally wouldn't trust a computer to fire someone if I was on the hook for it's choices.
I don't trust any of the feedback loops, or the good faith of any of the kinds of organizations that would use them. Show me a time when a company was truly accountable for a "computer mistake" and didn't just shrug it off when the damage was done. The fable of the scorpion and the frog springs to mind.
I don't think it's actually a defence now, except maybe in the court of public opinion. For example, I'm certain courts would find that it's still sexual harassment if you build a machine to go around looking up women's dresses instead of doing it yourself. The problem isn't a lack of accountability for the machine so much as it is a lack of accountability generally.