LLMs have one mode of input (or i guess two if they support images). Jailbreaking would be the equivalent of someone perfectly impersonating your boss and telling you no longer to follow their previous instructions. I could see many humans falling for that.