It's performative in exactly the same way I grew up among with evangelical extremists. You can't reduce moral decisions and ethics to some sort of karmic account balance. Functionally EA allows its promotors to rebrand self interest as actually altruism. It becomes an all purpose end to justify any means. A thought ending cliche to avoid engaging with the actual complexity.
And yet no one ever seems to be able to actually define a better moral decision making process or describe what the argument defeating "complexity" actually is. Just a longer way of saying "I disagree."
What is the evidence that we actually truly know what is most effective in a lot of situations? If "effectiveness" is just a hypothesis then the whole thing becomes quite shaky in terms of resource allocation.
I think the OP is being unfair to Will and EA, but your statement is pretty outrageous. No one's ever come up with a moral or political philosophy other than utilitarianism - are you serious?