Heat pumps work by moving heat, they usually move more heat than input energy.
Basically, a refrigerator is a heat pump. The input energy runs the heat pump, which it uses to move heat out of the refrigerator. Typically, for every watt of electricity, more than a watt of electricity is moved out of the refrigerator.
They work by compressing gasses into fluids, and then letting the fluid expand back into a gas. Basically, when a fluid evaporates, it absorbs heat as potential energy. The energy can be harvested by compressing the gas at a high enough pressure that it condenses into a fluid. Do this in a loop, and you can move heat.
Or an example that might be more readily acceptable, being something that a person without any domain knowledge could simply test* to prove it to themselves:
Suppose you built a shed around the outdoor unit of a central air conditioning system and let it come up to max ambient temperature. Then right next to it you built a shed with a resistive heater which consumes the exact same wattage as the air conditioner. The first shed will be much warmer because you're not creating heat so much as you're moving heat. If you increase the resistive heater's wattage by about 2.5x then the sheds will be about the same temperature.
* Don't actually do this and expect the system to survive.
I understand this, but the argument I was responding to was "you're already going to use the electricity, you might as well make profit", but actually you could use less energy so it's not a given that all this electricity was going to be used. It is instead a choice between profit and carbon emissions.