Something I was wondering is if faking results is so common, then surely these things they are researching must never be used in any application right? If they were, it would quickly be found that it does not actually work...
This is exactly how it works in practice. Anyone who works at the bench learns quickly to spot the frauds and fakes and avoids them. That's the "replication" everyone talks about, no special agency to waste funds on boring stuff needed.
> If they were, it would quickly be found that it does not actually work...
Unfortunately some of the effect sizes are so small that it's hard to tell what's working or not. The results of papers on body building, for instance, are definitely put into practice by some people. If the claim of the paper is that eating pumpkin [EDIT] decreases muscle recovery time by 5%, how is an individual who starts eating pumpkin supposed to notice that he's not getting any particular benefit from following its advice? Particularly if he's also following random bits of advice from a dozen other papers, half of which are valid and half of which are not?
One problem I've observed is that people applying things often cargo-cult "proven" things from the scientific literature that aren't actually proven. It's easier to say that you're following "best practices" than it is to check that what you're doing works, unfortunately.