I don’t think it’s a skill issue. It’s that they’re really only useful with profile guided optimization—so you can have hints that reflect actual branch probabilities. But most developers don’t seem to bother to do that.
I wouldn't say that PGO is necessary to predict branches correctly. Lots of branches are exceptional control flow (error checks, bounds checks, etc.) and compilers will almost always correctly statically predict not taken for them. Any branch with a target that's postdominated by a throw or a process abort can basically be safely predicted not taken (after all, if you got it wrong, the penalty is going to be minuscule compared to the cost of the exception).
The hardware itself is already very good at predicting the direction. I think the point about PGO here is that you need it to find infrequently-occuring biased-taken branches.
When you encounter a biased-taken branch that isn't currently being tracked by the machine, you're always condemned to pay the cost of a misprediction because the default prediction is "not-taken". The hinting here is supposed to indicate "when encountering this branch, don't predict not-taken by default."
If memory serves, wasn't that a bit of a tool minefield back then, too? It’s been a while but I thought I remembered a few colleagues trying GCC’s version which involved building a new version of GCC, slowing down the builds a fair amount, and then seeing only a small benefit – far less than they got switching to the first AMD Opteron when it came out a year later.