The one case so far where I've seen better results using tricks like reciprocal multiplication and shifts was when there was no way for gcc to know it could throw away almost half of the bits due to input range limits. It would be kind of nice to have, say, a 19-bit integer type.