htons, etc are very 1980s, and I’ll make a fairly strong claim: they should never be used in new code, with a single exception. The reason is that an int with network endianness simply should not exist. In other words, when someone sends you a four byte big-endian integer, they sent four bytes, not an int. You can turn it into an int by shifting each byte by the relevant amount and oring them together. And a modern compiler will generate good code.
The sole exception is legacy APIs like inet_aton() that actually require these nonsensical conversions.
After re-reading your comment above, I'm actually confused. You think you should never store a big-endian int? That is ridiculous. Some architectures are big-endian. You should not be using custom bitswapping as part of application code, because you cannot know the endianness of your architecture.
The ntoh* functions are the right approach, and your claim is not only strong, it's wrong. The ntoh* functions exist to transform network byte-order to host byte-order. Depending on your architecture endianness, their functionality will change.
Although IIRC there is or at least was still some disagreement as to whether this might be UB. You could use a union to make it definitely not UB.
But none of these variants are sensible, and, in fact, they don't even translate to most safer languages than C. The correct way to write this code is:
On any recent compiler, this will generate as good or better code, and it doesn't make pointless assumptions about the representation of uint32_t on the platform you're using.
So I stand by my claim: well-written modern C code should not contain any "network-order" values. They should contain bytes, vectors of bytes, and numbers.
My C code didn't include mention of ints though, so I'm wondering where you got that from.
Your first example is UB and again, is not something my example depended on.
Your final claims are overly cautious. It is perfectly fine to use uint32_t in this way. Uint32_t is defined as a 32-bit unsigned integer. There is a bijection between network order 32-bit unsigned integers and host order integers, and ntohs is the bijection. It is no different than storing any other value. It is certainly not wrong.
The sole exception is legacy APIs like inet_aton() that actually require these nonsensical conversions.