More seriously, this is one of those situations where the short-term pluses are plain to see, but the long-term risks are vague. Why not employ the precautionary principle here?
Instead of simply alluding to long-term risks, can you be more specific about what exactly you're concerned about? What kind of precautions are you advocating? At some point medical technology will give us organs that are far superior to biological organs - should we not make these available to people?
It really depends what we're talking about here. Specific to your comments I can think of a couple things:
1) In terms of organ replacement, I agree it's a great opportunity -- but, we really won't or can't know the full spectrum of risks right away. Thalidomide, for example, was (for the most part) believed to be safe in the mid-50s, and it wasn't until the early 60s that awareness of the issues began to set in. So I am being a bit facetious when I say "you first" -- if it suits your risk tolerance or if you are in exigent circumstances, sure thing, who am I to judge? But it doesn't suit my personal risk tolerance at this point in my life. (Maybe ask me in a few years when I'm old and sick ...)
2) In terms of "transhumanism" more generally ... I see great dangers and I fear what a society of transhumans combined with humans would be capable of (if we haven't started down that road already). If transhumans really become vastly superior to humans, what will the relationship between them and the old-stock humans look like? Yuval Noah Hariri said something to the effect of, the only model for this version of the world we have is the current relationship between humans and animals -- the track record there is not exactly great ...
So, it's partially the risks from the tech itself, and partially the downstream consequences for society.
More seriously, this is one of those situations where the short-term pluses are plain to see, but the long-term risks are vague. Why not employ the precautionary principle here?