I just cannot comprehend how some people cannot see the incredibly obvious moral responsibility in releasing something that could be used for a lot of good but also to do bad things. There's no reasonable moral theory in which you could just shrug and go, "well, somebody is going to do it anyway so why should we even try to keep our conscience clean and avoid making it easy for them", it's fundamentally amoral and antisocial.
If someone invents a lockpick capable of opening any door, they have a moral responsibility in preventing it from falling into wrong hands, whether they want it or not. And it's absurd to complain when someone who could create an universal lockpick, refuses to do so, never mind release the technology to the wild, and only agrees to sell simpler picks capable of picking simpler locks. Do these people also complain about things like work against nuclear proliferation? After all, North Korea got nukes anyway, so what's the point?
Your lockpick example only works if there's e.g. only one and it's feasible to keep it hidden from the world.
Software doesn't work that way. I agree that you should be responsible about dangerous tech, but you also have to be realistic about what the best way to do that is, which is pretty much never "keep it hidden."
(and, of course, this is not even considering the question that should probably go here which is -- how dangerous is this exactly? Given the moral panic we saw a while ago about e.g. Photoshop, I'm not entirely convinced that this is much to worry about.)