I agree with you. I'm busy at the moment, by I think I'll come back later to see if it gets better. The editor may have had Hylton write a catchy, but inauthentic hook to draw readership.
I went into the article thinking that he had written some old software bugs that are inadvertently making it difficult for some evil organization to consolidate its plans and carry out its schemes.
I respect his accomplishments, but someone who gets thrills from life-endangering pursuits is not the person I would want releasing artificial-organisms into the wild to save the world.
Upvoted for funny delivery, but I disagree with you. I, too, enjoy riding motorcycles fast, but that doesn't mean I'm not capable of rational thought when applied to my work.
The past 5 years is a pretty-clear demolition of the idea that we should allow so much concentration of power by the Corporatized State (or anyone, really). Bright? Hardly. They're good at winning popularity contests, certainly. Well-trained? At what? Amassing power? The people most in charge of my life (ie, over a third of my labor) aren't trained in jack. The guy at the top of the pile currently was a "community organizer", whatever the hell that is. The guy in charge of the pile before him was another big government average kind of guy who had his success handed to him by his daddy.
So while I disagree with you as to the particular source of the problem for the last five years, I'm actually going to agree with you that overall human beings are bad at assessing systemic risk. While I really admire Craig Venter, I'm extremely distrustful of someone with grand ideas of releasing designer micro-organisms into the environment.
I'm all for us pursuing designer micro-organisms, but we should clamp down hard on safety protocols. I didn't read every line of the whole article (as someone else mentioned, the writing is pretty pathetic) -- but I didn't see any real text given over to the extraordinary danger of releasing organisms into the environment.
Really, I'd prefer if we restricted our organism construction for now to things that are big and can't really escape into the wild easily... say, that Woolly Mammoth project. It's hard to lose track of a Woolly Mammoth.
I don't think so. I also don't think that risk as applied to one's personal life (e.g. riding a motorcycle for fun) has much bearing on risk as applied to one's work.
Riding bikes is my main source of enjoyment in life (both racing and just going for a ride on the road), and I accept that in order to obtain that enjoyment I have to take a risk. That's OK. If I were building artificial lifeforms, sure I'd get a kick out of it, but I would be a lot more careful than when I'm just riding a bike for fun.
I'm unclear on what you mean by "they" when you say "either they work or they don't" - his bugs? Or the way in which those bugs, once released, solve a particular problem? Or once released, those changes are non-harmful? My problem with risk is the attitudes towards proof ( or principle of least harm) in the latter, not a quibble with the previous two. And I think we can take satisfaction of those two conditions a necessary condition for the possibility of the third, right?
As the article states, his ideas have a history of being successful (though he almost certainly has a hell of a lot of unsuccessful ideas that we haven't heard about). I think that's what "moron" was referring to.
Great remind about unsuccessful ideas; as I prefaced my original remark, I am impressed by his accomplishments. And I have no doubt that he (or someone else, eventually) would be able to build engineered organisms that could "work" (in the small sense), and survive and reproduce (and so work, in the larger sense). But that success is pretty distant from being able to predict and control the long-term effects once they're released. [edit: spelling/undo autocorrect]
https://www.nytimes.com/2012/06/03/magazine/craig-venters-bu...