There are about 7 billion people living on the planet. Evolution has given us large and maddeningly complex brains. Even discounting the occasionally deleterious affects of politics, religion, and culture in general on the human psyche, the sheer weight of the numbers dictate that you’ll find some number of people who are malevolently deranged. That would appear to be a fact we have to labor under at least until our understanding of the brain is advanced far beyond where it is today, something, I think, that is well over the horizon.
Meanwhile technology marches forward at a steady clip. And that includes destructive technology. Only a hundred years ago nuclear weaponry was but a twinkle in a theoretical physicists’ eye, with the ICBM technology to deliver multiple warheads anywhere on the globe being the stuff of science fiction. Chemical and biological weaponry advances apace.
While the Boston bombing has currently caught the media spotlight, it is the intersection of madness and gun technology that has been occupying the clutural and political realm. The Sandy Hook shooting prompted an extensive national conversation about how to keep guns out of the hands of the mentally ill. Legislatively no action has yet been taken, with a bill to expand background checks on gun purchases dying in the Senate.
So, at least so far, we seem incapable of even beginning to address the intersection of guns and madness. But what about when this intersection doesn’t mean a few dozen dead, or even a few hundred, but a few hundred thousand or a few million? What about when there is no margin for error? Today there are nine nations with nuclear weapons, five who are members of the Non-Proliferation Treaty ( US, Russia, France, United Kingdom, and China ), one that doesn’t declare that they have nuclear weapons ( Israel ), and three non signatories of the NPT ( India, Pakistan, North Korea ).
Looking at Pakistan, the next thing to a failed state, with radical Islam a powerful force, one has to wonder if bronze age eschatology can be kept from meaningfully intersecting with technology that can quickly wreak biblical levels of destruction. Can those lines be kept parallel forever? It’s an odd quirk of the human mind that a single brain can make use of the powers of reason and science but at the same time be addled by unreasonable delusions. Francis Collins, the head of the National Institutes of Health, saw a froze waterfall and was thus convinced of the divinity of Jesus Christ. Isaac Newton, arguably the greatest genius is history, spent an absurd amount of time on alchemy and finding hidden messages in the Bible.
We tend to have a optimistic view of technology as progressive and liberating. But even beyond the potential uses of technology by governments to surveil and control citizens, the practical matter of the destructiveness of some technologies may mean an inherent incompatibility with our notions of individual liberty and democratic government.
This is not to say that all technology is destructive or by its very nature irreconcilable with liberal democracy. But what if there is no room for error? What if freedom doesn’t mean the occasional shooting spree or pressure cooker bomb, but the occasional briefcase ( or pen? ) nuke obliterating half a city, or the occasional outbreak of an engineered super flu? What sorts of laws would constituents demand of their legislators then?
I’m not sure these are questions that will need to be answered in our lifetime. But at some point technology will progress to the level that (even more) highly destructive technology is relatively simple, cheap, and easy to use. Because how can the march of technology be stopped? Would we really want it to be? And so with any margin of error diminishing, someone is going to have to figure out how to prevent that technology from intersecting with human frailty, stupidity, and malevolence. What would those laws look like? What kind of society would result? These are questions we should think about, because we might not have the luxury of handing them off to our progeny.