This site may contain copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in an effort to advance understanding of environmental, political, human rights, economic, democracy, scientific, and social justice issues, etc. we believe this constitutes a ‘fair use’ of any such copyrighted material as provided for in section 107 of the US Copyright Law.
In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. For more information go to: http://www.law.cornell.edu/uscode/17/107.shtml
If you wish to use copyrighted material from this site for purposes of your own that go beyond ‘fair use’, you must obtain permission from the copyright owner.
FAIR USE NOTICE FAIR USE NOTICE: This page may contain copyrighted material the use of which has not been specifically authorized by the copyright owner. This website distributes this material without profit to those who have expressed a prior interest in receiving the included information for scientific, research and educational purposes. We believe this constitutes a fair use of any such copyrighted material as provided for in 17 U.S.C § 107.
FAIR USE NOTICE FAIR USE NOTICE: This page may contain copyrighted material the use of which has not been specifically authorized by the copyright owner. This website distributes this material without profit to those who have expressed a prior interest in receiving the included information for scientific, research and educational purposes. We believe this constitutes a fair use of any such copyrighted material as provided for in 17 U.S.C § 107.
Just in case you don't have enough to worry about.
Photo Credit: shutterstock.com
July 21, 2014 |
The Walking Dead” is at the top of the cultural zeitgeist these days, one of the most popular television series on the air. In the show, a virus has ravaged the Earth, killing most of humanity, with the dead corpses rising to terrorize the few remaining living souls. While enormously entertaining, it is not a likely scenario for the end of the human race. Dick Cheney notwithstanding, zombies aren’t real. The end of humanity, however, could be. While it is difficult to envision a world without “us,” there are multiple scenarios staring at us, right here, right now, not far-fetched, that could wipe out all or most of humanity, leaving a wasteland for Mother Nature to reclaim. Here are some of the possible ways the reign of man- and womankind might end, no zombies needed.
1. Global Climate Change
Climate change is the Big Kahuna of all scenarios in which our presence on Earth is ended. Despite what the climate change deniers would have you believe, climate change is real. It is being caused by human beings, with a little help from lots of farting cows emitting methane, plusthat giant well of methane lurking under the Arctic ice. As we burn carbon and increase our meat-eating ways, more and more greenhouse gases are building up in the atmosphere. It is pretty easy to see the end game of this scenario. Grab a telescope and look at Venus, a planet with a thick, heat-trapping atmosphere and a surface temperature high enough to, well, melt lead. A few decades ago, climate scientist James Hanson studied Venus, and saw some parallels with what was happening on Earth. What he saw alarmed him, and he testified in Congress in 1988, warning our government that unless we changed our carbon-burning ways, we were on a course for disaster. Hanson got through to a single senator: Al Gore.
Meanwhile, the carbon keeps burning, the CO2 keeps rising, resulting in a slowly rising average Earth temperature despite the occasional freezing cold winter. On average, Earth’s temperature has been rising steadily since the Industrial Revolution unleashed our carbon-burning frenzy, resulting in a slow-moving train wreck. The hottest years in recorded history have occurred in the last decade. Author and environmental activist Bill McKibben outlines the situation:
“The Arctic ice cap is melting [releasing more greenhouse gases], the great glacier above Greenland is thinning, both with disconcerting and unexpected speed. The oceans are distinctly more acid and their level is rising…The greatest storms on our planet, hurricanes and cyclones, have become more powerful… The great rain forest of the Amazon is drying on its margins… The great boreal forest of North America is dying in a matter of years… [This] new planet looks more or less like our own but clearly isn’t."
Many environmentalists think we have already passed the point of no return. Once we pass a certain threshold, Earth will continue warming even if we do manage to cut our CO2 emissions. What we do know is that, if we don’t begin reducing the amount of CO2 we are releasing into the air, and at least minimize the damage, a planet-wide disaster is assured.
2. Loss of Biodiversity
If we don’t melt ourselves into extinction, another possible route to end times is partly a byproduct of climate change: loss of biodiversity. Human activity is responsible for massive extinctions of countless species on Planet Earth. Environment News Service reported as far back as 1999 that, “the current extinction rate is now approaching 1,000 times the background rate [what would be considered the normal rate of extinction] and may climb to 10,000 times the background rate during the next century, if present trends continue [resulting in] a loss that would easily equal those of past extinctions.”
The Millennium Ecosystem Assessment, a major environmental report released in 2005, reported 10-30% of mammals, birds and amphibians on the planet are in danger of extinction due to human activity, which includes deforestation (resulting in habitat destruction), CO2 emissions (resulting in acid rain), over-exploitation (such as overfishing the oceans), and invasive species introduction (like boa constrictors in the Florida Everglades). “This rapid extinction is therefore likely to precipitate collapses of ecosystems at a global scale,” said Jann Suurkula, chairman of Physicians and Scientists for Responsible Application of Science and Technology. “This is predicted to create large-scale agricultural problems, threatening food supplies to hundreds of millions of people. This ecological prediction does not take into consideration the effects of global warming which will further aggravate the situation.”
Amphibians, such as frogs and salamanders, are considered “marker species," meaning they provide important clues to the health of the ecosystem. Right now, the frog population, as well as other amphibians, has been declining rapidly. In any ecosystem, when one species dies, it affects other species, which depended on the now-extinct species for food and perhaps other necessities. When there is a sudden mass extinction of many species, a chain reaction can cause catastrophic results. There have been five mass extinctions in the history of the Earth, and many scientists are saying we are in the midst of the sixth. "We are entering an unknown territory of marine ecosystem change, and exposing organisms to intolerable evolutionary pressure,” states the International Programme on the State of the Ocean (IPSO), in the biannual State of the Oceans Report. The next mass extinction may have already begun." What would that be like? Well, in the worst one, 250 million years ago, 96 percent of ocean life and 70 percent of land life perished. What can we expect from mass extinction number six? We probably would prefer not to find out.
3. Bee Decline
Bees are dying—a lot of them, due to CCD, Colony Collapse Disorder. “One of every three bites of food eaten worldwide depends on pollinators, especially bees, for a successful harvest,” says Elizabeth Grossman, author of Chasing Molecules: Poisonous Products, Human Health. Plants depend on spreading their pollen to produce food. Bees are pollinators. No bees, no food (or at least much less). As many as 50% of the hives in the United States and Europe have collapsed in the past 10 years. The suspect in bee deaths is a class of chemicals called neonicotinoids, pesticides used on a massive scale in commercial farming. It is believed the chemicals impair the bees’ sense of direction, preventing them from returning to the hive.
With reduced pollen in the hive, fewer queen bees are produced, and eventually the colonies collapse. The European Commission has imposed a ban on these pesticides after the European Food Safety Agency concluded that they posed a “high acute risk” to honeybees. The United States, however, has declined to join Europe in banning neonicotinoids, citing other possible causes of CCD, including parasites. Meanwhile, as Nero fiddles, Rome is burning and bees are quickly disappearing. It is not hard to imagine a scenario where resulting acute food shortages bring on mass starvation, war and human extinction.
4. Bat Decline
Bees aren’t the only pollinators dying off. Bats, too, are dropping like flies. As a result of deforestation, habitat destruction and hunting, combined with a fatal fungal disease spreading among the bat population called White Nose Syndrome, bats are disappearing at an alarming rate. Besides contributing to the pollination crisis, the dwindling bat population brings about another possible human extinction scenario. As their habitats are destroyed, bats are increasingly crossing paths with the human population, in search of food and shelter. With bats come bat viruses. "It's very easy to see how pathogens can jump from animals to humans," says Jon Epstein, at the EcoHealth Alliance, a non-profit agency dedicated to conservation and biodiversity. Every year, on average, five new infectious diseases pop up, and about 75% of these new diseases come from animals. It is already suspected that human killers like Ebola emerged from the bat population. Might some new human-killing pathogen mutate from bats to humans and decimate mankind?
5. Pandemic
Which leads us to a related extinction scenario: a worldwide pandemic. New diseases emerge every year. Some have the potential to devastate the population. In 1918, a strain of influenza spread worldwide and killed between 20 and 50 million people—more than were killed in all of World War I. In the past several years, diseases like SARS have come close to igniting into worldwide pandemics, and it is not at all inconceivable that, in our airplane-riding, interconnected world, some other virus could arrive on the scene with the virulence and transmissibility to decimate, if not destroy, the human population. “It is not in the interests of a virus to kill all of its hosts, so a virus is unlikely to wipe out the human race,” says Maria Zambon, a virologist with the Health Protection Agency Influenza Laboratory. “But it could cause a serious setback for a number of years. We can never be completely prepared for what nature will do: nature is the ultimate bioterrorist."
6. Biological /Nuclear Terrorism
In the interim, there are plenty of down-and-dirty, run-of-the-mill terrorists and the grand prize they all hope to get their hands on is a weapon of mass destruction like a nuclear bomb or a vial of smallpox virus. “Today's society is more vulnerable to terrorism because it is easier for a malevolent group to get hold of the necessary materials, technology and expertise to make weapons of mass destruction,” says Paul Wilkinson, chairman of the advisory board for the Center for the Study of Terrorism and Political Violence at the University of St. Andrew. “The most likely cause of large scale, mass-casualty terrorism right now is from a chemical or biological weapon.The large-scale release of something like anthrax, the smallpox virus, or the plague, would have a huge effect, and modern communications would quickly make it become a trans-national problem. There is a very high probability that a major attack will occur somewhere in the world, within our lifetimes.”
As for the nuclear threat, with increasing numbers of unstable countries like Pakistan and North Korea in possession of atomic weapons, the availability to terrorists seems only a matter of when and not if.
7. Super-Volcanoes
There are volcanoes, and then there are super-volcanoes. "Approximately every 50,000 years the Earth experiences a super-volcano. More than 1,000 square kilometers of land can be obliterated by pyroclastic ash flows, the surrounding continent is coated in ash and sulphur gases are injected into the atmosphere, making a thin veil of sulphuric acid all around the globe and reflecting back sunlight for years to come. Daytime becomes no brighter than a moonlit night.”
This lovely scenario is brought to us by Bill McGuire, director of the Benfield Hazard Research Center at University College London. About 74,000 years ago, the most powerful super-volcano eruption in human history occurred in Indonesia. It was close to the equator, and thus gases quickly passed into both hemispheres. Sunlight was blocked, and temperatures on Earth dropped worldwide for the next five to six years, below freezing even in the tropical regions. A super-volcano eruption is 12 times more likely than an asteroid hitting the Earth. Known super-volcanoes exist in Yellowstone National Park in the U.S. and Toba in Sumatra, Indonesia. And then there are the unknown ones….
8. Asteroid Impact
Recent films like Deep Impact and Armageddon have dramatized this human extinction scenario, an asteroid hitting the Earth. Hollywood is Hollywood, but in 2013, a real-life asteroid appeared without warning in Chelyabinsk, Russia. About 20 meters wide, it hurled into the Earth’s atmosphere at over 40,000 miles per hour. Only the angle it come in at and its relatively small size prevented damage and destruction on a massive scale. But what would happen if a not-at-all uncommon mile-wide asteroid hit the Earth at this speed? Quite probably it would wipe out the human race. The tremendous explosion it would cause upon impact would fling so much dust into the atmosphere that the sun would be completely blocked off, plant life and crops would die, severe acid rain would kill ocean life, and fiery debris would cause firestorms worldwide.
This has already happened at least once. The likely reason you don’t see any dinosaurs around the neighborhood is that they were wiped out by just such an incident. Donald Yeomans of NASA: “We expect an event of this type every million years on average.”
9. Rise of the Machine
We look to Hollywood again to dramatize our next scenario. The Terminator movies entertained us with killer androids from a future where war was being waged on man by super-intelligent machines. OK, we are not there yet, but as we program more and more intelligence into our computers, exponentially increasing their capabilities every year, it is only a matter of time before they are smarter than we are. Already we entrust computers to run our stock markets, land our planes, correct our spelling, Google our trivia, and calculate our restaurant tips. In development are robots that look like us, talk like us and recognize our facial movements. How long before they are us, as we download our thoughts and memories into our hard drives, the so-called “singularity”? How long before these machines are self-aware?
Futurist and author Ray Kurzwell believes computers will be as smart as us by 2029, and by 2045 will be billions of times smarter than us. What then? Will they decide we are superfluous? Or maybe we ourselves will decide. Sounds far-fetched, I know, but some very smart people buy into this scenario; people like genius physicist Stephen Hawking: “The danger is real that they [super-computers] could develop intelligence and take over the world.”
10. Zombie Apocalypse
I know. I said zombies aren’t real. But there is a parasite called toxoplasmosa gondii. This terrifying little bug infects rats, but it can only reproduce inside the intestines of a cat, so it evolved a nifty little trick wherein it actually takes over the rat’s brain and compels it to hang out around cats. Naturally, the cat eats the rat. The cat is happy. The parasite is happy because it gets to reproduce in the cat’s intestines. The rat? Not so happy, one would suppose. Why should we care about unhappy rats? Because rats and humans are actually very similar, which is why we conduct so many medical experiments on rats. And humans are infected with thetoxoplasmosa gondii parasite. About half the population of the Earth, in fact. Now it so happens that toxoplasmosa gondii does not affect humans the way it does rats. But what if it did? Viruses mutate. Viruses are manipulated in bio-weapons laboratories. Suddenly half the population would have no instinct for self-preservation. Half the population unable to think in a rational manner. Half the population suddenly very much resembling zombies. Nah. Couldn’t happen. Could it?
Larry Schwartz is a Brooklyn-based freelance writer with a focus on health, science and nutrition. He works at Scholastic Inc. in the classroom magazine division on Superscience and Science World.
I think this is among the most important observations Thomas Merton ever made:
"The rush and pressure of modern life are a form, perhaps the most common form, of its innate violence. To allow oneself to be carried away by a multitude of conflicting concerns, to surrender to too many demands, to commit oneself to too many projects, to want to help everyone in everything is to succumb to violence. More than that, it is cooperation in violence. The frenzy of the activist...destroys his own inner capacity for peace. It destroys the fruitfulness of his own work, because it kills the root of inner wisdom which makes work fruitful."
To think he wrote this half a century ago. What would he say now? What would he think of computers, email, cell phones--technologies that were intended to make life easier, and can, when kept in balance, but in the end have made "the rush and pressure of modern life" worse than ever.
I have seen and spoken to many people who no longer truly take a vacation, because wherever they go they take their work with them. I have seen people use their phones to check messages during the middle of school concerts. I have seen people do it in church. I have seen so many restaurant conversations broken by texting and even phone conversations.
It is not that the technology is bad. It is the way we let it control us. Merton describes this dominant busyness of life as violence. That might sound startling. And he says that giving in to it is tocooperate in violence. That is not easy to hear. And yet that is why this passage is so important, too.
We have created a society that overly values both work and entertainment, and people use technology to switch off one form of busyness and switch on the other. This is violence. It is violence first of all to the human person, because we cannot either know or become our true selves if we don't have regular periods of reflection free from distraction. Second, the busyness in both work and entertainment serves primarily material purposes, which are endlessly promoted as fulfilling hopes and dreams. The genius of the system is that, even though these hopes and dreams cannot possibly be fulfilled materialistically, the tendency of the chronically distracted is not to doubt the system, but to become willing cogs in the great machine. This is another form of violence to the human individual, and also feeds other forms of violence, such as crime, war, and environmental destruction.
Merton is especially addressing those who devote themselves to making the world a better place in some way. I think of the two groups I am most involved with--those in ministry, and those in environmental education. If the church in the developed world is failing, clearly the answer is not more programs, more bureaucracy, more meetings, more busyness. It is more holiness. And if we are using resources irresponsibly and setting the stage for massive economic and climate disruption, clearly the answers needed are unlikely to come from environmentalists who themselves are addicted to the lifestyle that caused the problems to begin with.
Merton's quote is troublesome, because I know he is right, and I also know that I, too, allow myself to get caught up in too many concerns, projects, and distractions, and that I, too, use resources at an unsustainable pace. In doing so, I am cooperating in violence. The fact that I cannot easily see this violence makes it all the more insidious. I don't say this to beat myself up. I know that I live relatively simply in comparison to the average American, but I also know that I can do still better. I reflect on Merton's quote every so often because I want to keep growing, and it is easy, in an energetically materialistic society, to lose sight of the inner journey that can lead to the fulfillment that materialism will never deliver.
I wrote because I've read you've an interest in Ayn Rand; I found this quote from her early private journals, which I thought might interest you:
"Some day I’ll find out whether I’m an unusual specimen of humanity in that my instincts and reason are so inseparably one, with the reason ruling the instincts. Am I unusual or merely normal and healthy? Am I trying to impose my own peculiarities as a philosophical system? Am I unusually intelligent or merely unusually honest? I think this last. Unless—honesty is also a form of superior intelligence."
This was written in 1934, prior to the publication of her novels, and representative of her less respectable "Nietzschean period" characterized by an overt sense of superiority over the human majority. I'm currently reading Anne C. Heller's biography Ayn Rand and the World She Made with a desire to understand Rand's psychology in light of neurodiversity. Rand is clearly a narcissist, and while too affective and inflexible for a perfect psychopath herself, she shows more than a few sociopathic tendencies as well as a consistent admiration for selective psychopathic qualities.
In relation to the above quote, I'm not at all sure that her mature universalism correctly resolved the question of her relation to the rest of her species. I wonder if her intelligence, low empathy, ambitious drive, social distance, public charisma, manipulative dominance, and purely intellectual conscience place her somewhere towards the extremes of the antisocial spectrum. This is certainly not a new idea for her detractors. I can't help but calculate that if 1% (or 4%) or Americans qualify as sociopaths, then Ayn Rand must surely have been more sociopathic by degree than 99% of any population.
On the topic of whether sociopaths are born or created, I just heard about the Japanese movie Battle Royale, in which a class of high school students is sent to an island to kill or be killed until there is one left standing. According to an imdb synopsis:
At the dawn of the new millennium, Japan is in a a state of near-collapse. Unemployment is at an all-time high, and violence among the nation's youth is spiraling out of control. With schoolchildren boycotting their classes and physically abusing their teachers, a beleaguered and near-defeated government decides to introduce a radical new measure: the Battle Royale Act Overseen by their former teacher Kitano and requiring that a randomly chosen school class is taken to a deserted island and forced to fight each other to the death, the Act dictates that only one pupil is allowed to survive the punishment. He or she will return, not as the victor, but as the ultimate proof of the lengths to which the government is prepared to go to curb the tide of juvenile disobedience.
Some of the kids immediately embrace the carnage, others reluctantly join in for self-preservation, others gather together into smaller groups that war with each other, still others seduce allies in, only to kill them in short order, and still others kill themselves in refusal to participate in the violence. The problems arise even in the groups of trusted souls as a greedy suspicion grasps them all. Those that don't succumb to this violent infidelity, surely risk falling victim to their external classmates' hunts.
When I first heard about the plot of the movie, I thought the island was meant to serve as an accelerant for natural selection. Of course if you are putting high school students on an island with weapons, the only thing you would be naturally selecting for is sociopaths. Under my revised-per-imdb understanding of the film, the island is not just naturally selecting out sociopaths, it is actually creating them out of normal empaths. Do I think this actually happens in war and other times of exigency? Yes, I do.
The Founders never intended to create an unregulated individual right to a gun. Today, millions believe they did. Here’s how it happened.
A fraud on the American public.” That’s how former Chief Justice Warren Burger described the idea that the Second Amendment gives an unfettered individual right to a gun. When he spoke these words to PBS in 1990, the rock-ribbed conservative appointed by Richard Nixon was expressing the longtime consensus of historians and judges across the political spectrum.
Twenty-five years later, Burger’s view seems as quaint as a powdered wig. Not only is an individual right to a firearm widely accepted, but increasingly states are also passing laws to legalize carrying weapons on streets, in parks, in bars—even in churches.
Many are startled to learn that the U.S. Supreme Court didn’t rule that the Second Amendment guarantees an individual’s right to own a gun until 2008, when District of Columbia v. Heller struck down the capital’s law effectively banning handguns in the home. In fact, every other time the court had ruled previously, it had ruled otherwise. Why such a head-snapping turnaround? Don’t look for answers in dusty law books or the arcane reaches of theory.
So how does legal change happen in America? We’ve seen some remarkably successful drives in recent years—think of the push for marriage equality, or to undo campaign finance laws. Law students might be taught that the court is moved by powerhouse legal arguments or subtle shifts in doctrine. The National Rifle Association’s long crusade to bring its interpretation of the Constitution into the mainstream teaches a different lesson:
Constitutional change is the product of public argument and political maneuvering. The pro-gun movement may have started with scholarship, but then it targeted public opinion and shifted the organs of government. By the time the issue reached the Supreme Court, the desired new doctrine fell like a ripe apple from a tree.
***
The Second Amendment consists of just one sentence: “A well regulated militia, being necessary for the security of a free state, the right of the people to keep and bear arms, shall not be infringed.” Today, scholars debate its bizarre comma placement, trying to make sense of the various clauses, and politicians routinely declare themselves to be its “strong supporters.” But in the grand sweep of American history, this sentence has never been among the most prominent constitutional provisions. In fact, for two centuries it was largely ignored.
The amendment grew out of the political tumult surrounding the drafting of the Constitution, which was done in secret by a group of mostly young men, many of whom had served together in the Continental Army. Having seen the chaos and mob violence that followed the Revolution, these “Federalists” feared the consequences of a weak central authority. They produced a charter that shifted power—at the time in the hands of the states—to a new national government.
“Anti-Federalists” opposed this new Constitution. The foes worried, among other things, that the new government would establish a “standing army” of professional soldiers and would disarm the 13 state militias, made up of part-time citizen-soldiers and revered as bulwarks against tyranny. These militias were the product of a world of civic duty and governmental compulsion utterly alien to us today. Every white man age 16 to 60 was enrolled. He was actually required to own—and bring—a musket or other military weapon.
On June 8, 1789, James Madison—an ardent Federalist who had won election to Congress only after agreeing to push for changes to the newly ratified Constitution—proposed 17 amendments on topics ranging from the size of congressional districts to legislative pay to the right to religious freedom. One addressed the “well regulated militia” and the right “to keep and bear arms.” We don’t really know what he meant by it. At the time, Americans expected to be able to own guns, a legacy of English common law and rights. But the overwhelming use of the phrase “bear arms” in those days referred to military activities.
There is not a single word about an individual’s right to a gun for self-defense or recreation in Madison’s notes from the Constitutional Convention. Nor was it mentioned, with a few scattered exceptions, in the records of the ratification debates in the states. Nor did the U.S. House of Representatives discuss the topic as it marked up the Bill of Rights. In fact, the original version passed by the House included a conscientious objector provision. “A well regulated militia,” it explained, “composed of the body of the people, being the best security of a free state, the right of the people to keep and bear arms shall not be infringed, but no one religiously scrupulous of bearing arms, shall be compelled to render military service in person.”
Though state militias eventually dissolved, for two centuries we had guns (plenty!) and we had gun laws in towns and states, governing everything from where gunpowder could be stored to who could carry a weapon—and courts overwhelmingly upheld these restrictions. Gun rights and gun control were seen as going hand in hand. Four times between 1876 and 1939, the U.S. Supreme Court declined to rule that the Second Amendment protected individual gun ownership outside the context of a militia. As the Tennessee Supreme Court put it in 1840, “A man in the pursuit of deer, elk, and buffaloes might carry his rifle every day for forty years, and yet it would never be said of him that he had borne arms; much less could it be said that a private citizen bears arms because he has a dirk or pistol concealed under his clothes, or a spear in a cane.”
***
Cue the National Rifle Association. We all know of the organization’s considerable power over the ballot box and legislation. Bill Clinton groused in 1994 after the Democrats lost their congressional majority, “The NRA is the reason the Republicans control the House.” Just last year, it managed to foster a successful filibuster of even a modest background-check proposal in the U.S. Senate, despite 90 percent public approval of the measure.
What is less known—and perhaps more significant—is its rising sway over constitutional law.
The NRA was founded by a group of Union officers after the Civil War who, perturbed by their troops’ poor marksmanship, wanted a way to sponsor shooting training and competitions. The group testified in support of the first federal gun law in 1934, which cracked down on the machine guns beloved by Bonnie and Clyde and other bank robbers. When a lawmaker asked whether the proposal violated the Constitution, the NRA witness responded, “I have not given it any study from that point of view.” The group lobbied quietly against the most stringent regulations, but its principal focus was hunting and sportsmanship: bagging deer, not blocking laws. In the late 1950s, it opened a new headquarters to house its hundreds of employees. Metal letters on the facade spelled out its purpose: firearms safety education, marksmanship training, shooting for recreation.
Cut to 1977. Gun-group veterans still call the NRA’s annual meeting that year the “Revolt at Cincinnati.” After the organization’s leadership had decided to move its headquarters to Colorado, signaling a retreat from politics, more than a thousand angry rebels showed up at the annual convention. By four in the morning, the dissenters had voted out the organization’s leadership. Activists from the Second Amendment Foundation and the Citizens Committee for the Right to Keep and Bear Arms pushed their way into power.
The NRA’s new leadership was dramatic, dogmatic and overtly ideological. For the first time, the organization formally embraced the idea that the sacred Second Amendment was at the heart of its concerns.
The gun lobby’s lurch rightward was part of a larger conservative backlash that took place across the Republican coalition in the 1970s. One after another, once-sleepy traditional organizations galvanized as conservative activists wrested control.
Conservatives tossed around the language of insurrection with the ardor of a Berkeley Weatherman. The “Revolt at Cincinnati” was followed by the “tax revolt,” which began in California in 1979, and the “sagebrush rebellion” against Interior Department land policies. All these groups shared a deep distrust of the federal government and spoke in the language of libertarianism. They formed a potent new partisan coalition.
Politicians adjusted in turn. The 1972 Republican platform had supported gun control, with a focus on restricting the sale of “cheap handguns.” Just three years later in 1975, preparing to challenge Gerald R. Ford for the Republican nomination, Reagan wrote in Guns & Ammo magazine, “The Second Amendment is clear, or ought to be. It appears to leave little if any leeway for the gun control advocate.” By 1980 the GOP platform proclaimed, “We believe the right of citizens to keep and bear arms must be preserved. Accordingly, we oppose federal registration of firearms.” That year the NRA gave Reagan its first-ever presidential endorsement.
Today at the NRA’s headquarters in Fairfax, Virginia, oversized letters on the facade no longer refer to “marksmanship” and “safety.” Instead, the Second Amendment is emblazoned on a wall of the building’s lobby. Visitors might not notice that the text is incomplete. It reads:
“.. the right of the people to keep and bear arms, shall not be infringed.”
The first half—the part about the well regulated militia—has been edited out.
***
From 1888, when law review articles first were indexed, through 1959, every single one on the Second Amendment concluded it did not guarantee an individual right to a gun. The first to argue otherwise, written by a William and Mary law student named Stuart R. Hays, appeared in 1960. He began by citing an article in the NRA’s American Rifleman magazine and argued that the amendment enforced a “right of revolution,” of which the Southern states availed themselves during what the author called “The War Between the States.”
At first, only a few articles echoed that view. Then, starting in the late 1970s, a squad of attorneys and professors began to churn out law review submissions, dozens of them, at a prodigious rate. Funds—much of them from the NRA—flowed freely. An essay contest, grants to write book reviews, the creation of “Academics for the Second Amendment,” all followed. In 2003, the NRA Foundation provided $1 million to endow the Patrick Henry professorship in constitutional law and the Second Amendment at George Mason University Law School.
This fusillade of scholarship and pseudo-scholarship insisted that the traditional view—shared by courts and historians—was wrong. There had been a colossal constitutional mistake. Two centuries of legal consensus, they argued, must be overturned.
If one delves into the claims these scholars were making, a startling number of them crumble. Historian Jack Rakove, whose Pulitzer-Prize winning book Original Meanings explored the founders’ myriad views, notes, “It is one thing to ransack the sources for a set of useful quotations, another to weigh their interpretive authority. … There are, in fact, only a handful of sources from the period of constitutional formation that bear directly on the questions that lie at the heart of our current controversies about the regulation of privately owned firearms. If Americans has indeed been concerned with the impact of the Constitution on this right … the proponents of individual right theory would not have to recycle the same handful of references … or to rip promising snippets of quotations from the texts and speeches in which they are embedded.”
And there were plenty of promising snippets to rip. There was the ringing declaration from Patrick Henry: “The great object is, that every man be armed.” The eloquent patriot’s declaration provided the title for the ur-text for the gun rights movement, Stephen Halbrook’s 1984 book, That Every Man Be Armed. It is cited reverentially in law review articles and scholarly texts. The Second Amendment professorship at George Mason University is named after Henry. A $10,000 gift to the NRA makes you a “Patrick Henry Member.”
The quote has been plucked from Henry’s speech at Virginia’s ratifying convention for the Constitution in 1788. But if you look at the full text, he was complaining about the cost of both the federal government and the state arming the militia. (“The great object is, that every man be armed,” he said. “At a very great cost, we shall be doubly armed.”) In other words: Sure, let every man be armed, but only once! Far from a ringing statement of individual gun-toting freedom, it was an early American example of a local politician complaining about government waste.
Thomas Jefferson offers numerous opportunities for pro-gun advocates. “Historical research demonstrates the Founders out-‘NRAing’ even the NRA,” proclaimed one prolific scholar. “‘One loves to possess arms’ wrote Thomas Jefferson, the premier intellectual of his day, to George Washington on June 19, 1796.” What a find! Oops: Jefferson was not talking about guns. He was writing to Washington asking for copies of some old letters, to have handy so he could issue a rebuttal in case he got attacked for a decision he made as secretary of state. The NRA website still includes the quote. You can go online to buy a T-shirt emblazoned with Jefferson’s mangled words.
Some of the assumptions were simply funny. In his book on judicial philosophy, Supreme Court Justice Antonin Scalia, for example, lauded Professor Joyce Lee Malcolm’s “excellent study” of English gun rights, noting sarcastically, “she is not a member of the Michigan Militia, but an Englishwoman.” But a historian fact-checked the justice: “Malcolm’s name may sound British, and Bentley College, where Malcolm teaches history, may sound like a college at Oxford, but in fact Malcolm was born and raised in Utica, New York, and Bentley is a business college in Massachusetts.”
Still, all this focus on historical research began to have an impact. And eventually these law professors, many toiling at the fringes of respectability, were joined by a few of academia’s leading lights. Sanford Levinson is a prominent liberal constitutional law professor at the University of Texas at Austin. In 1989, he published an article tweaking other progressives for ignoring “The Embarrassing Second Amendment.” “For too long,” he wrote, “most members of the legal academy have treated the Second Amendment as the equivalent of an embarrassing relative, whose mention brings a quick change of subject to other, more respectable, family members. That will no longer do.” Levinson was soon joined by Akhil Reed Amar of Yale and Harvard’s Laurence Tribe. These prominent progressives had differing opinions on the amendment and its scope. But what mattered was their political provenance—they were liberals! (One is reminded of Robert Frost’s definition of a liberal: someone so open-minded he will not take his own side in an argument.)
***
As the revisionist perspective took hold, government agencies also began to shift. In 1981, Republicans took control of the U.S. Senate for the first time in 24 years. Utah Sen. Orrin Hatch became chair of a key Judiciary Committee panel, where he commissioned a study on “The Right to Keep and Bear Arms.” In a breathless tone it announced, “What the Subcommittee on the Constitution uncovered was clear—and long lost—proof that the second amendment to our Constitution was intended as an individual right of the American citizen to keep and carry arms in a peaceful manner, for protection of himself, his family, and his freedoms.” The cryptologist discovering invisible writing on the back of the Declaration of Independence in the Disney movie National Treasure could not have said it better.
Despite Hatch’s dramatic “discovery,” a constitutional right to gun ownership was still a stretch, even for the conservatives in Reagan’s Justice Department, who were reluctant to undo the work not only of judges, but also of democratically elected legislators. When Ed Meese, Reagan’s attorney general, commissioned a comprehensive strategy for jurisprudential change in 15 areas ranging from the “exclusionary rule” under the Fourth Amendment to public initiatives to private religious education, it did not include a plan for the Second Amendment.
But in time, the NRA’s power to elect presidents began to shift executive branch policies, too. In 2000, gun activists strongly backed Governor George W. Bush of Texas. After the election, Bush’s new attorney general, John Ashcroft, reversed the Justice Department’s stance. The NRA’s head lobbyist read the new policy aloud at its 2001 convention in Kansas City: “The text and original intent of the Second Amendment clearly protect the right of individuals to keep and bear firearms.”
In the meantime, the “individual right” argument was starting to win in another forum: public opinion. In 1959, according to a Gallup poll, 60 percent of Americans favored banning handguns; that dropped to 41 percent by 1975 and 24 percent in 2012. By early 2008, according to Gallup, 73 percent of Americans believed the Second Amendment “guaranteed the rights of Americans to own guns” outside the militia.
Over the past decade, the idea of a Second Amendment right has become synonymous with conservatism, even with support for the Republican Party. In 1993, for example, the New York Times mentioned “gun control” 388 times, and the Second Amendment only 16. By 2008, overall mentions of the issue dropped to 160 but the Second Amendment was mentioned 59 times.
***
In the end, it was neither the NRA nor the Bush administration that pressed the Supreme Court to reverse its centuries-old approach, but a small group of libertarian lawyers who believed other gun advocates were too timid. They targeted a gun law passed by the local government in Washington, D.C., in 1976—perhaps the nation’s strictest—that barred individuals from keeping a loaded handgun at home without a trigger lock. They recruited an appealing plaintiff: Dick Heller, a security guard at the Thurgood Marshall Federal Judiciary Building, who wanted to bring his work revolver home to his high-crime neighborhood. The NRA worried it lacked the five votes necessary to win. The organization tried to sideswipe the effort, filing what Heller’s lawyers called “sham litigation” to give courts an excuse to avoid a constitutional ruling. But the momentum that the NRA itself had set in motion proved unstoppable, and the big case made its way to the Supreme Court.
The argument presented in District of Columbia v. Heller showed just how far the gun rights crusade had come. Nearly all the questions focused on arcane matters of colonial history. Few dealt with preventing gun violence, social science findings or the effectiveness of today’s gun laws—the kinds of things judges might once have considered. On June 26, 2008, the Supreme Court ruled 5-4 that the Second Amendment guarantees a right to own a weapon “in common use” to protect “hearth and home.” Scalia wrote the opinion, which he later called the “vindication” of his judicial philosophy.
After the decision was announced, Heller stood on the steps of the court for a triumphant press conference. Held aloft behind him was a poster bearing that quote from Patrick Henry, unearthed by the scholars who had proven so important for the successful drive: “Let every man be armed.”
* * *
In January 2014, liberal activists jammed a conference room at the Open Society Foundations in New York City. They were there to hear former NRA president David Keene. “Of course, we really just invited David to coax him into giving us the secret of the NRA’s success,” joked the moderator.
Improbably, the gun movement’s triumph has become a template for progressives, many of whom are appalled by the substance of the victories. Keene was joined by Evan Wolfson, the organizer of Freedom to Marry, whose movement has begun to win startling victories for marriage equality in courts. Once, conservatives fumed about activist courts enforcing newly articulated rights—a woman’s right to reproductive choice, equal protection for all races. But just as they learned from the left’s legal victories in those fields, today progressives are trying to re-learn from their conservative counterparts.
One lesson: patience. The fight for gun rights took decades. Another lesson, perhaps obvious: There is no substitute for political organizing. A century ago the satirical character Mr. Dooley famously said in an Irish brogue, “No matter whether th' Constitution follows th' flag or not, the Supreme Coort follows th' iliction returns.” Before social movements can win at the court they must win at the ballot box. The five justices in the Heller majority were all nominated by presidents who themselves were NRA members.
But even more important is this: Activists turned their fight over gun control into a constitutional crusade. Modern political consultants may tell clients that constitutional law and the role of the Supreme Court is too arcane for discussion at the proverbial “kitchen table.” Nonsense. Americans always have been engaged, and at times enraged, by constitutional doctrine. Deep notions of freedom and rights have retained totemic power. Today’s “Second Amendment supporters” recognize that claiming the constitutional high ground goes far toward winning an argument.
Liberal lawyers might once have rushed to court at the slightest provocation. Now, they are starting to realize that a long, full jurisprudential campaign is needed to achieve major goals. Since 2011, activists have waged a widespread public education campaign to persuade citizens that new state laws were illegitimate attempts to curb voting rights, all as a precursor to winning court victories. Now many democracy activists, mortified by recent Supreme Court rulings in campaign finance cases (all with Heller’s same 5-4 split), have begun to map out a path to overturn Citizens United and other recent cases. Years of scholarship, theorizing, amicus briefs, test cases and minority dissents await before a new majority can refashion recent constitutional doctrine.
Molding public opinion is the most important factor. Abraham Lincoln, debating slavery, said in 1858, “Public sentiment is everything. With public sentiment, nothing can fail; without it, nothing can succeed. Consequently he who molds public sentiment goes deeper than he who enacts statutes or pronounces decisions. He makes statutes and decisions possible or impossible to be executed.” The triumph of gun rights reminds us today: If you want to win in the court of law, first win in the court of public opinion.
Rather than celebrate the release of an American prisoner of war, Republicans are using it as an excuse to claim President Obama is weak. Top Republicans on the Senate and House armed services committees went so far as to accuse President Obama of having…
Philosophy goes where hard science can't, or won't. Philosophers have a license to speculate about everything from metaphysics to morality, and this means they can shed light on some of the basic questions of existence. The bad news? These are questions that may always lay just beyond the limits of our comprehension.
Here are eight mysteries of philosophy that we'll probably never resolve.
1. Why is there something rather than nothing?
SEXPAND
Our presence in the universe is something too bizarre for words. The mundaneness of our daily lives cause us take our existence for granted — but every once in awhile we're cajoled out of that complacency and enter into a profound state of existential awareness, and we ask: Why is there all this stuff in the universe, and why is it governed by such exquisitely precise laws? And why should anything exist at all? We inhabit a universe with such things as spiral galaxies, the aurora borealis, and SpongeBob Squarepants. And as Sean Carroll notes, "Nothing about modern physics explains why we have these laws rather than some totally different laws, although physicists sometimes talk that way — a mistake they might be able to avoid if they took philosophers more seriously." And as for the philosophers, the best that they can come up with is the anthropic principle— the notion that our particular universe appears the way it does by virtue of our presence as observers within it — a suggestion that has an uncomfortably tautological ring to it.
Also called the dilemma of determinism, we do not know if our actions are controlled by a causal chain of preceding events (or by some other external influence), or if we're truly free agents making decisions of our own volition. Philosophers (and now some scientists) have been debating this for millennia, and with no apparent end in sight. If our decision making is influenced by an endless chain of causality, then determinism is true and we don't have free will. But if the opposite is true, what's called indeterminism, then our actions must be random — what some argue is still not free will. Conversely, libertarians (no, not political libertarians, those are other people), make the case for compatibilism — the idea that free will is logically compatible with deterministic views of the universe. Compounding the problem are advances in neuroscience showing that our brains make decisions before we're even conscious of them. But if we don't have free will, then why did we evolve consciousness instead of zombie-minds? Quantum mechanics makes this problem even more complicated by suggesting that we live in a universe of probability, and that determinism of any sort is impossible. And as Linas Vepstas has said, "Consciousness seems to be intimately and inescapably tied to the perception of the passage of time, and indeed, the idea that the past is fixed and perfectly deterministic, and that the future is unknowable. This fits well, because if the future were predetermined, then there'd be no free will, and no point in the participation of the passage of time."
4. Does God exist?
SEXPAND
Simply put, we cannot know if God exists or not. Both the atheists and believers are wrong in their proclamations, and the agnostics are right. True agnostics are simply being Cartesian about it, recognizing the epistemological issues involved and the limitations of human inquiry. We do not know enough about the inner workings of the universe to make any sort of grand claim about the nature of reality and whether or not a Prime Mover exists somewhere in the background. Many people defer to naturalism — the suggestion that the universe runs according to autonomous processes — but that doesn't preclude the existence of a grand designer who set the whole thing in motion (what's called deism). And as mentioned earlier, we may live in a simulation where the hacker gods control all the variables. Or perhaps the gnostics are right and powerful beings exist in some deeper reality that we're unaware of. These aren't necessarily the omniscient, omnipotent gods of the Abrahamic traditions — but they're (hypothetically) powerful beings nonetheless. Again, these aren't scientific questions per se — they're more Platonic thought experiments that force us to confront the limits of human experience and inquiry.
5. Is there life after death?
SEXPAND
Before everyone gets excited, this is not a suggestion that we'll all end up strumming harps on some fluffy white cloud, or find ourselves shoveling coal in the depths of Hell for eternity. Because we cannot ask the dead if there's anything on the other side, we're left guessing as to what happens next. Materialists assume that there's no life after death, but it's just that — an assumption that cannot necessarily be proven. Looking closer at the machinations of the universe (or multiverse), whether it be through a classical Newtonian/Einsteinian lens, or through the spooky filter of quantum mechanics, there's no reason to believe that we only have one shot at this thing called life. It's a question of metaphysics and the possibility that the cosmos (what Carl Sagan described as "all that is or ever was or ever will be") cycles and percolates in such a way that lives are infinitely recycled. Hans Moravec put it best when, speaking in relation to the quantum Many Worlds Interpretation, said that non-observance of the universe is impossible; we must always find ourselves alive and observing the universe in some form or another. This is highly speculative stuff, but like the God problem, is one that science cannot yet tackle, leaving it to the philosophers.
6. Can you really experience anything objectively?
SEXPAND
There's a difference between understanding the world objectively (or at least trying to, anyway) and experiencing it through an exclusively objective framework. This is essentially the problem of qualia — the notion that our surroundings can only be observed through the filter of our senses and the cogitations of our minds. Everything you know, everything you've touched, seen, and smelled, has been filtered through any number of physiological and cognitive processes. Subsequently, your subjective experience of the world is unique. In the classic example, the subjective appreciation of the color red may vary from person to person. The only way you could possibly know is if you were to somehow observe the universe from the "conscious lens" of another person in a sort of Being John Malkovich kind of way — not anything we're likely going to be able to accomplish at any stage of our scientific or technological development. Another way of saying all this is that the universe can only be observed through a brain (or potentially a machine mind), and by virtue of that, can only be interpreted subjectively. But given that the universe appears to be coherent and (somewhat) knowable, should we continue to assume that its true objective quality can never be observed or known? It's worth noting that much of Buddhist philosophy is predicated on this fundamental limitation (what they call emptiness), and a complete antithesis to Plato's idealism.
7. What is the best moral system?
Essentially, we'll never truly be able to distinguish between "right" and "wrong" actions. At any given time in history, however, philosophers, theologians, and politicians will claim to have discovered the best way to evaluate human actions and establish the most righteous code of conduct. But it's never that easy. Life is far too messy and complicated for there to be anything like a universal morality or an absolutist ethics. The Golden Rule is great (the idea that you should treat others as you would like them to treat you), but it disregards moral autonomy and leaves no room for the imposition of justice (such as jailing criminals), and can even be used to justify oppression (Immanuel Kant was among its most staunchest critics). Moreover, it's a highly simplified rule of thumb that doesn't provision for more complex scenarios. For example, should the few be spared to save the many? Who has more moral worth: a human baby or a full-grown great ape? And as neuroscientists have shown, morality is not only a culturally-ingrained thing, it's also a part of our psychologies (the Trolly Problem is the best demonstration of this). At best, we can only say that morality is normative, while acknowledging that our sense of right and wrong will change over time.
8. What are numbers?
SEXPAND
We use numbers every day, but taking a step back, what are they, really — and why do they do such a damn good job of helping us explain the universe (such as Newtonian laws)? Mathematical structures can consist of numbers, sets, groups, and points — but are they real objects, or do they simply describe relationships that necessarily exist in all structures? Plato argued that numbers were real (it doesn't matter that you can't "see" them), but formalists insisted that they were merely formal systems (well-defined constructions of abstract thought based on math). This is essentially an ontological problem, where we're left baffled about the true nature of the universe and which aspects of it are human constructs and which are truly tangible.