Tag Archives: USA
Capital punishment has been practiced on American soil for more than 400 years. Historians have documented nearly 16,000 executions, accomplished by burning, hanging, firing squad, electrocution, lethal gas and lethal injection. An untold number of others have doubtlessly occurred yet escaped recognition.
We helped create the University at Albany’s National Death Penalty Archive, a rich repository of primary source material encompassing the long and growing history of the death penalty.
Capital punishment has long been and continues to be controversial, but there is no disputing its historical and contemporary significance. More than 2,700 men and women are currently under sentence of death throughout the U.S., although they are distributed in wildly uneven fashion. California’s death row, by far the nation’s largest, tops out at well over 700, while three or fewer inmates await execution in seven states.
Executions similarly vary markedly by jurisdiction. Texas has been far and away the leader over the last half century, with five times as many executions as the next leading state.
We established the National Death Penalty Archive to help preserve a record of the country’s past and current capital punishment policies and practices, and to ensure that scholars and the general public can gain access to this critical information.
The archive currently holds numerous collections from diverse sources, including academics, activists, litigators and researchers. We remain open to new donations of materials relating to capital punishment. The materials are stored in a climate-controlled environment and are accessible to the public.
One of our prized collections is the voluminous set of execution records compiled by M. Watt Espy Jr. Espy spent more than three decades, encompassing the 1960s into the 1990s, traversing the countryside, collaborating with others to uncover primary and secondary sources documenting more than 15,000 executions carried out in the U.S. between the 1600s and the late 20th century. Espy’s data set has since been updated to include information on executions through 2002.
The National Death Penalty Archive houses the court records, newspaper articles, magazine stories, bulletins, photographs and index cards created for each execution that Espy and his assistants painstakingly collected. These items vividly capture this unparalleled history of executions within the American colonies and the U.S.
Among those documented is the 1944 electrocution in South Carolina of George Stinney Jr., who at age 14 was the youngest person punished by death during the 20th century. Seventy years later, a South Carolina judge vacated Stinney’s conviction, ruling that he did not receive a fair trial.
In July, after the documents are fully digitized, the National Death Penalty Archive will make all of Espy’s materials available online.
Another prized holding consists of nearly 150 boxes of materials from Eugene Wanger. As a delegate to the Michigan Constitutional Convention, Wanger drafted the provision prohibiting capital punishment that was incorporated into the state constitution in 1961.
For more than 50 years, Wanger compiled a treasure trove of items spanning the 18th through 21st centuries relating to the death penalty, including numerous rare documents and paraphernalia. Among the thousands of items in the extensive bibliography are copies of anti-capital-punishment essays written by Pennsylvania’s Benjamin Rush shortly after the nation’s founding.
We also have collected the work of notable scholars. For example, the National Death Penalty Archive houses research completed by the late David Baldus, known primarily for his analysis of racial disparities in the administration of the death penalty; the writings of the late Hugo Adam Bedau, perhaps the country’s leading philosopher on issues of capital punishment; and the papers of the late Ernest van den Haag, a prolific academic proponent of capital punishment.
The National Death Penalty Archive additionally contains more than 150 clemency petitions filed on behalf of condemned prisoners, as well as materials relating to notable U.S. Supreme Court decisions, including Ford v. Wainwright, prohibiting execution of the insane, and Herrera v. Collins, in which the justices were asked to rule that the Constitution forbids executing an innocent person wrongfully sentenced to death.
On the decline
The recent history of capital punishment in the U.S. has been marked by declining popularity and usage. Within the past 15 years, eight states have abandoned the death penalty through legislative repeal or judicial invalidation.
The number of new death sentences imposed annually nationwide has plummeted from more than 300 in the mid-1990s to a fraction of that – just 42 – in 2018. Last year, there were 25 executions in the U.S., down from the modern-era high of 98 in 1999.
Meanwhile, public support for capital punishment as measured by the Gallup Poll registered at 56 percent in 2018, compared to its peak of 80 percent in 1995. Only a few counties, primarily within California and a few southern states, are responsible for sending vastly disproportionate numbers of offenders to death row.
What these trends bode for the future of the death penalty in the U.S. remains to be seen. When later generations reflect on the nation’s long and complicated history with the death penalty, we hope that the National Death Penalty Archive will offer important insights into the currents that have helped shape it.
James Acker, Distinguished Teaching Professor of Criminal Justice, University at Albany, State University of New York and Brian Keough, Co-Director, National Death Penalty Archive, University at Albany, State University of New York
Rock inscriptions made by crews from two North American whaleships in the early 19th century were found superimposed over earlier Aboriginal engravings in the Dampier Archipelago.
Details of the find in northern Western Australia are in a paper published today in Antiquity.
They provide the earliest evidence for North American whalers’ memorialising practices in Australia, and have substantial implications for maritime history.
At the time, the Dampier Archipelago (Murujuga) was home to the Yaburara people. The rock art across the archipelago is testament to their artists asserting their connections to this place for millennia.
So did the whalers encounter the Yaburara? Did they engrave over earlier Aboriginal markings as an act of assertion, a realignment of a shifting political landscape? Or were they simply marking a milestone in their multi-year voyages, celebrating landfall after many months at sea?
The answer to all these questions is, we don’t know.
But these inscriptions provide a rare insight into the lives of whalers, filling a gap in our knowledge about this earliest industry on our northwestern coast.
Such historical inscriptions might be dismissed as graffiti. However, like other rock art, they tell important stories about our human past that cannot be gleaned from other sources.
Whaling in Australia
Ship-based whaling was a global phenomenon that lasted centuries. At its peak in the mid-19th century, around 900 wooden sailing ships were at sea on multi-year voyages, crewed by around 22,000 whalemen.
Most whaling in Australian waters was conducted by foreign vessels, and in the 19th century North American whalers dominated the globe.
Whaling led to some of the earliest contacts between American, European and a range of indigenous societies in Africa, Australasia and the Pacific.
But early visits by foreign whalers to Australia’s northwest are poorly documented given the absence of a British colonial land-based presence in the area until the 1860s.
While explorer William Dampier named the Dampier Archipelago and Rosemary Island in 1699, British naval Captain Phillip Parker King was the first to document encounters with the Yaburara people in 1818. His visit to the archipelago in the rainy season (February) coincided with large groups of people using the seasonally abundant resources at this time.
The Swan River Colony (Perth) was established in 1829, but permanent European colonisation of the northwest only began in the early 1860s with an influx of pastoralists and pearlers.
Early whaling contact
A few surviving ship logbooks record English and North American whalers on the Dampier Archipelago from 1801, but the heyday of whaling near “The Rosemary Islands” was between the 1840s and 1860s.
The logbooks describe American whaling ships worked together to hunt herds of humpback whales, which migrate along Australia’s northwest coastline during the winter months.
The ships’ crews made landfall to collect firewood and drinking water, and to post lookouts on vantage points to assist in sighting whales for the open boats to pursue.
Research by archaeologists from the University of Western Australia working with the Murujuga Aboriginal Corporation and industry partner Rio Tinto has found some evidence of two such landfalls in inscriptions from the crew of two North American whalers – the Connecticut and the Delta.
The earliest of these inscriptions records that the Connecticut visited Rosemary Island on August 18 1842. At least part of this inscription was made by Jacob Anderson, identified from the Connecticut’s crew list as a 19-year-old African-American sailor.
Research shows this set of ships’ and people’s names was placed over an earlier set of Aboriginal grid motifs. This was along a ridgeline that has millennia of evidence for the Yaburara producing rock art and raising standing stones and quarrying tool-stone elevated above this seascape.
The dates and names found in the inscription correlate with port records that show the Connecticut left the town of New London in Connecticut, US, for the New Holland ground (as the waters off Australia’s northwest were known) in 1841, with Captain Daniel Crocker and a crew of 26.
The Connecticut returned to New London on June 16 1843, with 1,800 barrels of oil, travelling via Fremantle, New Zealand and Cape Horn.
The Connecticut’s logbook for the voyage is missing, so without these inscriptions we would know nothing of this ship’s visit to the Dampier Archipelago.
On another island, another set of inscriptions record a visit to a similar vantage point by crew of the Delta on July 12 1849.
Registered in Greenport, New York, the Delta made 18 global whaling voyages between 1832 and 1856. Its logbook confirms it was whaling in the Dampier Archipelago between June 2 and September 8 1849.
While the log records crew members going ashore to shoot kangaroos and collect water, no mention is made of them making inscriptions or having any contact with Yaburara people.
Given it was the dry season, and the lack of permanent water on the islands, this lack of contact is not surprising.
But again, these whalers chose to make their marks on surfaces that were already marked by the Yaburara. By recording their presence at these speciﬁc historical moments, the whalers continued the long tradition of the Yaburara in interacting with and marking their maritime environment.
Protecting the heritage
Between 1822 and 1963, whalers killed more than 26,000 southern right whales (Eubalaena australis) and 40,000 humpback whales (Megaptera novaengliae) in Australia and New Zealand, driving populations to near-extinction.
Today there are signs of renewal, with whale populations increasing, and Aboriginal people are reclaiming responsibility for management of the archipelago.
There is a strong push for World Heritage Listing of Murujuga — one of the most significant concentrations for human artistic creativity on the planet, recording millennia of human responses to the sustainable use of this productive landscape.
These two whaling inscriptions provide the only known archaeological insight into this earliest global resource extraction in Australia’s northwest – the whale oil industry – which began over two centuries ago.
They demonstrate yet again the unique capacity of Murujuga’s rock art to shed light on previously unknown details of our shared human history.
Jo McDonald, Director, Centre for Rock Art Research + Management, University of Western Australia; Alistair Paterson, ARC Future Fellow, University of Western Australia, and Ross Anderson, Curator of Maritime Archaeology, Western Australian Museum
No one doubts the job of president of the United States is stressful and demanding. The chief executive deserves downtime.
But how much is enough, and when is it too much?
These questions came into focus after Axios’ release of President Donald Trump’s schedule. The hours blocked off for nebulous “executive time” seem, to many critics, disproportionate to the number of scheduled working hours.
While Trump’s workdays may ultimately prove to be shorter than those of past presidents, he’s not the first to face criticism. For every president praised for his work ethic, there’s one disparaged for sleeping on the job.
Teddy Roosevelt, locomotive president
Before Theodore Roosevelt ascended to the presidency in 1901, the question of how hard a president toiled was of little concern to Americans.
Except in times of national crisis, his predecessors neither labored under the same expectations, nor faced the same level of popular scrutiny. Since the country’s founding, Congress had been the main engine for identifying national problems and outlining legislative solutions. Congressmen were generally more accessible to journalists than the president was.
But when Roosevelt shifted the balance of power from Congress to the White House, he created the expectation that an activist president, consumed by affairs of state, would work endlessly in the best interests of the people.
Roosevelt, whom Sen. Joseph Foraker called a “steam engine in trousers,” personified the hard-working chief executive. He filled his days with official functions and unofficial gatherings. He asserted his personality on policy and stamped the presidency firmly on the nation’s consciousness.
Taft had a tough act to follow
His successor, William Howard Taft, suffered by comparison. While it’s fair to observe that nearly anyone would have looked like a slacker compared with Roosevelt, it didn’t help that Taft weighed 300 pounds, which his contemporaries equated with laziness.
Taft helped neither his cause nor his image when he snored through meetings, at evening entertainments and, as author Jeffrey Rosen noted, “even while standing at public events.” Watching Taft’s eyelids close, Sen. James Watson said to him, “Mr. President, you are the largest audience I ever put entirely to sleep.”
An early biographer called Taft “slow-moving, easy-going if not lazy” with “a placid nature.” Others have suggested that Taft’s obesity caused sleep apnea and daytime drowsiness, a finding not inconsistent with historian Lewis L. Gould’s conclusion that Taft was capable of work “at an intense pace” and “a high rate of efficiency.”
It seems that Taft could work quickly, but in short bursts.
Coolidge the snoozer
Other presidents were more intentional about their daytime sleeping. Calvin Coolidge’s penchant for hourlong naps after lunch earned him amused scorn from contemporaries. But when he missed his nap, he fell asleep at afternoon meetings. He even napped on vacation. Tourists stared in amazement as the president, blissfully unaware, swayed in a hammock on his front porch in Vermont.
This, for many Republicans, wasn’t a problem: The Republican Party of the 1920s was averse to an activist federal government, so the fact that Coolidge wasn’t seen as a hard-charging, incessantly busy president was fine.
Biographer Amity Shlaes wrote that “Coolidge made a virtue of inaction” while simultaneously exhibiting “a ferocious discipline in work.” Political scientist Robert Gilbert argued that after Coolidge’s son died during his first year as president, Coolidge’s “affinity for sleep became more extreme.” Grief, according to Gilbert, explained his growing penchant for slumbering, which expanded into a pre-lunch nap, a two- to four-hour post-lunch snooze and 11 hours of shut-eye nightly.
For Reagan, the jury’s out
Ronald Reagan may have had a tendency to nod off.
“I have left orders to be awakened at any time in case of a national emergency – even if I’m in a cabinet meeting,” he joked. Word got out that he napped daily, and historian Michael Schaller wrote in 1994 that Reagan’s staff “released a false daily schedule that showed him working long hours,” labeling his afternoon nap “personal staff time.” But some family members denied that he napped in the White House.
Journalists were divided. Some found him “lazy, passive, stupid or even senile” and “intellectually lazy … without a constant curiosity,” while others claimed he was “a hard worker,” who put in long days and worked over lunch. Perhaps age played a role in Reagan’s naps – if they happened at all.
Clinton crams in the hours
One president not prone to napping was Bill Clinton. Frustrated that he could not find time to think, Clinton ordered a formal study of how he spent his days. His ideal was four hours in the afternoon “to talk to people, to read, to do whatever.” Sometimes he got half that much.
Two years later, a second study found that, during Clinton’s 50-hour workweek, “regularly scheduled meetings” took up 29 percent of his time, “public events, etc.” made up 36 percent of his workday, while “thinking time – phone & office work” constituted 35 percent of his day. Unlike presidents whose somnolence drew sneers, Clinton was disparaged for working too much and driving his staff to exhaustion with all-nighters.
Partisanship at the heart of criticism?
The work of being president of the United States never ends. There is always more to be done. Personal time may be a myth, as whatever the president reads, watches or does can almost certainly be applied to some aspect of the job.
Trump’s “executive time” could be a rational response to the demands of the job or life circumstances. Trump, for example, only seems to get four or five hours of sleep a night, which seems to suggest that he has more time to tackle his daily duties than the rest of us.
But, like his predecessors, the appearance of taking time away from running the country will garner criticism. Though they can sometimes catch 40 winks, presidents can seldom catch a break.
Sometime in the autumn of 1621, a group of English Pilgrims who had crossed the Atlantic Ocean and created a colony called New Plymouth celebrated their first harvest.
They hosted a group of about 90 Wampanoags, their Algonquian-speaking neighbors. Together, migrants and Natives feasted for three days on corn, venison and fowl.
In their bountiful yield, the Pilgrims likely saw a divine hand at work.
As Gov. William Bradford wrote in 1623, “Instead of famine now God gave them plenty, and the face of things was changed, to the rejoicing of the hearts of many, for which they blessed God.”
But my recent research on the ways Europeans understood the Western Hemisphere shows that – despite the Pilgrims’ version of events – their survival largely hinged on two unrelated developments: an epidemic that swept through the region and a repository of advice from earlier explorers.
A ‘desolate wilderness’ or ‘Paradise of all parts’?
Bradford’s “Of Plymouth Plantation,” which he began to write in 1630 and finished two decades later, traces the history of the Pilgrims from their persecution in England to their new home along the shores of modern Boston Harbor.
Bradford and other Pilgrims believed in predestination. Every event in their lives marked a stage in the unfolding of a divine plan, which often echoed the experiences of the ancient Israelites.
Throughout his account, Bradford probed Scripture for signs. He wrote that the Puritans arrived in “a hideous and desolate wilderness, full of wild beasts and wild men.” They were surrounded by forests “full of woods and thickets,” and they lacked the kind of view Moses had on Mount Pisgah, after successfully leading the Israelites to Canaan.
Drawing on chapter 26 of the Book of Deuteronomy, Bradford declared that the English “were ready to perish in this wilderness,” but God had heard their cries and helped them. Bradford paraphrased from Psalm 107 when he wrote that the settlers should “praise the Lord” who had “delivered them from the hand of the oppressor.”
If you were reading Bradford’s version of events, you might think that the survival of the Pilgrims’ settlements was often in danger. But the situation on the ground wasn’t as dire as Bradford claimed.
Earlier European visitors had described pleasant shorelines and prosperous indigenous communities. In 1605, the French explorer Samuel de Champlain sailed past the site the Pilgrims would later colonize and noted that there were “a great many cabins and gardens.” He even provided a drawing of the region, which depicted small Native towns surrounded by fields.
About a decade later Captain John Smith, who coined the term “New England,” wrote that the Massachusetts, a nearby indigenous group, inhabited what he described as “the Paradise of all those parts.”
‘A wonderful plague’
Champlain and Smith understood that any Europeans who wanted to establish communities in this region would need either to compete with Natives or find ways to extract resources with their support.
But after Champlain and Smith visited, a terrible illness spread through the region. Modern scholars have argued that indigenous communities were devastated by leptospirosis, a disease caused by Old World bacteria that had likely reached New England through the feces of rats that arrived on European ships.
The absence of accurate statistics makes it impossible to know the ultimate toll, but perhaps up to 90 percent of the regional population perished between 1617 to 1619.
To the English, divine intervention had paved the way.
“By God’s visitation, reigned a wonderful plague,” King James’ patent for the region noted in 1620, “that had led to the utter Destruction, Devastacion, and Depopulation of that whole territory.”
The epidemic benefited the Pilgrims, who arrived soon thereafter: The best land had fewer residents and there was less competition for local resources, while the Natives who had survived proved eager trading partners.
The wisdom of those who came before
Just as important, the Pilgrims understood what to do with the land.
By the time that these English planned their communities, knowledge of the Atlantic coast of North America was widely available.
Those hoping to create new settlements had read accounts of earlier European migrants who had established European-style villages near the water, notably along the shores of Chesapeake Bay, where the English had founded Jamestown in 1607.
These first English migrants to Jamestown endured terrible disease and arrived during a period of drought and colder-than-normal winters. The migrants to Roanoke on the outer banks of Carolina, where the English had gone in the 1580s, disappeared. And a brief effort to settle the coast of Maine in 1607 and 1608 failed because of an unusually bitter winter.
Many of these migrants died or gave up. But none disappeared without record, and their stories circulated in books printed in London. Every English effort before 1620 had produced accounts useful to would-be colonizers.
The most famous account, by the English mathematician Thomas Harriot, enumerated the commodities that the English could extract from America’s fields and forests in a report he first published in 1588.
The artist John White, who was on the same mission to modern Carolina, painted a watercolor depicting the wide assortment of marine life that could be harvested, another of large fish on a grill, and a third showing the fertility of fields at the town of Secotan. By the mid-1610s, actual commodities had started to arrive in England too, providing support for those who had claimed that North American colonies could be profitable. The most important of these imports was tobacco, which many Europeans considered a wonder drug capable of curing a wide range of human ailments.
These reports (and imports) encouraged many English promoters to lay plans for colonization as a way to increase their wealth. But those who thought about going to New England, especially the Pilgrims who were kindred souls of Bradford, believed that there were higher rewards to be reaped.
Bradford and the other Puritans who arrived in Massachusetts often wrote about their experience through the lens of suffering and salvation.
But the Pilgrims were better equipped to survive than they let on.
Peter C. Mancall, Andrew W. Mellon Professor of the Humanities, University of Southern California – Dornsife College of Letters, Arts and Sciences
In 1492, when Christopher Columbus crossed the Atlantic Ocean in search of a fast route to East Asia and the southwest Pacific, he landed in a place that was unknown to him. There he found treasures – extraordinary trees, birds and gold.
But there was one thing that Columbus expected to find that he didn’t.
Upon his return, in his official report, Columbus noted that he had “discovered a great many islands inhabited by people without number.” He praised the natural wonders of the islands.
But, he added, “I have not found any monstrous men in these islands, as many had thought.”
Why, one might ask, had he expected to find monsters?
My research and that of other historians reveal that Columbus’ views were far from abnormal. For centuries, European intellectuals had imagined a world beyond their borders populated by “monstrous races.”
Of course the ‘monstrous races’ exist
One of the earliest accounts of these non-human beings was written by the Roman natural historian Pliny the Elder in 77 A.D. In a massive treatise, he told his readers about dog-headed people, known as cynocephalus, and astoni, creatures with no mouth and no need to eat.
Across medieval Europe, tales of marvelous and inhuman creatures – of cyclops, blemmyes, creatures with heads in their chests, and sciapods, who had a single leg with a giant foot – circulated in manuscripts hand-copied by scribes who often embellished their treatises with illustrations of these fantastic creatures.
Though there were always some skeptics, most Europeans believed that distant lands would be populated by these monsters, and stories of monsters traveled far beyond the rarefied libraries of elite readers.
For example, churchgoers in Fréjus, an ancient market town in the south of France, could wander into the cloister of the Cathédrale Saint-Léonce and study monsters on the more than 1,200 painted wooden ceiling panels. Some panels portrayed scenes of daily life – local monks, a man riding a pig and contorted acrobats. Many others depicted monstrous hybrids, dog-headed people, blemmyes and other fearsome wretches.
Perhaps no one did more to spread news of monsters’ existence than a 14th-century English knight named John Mandeville, who, in his account of his travels to faraway lands, claimed to have seen people with the ears of an elephant, one group of creatures who had flat faces with two holes, and another that had the head of a man and the body of a goat.
Scholars debate whether Mandeville could have ventured far enough to see the places that he described, and whether he was even a real person. But his book was copied time and again, and likely translated into every known European language.
Leonardo da Vinci had a copy. So did Columbus.
Old beliefs die hard
Even though Columbus didn’t see monsters, his report wasn’t enough to dislodge prevailing ideas about the creatures Europeans expected to find in parts unknown.
In 1493 – around the time Columbus’ first report began to circulate – printers of the “Nuremberg Chronicle,” a massive volume of history, included images and descriptions of monsters. And soon after the explorer’s return, an Italian poet offered a verse translation describing Columbus’ journey, which its printer illustrated with monsters, including a sciapod and a blemmye.
Indeed, the belief that monsters lived at the Earth’s edge remained for generations.
In the 1590s, the English explorer Sir Walter Raleigh told readers about the American monsters he heard about in his travels to Guiana, some of which had “their eyes in their shoulders, and their mouths in the middle of their breasts, & that a long train of haire groweth backward between their shoulders.”
Soon after, the English natural historian Edward Topsell translated a mid-16th-century treatise of the various animals of the world, a book that appeared in London in 1607, the same year that colonists established a small community at Jamestown, Virginia. Topsell was eager to integrate descriptions of American animals in his book.
But alongside chapters on Old World horses, pigs and beavers, readers learned about the “Norwegian monster” and a “very deformed beast” that Americans called an “haut.” Another, known as a “su,” had “a very deformed shape, and monstrous presence” and was “cruell, untamable, impatient, violent, [and] ravening.”
Of course, in the New World, the gains for Europeans came at a terrifying cost for Native Americans: The newcomers stole their land and treasures, enslaved them, introduced Old World diseases and spurred long-term environmental change.
In the end, perhaps these indigenous Americans saw the invaders of their homelands as a ‘monstrous race’ of its own – creatures who destabilized their communities, took their possessions and threatened their lives.
Peter C. Mancall, Andrew W. Mellon Professor of the Humanities, University of Southern California – Dornsife College of Letters, Arts and Sciences
This article is part of our series of explainers on key moments in the past 100 years of world political history. In it, our authors examine how and why an event unfolded, its impact at the time, and its relevance to politics today.
At 8:46am on a sunny Tuesday morning in New York City, a commercial jet plane flew into the North Tower of the World Trade Centre, cutting through floors 93 to 99.
As the news was beamed around the world, shaken reporters wondered whether the crash had been an accident or an act of terrorism. At 9:03am, viewers watching the smoke billowing from the gash in the building were stunned to see a second jet plane dart into view and fly directly into the South Tower. Suddenly, it was clear that the United States was under attack.
The scale of the assault became apparent about 40 minutes later, when a third jet crashed into the Pentagon. Not long after, in the fourth shock of the morning, the South Tower of the World Trade Centre unexpectedly crumbled to the ground in a few seconds, its structural integrity destroyed by the inferno set off by the plane’s thousands of gallons of jet fuel. Its twin soon succumbed to the same fate.
Over the next days and weeks, the world learned that 19 militants belonging to the Islamic terrorist group, al Qaeda, armed with box cutters and knives missed by airport security, had hijacked four planes.
Three hit their targets. The fourth, intended for the White House or the Capitol, crashed in a field in Pennsylvania when passengers, who had learned of the other attacks, struggled for control of the plane. All told, close to 3,000 people were killed and 6,000 were injured.
Immediate impact of the attacks
The events of 9/11 seared the American psyche. A country whose continental states had not seen a major attack in nearly 200 years was stunned to find that its financial and military centres had been hit by a small terrorist group based thousands of miles away. More mass attacks suddenly seemed not just probable but inevitable.
The catastrophe set in motion a sequence of reactions and unintended consequences that continue to reverberate today. Its most lasting and consequential effects are interlinked: a massively expensive and unending “war on terror”, heightened suspicion of government and the media in many democratic countries, a sharp uptick in Western antagonism toward Muslims, and the decline of US power alongside rising international disorder – developments that aided the rise of Donald Trump and leaders like him.
War without end?
Just weeks after 9/11, the administration of US President George W. Bush invaded Afghanistan with the aim of destroying al Qaeda, which had been granted safe haven by the extremist Taliban regime. With the support of dozens of allies, the invasion quickly toppled the Taliban government and crippled al Qaeda. But it was not until 2011, under President Barack Obama, that US forces found and killed al Qaeda’s leader and 9/11 mastermind – Osama bin Laden.
Though there have been efforts to end formal combat operations since then, over 10,000 US troops remain in Afghanistan today, fighting an intensifying Taliban insurgency. It is now the longest war the United States has fought. Far from being eradicated, the Taliban is active in most of the country. Even though the war’s price tag is nearing a trillion dollars, domestic pressure to end the war is minimal, thanks to an all volunteer army and relatively low casualties that make the war seem remote and abstract to most Americans.
Even more consequential has been the second major armed conflict triggered by 9/11: the US-led invasion of Iraq in 2003. Although Iraqi dictator Saddam Hussein was not linked to 9/11, officials in the administration of George W. Bush were convinced his brutal regime was a major threat to world order. This is largely due to Saddam Hussein’s past aggression, his willingness to defy the United States, and his aspirations to build or expand nuclear, chemical, and biological weapons programs, making it seem likely that he would help groups planning terrorist attacks on the West.
The invading forces quickly ousted Saddam, but the poorly executed, error-ridden occupation destabilised the entire region.
In Iraq, it triggered a massive, long-running insurgency. In the Middle East more broadly, it boosted Iran’s regional influence, fostered the rise of the Islamic State, and created lasting disorder that has led to civil wars, countless terrorist attacks, and radicalisation.
In many parts of the world, the war fuelled anti-Americanism; in Europe, public opinion about the war set in motion a widening estrangement between the United States and its key European allies.
Monetary and social costs
Today, the United States spends US$32 million every hour on the wars fought since 9/11. The total cost is over US$5,600,000,000,000. (5.6 trillion dollars). The so-called war on terror has spread into 76 countries where the US military is now conducting counter-terror activities, ranging from drone strikes to surveillance operations.
The mind-boggling sums have been financed by borrowing, which has increased social inequality in the United States. Some observers have suggested that government war spending was even more important than financial deregulation in causing the 2007-2008 Global Financial Crisis.
The post-9/11 era has eroded civil liberties across the world. Many governments have cited the urgent need to prevent future attacks as justification for increased surveillance of citizens, curbing of dissent, and enhanced capacity to detain suspects without charge.
The well publicised missteps of the FBI and the CIA in failing to detect and prevent the 9/11 plot, despite ample warnings, fed public distrust of intelligence and law enforcement agencies. Faulty intelligence about what turned out to be nonexistent Iraqi “weapons of mass destruction” (WMDs) undermined public confidence not only in the governments that touted those claims but also in the media for purveying false information.
The result has been a climate of widespread distrust of the voices of authority. In the United States and in other countries, citizens are increasingly suspicious of government sources and the media — at times even questioning whether truth is knowable. The consequences for democracy are dire.
Across the West, 9/11 also set off a wave of Islamophobia. Having fought a decades-long Cold War not long before, Americans framed the attack as a struggle of good versus evil, casting radical Islam as the latest enemy. In many countries, voices in the media and in politics used the extremist views and actions of Islamic terrorists to castigate Muslims in general. Since 9/11, Muslims in the United States and elsewhere have experienced harassment and violence.
In Western countries, Muslims are now often treated as the most significant public enemy. European populists have risen to power by denouncing refugees from Muslim majority countries like Syria, and the willingness and ability of Muslims to assimilate is viewed with increasing scepticism.
A week after his inauguration, US President Donald Trump kept a campaign promise by signing the so-called “Muslim ban”, designed to prevent citizens of six Muslim-majority countries from entering the United States.
One of the most widely expected consequences of 9/11 has so far been averted. Though Islamic terrorists have engaged in successful attacks in the West since 9/11, including the 2002 Bali bombings, the 2004 Madrid train bombings, and the 2015 attacks in Paris, there has been no attack on the scale of 9/11. Instead, it is countries with large Muslim populations that have seen a rise in terrorist attacks.
Yet the West still pays the price for its militant and militarised response to terrorism through the weakening of democratic norms and values. The unleashing of US military power that was supposed to intimidate terrorists has diminished America’s might, creating a key precondition for Donald Trump’s promise to restore American greatness.
Although many of the issues confronting us today have very long roots, the world we live in has been indelibly shaped by 9/11 and its aftermath.