Category Archives: USA

Hawaii


Advertisements

Celtic America?



The death penalty, an American tradition on the decline



File 20190110 43520 tj02n8.jpg?ixlib=rb 1.1
George Stinney, a 14-year old wrongfully executed for murder in 1944.
M. Watt Espy Papers, University at Albany, CC BY-ND

James Acker, University at Albany, State University of New York and Brian Keough, University at Albany, State University of New York

Capital punishment has been practiced on American soil for more than 400 years. Historians have documented nearly 16,000 executions, accomplished by burning, hanging, firing squad, electrocution, lethal gas and lethal injection. An untold number of others have doubtlessly occurred yet escaped recognition.

The book ‘Not in Our Name: Murder Victims Families Speak Out against the Death Penalty,’ published in 1997.
Murder Victims’ Families for Reconciliation Records, University at Albany, CC BY-ND

We helped create the University at Albany’s National Death Penalty Archive, a rich repository of primary source material encompassing the long and growing history of the death penalty.

Capital punishment has long been and continues to be controversial, but there is no disputing its historical and contemporary significance. More than 2,700 men and women are currently under sentence of death throughout the U.S., although they are distributed in wildly uneven fashion. California’s death row, by far the nation’s largest, tops out at well over 700, while three or fewer inmates await execution in seven states.

Executions similarly vary markedly by jurisdiction. Texas has been far and away the leader over the last half century, with five times as many executions as the next leading state.

https://datawrapper.dwcdn.net/mVj62/1/

Prized archives

We established the National Death Penalty Archive to help preserve a record of the country’s past and current capital punishment policies and practices, and to ensure that scholars and the general public can gain access to this critical information.

The archive currently holds numerous collections from diverse sources, including academics, activists, litigators and researchers. We remain open to new donations of materials relating to capital punishment. The materials are stored in a climate-controlled environment and are accessible to the public.

One of our prized collections is the voluminous set of execution records compiled by M. Watt Espy Jr. Espy spent more than three decades, encompassing the 1960s into the 1990s, traversing the countryside, collaborating with others to uncover primary and secondary sources documenting more than 15,000 executions carried out in the U.S. between the 1600s and the late 20th century. Espy’s data set has since been updated to include information on executions through 2002.

https://datawrapper.dwcdn.net/UOBeT/1/

The National Death Penalty Archive houses the court records, newspaper articles, magazine stories, bulletins, photographs and index cards created for each execution that Espy and his assistants painstakingly collected. These items vividly capture this unparalleled history of executions within the American colonies and the U.S.

Among those documented is the 1944 electrocution in South Carolina of George Stinney Jr., who at age 14 was the youngest person punished by death during the 20th century. Seventy years later, a South Carolina judge vacated Stinney’s conviction, ruling that he did not receive a fair trial.

In July, after the documents are fully digitized, the National Death Penalty Archive will make all of Espy’s materials available online.

Index card about George Stinney, created by death penalty historian M. Watt Espy.
M. Watt Espy Papers, University at Albany, CC BY-ND

Other papers

Another prized holding consists of nearly 150 boxes of materials from Eugene Wanger. As a delegate to the Michigan Constitutional Convention, Wanger drafted the provision prohibiting capital punishment that was incorporated into the state constitution in 1961.

For more than 50 years, Wanger compiled a treasure trove of items spanning the 18th through 21st centuries relating to the death penalty, including numerous rare documents and paraphernalia. Among the thousands of items in the extensive bibliography are copies of anti-capital-punishment essays written by Pennsylvania’s Benjamin Rush shortly after the nation’s founding.

We also have collected the work of notable scholars. For example, the National Death Penalty Archive houses research completed by the late David Baldus, known primarily for his analysis of racial disparities in the administration of the death penalty; the writings of the late Hugo Adam Bedau, perhaps the country’s leading philosopher on issues of capital punishment; and the papers of the late Ernest van den Haag, a prolific academic proponent of capital punishment.

The National Death Penalty Archive additionally contains more than 150 clemency petitions filed on behalf of condemned prisoners, as well as materials relating to notable U.S. Supreme Court decisions, including Ford v. Wainwright, prohibiting execution of the insane, and Herrera v. Collins, in which the justices were asked to rule that the Constitution forbids executing an innocent person wrongfully sentenced to death.

Remarks by Hugo Bedau against the death penalty in 1958.
Hugo Bedau Papers, University at Albany, CC BY-ND

On the decline

The recent history of capital punishment in the U.S. has been marked by declining popularity and usage. Within the past 15 years, eight states have abandoned the death penalty through legislative repeal or judicial invalidation.

The number of new death sentences imposed annually nationwide has plummeted from more than 300 in the mid-1990s to a fraction of that – just 42 – in 2018. Last year, there were 25 executions in the U.S., down from the modern-era high of 98 in 1999.

Meanwhile, public support for capital punishment as measured by the Gallup Poll registered at 56 percent in 2018, compared to its peak of 80 percent in 1995. Only a few counties, primarily within California and a few southern states, are responsible for sending vastly disproportionate numbers of offenders to death row.

https://datawrapper.dwcdn.net/atkRA/1/

What these trends bode for the future of the death penalty in the U.S. remains to be seen. When later generations reflect on the nation’s long and complicated history with the death penalty, we hope that the National Death Penalty Archive will offer important insights into the currents that have helped shape it.The Conversation

James Acker, Distinguished Teaching Professor of Criminal Justice, University at Albany, State University of New York and Brian Keough, Co-Director, National Death Penalty Archive, University at Albany, State University of New York

This article is republished from The Conversation under a Creative Commons license. Read the original article.


Rock art shows early contact with US whalers on Australia’s remote northwest coast



File 20190130 39344 k0qvoi.png?ixlib=rb 1.1
Detail of the Connecticut Inscription, with image enhancement.
Centre for Rock Art Research and Management database, Author provided

Jo McDonald, University of Western Australia; Alistair Paterson, University of Western Australia, and Ross Anderson, Western Australian Museum

Rock inscriptions made by crews from two North American whaleships in the early 19th century were found superimposed over earlier Aboriginal engravings in the Dampier Archipelago.

Details of the find in northern Western Australia are in a paper published today in Antiquity.

They provide the earliest evidence for North American whalers’ memorialising practices in Australia, and have substantial implications for maritime history.

At the time, the Dampier Archipelago (Murujuga) was home to the Yaburara people. The rock art across the archipelago is testament to their artists asserting their connections to this place for millennia.




Read more:
Where art meets industry: protecting the spectacular rock art of the Burrup Peninsula


So did the whalers encounter the Yaburara? Did they engrave over earlier Aboriginal markings as an act of assertion, a realignment of a shifting political landscape? Or were they simply marking a milestone in their multi-year voyages, celebrating landfall after many months at sea?

The answer to all these questions is, we don’t know.

But these inscriptions provide a rare insight into the lives of whalers, filling a gap in our knowledge about this earliest industry on our northwestern coast.

Such historical inscriptions might be dismissed as graffiti. However, like other rock art, they tell important stories about our human past that cannot be gleaned from other sources.

Whaling in Australia

Ship-based whaling was a global phenomenon that lasted centuries. At its peak in the mid-19th century, around 900 wooden sailing ships were at sea on multi-year voyages, crewed by around 22,000 whalemen.

Most whaling in Australian waters was conducted by foreign vessels, and in the 19th century North American whalers dominated the globe.

Illustration of an American whaling ship in the 19th century.
Dr Kenneth McPherson, Indian Ocean Collection, WA Museum (with permission), Author provided

Whaling led to some of the earliest contacts between American, European and a range of indigenous societies in Africa, Australasia and the Pacific.

But early visits by foreign whalers to Australia’s northwest are poorly documented given the absence of a British colonial land-based presence in the area until the 1860s.

While explorer William Dampier named the Dampier Archipelago and Rosemary Island in 1699, British naval Captain Phillip Parker King was the first to document encounters with the Yaburara people in 1818. His visit to the archipelago in the rainy season (February) coincided with large groups of people using the seasonally abundant resources at this time.

The Swan River Colony (Perth) was established in 1829, but permanent European colonisation of the northwest only began in the early 1860s with an influx of pastoralists and pearlers.

For the Yaburara, this colonisation was catastrophic. It culminated in the Flying Foam Massacre in 1868 in which many Yaburara people were killed.

Early whaling contact

A few surviving ship logbooks record English and North American whalers on the Dampier Archipelago from 1801, but the heyday of whaling near “The Rosemary Islands” was between the 1840s and 1860s.

The logbooks describe American whaling ships worked together to hunt herds of humpback whales, which migrate along Australia’s northwest coastline during the winter months.

The ships’ crews made landfall to collect firewood and drinking water, and to post lookouts on vantage points to assist in sighting whales for the open boats to pursue.

Research by archaeologists from the University of Western Australia working with the Murujuga Aboriginal Corporation and industry partner Rio Tinto has found some evidence of two such landfalls in inscriptions from the crew of two North American whalers – the Connecticut and the Delta.

The earliest of these inscriptions records that the Connecticut visited Rosemary Island on August 18 1842. At least part of this inscription was made by Jacob Anderson, identified from the Connecticut’s crew list as a 19-year-old African-American sailor.

Research shows this set of ships’ and people’s names was placed over an earlier set of Aboriginal grid motifs. This was along a ridgeline that has millennia of evidence for the Yaburara producing rock art and raising standing stones and quarrying tool-stone elevated above this seascape.

Visualising the Connecticut inscription.

The dates and names found in the inscription correlate with port records that show the Connecticut left the town of New London in Connecticut, US, for the New Holland ground (as the waters off Australia’s northwest were known) in 1841, with Captain Daniel Crocker and a crew of 26.

Connecticut inscription, tracing by Ken Mulvaney.
Antiquity, Paterson et al 2019 (with permission)

The Connecticut returned to New London on June 16 1843, with 1,800 barrels of oil, travelling via Fremantle, New Zealand and Cape Horn.

The largest of the Connecticut inscriptions showing micro-analysis of the inscription over the Aboriginal engravings.
Antiquity, Paterson et al 2019 (with permission)

The Connecticut’s logbook for the voyage is missing, so without these inscriptions we would know nothing of this ship’s visit to the Dampier Archipelago.

On another island, another set of inscriptions record a visit to a similar vantage point by crew of the Delta on July 12 1849.

Details of the Delta inscriptions.
Centre for Rock Art Research + Management

Registered in Greenport, New York, the Delta made 18 global whaling voyages between 1832 and 1856. Its logbook confirms it was whaling in the Dampier Archipelago between June 2 and September 8 1849.

The voyage of the Delta as researched from Log Book entries.
Antiquity, Paterson et al 2019 (with permission)

While the log records crew members going ashore to shoot kangaroos and collect water, no mention is made of them making inscriptions or having any contact with Yaburara people.

Given it was the dry season, and the lack of permanent water on the islands, this lack of contact is not surprising.

But again, these whalers chose to make their marks on surfaces that were already marked by the Yaburara. By recording their presence at these specific historical moments, the whalers continued the long tradition of the Yaburara in interacting with and marking their maritime environment.

Protecting the heritage

Between 1822 and 1963, whalers killed more than 26,000 southern right whales (Eubalaena australis) and 40,000 humpback whales (Megaptera novaengliae) in Australia and New Zealand, driving populations to near-extinction.

Commercial whaling in Australian waters ended 40 years ago on November 21 1978, with the closure of the Cheynes Beach Whaling Station in Albany, Western Australia.

Today there are signs of renewal, with whale populations increasing, and Aboriginal people are reclaiming responsibility for management of the archipelago.




Read more:
Explainer: why the rock art of Murujuga deserves World Heritage status


There is a strong push for World Heritage Listing of Murujuga — one of the most significant concentrations for human artistic creativity on the planet, recording millennia of human responses to the sustainable use of this productive landscape.

These two whaling inscriptions provide the only known archaeological insight into this earliest global resource extraction in Australia’s northwest – the whale oil industry – which began over two centuries ago.

They demonstrate yet again the unique capacity of Murujuga’s rock art to shed light on previously unknown details of our shared human history.The Conversation

Jo McDonald, Director, Centre for Rock Art Research + Management, University of Western Australia; Alistair Paterson, ARC Future Fellow, University of Western Australia, and Ross Anderson, Curator of Maritime Archaeology, Western Australian Museum

This article is republished from The Conversation under a Creative Commons license. Read the original article.


A brief history of presidential lethargy



File 20190215 56240 1qpg97p.jpg?ixlib=rb 1.1
A television set turned on in the West Wing of the White House.
AP Photo/Susan Walsh

Stacy A. Cordery, Iowa State University

No one doubts the job of president of the United States is stressful and demanding. The chief executive deserves downtime.

But how much is enough, and when is it too much?

These questions came into focus after Axios’ release of President Donald Trump’s schedule. The hours blocked off for nebulous “executive time” seem, to many critics, disproportionate to the number of scheduled working hours.

While Trump’s workdays may ultimately prove to be shorter than those of past presidents, he’s not the first to face criticism. For every president praised for his work ethic, there’s one disparaged for sleeping on the job.

Teddy Roosevelt, locomotive president

Before Theodore Roosevelt ascended to the presidency in 1901, the question of how hard a president toiled was of little concern to Americans.

Except in times of national crisis, his predecessors neither labored under the same expectations, nor faced the same level of popular scrutiny. Since the country’s founding, Congress had been the main engine for identifying national problems and outlining legislative solutions. Congressmen were generally more accessible to journalists than the president was.

Teddy Roosevelt’s activist approach to governing shifted the public’s expectations for the president.
Library of Congress

But when Roosevelt shifted the balance of power from Congress to the White House, he created the expectation that an activist president, consumed by affairs of state, would work endlessly in the best interests of the people.

Roosevelt, whom Sen. Joseph Foraker called a “steam engine in trousers,” personified the hard-working chief executive. He filled his days with official functions and unofficial gatherings. He asserted his personality on policy and stamped the presidency firmly on the nation’s consciousness.

Taft had a tough act to follow

His successor, William Howard Taft, suffered by comparison. While it’s fair to observe that nearly anyone would have looked like a slacker compared with Roosevelt, it didn’t help that Taft weighed 300 pounds, which his contemporaries equated with laziness.

Taft’s girth only added to the perception that he lacked Roosevelt’s vigor.
Library of Congress

Taft helped neither his cause nor his image when he snored through meetings, at evening entertainments and, as author Jeffrey Rosen noted, “even while standing at public events.” Watching Taft’s eyelids close, Sen. James Watson said to him, “Mr. President, you are the largest audience I ever put entirely to sleep.”

An early biographer called Taft “slow-moving, easy-going if not lazy” with “a placid nature.” Others have suggested that Taft’s obesity caused sleep apnea and daytime drowsiness, a finding not inconsistent with historian Lewis L. Gould’s conclusion that Taft was capable of work “at an intense pace” and “a high rate of efficiency.”

It seems that Taft could work quickly, but in short bursts.

Coolidge the snoozer

Other presidents were more intentional about their daytime sleeping. Calvin Coolidge’s penchant for hourlong naps after lunch earned him amused scorn from contemporaries. But when he missed his nap, he fell asleep at afternoon meetings. He even napped on vacation. Tourists stared in amazement as the president, blissfully unaware, swayed in a hammock on his front porch in Vermont.

This, for many Republicans, wasn’t a problem: The Republican Party of the 1920s was averse to an activist federal government, so the fact that Coolidge wasn’t seen as a hard-charging, incessantly busy president was fine.

Biographer Amity Shlaes wrote that “Coolidge made a virtue of inaction” while simultaneously exhibiting “a ferocious discipline in work.” Political scientist Robert Gilbert argued that after Coolidge’s son died during his first year as president, Coolidge’s “affinity for sleep became more extreme.” Grief, according to Gilbert, explained his growing penchant for slumbering, which expanded into a pre-lunch nap, a two- to four-hour post-lunch snooze and 11 hours of shut-eye nightly.

For Reagan, the jury’s out

Ronald Reagan may have had a tendency to nod off.

“I have left orders to be awakened at any time in case of a national emergency – even if I’m in a cabinet meeting,” he joked. Word got out that he napped daily, and historian Michael Schaller wrote in 1994 that Reagan’s staff “released a false daily schedule that showed him working long hours,” labeling his afternoon nap “personal staff time.” But some family members denied that he napped in the White House.

Journalists were divided. Some found him “lazy, passive, stupid or even senile” and “intellectually lazy … without a constant curiosity,” while others claimed he was “a hard worker,” who put in long days and worked over lunch. Perhaps age played a role in Reagan’s naps – if they happened at all.

Clinton crams in the hours

One president not prone to napping was Bill Clinton. Frustrated that he could not find time to think, Clinton ordered a formal study of how he spent his days. His ideal was four hours in the afternoon “to talk to people, to read, to do whatever.” Sometimes he got half that much.

Two years later, a second study found that, during Clinton’s 50-hour workweek, “regularly scheduled meetings” took up 29 percent of his time, “public events, etc.” made up 36 percent of his workday, while “thinking time – phone & office work” constituted 35 percent of his day. Unlike presidents whose somnolence drew sneers, Clinton was disparaged for working too much and driving his staff to exhaustion with all-nighters.

Partisanship at the heart of criticism?

The work of being president of the United States never ends. There is always more to be done. Personal time may be a myth, as whatever the president reads, watches or does can almost certainly be applied to some aspect of the job.

Trump’s “executive time” could be a rational response to the demands of the job or life circumstances. Trump, for example, only seems to get four or five hours of sleep a night, which seems to suggest that he has more time to tackle his daily duties than the rest of us.

But, like his predecessors, the appearance of taking time away from running the country will garner criticism. Though they can sometimes catch 40 winks, presidents can seldom catch a break.The Conversation

Stacy A. Cordery, Professor of History, Iowa State University

This article is republished from The Conversation under a Creative Commons license. Read the original article.


Hidden women of history: Tarpe Mills, 1940s comic writer, and her feisty superhero Miss Fury


File 20190204 193209 9yt4ya.png?ixlib=rb 1.1
Miss Fury had cat claws, stiletto heels and a killer make-up compact.
Author provided

Camilla Nelson, University of Notre Dame Australia

In this series, we look at under-acknowledged women through the ages.

In April 1941, just a few short years after Superman came swooping out of the Manhattan skies, Miss Fury – originally known as Black Fury – became the first major female superhero to go to print. She beat Charles Moulton Marsden’s Wonder Woman to the page by more than six months. More significantly, Miss Fury was the first female superhero to be written and drawn by a woman, Tarpé Mills.

Miss Fury’s creator – whose real name was June – shared much of the gritty ingenuity of her superheroine. Like other female artists of the Golden Age, Mills was obliged to make her name in comics by disguising her gender. As she later told the New York Post, “It would have been a major let-down to the kids if they found out that the author of such virile and awesome characters was a gal.”

Yet, this trailblazing illustrator, squeezed out of the comic world amid a post-WW2 backlash against unconventional images of femininity and a 1950s climate of heightened censorship, has been largely excluded from the pantheon of comic greats – until now.

Comics then and now tend to feature weak-kneed female characters who seem to exist for the sole purpose of being saved by a male hero – or, worse still, are “fridged”, a contemporary comic book colloquialism that refers to the gruesome slaying of an undeveloped female character to deepen the hero’s motivation and propel him on his journey.

But Mills believed there was room in comics for a different kind of female character, one who was able, level-headed and capable, mingling tough-minded complexity with Mills’ own taste for risqué behaviour and haute couture gowns.

Tarpe Mills was obliged to make her name in comics during the 1940s by disguising her gender.
Author provided

Where Wonder Woman’s powers are “marvellous” – that is, not real or attainable – Miss Fury and her alter ego Marla Drake use their collective brains, resourcefulness and the odd stiletto heel in the face to bring the villains to justice.

A WW2 plane featuring an image of Miss Fury.
http://www.tarpemills.com

And for a time they were wildly successful.

Miss Fury ran a full decade from April 1941 to December 1951, was syndicated in 100 different newspapers at the height of her wartime fame, and sold a million copies an issue in reprints released by Timely (now Marvel) comics.

Pilots flew bomber planes with Miss Fury painted on the fuselage. Young girls played with paper doll cut outs featuring her extensive high fashion wardrobe.

An anarchic, ‘gender flipped’ universe

Miss Fury’s “origin story” offers its own coolly ironic commentary on the masculine conventions of the comic genre.

One night a girl called Marla Drake finds out that her friend Carol is wearing an identical gown to a masquerade party. So, at the behest of her maid Francine, she dons a skin tight black cat suit that – in an imperial twist, typical of the period – was once worn as a ceremonial robe by a witch doctor in Africa.

On the way to the ball, Marla takes on a gun-toting killer, using her cat claws, stiletto heels, and – hilariously – a puff of powder blown from her makeup compact to disarm the villain. She leaves him trussed up with a hapless and unconscious police detective by the side of the road.

Tarpe Mills with her beloved Persian cat.
Author provided

Miss Fury could fly a fighter plane when she had to, jumping out in a parachute dressed in a red satin ball gown and matching shoes. She was also a crack shot.

This was an anarchic, gender flipped, comic book universe in which the protagonist and principle antagonists were women, and in which the supposed tools of patriarchy – high heels, makeup and mermaid bottom ball gowns – were turned against the system. Arch nemesis Erica Von Kampf – a sultry vamp who hides a swastika-branded forehead behind a v-shaped blond fringe – also displayed amazing enterprise in her criminal antics.


Author provided

Invariably the male characters required saving from the crime gangs, the Nazis or merely from themselves. Among the most ingenious panels in the strip were the ones devoted to hapless lovelorn men, endowed with the kind of “thought bubbles” commonly found hovering above the heads of angsty heroines in romance comics.

By contrast, the female characters possessed a gritty ingenuity inspired by Noir as much as by the changed reality of women’s wartime lives. Half way through the series, Marla got a job, and – astonishingly, for a Sunday comic supplement – became a single mother, adopting the son of her arch nemesis, wrestling with snarling dogs and chains to save the toddler from a deadly experiment.

Mills claims to have modelled Miss Fury on herself. She even named Marla’s cat Peri-Purr after her own beloved Persian pet. Born in Brooklyn in 1918, Mills grew up in a house headed by a single widowed mother, who supported the family by working in a beauty parlour. Mills worked her way through New York’s Pratt Institute by working as a model and fashion illustrator.

Censorship

In the end, ironically, it was Miss Fury’s high fashion wardrobe that became a major source of controversy.

In 1947, no less than 37 newspapers declined to run a panel that featured one of Mills’ tough-minded heroines, Era – a South American Nazi-Fighter who became a post-war nightclub entertainer – dressed as Eve, replete with snake and apple, in a spangled, two-piece costume.

This was not the only time the comic strip was censored. Earlier in the decade, Timely comics had refused to run a picture of the villainess Erica resplendent in her bath – surrounded by pink flamingo wallpaper.

Erica in the bath, surrounded by pink flamingo wallpaper.
Author provided.

But so many frilly negligées, cat fights, and shower scenes had escaped the censor’s eye. It’s not a leap to speculate that behind the ban lay the post-war backlash against powerful and unconventional women.

In wartime, nations had relied on women to fill the production jobs that men had left behind. Just as “Rosie the Riveter” encouraged women to get to work with the slogan “We Can Do It!”, so too the comparative absence of men opened up room for less conventional images of women in the comics.

A Miss Fury paper doll cut out.
Author provided

Once the war was over, women lost their jobs to returning servicemen. Comic creators were no longer encouraged to show women as independent or decisive. Politicians and psychologists attributed juvenile delinquency to the rise of unconventional comic book heroines and by 1954 the Comics Code Authority was policing the representation of women in comics, in line with increasingly conservative ideologies. In the 1950s, female action comics gave way to romance ones, featuring heroines who once again placed men at the centre of their existence.

Miss Fury was dropped from circulation in December 1951, and despite a handful of attempted comebacks, Mills and her anarchic creation slipped from public view.

Mills continued to work as a commercial illustrator on the fringes of a booming advertising industry. In 1971, she turned a hand to romance comics, penning a seven-page story that was published by Marvel, but it wasn’t her forte. In 1979, she began work on a graphic novel Albino Jo, which remains unfinished.

Despite her chronic asthma, Mills – like the reckless Noir heroine she so resembled – chain-smoked to the bitter end. She died of emphysema on December 12, 1988, and is buried in New Jersey under the simple inscription, “Creator of Miss Fury”.

This year Mills’ work will be belatedly recognised. As a recipient of the 2019 Eisner Award, she will finally take her place in the Comics Hall of Fame, alongside the male creators of the Golden Age who have too long dominated the history of the genre. Hopefully this will bring her comic creation the kind of notoriety, readership and big screen adventures she thoroughly deserves.The Conversation

Camilla Nelson, Associate Professor in Media, University of Notre Dame Australia

This article is republished from The Conversation under a Creative Commons license. Read the original article.


The USA & Naval Encounters



Why the Pilgrims were actually able to survive



File 20181118 194519 1eayget.jpg?ixlib=rb 1.1
‘Mayflower in Plymouth Harbor’ by William Halsall (1882).
Pilgrim Hall Museum

Peter C. Mancall, University of Southern California – Dornsife College of Letters, Arts and Sciences

Sometime in the autumn of 1621, a group of English Pilgrims who had crossed the Atlantic Ocean and created a colony called New Plymouth celebrated their first harvest.

They hosted a group of about 90 Wampanoags, their Algonquian-speaking neighbors. Together, migrants and Natives feasted for three days on corn, venison and fowl.

In their bountiful yield, the Pilgrims likely saw a divine hand at work.

As Gov. William Bradford wrote in 1623, “Instead of famine now God gave them plenty, and the face of things was changed, to the rejoicing of the hearts of many, for which they blessed God.”

But my recent research on the ways Europeans understood the Western Hemisphere shows that – despite the Pilgrims’ version of events – their survival largely hinged on two unrelated developments: an epidemic that swept through the region and a repository of advice from earlier explorers.

A ‘desolate wilderness’ or ‘Paradise of all parts’?

Bradford’s “Of Plymouth Plantation,” which he began to write in 1630 and finished two decades later, traces the history of the Pilgrims from their persecution in England to their new home along the shores of modern Boston Harbor.

William Bradford’s writings depicted a harrowing, desolate environment.

Bradford and other Pilgrims believed in predestination. Every event in their lives marked a stage in the unfolding of a divine plan, which often echoed the experiences of the ancient Israelites.

Throughout his account, Bradford probed Scripture for signs. He wrote that the Puritans arrived in “a hideous and desolate wilderness, full of wild beasts and wild men.” They were surrounded by forests “full of woods and thickets,” and they lacked the kind of view Moses had on Mount Pisgah, after successfully leading the Israelites to Canaan.

Drawing on chapter 26 of the Book of Deuteronomy, Bradford declared that the English “were ready to perish in this wilderness,” but God had heard their cries and helped them. Bradford paraphrased from Psalm 107 when he wrote that the settlers should “praise the Lord” who had “delivered them from the hand of the oppressor.”

If you were reading Bradford’s version of events, you might think that the survival of the Pilgrims’ settlements was often in danger. But the situation on the ground wasn’t as dire as Bradford claimed.

The French explorer Samuel de Champlain depicted Plymouth as a region that was eminently inhabitable.
Source., Author provided

Earlier European visitors had described pleasant shorelines and prosperous indigenous communities. In 1605, the French explorer Samuel de Champlain sailed past the site the Pilgrims would later colonize and noted that there were “a great many cabins and gardens.” He even provided a drawing of the region, which depicted small Native towns surrounded by fields.

About a decade later Captain John Smith, who coined the term “New England,” wrote that the Massachusetts, a nearby indigenous group, inhabited what he described as “the Paradise of all those parts.”

‘A wonderful plague’

Champlain and Smith understood that any Europeans who wanted to establish communities in this region would need either to compete with Natives or find ways to extract resources with their support.

But after Champlain and Smith visited, a terrible illness spread through the region. Modern scholars have argued that indigenous communities were devastated by leptospirosis, a disease caused by Old World bacteria that had likely reached New England through the feces of rats that arrived on European ships.

The absence of accurate statistics makes it impossible to know the ultimate toll, but perhaps up to 90 percent of the regional population perished between 1617 to 1619.

To the English, divine intervention had paved the way.

“By God’s visitation, reigned a wonderful plague,” King James’ patent for the region noted in 1620, “that had led to the utter Destruction, Devastacion, and Depopulation of that whole territory.”

The epidemic benefited the Pilgrims, who arrived soon thereafter: The best land had fewer residents and there was less competition for local resources, while the Natives who had survived proved eager trading partners.

The wisdom of those who came before

Just as important, the Pilgrims understood what to do with the land.

By the time that these English planned their communities, knowledge of the Atlantic coast of North America was widely available.

Those hoping to create new settlements had read accounts of earlier European migrants who had established European-style villages near the water, notably along the shores of Chesapeake Bay, where the English had founded Jamestown in 1607.

These first English migrants to Jamestown endured terrible disease and arrived during a period of drought and colder-than-normal winters. The migrants to Roanoke on the outer banks of Carolina, where the English had gone in the 1580s, disappeared. And a brief effort to settle the coast of Maine in 1607 and 1608 failed because of an unusually bitter winter.

Many of these migrants died or gave up. But none disappeared without record, and their stories circulated in books printed in London. Every English effort before 1620 had produced accounts useful to would-be colonizers.

The most famous account, by the English mathematician Thomas Harriot, enumerated the commodities that the English could extract from America’s fields and forests in a report he first published in 1588.

The artist John White, who was on the same mission to modern Carolina, painted a watercolor depicting the wide assortment of marine life that could be harvested, another of large fish on a grill, and a third showing the fertility of fields at the town of Secotan. By the mid-1610s, actual commodities had started to arrive in England too, providing support for those who had claimed that North American colonies could be profitable. The most important of these imports was tobacco, which many Europeans considered a wonder drug capable of curing a wide range of human ailments.

These reports (and imports) encouraged many English promoters to lay plans for colonization as a way to increase their wealth. But those who thought about going to New England, especially the Pilgrims who were kindred souls of Bradford, believed that there were higher rewards to be reaped.

Bradford and the other Puritans who arrived in Massachusetts often wrote about their experience through the lens of suffering and salvation.

But the Pilgrims were better equipped to survive than they let on.The Conversation

Peter C. Mancall, Andrew W. Mellon Professor of the Humanities, University of Southern California – Dornsife College of Letters, Arts and Sciences

This article is republished from The Conversation under a Creative Commons license. Read the original article.


Columbus believed he would find ‘blemmyes’ and ‘sciapods’ – not people – in the New World



File 20181004 52678 1rvabyh.jpg?ixlib=rb 1.1
The statue of Christopher Columbus in Columbus Circle, New York City.
Zoltan Tarlacz/Shutterstock.com

Peter C. Mancall, University of Southern California – Dornsife College of Letters, Arts and Sciences

In 1492, when Christopher Columbus crossed the Atlantic Ocean in search of a fast route to East Asia and the southwest Pacific, he landed in a place that was unknown to him. There he found treasures – extraordinary trees, birds and gold.

But there was one thing that Columbus expected to find that he didn’t.

Upon his return, in his official report, Columbus noted that he had “discovered a great many islands inhabited by people without number.” He praised the natural wonders of the islands.

But, he added, “I have not found any monstrous men in these islands, as many had thought.”

Why, one might ask, had he expected to find monsters?

My research and that of other historians reveal that Columbus’ views were far from abnormal. For centuries, European intellectuals had imagined a world beyond their borders populated by “monstrous races.”

Of course the ‘monstrous races’ exist

One of the earliest accounts of these non-human beings was written by the Roman natural historian Pliny the Elder in 77 A.D. In a massive treatise, he told his readers about dog-headed people, known as cynocephalus, and astoni, creatures with no mouth and no need to eat.

Across medieval Europe, tales of marvelous and inhuman creatures – of cyclops, blemmyes, creatures with heads in their chests, and sciapods, who had a single leg with a giant foot – circulated in manuscripts hand-copied by scribes who often embellished their treatises with illustrations of these fantastic creatures.

A 1544 woodcut by Sebastian Münster depicts, from left to right, a sciapod, a cyclops, conjoined twins, a blemmye and a cynocephaly.
Wikimedia Commons

Though there were always some skeptics, most Europeans believed that distant lands would be populated by these monsters, and stories of monsters traveled far beyond the rarefied libraries of elite readers.

For example, churchgoers in Fréjus, an ancient market town in the south of France, could wander into the cloister of the Cathédrale Saint-Léonce and study monsters on the more than 1,200 painted wooden ceiling panels. Some panels portrayed scenes of daily life – local monks, a man riding a pig and contorted acrobats. Many others depicted monstrous hybrids, dog-headed people, blemmyes and other fearsome wretches.

The ceiling of the Cathédrale Saint-Léonce depicts an array of monstrous creatures.
Peter C. Mancall, Author provided

Perhaps no one did more to spread news of monsters’ existence than a 14th-century English knight named John Mandeville, who, in his account of his travels to faraway lands, claimed to have seen people with the ears of an elephant, one group of creatures who had flat faces with two holes, and another that had the head of a man and the body of a goat.

Scholars debate whether Mandeville could have ventured far enough to see the places that he described, and whether he was even a real person. But his book was copied time and again, and likely translated into every known European language.

Leonardo da Vinci had a copy. So did Columbus.

Old beliefs die hard

Even though Columbus didn’t see monsters, his report wasn’t enough to dislodge prevailing ideas about the creatures Europeans expected to find in parts unknown.

In 1493 – around the time Columbus’ first report began to circulate – printers of the “Nuremberg Chronicle,” a massive volume of history, included images and descriptions of monsters. And soon after the explorer’s return, an Italian poet offered a verse translation describing Columbus’ journey, which its printer illustrated with monsters, including a sciapod and a blemmye.

Indeed, the belief that monsters lived at the Earth’s edge remained for generations.

In the 1590s, the English explorer Sir Walter Raleigh told readers about the American monsters he heard about in his travels to Guiana, some of which had “their eyes in their shoulders, and their mouths in the middle of their breasts, & that a long train of haire groweth backward between their shoulders.”

Soon after, the English natural historian Edward Topsell translated a mid-16th-century treatise of the various animals of the world, a book that appeared in London in 1607, the same year that colonists established a small community at Jamestown, Virginia. Topsell was eager to integrate descriptions of American animals in his book.
But alongside chapters on Old World horses, pigs and beavers, readers learned about the “Norwegian monster” and a “very deformed beast” that Americans called an “haut.” Another, known as a “su,” had “a very deformed shape, and monstrous presence” and was “cruell, untamable, impatient, violent, [and] ravening.”

Of course, in the New World, the gains for Europeans came at a terrifying cost for Native Americans: The newcomers stole their land and treasures, enslaved them, introduced Old World diseases and spurred long-term environmental change.

In the end, perhaps these indigenous Americans saw the invaders of their homelands as a ‘monstrous race’ of its own – creatures who destabilized their communities, took their possessions and threatened their lives.The Conversation

Peter C. Mancall, Andrew W. Mellon Professor of the Humanities, University of Southern California – Dornsife College of Letters, Arts and Sciences

This article is republished from The Conversation under a Creative Commons license. Read the original article.


World politics explainer: The twin-tower bombings (9/11)



File 20180904 41720 xwahre.jpg?ixlib=rb 1.1
South Tower being hit during the 9/11 attacks. The events of September 11 2001 has significantly shaped American attitudes and actions towards fighting terrorism, surveilling citizens and othering outsiders.
NIST SIPA/Wikicommons

Barbara Keys, University of Melbourne

This article is part of our series of explainers on key moments in the past 100 years of world political history. In it, our authors examine how and why an event unfolded, its impact at the time, and its relevance to politics today.


At 8:46am on a sunny Tuesday morning in New York City, a commercial jet plane flew into the North Tower of the World Trade Centre, cutting through floors 93 to 99.

As the news was beamed around the world, shaken reporters wondered whether the crash had been an accident or an act of terrorism. At 9:03am, viewers watching the smoke billowing from the gash in the building were stunned to see a second jet plane dart into view and fly directly into the South Tower. Suddenly, it was clear that the United States was under attack.

The scale of the assault became apparent about 40 minutes later, when a third jet crashed into the Pentagon. Not long after, in the fourth shock of the morning, the South Tower of the World Trade Centre unexpectedly crumbled to the ground in a few seconds, its structural integrity destroyed by the inferno set off by the plane’s thousands of gallons of jet fuel. Its twin soon succumbed to the same fate.

Fire fighters on scene after the 9/11 attack.
Mike Goad/Wikicommons

What happened?

Over the next days and weeks, the world learned that 19 militants belonging to the Islamic terrorist group, al Qaeda, armed with box cutters and knives missed by airport security, had hijacked four planes.

Three hit their targets. The fourth, intended for the White House or the Capitol, crashed in a field in Pennsylvania when passengers, who had learned of the other attacks, struggled for control of the plane. All told, close to 3,000 people were killed and 6,000 were injured.

Immediate impact of the attacks

The events of 9/11 seared the American psyche. A country whose continental states had not seen a major attack in nearly 200 years was stunned to find that its financial and military centres had been hit by a small terrorist group based thousands of miles away. More mass attacks suddenly seemed not just probable but inevitable.




Read more:
How the pain of 9/11 still stays with a generation


The catastrophe set in motion a sequence of reactions and unintended consequences that continue to reverberate today. Its most lasting and consequential effects are interlinked: a massively expensive and unending “war on terror”, heightened suspicion of government and the media in many democratic countries, a sharp uptick in Western antagonism toward Muslims, and the decline of US power alongside rising international disorder – developments that aided the rise of Donald Trump and leaders like him.

War without end?

Just weeks after 9/11, the administration of US President George W. Bush invaded Afghanistan with the aim of destroying al Qaeda, which had been granted safe haven by the extremist Taliban regime. With the support of dozens of allies, the invasion quickly toppled the Taliban government and crippled al Qaeda. But it was not until 2011, under President Barack Obama, that US forces found and killed al Qaeda’s leader and 9/11 mastermind – Osama bin Laden.

American soldiers in Afghanistan, 2001.
Marine Corps New York/Flickr, CC BY

Though there have been efforts to end formal combat operations since then, over 10,000 US troops remain in Afghanistan today, fighting an intensifying Taliban insurgency. It is now the longest war the United States has fought. Far from being eradicated, the Taliban is active in most of the country. Even though the war’s price tag is nearing a trillion dollars, domestic pressure to end the war is minimal, thanks to an all volunteer army and relatively low casualties that make the war seem remote and abstract to most Americans.

Even more consequential has been the second major armed conflict triggered by 9/11: the US-led invasion of Iraq in 2003. Although Iraqi dictator Saddam Hussein was not linked to 9/11, officials in the administration of George W. Bush were convinced his brutal regime was a major threat to world order. This is largely due to Saddam Hussein’s past aggression, his willingness to defy the United States, and his aspirations to build or expand nuclear, chemical, and biological weapons programs, making it seem likely that he would help groups planning terrorist attacks on the West.

The invading forces quickly ousted Saddam, but the poorly executed, error-ridden occupation destabilised the entire region.

In Iraq, it triggered a massive, long-running insurgency. In the Middle East more broadly, it boosted Iran’s regional influence, fostered the rise of the Islamic State, and created lasting disorder that has led to civil wars, countless terrorist attacks, and radicalisation.

In many parts of the world, the war fuelled anti-Americanism; in Europe, public opinion about the war set in motion a widening estrangement between the United States and its key European allies.

Monetary and social costs

Today, the United States spends US$32 million every hour on the wars fought since 9/11. The total cost is over US$5,600,000,000,000. (5.6 trillion dollars). The so-called war on terror has spread into 76 countries where the US military is now conducting counter-terror activities, ranging from drone strikes to surveillance operations.

The mind-boggling sums have been financed by borrowing, which has increased social inequality in the United States. Some observers have suggested that government war spending was even more important than financial deregulation in causing the 2007-2008 Global Financial Crisis.

Eroding democracy

The post-9/11 era has eroded civil liberties across the world. Many governments have cited the urgent need to prevent future attacks as justification for increased surveillance of citizens, curbing of dissent, and enhanced capacity to detain suspects without charge.

The well publicised missteps of the FBI and the CIA in failing to detect and prevent the 9/11 plot, despite ample warnings, fed public distrust of intelligence and law enforcement agencies. Faulty intelligence about what turned out to be nonexistent Iraqi “weapons of mass destruction” (WMDs) undermined public confidence not only in the governments that touted those claims but also in the media for purveying false information.

The result has been a climate of widespread distrust of the voices of authority. In the United States and in other countries, citizens are increasingly suspicious of government sources and the media — at times even questioning whether truth is knowable. The consequences for democracy are dire.

Increasing Islamophobia

Across the West, 9/11 also set off a wave of Islamophobia. Having fought a decades-long Cold War not long before, Americans framed the attack as a struggle of good versus evil, casting radical Islam as the latest enemy. In many countries, voices in the media and in politics used the extremist views and actions of Islamic terrorists to castigate Muslims in general. Since 9/11, Muslims in the United States and elsewhere have experienced harassment and violence.

Cartoon highlighting Islamophobia in Europe.
Carlos Latuff/Flickr, CC BY-SA

In Western countries, Muslims are now often treated as the most significant public enemy. European populists have risen to power by denouncing refugees from Muslim majority countries like Syria, and the willingness and ability of Muslims to assimilate is viewed with increasing scepticism.

A week after his inauguration, US President Donald Trump kept a campaign promise by signing the so-called “Muslim ban”, designed to prevent citizens of six Muslim-majority countries from entering the United States.

Following attacks

One of the most widely expected consequences of 9/11 has so far been averted. Though Islamic terrorists have engaged in successful attacks in the West since 9/11, including the 2002 Bali bombings, the 2004 Madrid train bombings, and the 2015 attacks in Paris, there has been no attack on the scale of 9/11. Instead, it is countries with large Muslim populations that have seen a rise in terrorist attacks.

Yet the West still pays the price for its militant and militarised response to terrorism through the weakening of democratic norms and values. The unleashing of US military power that was supposed to intimidate terrorists has diminished America’s might, creating a key precondition for Donald Trump’s promise to restore American greatness.

Although many of the issues confronting us today have very long roots, the world we live in has been indelibly shaped by 9/11 and its aftermath.The Conversation

Barbara Keys, Associate Professor of US and International History, University of Melbourne

This article is republished from The Conversation under a Creative Commons license. Read the original article.


%d bloggers like this: