Tag Archives: USA

Slavery Begins in America


Advertisements

Who won the war? We did, says everyone



Winston Churchill, Joseph Stalin and Franklin D Roosevelt at the Yalta Conference, 1945.
Wikipedia

Nick Chater, Warwick Business School, University of Warwick

Ask any of the few remaining World War II veterans what they did during the war and you’re likely to get a humble answer. But ask the person on the street how important their country’s contribution to the war effort was and you’ll probably hear something far less modest. A new study suggests people from Germany, Russia, the UK and the US on average all think their own country shouldered more than half the burden of fighting World War II.

Our national collective memories seem to be deceiving us, and this is part of a far more general pattern. Aside from those veterans who have no desire to revel in the horrors of war, we may have a general psychological tendency to believe our contributions are more significant than they really are.

You can see this in even the most mundane of tasks. Unloading the dishwasher can be a perennial source of family irritation. I suspect that I’m doing more than my fair share. The trouble is that so does everybody else. Each of us can think: “The sheer injustice! I’m overworked and under-appreciated.”

But we can’t all be right. This strange magnification of our own efforts seems to be ubiquitous. In business, sport or entertainment, it’s all too easy for each participant to think that their own special stardust is the real reason their company, team or show was a hit.

It works for nations, too. A study last year, led by US memory researcher Henry Roediger III, asked people from 35 countries for the percentage contribution their own nation has made to world history. A dispassionate judge would, of course, assign percentages that add up to no more than 100% (and, indeed, considerably less, given the 160 or so countries left out). In fact, the self-rating percentages add up to over 1,000%, with citizens of India, Russia and the UK each suspecting on average that their own nations had more than half the responsibility for world progress.

A sceptic might note that “contributing to world history” is a rather nebulous idea, which each nation can interpret to its advantage. (The Italians, at 40%, might focus on the Romans and the Renaissance, for example.) But what about our responsibility for specific world events? The latest study from Roediger’s lab addresses the question of national contributions to World War II.

The researchers surveyed people from eight former Allied countries (Australia, Canada, China, France, New Zealand, Russia/USSR, the UK and the US) and three former Axis powers (Germany, Italy and Japan). As might be expected, people from the winning Allied side ranked their own countries highly, and the average percentage responses added up to 309%. Citizens of the UK, US and Russia all believed their countries had contributed more than 50% of the war effort and were more than 50% responsible for victory.

World War II deaths by country. How would you work out which country contributed the most?
Dna-Dennis/Wikimedia Commons

You might suspect that the losing Axis powers, whose historical record is inextricably tied to the immeasurable human suffering of the war, might not be so proud. As former US president John F Kennedy said (echoing the Roman historian Tacitus): “Victory has a hundred fathers and defeat is an orphan.” Perhaps the results for the Allied countries just reflect a general human tendency to claim credit for positive achievements. Yet citizens of the three Axis powers also over-claim shares of the war effort (totalling 140%). Rather than minimising their own contribution, even defeated nations seem to overstate their role.

Why? The simplest explanation is that we piece together answers to questions, of whatever kind, by weaving together whatever relevant snippets of information we can bring to mind. And the snippets of information that come to mind will depend on the information we’ve been exposed to through our education and cultural environment. Citizens of each nation learn a lot more about their country’s own war effort than those of other countries. These “home nation” memories spring to mind, and a biased evaluation is the inevitable result.

So there may not be inherent “psychological nationalism” in play here. And nothing special about collective, rather than individual, memory either. We simply improvise answers, perhaps as honestly as possible, based on what our memory provides – and our memory, inevitably, magnifies our own (or our nation’s) efforts.

How do you calculate real responsibility?

A note of caution is in order. Assigning responsibilities for past events baffles not just everyday citizens, but academic philosophers. Imagine a whodunit in which two hopeful murderers put lethal doses of cyanide into Lady Fotherington’s coffee. Each might say: “It’s not my fault – she would have died anyway.” Is each only “half” to blame, and hence due a reduced sentence? Or are they both 100% culpable? This poisoning is a simple matter compared with the tangled causes of military victory and defeat. So it is not entirely clear what even counts as over- or under-estimating our responsibilities because responsibilities are so difficult to assess.

Still, the tendency to overplay our own and our nation’s role in just about anything seems all too plausible. We see history through a magnifying glass that is pointing directly at ourselves. We learn the most about the story of our own nation. So our home nation’s efforts and contributions inevitably spring readily to mind (military and civilian deaths, key battles, advances in technology and so on). The efforts and contributions of other nations are sensed more dimly, and often not at all.

And the magnifying glass over our efforts is pervasive in daily life. I can find myself thinking irritably, as I unload the dishwasher, “Well, I don’t even remember the last time you did this!” But of course not. Not because you didn’t do it, but because I wasn’t there.The Conversation

Nick Chater, Professor of Behavioural Science, Warwick Business School, University of Warwick

This article is republished from The Conversation under a Creative Commons license. Read the original article.


How the 1869 Cincinnati Red Stockings turned baseball into a national sensation



A drawing from Harper’s Weekly depicts a game between the Red Stockings and the Brooklyn Atlantics.
New York Public Library

Robert Wyss, University of Connecticut

This Major League Baseball season, fans may notice a patch on the players’ uniforms that reads “MLB 150.”

The logo commemorates the Cincinnati Red Stockings, who, in 1869, became the first professional baseball team – and went on to win an unprecedented 81 straight games.

As the league’s first openly salaried club, the Red Stockings made professionalism – which had been previously frowned upon – acceptable to the American public.

But the winning streak was just as pivotal.

“This did not just make the city famous,” John Thorn, Major League Baseball’s official historian, said in an interview for this article. “It made baseball famous.”

Pay to play?

In the years after the Civil War, baseball’s popularity exploded, and thousands of American communities fielded teams. Initially most players were gentry – lawyers, bankers and merchants whose wealth allowed them to train and play as a hobby. The National Association of Base Ball Players banned the practice of paying players.

At the time, the concept of amateurism was especially popular among fans. Inspired by classical ideas of sportsmanship, its proponents argued that playing sport for a reason other than for the love of the game was immoral, even corrupt.

Nonetheless, some of the major clubs in the East and Midwest began disregarding the rule prohibiting professionalism and secretly hired talented young working-class players to get an edge.

After the 1868 season, the national association reversed its position and sanctified the practice of paying players. The move recognized the reality that some players were already getting paid, and that was unlikely to change because professionals clearly helped teams win.

Yet the taint of professionalism restrained virtually every club from paying an entire roster of players.

The Cincinnati Red Stockings, however, became the exception.

The Cincinnati experiment

In the years after the Civil War, Cincinnati was a young, growing, grimy city.

The city had experienced an influx of German and Irish immigrants who toiled in the multiplying slaughterhouses. The stench of hog flesh wafted through the streets, while the black fumes of steamboats, locomotives and factories lingered over the skyline.

Nonetheless, money was pouring into the coffers of the city’s gentry. And with prosperity, the city sought respectability; it wanted to be as significant as the big cities that ran along the Atlantic seaboard – New York, Philadelphia and Baltimore.

Men slaughter hogs on an assembly line in a Cincinnati meat packing plant.
Library of Congress

Cincinnati’s main club, the Red Stockings, was run by an ambitious young lawyer named Aaron Champion. Prior to the 1869 season, he budgeted US$10,000 for his payroll and hired Harry Wright to captain and manage the squad. Wright was lauded later in his career as a “baseball Edison” for his ability to find talent. But the best player on the team was his 22-year-old brother, George, who played shortstop. George Wright would end up finishing the 1869 season with a .633 batting average and 49 home runs.

Only one player hailed from Cincinnati; the rest had been recruited from other teams around the nation. Wright had hoped to attract the top player in the country for each position. He didn’t quite get the best of the best, but the team was loaded with stars.

As the season began, the Red Stockings and their new salaries attracted little press attention.

“The benefits of professionalism were not immediately recognized,” Greg Rhodes, a co-author of “Baseball Revolutionaries: How the 1869 Red Stockings Rocked the Country and Made Baseball Famous,” told me. “So the Cincinnati experiment wasn’t seen as all that radical.”

The Red Stockings opened the season by winning 45 to 9. They kept winning and winning and winning – huge blowouts.

At first only the Cincinnati sports writers had caught on that something special was going on. Then, in June, the team took its first road trip east. Playing in hostile territory against what were considered the best teams in baseball, they were also performing before the most influential sports writers.

The pivotal victory was a tight 4-to-2 win against what had been considered by many the best team in baseball, the powerful New York Mutuals, in a game played with Tammany Hall “boss” William Tweed watching from the stands.

Now the national press was paying attention. The Red Stockings continued to win, and, by the conclusion of the road trip in Washington, they were puffing stogies at the White House with their host, President Ulysses Grant.

The players chugged home in a boozy, satisfied revel and were met by 4,000 joyous fans at Cincinnati’s Union Station.

American idols

The Red Stockings had become a sensation. They were profiled in magazines and serenaded in sheet music. Ticket prices doubled to 50 cents. They drew such huge crowds that during a game played outside of Chicago, an overloaded bleacher collapsed.

Aaron Chapman’s squad averaged 42 runs a game in the 1869 season.
From the collection of Greg Rhodes, Author provided

Most scores were ridiculously lopsided; during the 1869 season the team averaged 42 runs a game. Once they even scored 103. The most controversial contest was in August against the Haymakers of Troy, New York. The game was rife with rumors of $17,000 bets, and bookmakers bribing umpires and players. The game ended suspiciously at 17 to 17, when the Haymakers left the field in the sixth inning, incensed by an umpire’s call. The Red Stockings were declared the winners.

The season climaxed with a road trip west on the new transcontinental railroad, which had just opened in May. The players, armed with rifles, shot out windows at bison, antelope and even prairie dogs and slept in wooden Coleman cars lighted with whale oil. More than 2,000 excited baseball fans greeted the team in San Francisco, where admission to games was one dollar in gold.

Cincinnati ended its season with an undefeated record: 57 wins, 0 losses. The nation’s most prominent sports writer of the day, Henry Chadwick, declared them “champion club of the United States.”

Despite fears that others clubs would outbid Cincinnati for their players, every Red Stockings player demonstrated his loyalty by signing contracts to return for the 1870 season.

The demise begins

The winning streak continued into the next season – up until a June 14, 1870, game against the Brooklyn Atlantics.

An error by second baseman Charles Sweasy ended the Red Sockings’ historic streak.
From the collection of John Thorn, Author provided

After nine innings, the teams were tied at 5. Under the era’s rules, the game could have been declared a draw, leaving the streak intact. Instead Harry Wright opted to continue, and the Red Stockings ended up losing in extra innings after an error by the second baseman, Charlie Sweasy.

The 81-game win streak had ended.

The Red Stockings did not return in 1871. Ticket sales had fallen after their first loss, and other teams began to outbid the Red Stockings for their star players. Ultimately the cost of retaining all of its players was more than the Cincinnati club could afford.

Yet the team had made its mark.

“It made baseball from something of a provincial fare to a national game,” Thorn explained.

A few years later, in 1876, the National League was founded and still exists today. The Cincinnati Reds were a charter member. And not surprisingly, some of the biggest 150-year celebrations of the first professional baseball team are occurring in the town they once called Porkopolis.The Conversation

Robert Wyss, Professor of Journalism, University of Connecticut

This article is republished from The Conversation under a Creative Commons license. Read the original article.


A brief history of presidential lethargy



A television set turned on in the West Wing of the White House.
AP Photo/Susan Walsh

Stacy A. Cordery, Iowa State University

No one doubts the job of president of the United States is stressful and demanding. The chief executive deserves downtime.

But how much is enough, and when is it too much?

These questions came into focus after Axios’ release of President Donald Trump’s schedule. The hours blocked off for nebulous “executive time” seem, to many critics, disproportionate to the number of scheduled working hours.

While Trump’s workdays may ultimately prove to be shorter than those of past presidents, he’s not the first to face criticism. For every president praised for his work ethic, there’s one disparaged for sleeping on the job.

Teddy Roosevelt, locomotive president

Before Theodore Roosevelt ascended to the presidency in 1901, the question of how hard a president toiled was of little concern to Americans.

Except in times of national crisis, his predecessors neither labored under the same expectations, nor faced the same level of popular scrutiny. Since the country’s founding, Congress had been the main engine for identifying national problems and outlining legislative solutions. Congressmen were generally more accessible to journalists than the president was.

Teddy Roosevelt’s activist approach to governing shifted the public’s expectations for the president.
Library of Congress

But when Roosevelt shifted the balance of power from Congress to the White House, he created the expectation that an activist president, consumed by affairs of state, would work endlessly in the best interests of the people.

Roosevelt, whom Sen. Joseph Foraker called a “steam engine in trousers,” personified the hard-working chief executive. He filled his days with official functions and unofficial gatherings. He asserted his personality on policy and stamped the presidency firmly on the nation’s consciousness.

Taft had a tough act to follow

His successor, William Howard Taft, suffered by comparison. While it’s fair to observe that nearly anyone would have looked like a slacker compared with Roosevelt, it didn’t help that Taft weighed 300 pounds, which his contemporaries equated with laziness.

Taft’s girth only added to the perception that he lacked Roosevelt’s vigor.
Library of Congress

Taft helped neither his cause nor his image when he snored through meetings, at evening entertainments and, as author Jeffrey Rosen noted, “even while standing at public events.” Watching Taft’s eyelids close, Sen. James Watson said to him, “Mr. President, you are the largest audience I ever put entirely to sleep.”

An early biographer called Taft “slow-moving, easy-going if not lazy” with “a placid nature.” Others have suggested that Taft’s obesity caused sleep apnea and daytime drowsiness, a finding not inconsistent with historian Lewis L. Gould’s conclusion that Taft was capable of work “at an intense pace” and “a high rate of efficiency.”

It seems that Taft could work quickly, but in short bursts.

Coolidge the snoozer

Other presidents were more intentional about their daytime sleeping. Calvin Coolidge’s penchant for hourlong naps after lunch earned him amused scorn from contemporaries. But when he missed his nap, he fell asleep at afternoon meetings. He even napped on vacation. Tourists stared in amazement as the president, blissfully unaware, swayed in a hammock on his front porch in Vermont.

This, for many Republicans, wasn’t a problem: The Republican Party of the 1920s was averse to an activist federal government, so the fact that Coolidge wasn’t seen as a hard-charging, incessantly busy president was fine.

Biographer Amity Shlaes wrote that “Coolidge made a virtue of inaction” while simultaneously exhibiting “a ferocious discipline in work.” Political scientist Robert Gilbert argued that after Coolidge’s son died during his first year as president, Coolidge’s “affinity for sleep became more extreme.” Grief, according to Gilbert, explained his growing penchant for slumbering, which expanded into a pre-lunch nap, a two- to four-hour post-lunch snooze and 11 hours of shut-eye nightly.

For Reagan, the jury’s out

Ronald Reagan may have had a tendency to nod off.

“I have left orders to be awakened at any time in case of a national emergency – even if I’m in a cabinet meeting,” he joked. Word got out that he napped daily, and historian Michael Schaller wrote in 1994 that Reagan’s staff “released a false daily schedule that showed him working long hours,” labeling his afternoon nap “personal staff time.” But some family members denied that he napped in the White House.

Journalists were divided. Some found him “lazy, passive, stupid or even senile” and “intellectually lazy … without a constant curiosity,” while others claimed he was “a hard worker,” who put in long days and worked over lunch. Perhaps age played a role in Reagan’s naps – if they happened at all.

Clinton crams in the hours

One president not prone to napping was Bill Clinton. Frustrated that he could not find time to think, Clinton ordered a formal study of how he spent his days. His ideal was four hours in the afternoon “to talk to people, to read, to do whatever.” Sometimes he got half that much.

Two years later, a second study found that, during Clinton’s 50-hour workweek, “regularly scheduled meetings” took up 29 percent of his time, “public events, etc.” made up 36 percent of his workday, while “thinking time – phone & office work” constituted 35 percent of his day. Unlike presidents whose somnolence drew sneers, Clinton was disparaged for working too much and driving his staff to exhaustion with all-nighters.

Partisanship at the heart of criticism?

The work of being president of the United States never ends. There is always more to be done. Personal time may be a myth, as whatever the president reads, watches or does can almost certainly be applied to some aspect of the job.

Trump’s “executive time” could be a rational response to the demands of the job or life circumstances. Trump, for example, only seems to get four or five hours of sleep a night, which seems to suggest that he has more time to tackle his daily duties than the rest of us.

But, like his predecessors, the appearance of taking time away from running the country will garner criticism. Though they can sometimes catch 40 winks, presidents can seldom catch a break.The Conversation

Stacy A. Cordery, Professor of History, Iowa State University

This article is republished from The Conversation under a Creative Commons license. Read the original article.


A Confederate statue graveyard could help bury the Old South



A damaged Confederate statue lies on a pallet in a warehouse in Durham, N.C. on Tuesday, Aug. 15, 2017, after protesters yanked it off its pedestal in front of a government building.
AP Photo/Allen Breed

Jordan Brasher, University of Tennessee and Derek H. Alderman, University of Tennessee

An estimated 114 Confederate symbols have been removed from public view since 2015. In many cases, these cast-iron Robert E. Lees and Jefferson Davises were sent to storage.

If the aim of statue removal is to build a more racially just South, then, as many analysts have pointed out, putting these monuments in storage is a lost opportunity. Simply unseating Confederate statues from highly visible public spaces is just the first step in a much longer process of understanding, grieving and mending the wounds of America’s violent past. Merely hiding away the monuments does not necessarily change the structural racism that birthed them.

Studies show that the environment in which statues are displayed shapes how people understand their meaning. In that sense, relocating monuments, rather than eliminating them, can help people put this painful history into context.

For example, monuments to Confederate war heroes first appeared in cemeteries immediately following the Civil War. That likely evoked in visitors a direct and private honoring and grieving for the dead.

By the early 1900s, hundreds of Confederate statues dotted courthouse lawns and town squares across the South. This prominent, centrally located setting on government property sent an intentionally different message: that local officials endorsed the prevailing white social order.

So what should we do with rejected Confederate monuments? We have a modest proposal: a Confederate statue graveyard.

Lessons from the Soviet past

Our research as cultural geographers recognizes that Confederate monument controversies – while typically considered regional or national issues – are in fact part of global struggles to recognize and heal from the wounds of racism, white supremacy and anti-democratic regimes.

The idea of a Confederate monument graveyard is modeled after ways that the former communist bloc nations of Hungary, Lithuania and Estonia have dealt with statues of Soviet heroes like Joseph Stalin and Vladimir Lenin.

Under communist Soviet rule between 1945 and 1991, Eastern European countries suffered mass starvation, land theft, military rule and rigid censorship. An estimated 15 million people in the Soviet bloc died during this totalitarian reign.

Despite these horrors, many countries have opted not to destroy or hide their Soviet-era monuments, but they haven’t left them to rule over city hall or public plazas, either.

Rather, governments in Eastern Europe have altered the meaning of these politically charged Soviet statues by relocating them. Dozens of Soviet statues across Hungary, Lithuania and Estonia have been pulled from their pedestals and placed in open-air parks, where interested visitors can reflect on their new significance.

The idea behind relocating monuments is to dethrone dominant historical narratives that, in their traditional places of power, are tacitly endorsed.

A statue graveyard

The Eastern European effort to create a new memorial landscape has been met with mixed public reaction.

In Hungary, some see it as a step in the right direction. But, in Lithuania, people have expressed that re-erecting the statues of known dictators is in “poor taste” – an affront to those who suffered under totalitarianism.

The relocation of Soviet statues in Estonia has taken an even more interesting turn.

For the past decade, the Estonian History Museum has been collecting former Soviet monuments with the intention of making an outdoor exhibition out of them. For years it kept a decapitated Lenin and a noseless Stalin, among other degraded Soviet relics, in a field next to the museum.

The statues weathered Eastern European winters and languished in a defunct, toppled state. Weeds grew over them. The elements took their toll.

Travel writer Michael Turtle, who visited the museum in 2015, called the field a “statue graveyard.”

“Everything here seems to fit into some kind of purgatorial limbo,” he wrote on his blog. “The statues are not respected enough to be displayed as history but are culturally significant enough to not just be destroyed.”

To this we would add that these old statues, when repurposed thoughtfully and intentionally, have the potential to mend old wounds.

Confederate monument graveyard

What if the United States created its own graveyard for the distasteful relics of its own racist past?

We envision a cemetery for the American South where removed Confederate statues would be displayed, perhaps, in a felled position – a visual condemnation of the white supremacy they fought to uphold. Already crumpled monuments, like the statue to “The Boys Who Wore Grey” that was forcefully removed from downtown Durham, North Carolina, might be placed in the Confederate statue graveyard in their defunct state.

One art critic has even suggested that old monuments be physically buried under tombstones with epitaphs written by the descendants of those they enslaved.

We are not the first to suggest relocating Confederate statues.

Democratic presidential candidate Elizabeth Warren, for example, has proposed that toppled Confederate statues be housed in a history museum – “where they belong.”

That has proven challenging for curators.

When The University of Texas moved a statue of the Confederate President Jefferson Davis from its pedestal on campus to a campus museum, some students criticized the ensuing exhibit’s “lack of focus on racism and slavery.” One suggested that the statue’s new setting inadvertently glorified Davis, given the inherent value conferred on objects in museums.

And since statues in museums are typically exhibited in their original, upright position, Confederate generals like Robert E. Lee still tower over visitors – maintaining an imposing sense of authority.

We believe felled and crumpled monuments, in contrast, would create a somber commemorative atmosphere that encourages visitors to grieve – without revering – their legacy. A carefully-planned and aesthetically sensitive Confederate monument graveyard could openly and purposefully undermine the power these monuments once held, acknowledging, dissecting and ultimately rejecting the Confederacy’s roots in slavery.

Planning a Confederate monument graveyard will prompt many questions. Where should it be located? Will there be one central Confederate monument graveyard or many? Who will design and plan the graveyard?

Answering these questions would not just be part of a conversation about steel and stone but about the serious pursuit of peace, justice and racial healing in the nation — and about putting the Old South to rest.

Jordan Brasher is a member of the American Association of Geographers

The association is a funding partner of The Conversation US.

Derek H. Alderman is a member of the American Association of GeographersThe Conversation

The association is a funding partner of The Conversation US.

Jordan Brasher, Doctoral Candidate in Geography, University of Tennessee and Derek H. Alderman, Professor of Geography, University of Tennessee

This article is republished from The Conversation under a Creative Commons license. Read the original article.


The Gulf War



An archaeologist in Alaska: how working with a Yup’ik community transformed my view of heritage



Alice Watterson, CC BY-SA

Alice Watterson, University of Dundee

In spite of the gore, I sit completely transfixed by the deft movements of Sarah’s hands as she butchers a young spotted seal laid out on a strip of cardboard on the floor. “Can I help with anything?” I ask. She laughs as she separates out the meat from the fat and the fat from the skin and suggests that I can do the dishes if I like.

This is my third trip to the village of Quinhagak on the western coast of Alaska, surrounded by a landscape of vast expanses of tundra and an intertwined tangle of lakes and rivers which feed into the Bering Sea.


Google Maps

The Yukon-Kuskokwim Delta region of Alaska is home to the Yup’ik people, who practise a largely subsistence lifestyle characterised by seasonal hunting, fishing and gathering. An archaeological dig happens here each year over the brief summer season between July and August, although seasonal changes, once like clockwork, are becoming less distinct because of climate disruption.

Local woman Mary Church hangs up salmon strips to dry.
Willard Church., CC BY-SA

Discovering Nunalleq

The dig, which goes back nearly a decade, was initiated by the local community with the aim of rescuing the remains of an old sod house in a nearby area known as Nunalleq, or “the old village”, before it is lost to permafrost melt and a crumbling coastline. The site dates from between 1570 and 1675, decades before Yup’ik first came into contact with Russian and European traders.

The archaeological excavations at Nunalleq.
Nunalleq Archaeology Project

The excavations, led by a team from Aberdeen University in Scotland, were well underway by the time I joined in 2017. The project has recovered some 100,000 artefacts which were put on public display for the first time in August 2018 at the Nunalleq Culture and Archaeology Centre in Quinhagak.

As a reconstruction artist, it is my job to translate archaeological findings into renderings of life in the past. In Quinhagak, I was tasked with collaborating with the local community to co-design a digital resource for schoolchildren. It tells the story of the excavations in a way that makes space for the traditional Yup’ik worldview and contemporary parallels in subsistence, dance and crafts.

A reconstructed interior of the sod house.
Alice Watterson with characters by Tom Paxton

Within this resource, 3D-scanned artefacts and animated reconstructions of village life at Nunalleq can be explored on a computer screen, accompanied by soundbites, videos and interactive content co-curated by the Quinhagak community and the archaeologists. It will be available to the public here from July 2019.

3D scanned artefacts, narrated by local community and archaeologists.
Alice Watterson with characters by Tom Paxton

The purpose of my trip in April 2019 was to test the resource on school computers in the village. This trip was outside the usual dig season so I stayed with a local family. My host was schoolteacher Dora Strunk, who was raised in Quinhagak and whose children belong to a generation in the village who grew up with the archaeology project.

Whether it was bouncing across the tundra on Dora’s four-wheeler to collect kapuukuq greens, or sitting in her daughter Larissa’s bedroom listening to her explain the meaning behind her traditional dance regalia, these friendships have gradually reshaped my own understanding of what it means to be Yup’ik in the 21st century.

What heritage means

I’ve heard objections to the collection being housed in the village: shouldn’t it be in a big museum in Anchorage or New York where more people can see it, “for the greater good”?

What I have learned during my visits here is that there is a need to maintain heritage within a community – and to allow it to be part of the here and now. Heritage is often seen as being focused on fragmented artefacts and ruinous buildings, but for many people, particularly indigenous and descendant communities, it can be intrinsically connected to a sense of social identity and cohesion.

Children gather for the annual show-and-tell to see the finds excavated after each dig season.
Nunalleq Archaeology Project

Like many indigenous communities across the world, Yup’ik are still dealing with the effects of deep historic trauma from centuries of colonisation, exploitation and misrepresentation. Yet unlike the majority of Native Americans in the lower 48 states, Alaska Native land isn’t compacted into Indian reservations. People still traverse the vast expanse of tundra and coastline like their ancestors did thousands of years ago.

That said, maintaining this connection to land and tradition does not constitute a bygone era. Yup’ik is a living culture fully part of the modern world, with Snapchat and drum dances, microwave pizza and walrus ivory carving, snow machines and subsistence practices – even Facebook feeds filled with Yup’ik memes. Culture persists.

Establishing the Nunalleq centre in Quinhagak and helping to create the digital resource with the Yup’ik community is part of the same mindset that is prompting a handful of museums to repatriate artefacts and remains to descendant communities – while others come under mounting pressure to do so.

Here and now

My latest trip coincided with the district’s annual dance festival, which brought together schools from across the region. I had worked with young people from the local group the previous summer, who had chosen to interpret the excavation of dance regalia from Nunalleq by writing a new traditional drum song or yuraq about the site. They performed it again at this year’s festival.

During the festival, many youngsters came to the museum to see the artefacts. I witnessed teenagers pulling open drawers containing wooden dance masks, drum rings, ivory earrings, bentwood bowls and harpoons with trembling hands. Big kids lifted little kids up to peer into the cabinets and gasp, asking: “These all came from down there? From our beach?”

Volunteer Rufus holds out an ivory toggle he excavated from Nunalleq.
Nunalleq Archaeology Project

The “greater good” is right here: not only the collection being housed in Quinhagak, but also the work the village is doing to take charge of its story and share it with the wider world through outreach like the Nunalleq Educational Resource.

For Quinhagak, the past is not a place which is independent from the present. For the younger generation especially, the past is becoming a space for engaging with their heritage which they are continually transforming and reimagining in the present.The Conversation

Alice Watterson, Post-doctoral Research Assistant, 3DVisLab, University of Dundee

This article is republished from The Conversation under a Creative Commons license. Read the original article.


American War of Independence



Why Russia Sold Alaska



Hawaii



%d bloggers like this: