Category Archives: article

World politics explainer: The twin-tower bombings (9/11)



File 20180904 41720 xwahre.jpg?ixlib=rb 1.1
South Tower being hit during the 9/11 attacks. The events of September 11 2001 has significantly shaped American attitudes and actions towards fighting terrorism, surveilling citizens and othering outsiders.
NIST SIPA/Wikicommons

Barbara Keys, University of Melbourne

This article is part of our series of explainers on key moments in the past 100 years of world political history. In it, our authors examine how and why an event unfolded, its impact at the time, and its relevance to politics today.


At 8:46am on a sunny Tuesday morning in New York City, a commercial jet plane flew into the North Tower of the World Trade Centre, cutting through floors 93 to 99.

As the news was beamed around the world, shaken reporters wondered whether the crash had been an accident or an act of terrorism. At 9:03am, viewers watching the smoke billowing from the gash in the building were stunned to see a second jet plane dart into view and fly directly into the South Tower. Suddenly, it was clear that the United States was under attack.

The scale of the assault became apparent about 40 minutes later, when a third jet crashed into the Pentagon. Not long after, in the fourth shock of the morning, the South Tower of the World Trade Centre unexpectedly crumbled to the ground in a few seconds, its structural integrity destroyed by the inferno set off by the plane’s thousands of gallons of jet fuel. Its twin soon succumbed to the same fate.

Fire fighters on scene after the 9/11 attack.
Mike Goad/Wikicommons

What happened?

Over the next days and weeks, the world learned that 19 militants belonging to the Islamic terrorist group, al Qaeda, armed with box cutters and knives missed by airport security, had hijacked four planes.

Three hit their targets. The fourth, intended for the White House or the Capitol, crashed in a field in Pennsylvania when passengers, who had learned of the other attacks, struggled for control of the plane. All told, close to 3,000 people were killed and 6,000 were injured.

Immediate impact of the attacks

The events of 9/11 seared the American psyche. A country whose continental states had not seen a major attack in nearly 200 years was stunned to find that its financial and military centres had been hit by a small terrorist group based thousands of miles away. More mass attacks suddenly seemed not just probable but inevitable.




Read more:
How the pain of 9/11 still stays with a generation


The catastrophe set in motion a sequence of reactions and unintended consequences that continue to reverberate today. Its most lasting and consequential effects are interlinked: a massively expensive and unending “war on terror”, heightened suspicion of government and the media in many democratic countries, a sharp uptick in Western antagonism toward Muslims, and the decline of US power alongside rising international disorder – developments that aided the rise of Donald Trump and leaders like him.

War without end?

Just weeks after 9/11, the administration of US President George W. Bush invaded Afghanistan with the aim of destroying al Qaeda, which had been granted safe haven by the extremist Taliban regime. With the support of dozens of allies, the invasion quickly toppled the Taliban government and crippled al Qaeda. But it was not until 2011, under President Barack Obama, that US forces found and killed al Qaeda’s leader and 9/11 mastermind – Osama bin Laden.

American soldiers in Afghanistan, 2001.
Marine Corps New York/Flickr, CC BY

Though there have been efforts to end formal combat operations since then, over 10,000 US troops remain in Afghanistan today, fighting an intensifying Taliban insurgency. It is now the longest war the United States has fought. Far from being eradicated, the Taliban is active in most of the country. Even though the war’s price tag is nearing a trillion dollars, domestic pressure to end the war is minimal, thanks to an all volunteer army and relatively low casualties that make the war seem remote and abstract to most Americans.

Even more consequential has been the second major armed conflict triggered by 9/11: the US-led invasion of Iraq in 2003. Although Iraqi dictator Saddam Hussein was not linked to 9/11, officials in the administration of George W. Bush were convinced his brutal regime was a major threat to world order. This is largely due to Saddam Hussein’s past aggression, his willingness to defy the United States, and his aspirations to build or expand nuclear, chemical, and biological weapons programs, making it seem likely that he would help groups planning terrorist attacks on the West.

The invading forces quickly ousted Saddam, but the poorly executed, error-ridden occupation destabilised the entire region.

In Iraq, it triggered a massive, long-running insurgency. In the Middle East more broadly, it boosted Iran’s regional influence, fostered the rise of the Islamic State, and created lasting disorder that has led to civil wars, countless terrorist attacks, and radicalisation.

In many parts of the world, the war fuelled anti-Americanism; in Europe, public opinion about the war set in motion a widening estrangement between the United States and its key European allies.

Monetary and social costs

Today, the United States spends US$32 million every hour on the wars fought since 9/11. The total cost is over US$5,600,000,000,000. (5.6 trillion dollars). The so-called war on terror has spread into 76 countries where the US military is now conducting counter-terror activities, ranging from drone strikes to surveillance operations.

The mind-boggling sums have been financed by borrowing, which has increased social inequality in the United States. Some observers have suggested that government war spending was even more important than financial deregulation in causing the 2007-2008 Global Financial Crisis.

Eroding democracy

The post-9/11 era has eroded civil liberties across the world. Many governments have cited the urgent need to prevent future attacks as justification for increased surveillance of citizens, curbing of dissent, and enhanced capacity to detain suspects without charge.

The well publicised missteps of the FBI and the CIA in failing to detect and prevent the 9/11 plot, despite ample warnings, fed public distrust of intelligence and law enforcement agencies. Faulty intelligence about what turned out to be nonexistent Iraqi “weapons of mass destruction” (WMDs) undermined public confidence not only in the governments that touted those claims but also in the media for purveying false information.

The result has been a climate of widespread distrust of the voices of authority. In the United States and in other countries, citizens are increasingly suspicious of government sources and the media — at times even questioning whether truth is knowable. The consequences for democracy are dire.

Increasing Islamophobia

Across the West, 9/11 also set off a wave of Islamophobia. Having fought a decades-long Cold War not long before, Americans framed the attack as a struggle of good versus evil, casting radical Islam as the latest enemy. In many countries, voices in the media and in politics used the extremist views and actions of Islamic terrorists to castigate Muslims in general. Since 9/11, Muslims in the United States and elsewhere have experienced harassment and violence.

Cartoon highlighting Islamophobia in Europe.
Carlos Latuff/Flickr, CC BY-SA

In Western countries, Muslims are now often treated as the most significant public enemy. European populists have risen to power by denouncing refugees from Muslim majority countries like Syria, and the willingness and ability of Muslims to assimilate is viewed with increasing scepticism.

A week after his inauguration, US President Donald Trump kept a campaign promise by signing the so-called “Muslim ban”, designed to prevent citizens of six Muslim-majority countries from entering the United States.

Following attacks

One of the most widely expected consequences of 9/11 has so far been averted. Though Islamic terrorists have engaged in successful attacks in the West since 9/11, including the 2002 Bali bombings, the 2004 Madrid train bombings, and the 2015 attacks in Paris, there has been no attack on the scale of 9/11. Instead, it is countries with large Muslim populations that have seen a rise in terrorist attacks.

Yet the West still pays the price for its militant and militarised response to terrorism through the weakening of democratic norms and values. The unleashing of US military power that was supposed to intimidate terrorists has diminished America’s might, creating a key precondition for Donald Trump’s promise to restore American greatness.

Although many of the issues confronting us today have very long roots, the world we live in has been indelibly shaped by 9/11 and its aftermath.The Conversation

Barbara Keys, Associate Professor of US and International History, University of Melbourne

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Advertisements

Rome: City + Empire contains wonderful objects but elides the bloody cost of imperialism



File 20181002 98878 8uv9il.jpg?ixlib=rb 1.1
Coins from the Hoxne Treasure,
Hoxne, England, late 4th – early 5th century CE.
silver.
1994,0401.299.1-20
© Trustees of the British Museum
© Trustees of the British Museum, 2018. All rights reserved

Caillan Davenport, Macquarie University and Meaghan McEvoy, Macquarie University

“What have the Romans ever done for us?” asks Reg from the People’s Front of Judaea in Monty Python’s comedy classic, Life of Brian. Rome: City + Empire, now showing at the National Museum of Australia, offers visitors a clear answer: they brought civilization.

This collection of more than 200 objects from the British Museum presents a vision of a vast Roman empire, conquered by emperors and soldiers, who brought with them wealth and luxury. Quotations from ancient authors extolling the virtues of Rome and the rewards of conquest stare down from the walls. This is an exhibition of which the Romans themselves would have been proud.

Portrait head resembling Cleopatra.
Italy, 50–30 BCE
limestone
1879,0712.15
© Trustees of the British Museum

© Trustees of the British Museum, 2018. All rights reserved

Indeed, the major issue is that the displays present a largely uncritical narrative of Roman imperialism. One section, called “Military Might,” features a statue of the emperor Hadrian in armour, a defeated Dacian, and a bronze diploma attesting to the rewards of service in the Roman army. An explanatory panel informs us that resistors were “treated harshly” while those “who readily accepted Roman domination, benefited”. This is especially troubling to read in an Australian context.

The exhibition is beautifully laid out, with highly effective use of lighting and colour to emphasise the different themes: “The Rise of Rome”, “Military Might”, “The Eternal City”, “Peoples of the Empire” and “In Memoriam”. And it boasts impressive busts and statues of emperors, imperial women, priests and priestesses, gods and goddesses, most displayed in the open, rather than behind glass. This allows visitors to view them up close from many angles.

Mummy portrait of a woman.
Rubaiyat, Egypt, 160–170 CE
encaustic on limewood
1939,0324.211
© Trustees of the British Museum

© Trustees of the British Museum, 2018. All rights reserved

The use of imagery is one of the exhibition’s greatest strengths. Close-ups of coins and other small artefacts are projected against the wall, while enlarged 18th-century Piranesi prints of famous monuments such as the Pantheon provide a stunning backdrop.

There are some excellent curatorial choices. The number of images of women is commendable, enabling the exhibition to move beyond emperors, soldiers and magistrates to emphasise women as an intrinsic part of the life of Rome.

Stories of key monuments, such as the Colosseum, the Baths of Caracalla, and the Pantheon, are accompanied by busts of the emperors who built them as well as associated everyday objects such as theatre tickets and strigils. However, there is no map of the city of Rome to allow visitors to place these buildings in context. And the evidence for the true cost of Roman conquest is not sufficiently highlighted.

Where are the slaves?

Coins show emperors subduing prostrate peoples, including one featuring Judaea, where Vespasian and Titus cruelly crushed a revolt between 66-73 CE. The accompanying plaque refers obliquely to Roman “acts of oppression”, but one has to turn to the exhibition catalogue to find the true list of horrors, including the thousands enslaved and the sacking of the Temple of Jerusalem. Nor is there any mention that the construction of the Colosseum, profiled just a few feet away in the exhibition, was funded by the spoils of the Jewish War.

Relief showing two female gladiators.
Halicarnassus (modern Bodrum), Turkey, 1st–2nd century CE
marble
1847,0424.19
© Trustees of the British Museum

© Trustees of the British Museum, 2018. All rights reserved

The walls are covered with quotations extolling the Romans’ own imperialistic vision. “The divine right to conquer is yours”, a line from Virgil’s Aeneid, greets visitors at the start. Even more troubling is a quotation from Pliny the Elder which looms over the “Peoples of the Empire” section:

Besides, who does not agree that life has improved now the world is united under the splendour of the Roman Empire.

Toothpick from the Hoxne Treasure.
Hoxne, England, late 4th – early 5th century CE
silver and niello with gold gilding
1994,0408.146
© Trustees of the British Museum

© Trustees of the British Museum, 2018. All rights reserved

This section is full of objects displaying the luxurious lifestyle of provincial elites under Roman rule, from the stunning decorated spoons and bracelets of the British Hoxne treasure to beautiful funerary reliefs of rich Palmyrenes. The exhibition trumpets the “diversity” of Rome’s peoples, but this curious set of objects does not tell any coherent story beyond the comfortable lives of the privileged.

Slavery – the most horrifying aspect of Roman society – is all but absent. There are incidental references (a gladiator given his freedom, the funerary urn of a former slave), but they are presented with little context. Scholars have estimated that slaves composed at least 10 per cent of the empire’s total population of 60 million. They undertook domestic and agricultural labour, educated children, and served in the imperial household. Their stories remain largely untold.




Read more:
Mythbusting Ancient Rome: cruel and unusual punishment


Alternative narratives

The absence of any counterpoint to the Romans’ story in this exhibition is all the more surprising given that the catalogue contains an essay from the NMA that does show awareness of these problems. Curators Lily Withycombe and Mathew Trinca explore how the narrative of Roman conquest influenced imperial expansion in the modern age, including the colonisation of Australia.

Particularly revealing is their statement: “While the Classics may have once been in the service of British ideas of empire, they are now more likely to be taught using a critical postcolonial lens.” Yet this nuance does not make it into the exhibition itself.

Ring with sealstone depicting Mark Antony
probably Italy, 40–30 BCE.
gold and jasper
1867,0507.724
© Trustees of the British Museum

© Trustees of the British Museum, 2018. All rights reserved

A very different narrative about the Roman world could have been presented. Even in their own time, Roman commentators were aware of the darker side of imperialism. In his account of the influx of Roman habits and luxuries into Britain, the historian Tacitus remarked:

The Britons, who had no experience of this, called it ‘civilization’, although it was a part of their enslavement. (Agricola 21, trans. A. R. Birley).

The colossal head of the empress Faustina the Elder from a temple in Sardis is a spectacular object, but its overwhelming size should remind us of the asymmetrical power dynamics of Roman rule. Emperors and their family members were meant to be figures of awe to peoples of the empire, to be feared like gods. Tacitus memorably described the imperial cult temple at Colchester in Britain as a “fortress of eternal domination”.




Read more:
Guide to the Classics: Virgil’s Aeneid


The Rome of the exhibition is a curiously timeless world. The grant of Roman citizenship to all free inhabitants of the empire in 212 CE goes unmentioned, and the coming of Christianity is presented almost as an afterthought.

There are some spectacular items from the vibrant world of Late Antiquity (3rd-7th centuries CE), such as the gold glass displaying Peter and Paul and parts of the Esquiline treasure. But this section is marred by factual errors and it misses the opportunity to explore the dynamics of fundamental religious and cultural change.

Horse-trappings from the Esquiline Treasure.
Rome, Italy, 4th century CE
silver and gold gilding
1866,1229.26
© Trustees of the British Museum

© Trustees of the British Museum, 2018. All rights reserved

Rome: City + Empire is a wonderful collection of objects, displayed in an engaging manner, which will be of interest to all Australians. The exhibition is likely to be a hit with children – there is a playful audio-guide specifically for kids and many hands-on experiences dotted throughout: from the chance to electronically “colour-in” the funerary relief of a Palmyrene woman on a digital screen, to feeling a Roman coin or picking up a soldier’s dagger.

But visitors should be aware that it presents a distinctly old-fashioned tale of Rome’s rise and expansion, which is out of step with contemporary scholarly thinking. The benefits of empire came at a bloody cost.

Rome: City + Empire is at the National Museum of Australia until 3 February 2019.The Conversation

Caillan Davenport, Senior Lecturer in Roman History, Macquarie University and Meaghan McEvoy, Associate Lecturer in Byzantine Studies, Macquarie University

This article is republished from The Conversation under a Creative Commons license. Read the original article.


Bogs are unique records of history – here’s why


Henry Chapman, University of Birmingham; Ben Gearey, University College Cork; Jane Bunting, University of Hull; Kimberley Davies, Plymouth University, and Nicola Whitehouse, Plymouth University

Peat bogs, which cover 3% of the world’s land surface, are special places. While historically often considered as worthless morasses, today they are recognised as beautiful habitats providing environmental benefits from biodiversity to climate regulation. However, they are threatened by drainage, land reclamation for agriculture and peat cutting for fuel, which has significantly reduced the extent and condition of these ecosystems on a global scale. Bogs are fragile and sensitive to change, whether by human hands or by processes such as climate change.

A less well known aspect of bogs is their remarkable archaeological potential. In their undisturbed state at least, bogs are anoxic (oxygen-free) environments due to their saturation. These conditions are hostile to the microbes and fungi that would normally decay organic material such as the remains of plants, which are the principal constituents of the peat. The same anoxic conditions also offer protection from decay for organic archaeological remains. The vast majority of objects and structures used by our ancestors were made from organic materials (in particular wood). These are normally lost on dryland archaeological sites but can be preserved in peatlands.

The saturated conditions mean that even soft tissue can survive, including both skin and internal organs. Probably the best known archaeological finds are the remains of “bog bodies” such as the famous prehistoric Tollund Man in Denmark, Lindow Man in the UK, or the more recent Irish discoveries of Clonycavan Man, Old Croghan Man and Ireland’s oldest known bog body, Cashel Man, dated to the Bronze Age.

Excavating a trackway on Hatfield Moors, South Yorkshire.
© Henry Chapman

Seeing hidden landscapes

But archaeology is only part of the story these environments have to tell. They are important archives of the past in other ways: the layers of moss and other vegetation that make up peat are themselves immensely valuable as archives of past environments (palaeoenvironments). The manner in which peat accumulates means that the deposits have stratigraphic integrity, meaning that contained within each layer can be found macroscopic and microscopic remains of plants and other organisms that shed light on landscape change and biodiversity on timescales ranging from centuries to millennia. The high organic content of peat means that these records can be dated using the radiocarbon method.

The best known such records are probably pollen grains which provide evidence of past vegetation change. But evidence from other organic material can be used to reconstruct other past environmental processes. For example, single-celled organisms called testate amoebae, preserved in sub-fossil form, are highly sensitive to peatland hydrology and have been extensively used in recent years to reconstruct a history of climatic changes. Meanwhile, fossil beetles can tell us how the biodiversity and nutrient status of a peatland has altered over time.

Fossil beetle remains associated with Old Croghan Man bog body, Ireland.
© Nicki Whitehouse, Author provided

The potential of bogs to preserve both environmental and archaeological records means that they can be regarded as archives of “hidden landscapes”. The accumulating peat literally seals and protects evidence of human activity ranging from the macroscopic (in the form of archaeological sites, artefacts and larger plant and animal remains) through to the microscopic (pollen, testate amoebae and other remains) material that provides contextual evidence of environmental processes.

Through detailed integrated analyses these records can provide evidence of past human activity ranging from the everyday exploitation of economic resources of peatlands, through to the ceremonies associated with prehistoric human sacrifice and the deposition of the so-called bog bodies. The associated palaeoenvironmental record can be used to situate these cultural processes within long term patterns of environmental changes.

A bog in Estonia seen from above.
FotoHelin/Shutterstock.com

Taming the wild

There has been extensive study of the palaeoenvironmental record from bogs and notable archaeological excavations of sites and artefacts, but there have been relatively few concerted attempts to integrate these approaches. In part this is because generating sufficient data to model the development of a bog in four dimensions (the fourth being time) is a formidable research challenge. But some peatlands have seen relatively extensive archaeological and palaeoenvironmental research over the last few decades, providing an excellent starting point. Hatfield and Thorne Moors, situated primarily in South Yorkshire, are two such peatlands.

These two largest surviving areas of lowland bog in England are located within a wider lowland region known as the Humberhead Levels. After decades of industrial peat extraction, these bogs are now nature reserves managed by Natural England, and are becoming the “wild” bogs they once were. We are attempting to reconstruct the wildscape and bring the complex histories of this vast and dynamic boggy landscape to life.

Flora on Thorne Moors.
© Peter Roworth, Author provided

These moors are just two surviving parts of a once rich mosaic of wetland landscapes. In the past, this landscape was famed for its wildness – a remnant of an extensive complex of mires, rivers, meres and extensive floodplain wetlands. Antiquarians such as John Leland visited the area in the 16th century, and his descriptions provide a “window onto what must have been a truly fabulous ‘everglades-like’ landscape”, as described by local historian Colin Howes.

Now largely drained, tamed and converted to farmland, it’s hard to imagine the vast wetland landscapes that once characterised these areas. Following large-scale land reclamation in the 17th century, many of the traditional practises such as fishing, fowling, grazing and peat-cutting (turbary) rights were no longer available to commoners. Consequently, the connections between people and place became increasingly defined by a new, dryland landscape and disconnected from its former wetlands that were once so central to people’s lives.

Sphagnum moss on Thorne Moors.
© Peter Roworth

We are investigating and reconstructing this dynamic and changing wildscape throughout its history, reconnecting communities to these wetland landscapes. Drawing together previous research alongside targeted archaeological fieldwork and palaeoenvironmental analyses, we are combining these with newly available digital data and sophisticated modelling techniques to reconstruct their interwoven landscape and human histories. Together, for the first time, we are beginning to see the complexity of the dynamic and changing landscape that once characterised the Humberhead Levels.The Conversation

Henry Chapman, Professor of Archaeology, University of Birmingham; Ben Gearey, Lecturer in Environmental Archaeology, University College Cork; Jane Bunting, Reader in Geography, University of Hull; Kimberley Davies, Research Assistant, Wildscape Project, Plymouth University, and Nicola Whitehouse, Associate Professor (Reader) in Physical Geography, Plymouth University

This article is republished from The Conversation under a Creative Commons license. Read the original article.


A history of sporting lingo: a linguistic ‘shirtfronting’ for lovers and haters of sports alike


Kate Burridge, Monash University and Howard Manns, Monash University

Like sport or hate it, it’s hard to deny the role that sporting lingo plays in our daily lives.

Corporate language everywhere groans with references of people leveling playing fields, getting balls rolling, moving goal posts, lighting fires under their teams, blocking and tackling, even touching base offline – and of course it’s all done by the playbook and at close of play.

Perhaps it’s just not cricket, but politics is also rife with sporting lingo. Shirtfronting has escaped the on-field aggression of the AFL to cover diplomatic spats. Both the captain’s pick and captain’s call have slipped out of sporting jargon and onto the political football field. Political parties have even been accused of ball-tampering.

And so, we say to you, tenez! (“take, receive”), as a 14th century tennis player is believed to have called out before serving a ball (a French cry that reputedly gave tennis its name).

Allow us to bandy around (a tennis term) a few ideas here as we run with (a football term) a brief review of sporting lingo inside the bloody arena and throughout our daily lives.

Tickets and etiquette in ‘disport’

The word sport is a shortening of an earlier term disport, which from the 14th century broadly encompassed any form of relaxation or diversion.

In fact, from the 15th century, one meaning of sport was a playful reference to romance and lovemaking. This died off in the 18th century, but another 15th century meaning, “activity of skill and exertion with set rules or customs”, has withstood the test of time.

At sports events, you might see the reverse side of your ticket setting out rules of etiquette for spectators. Both derive from an Old French word estiquette meaning “note or label”. The word etiquette emerged in late 17th century French as a note detailing the rules and customs for engaging with the Spanish court.

But etiquette in modern sporting contests includes being nice to umpires. Sure, they make some tough calls, but so do we as English speakers.




Read more:
Why AFL commentary works the same way as Iron Age epic poetry


After all, the word umpire actually derives from the Norman French noumpere, corresponding to “non-peer”, the one who stood out among peers. (Linguistic boundary lines have been problematic for some time — but that, as the saying goes, is a whole nother story.)

Umpires try to keep the peace, but more than a few words derive from the punishing and warlike nature of sport. Melbourne Demons coach Simon Goodwin said his team would learn from the “drubbing” they received from West Coast.

We can only hope he intended the modern meaning of drub (“beat badly in a sporting contest”), and not the meaning associated with drub‘s 17th century Arabic origins (“the flogging of feet”).

Sporting language in everyday speech

We’re surrounded by sporting language, much of it from sports to which we no longer pay much heed — some forgotten entirely.

Archery has been quiet contributor over the years. The verb to rove “wander about with no purpose in mind”, for instance, comes from a 15th century archery term meaning “shoot arrows randomly at an arbitrarily selected target”.

The original upshot was the final shot in a match (a closing or parting shot). The first bolt was a crossbow projectile.

Even those disapproving of the “sport” of hunting have to admire its contributions to language. A tryst, now “an assignation with a lover”, was originally “an appointed station in hunting”. A ruse, these days a general term for “deception”, was the detour hunted animals made to elude the hounds.

These sagacious “acute-smelling” hounds would occasionally run riot “follow the scent of animals other than the intended prey”. Retrieving was flushing out their re-found quarry and worry “seize by the throat” was what they did to it once they got it.




Read more:
Get yer hand off it, mate, Australian slang is not dying


Hawking or falconry must have once played a central role in our lives for this sport has donated a number of expressions. Haggard was originally used to describe wild hawks, and to pounce derives from their pounces or fore-claws.

And reclaim or rebate referred to calling the hawk “back from flight”. It was carried out by a special pipe known as a lure, which is now a general word meaning “magnetism” or “attraction”.

Pall-mall player.
Wikimedia Commons

Some sports have completely disappeared but have left behind relics in some common expressions. Pall-mall (probably from Middle French pale-mail “ball-mallet”) was a croquet-like lawn game in the 16th and 17th centuries. It gave its name to straight roads or promenades (such as Pall Mall in London), before it then morphed into the shopping malls of modern times.

Even the medieval jousting tournament is the source of a few current expressions like break a lance, tilt at and at full tilt, meaning “at full speed”. (The tilt was originally the barrier separating the combatants and later was applied to the sport itself.)

These days, jousting refers generally to any sort of banter or sparring between individuals who might have thrown down or taken up the gauntlet, meaning “challenged” or “accepted a challenge”. (The gauntlet refers to the knight’s mailed glove).

Unlucky players might end up being thrilled (originally pronounced “thirled”), which doesn’t mean ecstatic, but rather pierced by a lance or spear.




Read more:
‘Too fat to get drafted’: the worrying body-image pressures in the AFL


Up there Cazaly!: on with the ‘people’s tournament’

And so we cry Up there Cazaly! (after the famed footballer Roy Cazaly) — on with the Grand Final, a.k.a. the big dance.

A mob football match in 18th century London.
Wikimedia Commons

And spare a final thought for the “people’s tournament” — the medieval game that gave us the word football. As Heiner Gillmeister points out, there is evidence this game was also opened by the cry tenez!

Whether it was played in the monastery cloisters (the arches forming the original goals) or in an open space (so-called “mob football” played between two villages — a tough gig for the boundary umpire), it was a bloody and riotous affair getting that ball full of wynde to the target area.

As Sir Thomas Elyot put it in The Boke Named the Governour (1531):

Nothinge but beastly furie, and exstreme violence

Like sport or hate it, we hope you’ve found our linguistic shirtfronting here gentle, fun and appropriate as far as captain’s calls go.The Conversation

Kate Burridge, Professor of Linguistics, Monash University and Howard Manns, Lecturer in Linguistics, Monash University

This article is republished from The Conversation under a Creative Commons license. Read the original article.


Why New Zealand was the first country where women won the right to vote



File 20180914 177962 ftgwba.jpg?ixlib=rb 1.1
A memorial by sculptor Margriet Windhausen depicts the life-size figures of Kate Sheppard and other leaders of the Aotearoa New Zealand suffrage movement.
Bernard Spragg/Wikimedia Commons, CC BY-ND

Katie Pickles

125 years ago today Aotearoa New Zealand became the first country in the world to grant all women the right to vote.

The event was part of an ongoing international movement for women to exit from an inferior position in society and to enjoy equal rights with men.

But why did this global first happen in a small and isolated corner of the South Pacific?




Read more:
Women’s votes: six amazing facts from around the world


Setting the stage

In the late 19th century, Aotearoa New Zealand was a volatile and rapidly changing contact zone where British settlers confidently introduced systematic colonisation, often at the expense of the indigenous Māori population. Settlers were keen to create a new world society that adapted the best of Britain and left behind behind the negative aspects of the industrial revolution – Britain’s dark satanic mills.

Many supported universal male suffrage and a less rigid class structure, enlightened race relations and humanitarianism that also extended to improving women’s lives. These liberal aspirations towards societal equality contributed to the 1893 women’s suffrage victory.

At the end of the 19th century, feminists in New Zealand had a long list of demands. It included equal pay, prevention of violence against women, economic independence for women, old age pensions and reform of marriage, divorce, health and education – and peace and justice for all.

The women’s suffrage cause captured widespread support and emerged as the uniting right for women’s equality in society. As suffragist Christina Henderson later summed up, 1893 captured “the mental and spiritual uplift” women experienced upon release “from their age-long inferiority complex”.

Two other factors assisted New Zealand’s global first for women: a relatively small size and population and the lack of an entrenched conservative tradition. In Britain, John Stuart Mill presented a first petition for women’s suffrage to the British Parliament in 1866, but it took until wartime 1918 for limited women’s suffrage there.

Women as moral citizens

As a “colonial frontier”, New Zealand had a surplus of men, especially in resource towns. Pragmatically, this placed a premium on women for their part as wives, mothers and moral compasses.

There was a fear of a chaotic frontier full of marauding single men. This colonial context saw conservative men who supported family values supporting suffrage. During the 1880s, depression and its accompanying poverty, sexual licence and drunken disorder further enhanced women’s value as settling maternal figures. Women voters promised a stabilising effect on society.

New Zealand gained much strength from an international feminist movement. Women were riding a first feminist wave that, most often grounded in their biological difference as life givers and carers, cast them as moral citizens.

Local feminists eagerly drew upon and circulated the best knowledge from Britain, America and Europe. When Mary Leavitt, the leader of the US-based Women’s Christian Temperance Union (WCTU) visited New Zealand in 1885, her goal was to set up local branches. This had a direct impact, leading to the country’s first national women’s organisation and providing a platform for women to secure the vote in order to affect their colonial feminist concerns.

Other places early to grant women’s suffrage shared the presence of liberal and egalitarian beliefs, a surplus of men over women, and less entrenched conservatism. The four frontier US western mountain states led the way with Wyoming (1869), Utah (1870), Colorado (1893) and Idaho (1895). South Australia (1894) and Western Australia (1899) made the 19th century and, before the first world war, were joined by other western US states, Australia, Finland and Scandinavia.

Local agency

Social reformer and suffragist Kate Sheppard, around 1905.
Wikimedia Commons, CC BY-ND

New Zealand was fortunate to have many effective women leaders. Most prominent among them was Kate Sheppard. In 1887, Sheppard became head of the WCTU’s Christchurch branch and led the campaign for the vote.

The campaign leaders were well organised and hard working. Their tactics were petitions, pamphlets, letters, public talks and lobbying politicians – this was a peaceful era before the suffragette militancy during the early 20th century elsewhere.




Read more:
Adela Pankhurst: the forgotten sister who doesn’t fit neatly into suffragette history


The women were persistent and overcame setbacks. It took multiple attempts in parliament before the Electoral Act 1893 was passed. Importantly, the suffragists got public opinion behind the cause. Mass support was demonstrated through petitions between 1891 and 1893, in total garnering 31,872 signatures, amounting to a quarter of Aotearoa’s adult women.

Pragmatically, the women worked in allegiance with men in parliament who could introduce the bills. In particular, veteran conservative Sir John Hall viewed women’s suffrage as a way to a more moral and civil society.

The Suffrage 125 celebratory slogan “whakatū wāhine – women stand up!” captures the intention of continuing progressive and egalitarian traditions. Recognising diverse cultural backgrounds is now important. With hindsight, the feminist movement can be implicated as an agent of colonisation, but it did support votes for Māori women. Meri Te Tai Mangakāhia presented a motion to the newly formed Māori parliament to allow women to vote and sit in it.

New Zealand remains a small country that can experience rapid social and economic change. Evoking its colonial past, however, it retains both a reputation as a tough and masculine place of beer-swilling, rugby-playing blokes and a tradition of staunch, tea drinking, domesticated women.The Conversation

Katie Pickles, Professor of History at the University of Canterbury and current Royal Society of New Zealand Te Apārangi James Cook Research Fellow

This article is republished from The Conversation under a Creative Commons license. Read the original article.


As we celebrate the rediscovery of the Endeavour let’s acknowledge its complicated legacy


Natali Pearson, University of Sydney

Researchers, including Australian maritime archaeologists, believe they have found Captain Cook’s historic ship HMB Endeavour in Newport Harbour, Rhode Island. An official announcement will be made on Friday.

The discovery is the culmination of decades of work by the Rhode Island Marine Archaeology Project and the Australian National Maritime Museum to locate and positively identify the vessel, which had been missing from the historical record for over two centuries. Plans are now under way to raise funds to excavate and conduct scientific testing in 2019.

As the first European seafaring vessel to reach the east coast of Australia, the Endeavour – much like James Cook himself – has become part of Australia’s national mythology. Unlike Cook, who famously met his end on Hawaiian shores, the fate of the Endeavour had long been unknown. The discovery has therefore resolved a long-standing maritime mystery.

In a serendipitous twist, it coincides with two significant dates: the 250th anniversary of the Endeavour’s departure from England in 1768 on its now (in)famous voyage south, and the 240th anniversary of the ship’s scuttling in 1778 during the American War of Independence.

Identifying the Endeavour’s location has been a 25-year processs. Archaeologists initially identified 13 potential candidates in the harbour. Over time, the number of possible sites was narrowed to five.

This month, a joint diving team has worked to measure and inspect these sites, drawing upon knowledge of Endeavour’s size to identify a likely candidate. Excavation and timber analysis is expected to provide final confirmation. Those expecting an entire ship to be recovered will be disappointed, as very little of it remains.

But this is a controversial vessel, and celebrations of its discovery will be tempered by reflection about its complicity in the British colonisation of Indigenous Australian land. While Endeavour played an instrumental role in advancing science and exploration, its arrival in what is now known as Botany Bay in 1770 also precipitated the occupation of territory that its Aboriginal owners never ceded.




Read more:
How Captain Cook became a contested national symbol


A ship by any other name …

Although Endeavour’s early days are well known, it has taken many years for researchers to piece together the rest of its story. One problem has been the many names the vessel was known by during its lifetime.

Built in 1764 in Whitby, England, as a collier (coal carrier), the vessel was originally named Earl of Pembroke. Its flat-bottomed hull and box-like shape, designed to transport bulk cargo, later proved helpful when navigating the treacherous coral reefs of the southern seas.

Endeavour, then known as Earl of Pembroke, leaving Whitby Harbour in 1768. Painting by Thomas Luny, c. 1790. (Some think Luny painted another ship after Endeavour became famous.)
Wikimedia

In 1768, Earl of Pembroke was sold into the service of the Royal Navy and the Royal Society. It underwent a major refit to accommodate a larger crew and sufficient provisions for a long voyage. In keeping with the ambitious spirit of the era, the vessel was renamed His Majesty’s Bark (HMB) Endeavour (bark being a nautical term to describe a ship with three masts or more).

Endeavour departed England in 1768 under the command of then-Lieutenant Cook. Ostensibly sailing to the South Pacific to observe the 1769 Transit of Venus, Cook was also under orders to search for the fabled southern continent. So it was that a coal carrier and a rare astronomical event changed the history of the Australian continent and its people.




Read more:
Transit of Venus: a tale of two expeditions


Mysterious ends

Following Endeavour’s circumnavigation of the globe (1768-1771), the vessel was used as a store ship before the Royal Navy sold it in 1775. Here, the ship’s fate become mysterious.

Many believed it had been renamed La Liberté and put to use as a French whaling ship before succumbing to rotting timbers in Newport Harbour in 1793. Others rejected this theory, suggesting instead that Endeavour had spent her final days on the river Thames.

A breakthrough came in 1997. Australian researchers suggested the Endeavour had in fact been renamed Lord Sandwich. The theory gained weight following an archival discovery by Kathy Abbass, director of the Rhode Island project, in 2016, which indicated that Lord Sandwich had been used as a troop transport and prison ship during the American War of Independence before being scuttled in Newport Harbour in 1778.

Lord Sandwich was one of a number of transport ships deliberately sunk by the British in an attempt to prevent the French fleet from approaching the shore.

Finding a shipwreck is not impossible, but finding the one you’re looking for is hard. Rhode Island volunteers have been searching for this vessel since 1993, slowly narrowing down the search area and eliminating potential contenders as they explore the often-murky waters of Newport Harbour.

They were joined in their efforts by the Australian National Maritime Museum in 1999 and, in more recent years, by the Silentworld Foundation, a not-for-profit organisation with a particular interest in Australasian maritime archaeology.

Endeavour’s voyage across the Pacific Ocean.
Wikimedia

Museums around the world are already turning their attention to the significant Cook anniversaries on the horizon and the complex legacy of these expeditions. These interpretive endeavours will only be heightened by the planned excavation of the ship’s remains in the near future.

Shipwrecks are a productive starting point for thinking about how we make meaning from the past because of the firm hold they have on the public imagination. They conjure images of lost treasure, pirates and, especially in the case of Endeavour, bold adventures to distant lands.

But as we celebrate the spirit of exploration that saw a humble coal carrier circumnavigate the globe – and the same spirit of exploration that has led to its discovery centuries later – we must also make space for the unsettling stories that will resurface as a result of this discovery.The Conversation

Natali Pearson, Deputy Director, Sydney Southeast Asia Centre, University of Sydney

This article is republished from The Conversation under a Creative Commons license. Read the original article.


World politics explainer: the Iranian Revolution


File 20180914 177965 1f3qsou.jpg?ixlib=rb 1.1
Protests during the Iranian Revolution, 1978 represent broader struggles across the region between secular and Islamic models of governance playing out.
Wikicommons

Mehmet Ozalp, Charles Sturt University

This article is part of our series of explainers on key moments in the past 100 years of world political history. In it, our authors examine how and why an event unfolded, its impact at the time, and its relevance to politics today.


To understand what caused the Iranian Revolution, we must first consider the ongoing conflict between proponents of secular versus Islamic models of governance in Muslim societies.

It all began with the British colonisation of India in 1858, which precipitated the collapse of classic Islamic civilisation. By early 20th century, almost the entire Muslim world was colonised by European powers.

The Ottoman Empire, the last representative of the classic Islamic civilisation, collapsed after world war one in 1918. So, the first half of 20th century saw Muslim nations fight to regain their independence.

It was the secular-nationalist, western, educated elites who first led these movements, gaining political control and leadership of their respective countries. These leaders wanted to mimic Europe’s progressive leaps that took place after diminishing Christianity’s grip on society and politics. They believed Muslim societies would progress if the Islam was reformed and its influence on society reduced through separating religion and state.

A key reform enforced by the new secular Republic of Turkey, for example, was to remove the Ottoman Caliphate (the religious and political leader considered the successor to the Prophet Muhammad) from his position in 1924, sending shockwaves across the Muslim world.

This caused the emergence of alternative grassroots Islamic revivalist movements led by the ulama (Muslim scholars), who believed the very existence of Islam was in jeopardy.

These movements were non-political in their inception and gained mass support at a time when Muslim masses needed spiritual solace and social support. In time, they developed an Islamic vision for society and became increasingly active in the social and political landscape.

The impact of the Cold War

By the end of the second world war, Muslim countries had largely escaped from the constraints of western colonisation, only to fall victim to the Cold War.

Iran and Turkey were key countries where Soviet expansion efforts were intensified. In response, the United States, provided both countries with economic and political support in return for their membership in the democratic Western block. Turkey and Iran accepted this support and became democratic in 1950 and 1951 respectively.

Soon after, Mohammad Mosaddeq’s National Front became the first democratically-elected Iranian government in 1951. Mosaddeq was a modern, secular leaning, progressive leader who was able to gain the broad support of both the secular elite and the Iranian ulama.

US President Harry S Truman (left) and Prime Minister Mohammad Mossadegh, 1951.
Nara.gov/Wikicommons

He was helped by a growing disdain for Shah (king) Reza Pahlavi’s reigning monarchy and Iranian anger at the exploitation of their oil fields.

Whilst Persian oil was used by Britain and Russia to survive the Nazi onslaught during the second world war and greatly helped boost the British economy, Iranians were only receiving 20% of the profits.

Mosaddeq made the bold move to address this issue through nationalising the previously British-owned Anglo-Iranian Oil Company (AIOC). This did not work out in his favour, as it attracted British and US economic sanctions. This in turn crippled the Iranian economy.

In 1953, he was replaced in a military coup organised by the CIA and British Intelligence. The Shah was returned to power and the Anglo-Iranian Oil Company became BP, British Petroleum, with a 50-50 divide of profits.

Not only did this intervention leave Iranians with a sense of bitter humiliation, betrayal and impotence, its impact also reverberated within the wider Muslim world.

It sent the message that a democratically-elected government would be toppled if it did not fit with Western interests. This narrative continues to be the dominant discourse of Islamist activists to this day, used in explaining world events that affect the Muslim masses.

Looking more closely at the developments in Iran between 1953 and 1977, the Shah relied heavily on the US in his efforts to modernise the army, Iranian society and build the economy through what he called the White Revolution.

The Shah (left) meeting with US officials including President Jimmy Carter, 1977.
National Archives ARC/Wikicommons

Though his economic program brought prosperity and industrialisation to Iran and educational initiatives increased literacy levels, this all came at a hefty cost. Wealth was unequally distributed, there was a development of an underclass of peasants migrating to urban centres and large scale political suppression of dissent. Disillusioned religious scholars were alarmed at the top-down imposition of a Western lifestyle, believing Islam was being completely removed from society.

The revolution – what happened?

Iranian dissidents responded finally to the Shah’s political suppression with violence. Two militant groups, Marxist Fadaiyan-e Khalq and Islamic leftist Mujahedin-e Khalq, started to mount attacks at government officials in the 1960s. More sustained and indirect opposition came from the religious circles led by Ayatollah Khomeini and intellectual circles led by Ali Shari’ati.

Shari’ati, a French-educated intellectual, was inspired by the Algerian and Cuban revolutions. He called for an active struggle for social justice and insisted on the prominence of Islamic cultural heritage instead of the Western model for society. He criticised the Shi’ite scholars for being stuck in their centuries-old doctrine of political quietism – seen as a significant barrier to the revolutionary fervour.

The barrier was broken by Ayatollah Khomeini, who rose to prominence for his outspoken role in the 1963 protests and was exiled as a result. His recorded sermons openly criticising the Shah were circulated widely in Iran.

Protesters holding Khomeini’s photo during the Iranian revolution, 1978.
Wikicommons

Influenced by the new idea of an Islamic state in which Islam could be implemented fully, thus ending the imperialism of the colonial West, Khomeini argued it was incumbent on Muslims to establish an Islamic government based on the Qur’an and the example of the Prophet Muhammad.

Khomeni’s return 1979.
Wikicommons

In his book Wilayat-i Faqih: Hukumat-i Islami (Islamic Government: Guardianship of the Jurist), Khomeni insisted that in the absence of the true Imam (the only legitimate leader from the linage of Prophet Muhammad in Shi’ite theology) the scholars were their proxies charged to fulfil the obligation by virtue of their knowledge of Islamic scriptures. This idea was an important innovation that gave licence to scholars to become involved in politics.

With the conditions ripe, the persistent protests instigated by Khomeini’s followers swelled to include all major cities. This culminated in the revolution on February 1, 1979, when Khomeini triumphantly returned to Iran.

The impact of the revolution

The Iranian revolution was a cataclysmic event that not only transformed Iran completely, but also had far-reaching consequences for the world.

It caused a deep shift in Cold War and global geopolitics. The US not only lost a key strategic ally against the communist threat, but it also gained a new enemy.

Emboldened by developments in Iran, the Soviet Union invaded Afghanistan in 1979. This was followed by the eruption of the Iran-Iraq war of 1980, designed to bring down the new Iranian theocratic regime. The US supported Saddam Hussein with weapons and training, helping him clinch his grip on power in Iraq.

Contemporary relevance

These two conflicts and the series of events that followed – Saddam Hussein’s invasion of Kuwait in 1991, two Gulf-Wars, the emergence of Al-Qaeda, and the 9/11 terrorist attacks on World Trade Centre and subsequent war on terror – defined geo-politics for the last three decades and continues to do so today.

World Trade Centre under attack, September 11, 2001.
Ken Tannenbaum/Shutterstock

The Iranian revolution also dramatically altered Middle Eastern politics. It flamed a regional sectarian cold war between Iran and Saudi Arabia. The revolution challenged Saudi Arabia’s monarchy and its claim for leadership of the Muslim world.

The religious and ideological cold war between Iran and Saudi Arabia continues to this day with their involvement in the Syrian and Yemeni conflicts.

Another impact of the revolution is the resurgence of political Islam throughout the Muslim world. Iran’s success showed that establishing an Islamic state was not just a dream. It was possible to take on the West, their collaborating monarchs/dictators and win.

Throughout the 1980s and 90s, Islamic political parties popped up in almost all Muslim countries, aiming to Islamise societies through the instruments of state. They declared the secular model had failed to deliver progress and full independence, and the Islamic model was the only alternative. For them, the Iranian revolution was proof it could be a reality.

Was the revolution a success?

From the perspective of longevity, the revolution still stands. It has managed to survive four decades, including the eight-year Iran-Iraq war as well as decades of economic sanctions. Comparatively, the Taliban’s attempt at establishing an Islamic state only lasted five years.

On the other hand, Khomeini and his supporters promised to end the gap between the rich and the poor, and deliver economic and social progress. Today, the Iranian economy is in poor shape, despite the oil revenues that holds back the economy from the brink of collapse. People are dissatisfied with high unemployment rates and hyper-inflation. They have little hope for the economic fortunes to turn.

The most important premise of Islamism – making society more religious through political power – has also failed to produce the desired results. Even though 63% of Iranians were born after the revolution, they are no more religious than before the revolution.

Although there is still significant support for the current regime, a significant proportion of Iranians want more freedoms, and disdain religion being forced from above. There are growing protests demanding economic, social and political reforms as well as an end to the Islamic republic.

Most Iranians blame the failures of the revolution on the never-ending US sanctions. Even though Iran trades with European powers, China and Russia, they believe the West does not want Iran to succeed at all costs.

Ultimately, the world geopolitics is a competitive business driven by national interests. The challenge before Muslim societies is to develop models that harmonises Islam and the modern world in a way that is appealing and contributory to humanity rather than seen as a threat.

Hard social and political conditions and forces of time have an uncanny ability to test and smooth ideologies. While the struggle between secular and Islamic models for society continues in Iran and the greater Muslim world, it is likely that Iran will evolve as a moderate society in the 21st century.The Conversation

Mehmet Ozalp, Associate Professor in Islamic Studies, Director of The Centre for Islamic Studies and Civilisation and Executive Member of Public and Contextual Theology, Charles Sturt University

This article is republished from The Conversation under a Creative Commons license. Read the original article.


World politics explainer: Pinochet’s Chile



File 20180827 149475 1pdm6pg.jpg?ixlib=rb 1.1
Pinochet in the car, 1982 celebrating the 8th anniversary of the coup. His dictatorship in Chile was both a step forwards for neoliberalism and a step back for democracy and human rights.
Wikimedia Commons, CC BY-SA

Peter Read, Australian National University

This article is part of our series of explainers on key moments in the past 100 years of world political history. In it, our authors examine how and why an event unfolded, its impact at the time, and its relevance to politics today. You can read parts one, two and three here.


General Augusto Pinochet Ugarte, a career military officer, was appointed Commander in Chief of the Chilean army by President Salvador Allende on August 1973. Eighteen days later, with the connivance, if not the assistance, of the US, he authorised a coup against Allende’s Socialist government.

To be clear, Pinochet’s rule was not the first, last or worst dictatorship in the history of Latin America. But it did grip the attention of western countries because of Chile’s comparatively orderly and democratic past, its institutions that made it seem closer to Great Britain than to Spain, its status as the first freely-elected Marxist government in the west, and the questionable role of the CIA in undermining the socialist Allende’s government.

What happened?

Augusto Pinochet Ugarte, 1986.
Biblioteca del Congreso Nacional, CC BY

On hearing the news of the coup, Allende dashed to his seat of government in the capital. Then, after his last and remarkable radio address, he shot himself rather than becoming a prisoner. Pinochet proclaimed himself president of the military junta (dictatorship) that followed.

The initial plan held that Pinochet would rule only for a year, to be succeeded by the chiefs of the navy, police and air force. However, Pinochet continued to rule, eventually as President of the Republic by decree (in effect, Chile’s military dictator) up until 1988. At that point, following a constitutional obligation signed eight years earlier, he held a national plebiscite. Unexpectedly to his followers, and no doubt himself, 55% of the country voted against him.

Pinochet retired soon after, in 1990, to what he hoped would be a quieter life as lifetime senator. But in 1998, he was detained in Britain to answer charges of torturing Spanish citizens in Chile during his rule. He was held in Britain for 18 months before being allowed to return to Chile to answer further charges. It was the first time a former head of state had been arrested based on the principle of universal jurisdiction.

Augusto Pinochet Ugarte (left) with Mario Arnello.
Biblioteca del Congreso Nacional, CC BY

He returned to face 59 criminal complaints for kidnapping, murder, and torture. Those charges never eventuated from a variety of legal complexities, principally because the Chilean Supreme Court ruled him mentally and physically unable to answer them.

He died in 2006 without answering those charges.

Nevertheless, by then, his reputation was damaged, even among his supporters. This is because of the findings of two National Commissions detailing the arbitrary arrests, torture, incarceration, disappearances and political executions that had occurred under his dictatorship. He directed his forces first at the more extreme of the left-wing parties, the Armed Revolutionary Movement (El MIR) and the Socialists, but later none of the members of any left wing party could consider themselves safe.

Some Chileans who had supported Pinochet’s attempt to rid the country of what he called the “communist cancer” withdrew support after allegations of serious financial mismanagement for his own benefit were revealed. For all the accusations levelled against him, Pinochet admitted nothing. Instead, he blamed his senior operatives like Manuel Contreras, his hated head of the secret police, for the terrible abuses that he himself had authorised.

The impact on the development of neoliberalism

One of the biggest impacts of Pinochet’s coup is his contribution to the advancement of an economic theory known as neoliberalism, which arguably has shaped the economies of many modern western countries to this day. Neoliberalism in essence means a distant retreat by the state from total economic management: it wants the state to withdraw from much regulation, encourage free enterprise and competition, and let the market determine real value. By contrast, socialist “command” economies seek to be the regulators of supply, demand and wages.

The last chaotic year of Allende’s presidency, marked by massive protectionism, chaotic land expropriations, strikes, food shortages (some artificially induced) and galloping inflation, certainly demanded reform. This provided the basis for the work a group of conservative Chilean economists had discussed and planned for a decade, which was enacted after 1973.

Members of the Government Junta in 1985 and Augusto Pinochet (middle)
Wikicommons

These economists renewed international trade, reduced inflation and divested the state of some of its assets. Some of these actions proved unwise, including selling some national utilities to Spanish companies, which did not necessarily run them in the interests of Chile.

The debates about Pinochet’s economic achievements continue, especially for the period after 1982, when the benefits of neoliberal practice faltered. His successes are still held by some to be a Chilean miracle, but the reality was a situation heavily tilted in his favour at a time when political opposition was eliminated, trade unions weakened and working class wages determined by the military dictatorship. The revelations of massive human rights abuses has further tarnished some of this achievement.

Contemporary relevance

We can now also detect some unforeseen consequences, thanks to Chile’s long and successful tradition of reconciliation after political trauma. The ten years of centre-left rule that followed Pinochet was a remarkable achievement, as was the first four-year term of the moderate centre-right Piñera government from 2010. This was the product of the peacemaking tradition called the via Chilena, the Chilean way.

Those who had achieved political exile in East Germany or the Soviet Union during Pinochet’s government did not take long to discover that life under the communist state was not the people’s utopia they had hoped to achieve in their own country.

Some returned, somewhat disillusioned, after 1990 to become high officials in much more moderate administrations than those they had planned many years before. They had learned not to make political changes too fast. Other exiles taking refuge in western Europe learned alternatives to a program of people’s revolution in every Latin American nation as taught by Che Guevara. They came to appreciate the lessons of Euro-Communism, that political change need not be wrought by violence but negotiation and co-operation with less radical left-wing parties.

Some reverberations from Pinochet’s rule are still also working themselves out. Members of what was once the radical and optimistic left, who gave so much to the radical cause and suffered so grievously, now wonder about the value of their struggle under Pinochet as they contemplate the low wages of today, much unemployment and, especially, wide disillusionment in the processes of government.

Some of their children have come to the same, but more pointed, conclusion. Fifty or 60 Chileans were actually sent to Cuba by their parents so they could re-enter the country at a later stage to continue the armed struggle against Pinochet. The children did not enjoy the experience. The documentary El edificio de los chilenos (The Chilean House) shows the filmmaker subjecting her once-radical mother to an excoriating interrogation as to whether her ideology, and by inference, any political ideology, should supervene her duty to care for her children.

Pinochet is now remembered not so much as someone who saved his country from becoming a second Cuba, or for clearing the ground to test economic theory. Rather, internationally he is recalled for his sensational detention in the UK. That’s an important outcome that, perhaps, makes every retired dictator think twice before venturing from their homelands.The Conversation

Peter Read, Professor of History, Australian National University

This article is republished from The Conversation under a Creative Commons license. Read the original article.


World politics explainer: The Holocaust



File 20180824 149472 yloajn.jpg?ixlib=rb 1.1
The horrific incarceration of European Jews during WWII should never be forgotten, particularly when we need to solve contemporary genocide and forced migration issues.
Shutterstock

Daniella Doron, Monash University

This article is part of our series of explainers on key moments in the past 100 years of world political history. In it, our authors examine how and why an event unfolded, its impact at the time, and its relevance to politics today.

Warning: some of the following photos may be disturbing to readers.


The event traditionally defined as the Holocaust — by which I mean the systematic extermination of European Jewry between 1941 and 1945 — defies an overly simplified explanation.

With that being said, making sense of the who, what, where and when presents the somewhat easier task.

What happened?

The Jews of Europe have been traditionally understood by historians as the principal targets of annihilation by the Nazis. Though Germans were perpetrators of this genocidal act, so too were other complicit Europeans, who either directly participated in murder or looked the other way.

Onlookers watching as Jewish people are forced to scrub the pavement.
USHMM/Wikicommons

Eastern Europe housed the sites of mass death (ghettos, labour camps, and concentration and death camps) built by the Nazis to “eliminate” Europe of Jews. Scholars generally point to 1941 – 1945 as constituting the woeful years in which approximately 6 million European Jews were killed by the Nazis.

The question of why the Nazis murdered the Jews of Europe remains somewhat thornier and far more controversial amongst scholars.

When the Nazis rose to power in 1933, the issue of how to handle the so-called “Jewish problem” ranked high on their agenda. The Germans held a racialised view of antisemitism, which defined Jews as not only biologically distinct from Germans, but also a critical threat to the health of the German nation.

Given their view of Jews as parasites, sapping the strength of the national body, the German Nazis experimented with a series of mechanisms to “eliminate” Jews from German life. The notion of murdering the Jews of Europe, we should note, had not yet surfaced.

Rather, in the early years of the regime, the Nazis saw emigration as an acceptable solution to the Jewish problem.

Legislation that disenfranchised Jews, stripped them of their financial assets, and denied them their livelihoods sought to clarify to Jews their new degraded position within Germany. The days of their security, stability, and equality under the law had come to an end. It was time to make a new life elsewhere. But these measures failed to bring about the results desired by the Nazis.

By 1939, slightly more than half of German Jews had decided to chart a new course abroad. Unfortunately, the outbreak of war the same year only served to exacerbate the dilemma of eliminating Germany of Jews within its domain.

The extent of Nazi Germany’s spread across Europe by 1942.
Shutterstock

As Germany quickly and violently occupied large swaths of territory in eastern and western Europe, the number of Jews in its territory exploded. Whereas the population of Jews in Germany in 1933 stood at roughly half a million, approximately nine million Jews resided in Europe as a whole.

Emigration no longer seemed like a viable option. Which nation would be the new home to all those Jews? As it stood, the half a million Jews of Germany struggled to find nations willing to house them.

And it was soon decided that new solutions needed to be found.

The impact of the Holocaust

Scholars debate why and when and even who arrived at the new solution of mass murder to solve Europe’s so-called “Jewish problem.”

We know that in 1939, when war broke out in the East and Germany occupied Poland, the Nazis turned to the creation of ghettos in Poland, which served to concentrate and isolate Jews from the greater Polish non-Jewish population. In these ghettos, often surrounded by high walls and gates, Jews engaged in forced labour and succumbed in great numbers to starvation and disease.

We also know that in 1941, when war broke out with the Soviet Union, the Nazis turned to rounding up Jews in the towns and villages of eastern Europe and murdered them by bullets. At least 1.6 million eastern European Jews died in this fashion. But why this turn from “passive murder” in the ghettos to “active murder” in the killing fields and later the concentration and death camps of eastern Europe? Who proposed this radical solution?

Row of bodies found at a liberated concentration camp, 1945.
Shutterstock

These questions defy scholarly consensus. It may be that the earlier policies of emigration, isolation and concentration of Europe’s Jews were eventually perceived as insufficient; it may be that the murder of Europe’s Jews by starvation and later bullets proved too slow and costly. And perhaps it was not Hitler who first arrived at the idea of mass murder, but Nazi bureaucrats and functionaries working on the ground in eastern Europe, who first conceived and experimented with this genocidal policy.

Regardless, by 1942, Jews across Europe were rounded up — from their homes, their hiding places, and Nazi run ghettos and labour camps to be deported to killing centres and concentration camps, where the vast majority lost their lives.

Contemporary implications

The legacy of the Holocaust has loomed large for more than 70 years, and continues to inform our culture and politics. It has come to be seen as arguably one of, if not the, defining events of the 20th century.

The Holocaust is commonly perceived as a truly rupturing occurrence in which a modern state used the mechanisms of modernity — technology, scientific knowledge and bureaucracy – not for the benefit of humanity but to inflict suffering and death.

We are now well aware that modernity does not necessarily result in progress and an improved standard of living, but that it comes with a dark underbelly that can lead to the violent purging of segments of society perceived as undesirable.

In its aftermath, the Holocaust became paradigmatic for defining genocide. The murder of European Jewry inspired the term “genocide”, which was coined by the jurist Raphael Lemkin in 1944. It later prompted the Genocide Convention agreed to by the United Nations in 1948.

The 1948 Genocide Convention, the memory of the Holocaust, and the phrase “never again” are thereby routinely invoked in the face of atrocity. And yet these words ring hollow in the face of genocides in Rwanda, Cambodia, Bosnia, and the current genocide against the Rohingya in Myanmar.

Rohingya Muslims walking through a broken road in Bangladesh late last year.
Shutterstock

It happens again and again. Not to mention that the largest refugee crisis since the second world war is occurring this very moment as desperate Syrians undertake perilous journeys across the Aegean.

Approximately 80 years ago, Nazi persecution likewise culminated in an international refugee crisis. World leaders recognised the plight of Europe’s Jews during the 1930s and into the 1940s, even if they could not foretell their eventual genocidal fate.

International leaders even convened a conference (the Evian Conference) in which they debated how best to aid German Jews. Country after country expressed their sympathies with Jewish refugees but ultimately denied them refuge.

International news at the time closely followed the journey of the SS St Louis, as it sailed from port to port with 900 Jewish refugees unable to disembark because nations refused to grant asylum. We now know the future that befell many of these refugees.

These days, images of desperate Syrians undertaking perilous journeys across the Aegean, or the Rohingya fleeing their persecution in Myanmar, occupy our front pages. As we contemplate our responsibility towards these desperate individuals, the SS St Louis and the Evian Conference have been routinely invoked in our public discourse as a reminder of the devastating consequences of restrictive refugee and immigration policies.

We must remain haunted by our past failings. It is the legacy of the Holocaust that compels us to examine our responsibility to intervene and turn our attention to the plight of refugees. And we should consider whether that legacy has remained sufficient or whether we need a reminder of the dire consequences that comes with numbing ourselves to the suffering of others.The Conversation

Daniella Doron, Senior lecturer, Monash University

This article is republished from The Conversation under a Creative Commons license. Read the original article.


Pistols at dawn: why there’s more to duelling than what’s seen on our screens


File 20180816 2906 1ov8u4l.jpg?ixlib=rb 1.1
Warren Clarke, Richard Harrington, Ruby Bentall, Aidan Turner, and Kyle Soller in Poldark.
Mammoth Screen

Ryna Ordynat, Monash University

Duelling has gone down in history as a rather quaint and misunderstood practice, a butt of the joke in historical comedies and references. However, duelling was once not only common but considered the pinnacle of honour and bravery, an event that could change one’s reputation – and indeed end one’s life – in a pull of a trigger.

Alexander Hamilton by John Trumbull: he famously died in a duel.
Wikimedia Commons

It is extraordinary, and a little fantastical to our modern mind to think that some of the most famous and respected individuals in history, such as Alexander Hamilton, one of the Founding Fathers of the U.S. and the seventh President of the U.S. Andrew Jackson, both fought in many duels. Hamilton died in a duel in 1804. Jackson duelled over 100 times, was wounded in two and killed at least one man.

In our age of “trolling” and social media wars, the idea that one man may calmly, and according to proper social rules, kill another over an accusation of cheating at cards, or of being a corrupt or incompetent politician, seems utterly barbaric. This modern incomprehension frequently shows in popular media. Modern filmmakers and writers of TV series and musicals can’t help projecting their own feelings when interpreting duelling in their work.

Take, for example, the successful musical Hamilton based on Alexander Hamilton’s life. The Burr-Hamilton duel, which ended Hamilton’s life, is portrayed in several songs. In one song, founding father Aaron Burr sings, “Can we agree that duels are dumb and immature?” and declares that the whole affair is “absurd”. It must certainly have seemed so to Lin-Manuel Miranda, who wrote the lyrics, but Burr himself probably had very different feelings on the matter.

Recent TV shows set in the 18th and early 19th century, notably Poldark and the 2016 BBC mini-series adaptation of Tolstoy’s War and Peace, depict duels similarly.

In the duel between Pierre and Dolokhov in the miniseries of Tolstoy’s novel, the words “I know it’s stupid but I think I must go through with it” are put into Pierre’s mouth. They encompass what the creators probably understood about duels – that they are stupid, but must be fought, for some unknown reason.

And in season four of Poldark, the main character utters, “My only regret is that I apologised in the first place”, to drive home the point that his faulty pride regrettably caused the duel.

This is not to argue that duels were in any way positive affairs, and should be portrayed as such.

The duel was a highly ritualized activity practised mainly by the upper classes from about 1500 to 1900. It was held in private, usually at dawn, as duelling was illegal throughout Europe and America. It was seen as neither a recreational sport, nor an urge or uncontrollable male aggression – the duel was an affair of honour. In the words of Samuel Johnson:

In a state of highly polished society, an affront is held to be a serious injury. It must, therefore, be resented, or rather a duel must be fought upon it.

Honour was a most crucial concept for gentlemen, and ladies, tied up with one’s reputation. The importance placed on defending honour made refusing a duel challenge nearly impossible; the social consequences for doing so were severe. Indeed, gentlemen did not shoot each other over trivial matters, but rather over slander and accusations of falsehood or dishonesty.

Duels involving women were not fought to gain a woman’s love, as some modern adaptations try to show, but rather because men took responsibility for the protection of honour of certain women in their lives. The duel, therefore, was a way to honourably and privately resolve offences. Its causes varied from accusations of cheating to women’s infidelity.

Alexander Pushkin, considered by many to be Russia’s greatest poet, died in a duel in 1837, defending the accusations that his wife Natalya had been unfaithful. His death echoed in many ways the famous duel between Eugene Onegin and Vladimir Lensky in Pushkin’s Onegin.

Eugene Onegin and Vladimir Lensky’s duel, Ilya Repin, 1899, Pushkin Museum of Fine Arts.
Wikimedia Commons

Scrupulous regulation

A duel was scrupulously regulated by an elaborate and detailed set of rules, though the specifics of the duelling code varied between countries. Many codes of duelling and help manuals were published throughout the 18th and 19th century, the most popular being the Irish code duello, published in 1777.

The duelling gentlemen would always have “seconds” – friends whose role was to negotiate a resolution of the dispute to avoid a potentially lethal confrontation, usually to very little success.

French cased duelling pistols circa 1794-1797.
Wikimedia Commons

The high probability of death was, of course, ever present in duels, especially when pistols became more fashionable than rapiers. Pistols could misfire and rarely shot straight, and could also be deadly in the hands of incompetent seconds, whose task it was to provide and load them.

Doctors were also indispensible in duels. The Art of Duelling, published by “A Traveller” in 1836, warns the duellist to remember to “secure the services of his medical attendant, who will provide himself with all the necessary apparatus for tying up wounds and arteries, and extracting balls”.

Public opinion (and ridicule) eventually led to the death of the duel. By the late 19th century, it was successfully banned by most countries, heavily criticised in the press, and frowned upon by the public.

This was, of course, a good thing, as we can all agree there are far better ways of resolving disputes. But next time you watch a duel on television or in a film, it might be worth recalling the history and meaning of this very serious rite of honour.The Conversation

Ryna Ordynat, PhD Candidate in History, Monash University

This article is republished from The Conversation under a Creative Commons license. Read the original article.


%d bloggers like this: