Tag Archives: history

How our discovery of Julius Caesar’s first landing point in Britain could change history



File 20171129 29123 11p3qnw.jpg?ixlib=rb 1.1

Wellcome Trust/Wikimedia Commons, CC BY-SA

Andrew Fitzpatrick, University of Leicester

During the nine-year-long Battle for Gaul, Julius Caesar fought his way across northwest Europe. He invaded Britain twice; in 55BC, and again in 54BC. But while archaeologists have found evidence of the war in France, there has been very little discovered in Britain – until now.

At a site called Ebbsfleet, in northeast Kent, my colleagues from the University of Leicester and I finally uncovered the site where Julius Caesar’s fleet landed in 54BC. A series of surveys and excavations, spanning from 2015 to 2017, revealed a large enclosure, defended by a ditch five metres wide and two metres deep.

What a find: pilum tip from Ebbsfleet.
University of Leicester, Author provided

We dated the ditch all the way back to the first century BC, by examining the pottery and using radiocarbon dating techniques.

At the bottom of the ditch, we found the tip of an iron weapon, which was later identified as a Roman spear, or “pilum”. Similar weapons were discovered at the site of Alésia in France, where the decisive encounter in the Battle for Gaul took place. What’s more, the defensive ditches at Alésia are the same size and shape as those we discovered at Ebbsfleet.

In Caesar’s own words

Our dig was situated next to Pegwell Bay, a large, sandy beach with chalk cliffs at its northern end. This striking landscape also helps to confirm that we really have found the location of Caesar’s base. Most of what is known about Caesar’s voyage comes from his own written accounts, based on his annual reports to the Roman senate.

When the Roman fleet set sail from France, they intended to use the wind to help them cross the Channel to find a large, safe place to lay anchor and prepare for battle. But the wind dropped, and the fleet was carried too far northeast by the tide.

We came, we saw, we excavated.
University of Leicester., Author provided

At sunrise, Caesar saw Britain “left afar on the port side”. Only high land would have been visible from a small ship far out at sea. And the only such land in northeast Kent are the cliffs near Ebbsfleet. Caesar also describes how he left the ships riding at anchor next to a “sandy, open shore” – a perfect description of Pegwell Bay.

Given Caesar’s own words seem so clear, it’s surprising that Pegwell Bay has never been considered as a possible landing site before. Instead, Caesar was long thought to have landed at Walmer, 15 kilometres to the south. One reason might be that, until the Middle Ages, Thanet was an island.

The Isle of Thanet was separated from the mainland by the Wantsum Channel. But no one knows how big the channel was 2,000 years ago; it could be that whatever disadvantages it presented were offset by the presence of a large and safe beach, where 800 ships could land and disembark 20,000 men and 2,000 horses in one day.

Peace by force

Despite the imposing size of Caesar’s fleet, it was long thought that his landing had little lasting impact on Britain. Caesar himself returned to France immediately after the two campaigns, without leaving a garrison. Yet the discovery of the landing site gives us cause to question this assumption.

Making history at Ebbsfleet.
Andrew Fitzpatrick/University of Leicester, Author provided

Historical sources, royal burials and ancient coins indicate that from about 20BC, the kings of southeast England had strong links to Rome. But historians have found it hard to explain how these alliances came into existence. The suggestion that they sprung from diplomatic ties forged by the emperor Augustus at that time has never been convincing.

But Caesar tells us that he reached a peace accord with the Britons in 54BC, even taking hostages from the ruling families to ensure the agreement was respected. Perhaps the alliances which came to light in the 20s BC were originally established by Caesar, a generation before emperor Augustus asserted his authority over the Roman Empire.

The close ties between Rome and the kings of southeast England assured emperor Claudius of a relatively easy military victory, when he first set out to conquer England in 43 AD. So it seems Caesar’s earlier conquest could have laid the foundations for the Roman occupation of Britain, which lasted more than 300 years.

The ConversationFor Caesar, the consequences of his invasions were clear. In his day, Britain lay beyond the known world. By crossing the ocean and conquering Britain, Caesar caused a sensation in his homeland. He was awarded the longest public thanksgiving in Rome, winning great acclaim and glory in the process. Mission accomplished.

Andrew Fitzpatrick, Research Associate, University of Leicester

This article was originally published on The Conversation. Read the original article.

Advertisements

Mungo Man returns home: there is still much he can teach us about ancient Australia



File 20171114 27612 es3lxu.jpg?ixlib=rb 1.1
Mungo Man finally returns to where he was found in the Mungo National Park.
Office of Environment and Heritage/J Spencer

Michael Westaway, Griffith University and Arthur Durband, Kansas State University

The remains of the first known Australian, Mungo Man, today begin their return to the Willandra area of New South Wales, where they were discovered in 1974.

They’ll be accompanied by the remains of around 100 other Aboriginal people who lived in the Willandra landscape during the last ice age.

Their modern descendants, the Mutti Mutti, Paakantyi and Ngyampaa people, will receive the ancestral remains, and will ultimately decide their future.


Read more: Buried tools and pigments tell a new history of humans in Australia for 65,000 years


But the hope is that scientists will have some access to the returned remains, which still have much to tell us about the lives of early Aboriginal Australians.

The Mungo discoveries

For more than a century, non-Indigenous people have collected the skeletal remains of Aboriginal Australians. This understandably created enormous resentment for many Aboriginal people who objected to the desecration of their gravesites.

The remote landscape of the Willandra region where Mungo Man was first discovered.
Arthur Durband, Author provided

The removal of the remains from the Willandra was quite different, done to prevent the erosion and destruction of fragile human remains but also to make sense of their meaning. In 1967 Mungo Woman’s cremated remains were found buried in a small pit on the shores of Lake Mungo.

Careful excavation by scientists from the Australian National University revealed they were the world’s oldest cremation, dated to some 42,000 years ago.

Several years later, and only several hundred metres from where Mungo Woman was buried, Mungo Man was discovered adorned in ochre that is thought to have been obtained from about 200km away to the north.

Mungo Man provided a further glimpse into a past that all of a sudden appeared far more complex than archaeologists across the world had previously thought possible. A picture was emerging that here, at a time when Europe was largely populated by Neanderthals, was an ancient culture of far more sophistication, full of symbolism with a thriving and complex belief system.

The discoveries made possible by the initial research of a young Jim Bowler rewrote our understanding of human history.

Some have argued that 42 years of scientific access to the remains is long enough for research to learn everything we can from the remains.

Limited research on the remains

While it is true that Mungo Man was excavated in 1975 and has been in Canberra ever since, the perception that scientists have been undertaking research on his remains since this time is not accurate.

In reality, very few scientists, probably fewer than ten, have been privileged with the opportunity to study the remains. Very little work has been published, which is unfortunate considering the importance of these remains to human history.

Before 2005 only a few papers from a couple of different authors were published, dealing mainly with dating and comparisons with other fossil human remains. None of these provided an actual description of the skeletal remains of Mungo Man.

Science works best when a variety of perspectives are collected by different scientists working on different questions. Science has not truly had this opportunity with Mungo Man.

We are fortunate to be working at a time when technology allows us to understand ancient human remains in ways that couldn’t have been imagined, even ten years ago. The collection of remains from the Willandra Lakes was CT scanned only four years ago, providing a wealth of new data that can be used to understand those populations.

Much to learn from further research

The study of ancient DNA has finally progressed to the point where we can potentially learn a great deal of information from ancient skeletons.

While DNA from contemporary populations can provide significant information, living people can never replace the information we can recover from people that lived 42,000 years ago.

Isotopes are geochemical signatures that can reveal how people may have moved across the landscape, from one different geological catchment to another. This type of work was recently applied to questions in other parts of Australia, where research revealed the ancient megafauna were probably migratory animals.

Further research may allow us to see how the ancient Australians interacted with the seasonal movement of the great megafauna herds and their migrations who we know now overlaped with people in the Willandra as recently as 32,000 years ago.

Only three of the ancient remains from the Willandra have been reliably dated, and there are more than 100 other skeletons that have no direct age estimates associated with them.

The early dates from Australia’s north raise the possibility that some of the ancient remains recovered from the Willandra system may be older than those of Mungo Man and Woman. This could further rewrite the history of the peopling of Australia.

Who knows what will be possible as science continues to progress? It is impossible to predict what else we may be able to learn from Mungo Man and the other individuals from the Willandra as technology advances.

Will the story continue?

The discovery of Mungo Man and Mungo Woman sent shockwaves through archaeology. Ancient burials with such sophisticated funerary rituals were unexpected in Pleistocene Australia.

The discovery forced a greater appreciation of the culture of the first Australians and was one of the main reasons that the Willandra Lakes area was given World Heritage status in 1981.

Those of us interested in the origins of the First Australians hope that the long overdue repatriation of Mungo Man will not mark the end of scientific work on his remains.


Read more: Aboriginal Australians co-existed with the megafauna for at least 17,000 years


A keeping place at Lake Mungo would allow for scientific work to be done in the future in greater collaboration with the Traditional Owners, while preserving the remains in a culturally appropriate and respectful way.

The story of the people from the ancient Willandra has been told so far by a small handful of white scientists. One day soon there will be Aboriginal scientists who will bring an entirely different approach to studying the past. A keeping place will give future generations the opportunity to seek answers to those questions.

As scientists interested in the study of human remains, we understand and appreciate the sensitivity involved in our work, and strive to treat these remains with the respect and dignity they deserve.

The ConversationWe are glad that Mungo Man will be returning to country, but equally we hope that he and the other 100 ancient people will be allowed to continue to tell the remarkable story of the First Australians.

Michael Westaway, Senior Research Fellow, Research Centre for Human Evolution, Griffith University and Arthur Durband, Associate Professor of Anthropology, Kansas State University

This article was originally published on The Conversation. Read the original article.


Friday essay: when did Australia’s human history begin?



File 20171114 29993 vwdhgd.jpg?ixlib=rb 1.1
Fossilised ancient human footprints at the Mungo National Park. How are we to engage with a history that spans 65,000 years?
Michael Amendolia/AAP

Billy Griffiths, Deakin University; Lynette Russell, Monash University, and Richard ‘Bert’ Roberts, University of Wollongong

In July, a new date was published that pushed the opening chapters of Australian history back to 65,000 years ago. It is the latest development in a time revolution that has gripped the nation over the past half century.

In the 1950s, it was widely believed that the first Australians had arrived on this continent only a few thousand years earlier. They were regarded as “primitive” – a fossilised stage in human evolution – but not necessarily ancient.

In the decades since, Indigenous history has been pushed back into the dizzying expanse of deep time. While people have lived in Australia, volcanoes have erupted, dunefields have formed, glaciers have melted and sea levels have risen about 125 metres, transforming Lake Carpentaria into a Gulf and the Bassian Plain into a Strait.

Australia’s Indigenous history has been pushed back into deep time.
Michael Amendolia/AAP

How are we to engage with a history that spans 65,000 years? There is a “gee whiz” factor to any dates that transcend our ordinary understanding of time as lived experience. Human experiences are reduced to numbers. And aside from being “a long time ago”, they are hard to grasp imaginatively.

It is all too easy to approach this history as one might read the Guinness Book of Records, to search the vast expanse of time for easily identifiable “firsts”: the earliest site, the oldest tool, the most extreme conditions. The rich contours of Australia’s natural and cultural history are trumped by the mentality that older is better.

To political leaders, old dates bestow a veneer of antiquity to a young settler nation. To scientists, they propel Australian history into a global human story and allow us to see ourselves as a species. To Indigenous Australians, they may be valued as an important point of cultural pride or perceived as utterly irrelevant. Their responses are diverse.


Further reading: Buried tools and pigments tell a new history of humans in Australia for 65,000 years


Recently, one of us, Lynette Russell, asked 35 Aboriginal friends and colleagues of varying ages, genders and backgrounds for their thoughts about Australia’s deep history.

Many of the responses were statements of cultural affirmation (“We have always been here” or “We became Aboriginal here”), while others viewed the long Indigenous history on this continent through the lens of continuity, taking pride in being members of “the oldest living population in the world” and “the world’s oldest continuing culture”.

As expressions of identity, these are powerful statements. But when others uncritically repeat such notions as historical fact, they risk suggesting that Aboriginal culture has been frozen in time. We need to be careful not to echo the language of past cultural evolutionists, who believed, in Robert Pulleine’s infamous words, that Aboriginal people were “an unchanging people, living in an unchanging environment”.

Rock art at Nourlangie Rock in Kakadu National Park.
Dean Lewins/AAP

This article seeks to move beyond the view of ancient Australia as a timeless and traditional foundation story to explore the ways in which scientists and humanists are engaging with the deep past as a transformative human history.

Memories of time

The revolution in Australia’s timescale was driven by the advent of radiocarbon dating in the mid-20th century. The nuclear chemist Willard Libby first realised the dating potential of carbon-14 isotopes while working on the Manhattan Project (which also produced the atom bomb). In 1949, he and James Arnold outlined a way to date organic materials from a couple of hundred years old to tens of thousands of years old. The key was to measure the memories of time preserved in carbon atoms.

By comparing the decaying isotope, carbon-14, with the stable isotope, carbon-12, they were able to measure the age of a sample with relative precision. The rate of decay and amount of carbon-14 provided the date.

“A new time machine has been invented”, Australian archaeologist John Mulvaney declared when he realised the implications of the method. In 1962, he used the new technique at Kenniff Cave in the central Queensland highlands and was stunned to discover that Australia had been occupied during the last Ice Age. The dates of 19,000 years overturned the long-standing idea that Australia was the last continent to be inhabited by modern humans and the artefacts he uncovered in his excavations revealed a rich history of cultural adaptation.

The remains of Mungo Man.
AAP

The following decade, at Lake Mungo, Australia’s human history was pushed back to the limits of the radiocarbon technique. A sample from spit 17 of Mulvaney and Wilfred Shawcross’ excavations at Lake Mungo revealed that the ancestors of the Mutthi Mutthi, Ngyiampaa and Paakantji peoples had thrived on these lakeshores over 40,000 years ago. Geomorphologist Jim Bowler also revealed the dramatic environmental fluctuations these people endured: what is now a dusty and desiccated landscape was then a fertile lake system with over 1000 km2 of open water.


Further reading Mungo man returns home and there is still much he can teach us about ancient Australia


The date of 40,000 years had a profound public impact and announced the coming of age of Australian archaeology. The phrase “40,000 years” quickly appeared on banners outside the Tent Embassy in Canberra, in songs by Aboriginal musicians and in land rights campaigns. When the bicentenary of European settlement was marked on 26 January 1988, thousands of Australians protested the celebrations with posters reading “White Australia has a Black History” and “You have been here for 200 years, we for 40,000”. The comparison magnified the act of dispossession.

A mural in Redfern, Sydney, based on the lyrics of the Joe Geia song ‘40,000 Years’.
Billy Griffiths

The discovery of 65,000 years of human occupation at Madjedbebe rock shelter on Mirrar land, at the edge of the Arnhem Land escarpment, draws on a different dating method: optically stimulated luminescence. This technique analyses individual grains of sand and the charge that builds up in their crystal quartz lattice over time. By releasing and measuring this charge, geochronologists are able to reveal the moment a grain of sand was last exposed to sunlight.

The archaeological site at Madjedbebe is far more than an old date; it reveals a long and varied history of human occupation, with evidence of profound cultural and ecological connections across the landscape, cutting edge Ice Age technology (such as the world’s earliest ground-edge axe) and dramatic environmental change.

Perhaps most evocatively, throughout the deposit, even at the lowest layers, archaeologists found ochre crayons: a powerful expression of artistic endeavour and cultural achievement.

Scientists Elspeth Hayes with Mark Djandjomerr (centre) and traditional owner May Nango extracting comparative samples at a cave adjacent to the Madjedbebe rock shelter in the Kakadu National Park.
Vincent Lamberti/GUNDJEIHMI ABORIGINAL CORPORATION

In the wake of the discovery, in August 2017, Prime Minister Malcolm Turnbull seized upon the new date in his speech at Garma, singling out the possibilities of this deep time story for political reconciliation:

I am filled with optimism about our future together as a reconciled Australia. Last month scientists and researchers revealed new evidence that our First Australians have been here in this land for 65,000 years. … This news is a point of great pride for our nation. We rejoice in it, as we celebrate your Indigenous cultures and heritage as our culture and heritage – uniquely Australian.

Although Turnbull revels in the deep time story, his speech avoids reflecting on the more recent past. Here is a statement of reconciliation that does not address the estrangement that it is seeking to overcome. As such it opens itself up to being dismissed as simply a prolonged platitude.

We cannot engage with the past 65,000 years without acknowledging the turbulent road of the past two centuries.

A story of rupture and resilience

When Europeans arrived in Australia in the 17th and 18th centuries they were setting foot onto a land that had been home to thousands of generations of Indigenous men and women. These groups lived along the coasts and hinterlands and travelled into the mountains and across stone plateaus; they thrived in the harsh deserts and gathered in great numbers along waterways and rivers.

Although Australia is a continent, it is home to hundreds of different nations, over 200 language groups and an immense variety of cultural, geographic and ecological regions. To the newcomers these people were simply perceived as “the natives”, and despite the immense cultural diversity across vastly different environmental zones, the disparate groups became labelled with the umbrella term: “the Aborigines”.

There is a similar tendency today to homogenise the deep history of the first Australians. The dynamic natural and cultural history of Australia is too often obscured by tropes of timelessness. Tourism campaigns continue to tell us that this is the land of the “never never”, the home of “ancient traditions” and “one of the world’s oldest living groups”.

Such slogans imply a lack of change and hide the remarkable variety of human experiences on this continent over tens of thousands of years. While there is great continuity in the cultural history of Indigenous peoples, theirs is also a story of rupture and resilience.

The 1989 excavations at Madjedbebe (Malakunanja II), Arnhem Land.
Mike Smith

The discovery of old dates at Madjedbebe does not make the history of the site any more or less significant. It simply reminds us that science, like history, is an ongoing inquiry. All it takes is a new piece of evidence to turn on its head what we thought we knew. Science is a journey and knowledge is ever evolving.

The epic story of Australia will continue to shift with the discovery of new sites and new techniques, and by engaging and collaborating with different worldviews. It is a history that can only be told by working across cultures and across disciplines; by bridging the divide between the sciences and the humanities and translating numbers and datasets into narratives that convey the incredible depth and variety of human experience on this continent.

The ConversationThe authors of this article will continue this conversation at a public event in Wollongong on Friday 24 November 2017 at the annual meeting of the Australasian Association for the History, Philosophy and Social Studies of Science. There will be two other sets of speakers, exploring issues surrounding precision medicine and artificial intelligence. Register here.

Billy Griffiths, Research fellow, Deakin University; Lynette Russell, Professor, Indigenous Studies and History, Monash University, and Richard ‘Bert’ Roberts, ARC Australian Laureate Fellow and Director, ARC Centre of Excellence for Australian Biodiversity and Heritage (CABAH), University of Wollongong

This article was originally published on The Conversation. Read the original article.


Was agriculture the greatest blunder in human history?



File 20171018 32345 1rwww1s.jpg?ixlib=rb 1.1
Rice famers near Siem Reap, Cambodia.
Darren Curnoe, Author provided

Darren Curnoe, UNSW

Twelve thousand years ago everybody lived as hunters and gatherers. But by 5,000 years ago most people lived as farmers.

This brief period marked the biggest shift ever in human history with unparalleled changes in diet, culture and technology, as well as social, economic and political organisation, and even the patterns of disease people suffered.

While there were upsides and downsides to the invention of agriculture, was it the greatest blunder in human history? Three decades ago Jarred Diamond thought so, but was he right?

Agriculture developed worldwide within a single and narrow window of time: between about 12,000 and 5,000 years ago. But as it happens it wasn’t invented just once but actually originated at least seven times, and perhaps 11 times, and quite independently, as far as we know.

Farming was invented in places like the Fertile Crescent of the Middle East, the Yangzi and Yellow River Basins of China, the New Guinea highlands, in the Eastern USA, Central Mexico and South America, and in sub-Saharan Africa.

And while its impacts were tremendous for people living in places like the Middle East or China, its impacts would have been very different for the early farmers of New Guinea.

The reasons why people took up farming in the first place remain elusive, but dramatic changes in the planet’s climate during the last Ice Age — from around 20,000 years ago until 11,600 years ago — seem to have played a major role in its beginnings.

The invention of agriculture thousands of years ago led to the domestication of today’s major food crops like wheat, rice, barley, millet and maize, legumes like lentils and beans, sweet potato and taro, and animals like sheep, cattle, goats, pigs, alpacas and chickens.

It also dramatically increased the human carrying capacity of the planet. But in the process the environment was dramatically transformed. What started as modest clearings gave way to fields, with forests felled and vast tracts of land turned over to growing crops and raising animals.

In most places the health of early farmers was much poorer than their hunter-gatherer ancestors because of the narrower range of foods they consumed alongside of widespread dietary deficiencies.

At archaeological sites like Abu Hereyra in Syria, for example, the changes in diet accompanying the move away from hunting and gathering are clearly recorded. The diet of Abu Hereyra’s occupants dropped from more than 150 wild plants consumed as hunter-gatherers to just a handful of crops as farmers.

In the Americas, where maize was domesticated and heavily relied upon as a staple crop, iron absorption was consequently low and dramatically increased the incidence of anaemia. While a rice based diet, the main staple of early farmers in southern China, was deficient in protein and inhibited vitamin A absorption.

There was a sudden increase in the number of human settlements signalling a marked shift in population. While maternal and infant mortality increased, female fertility rose with farming, the fuel in the engine of population growth.

The planet had supported roughly 8 million people when we were only hunter-gatherers. But the population exploded with the invention of agriculture climbing to 100 million people by 5,000 years ago, and reaching 7 billion people today.

People began to build settlements covering more than ten hectares – the size of ten rugby fields – which were permanently occupied. Early towns housed up to ten thousand people within rectangular stone houses with doors on their roofs at archaeological sites like Çatalhöyük in Turkey.

By way of comparison, traditional hunting and gathering communities were small, perhaps up to 50 or 60 people.

Crowded conditions in these new settlements, human waste, animal handling and pest species attracted to them led to increased illness and the rapid spread of infectious disease.

Today, around 75% of infectious diseases suffered by humans are zoonoses, ones obtained from or more often shared with domestic animals. Some common examples include influenza, the common cold, various parasites like tapeworms and highly infectious diseases that decimated millions of people in the past such as bubonic plague, tuberculosis, typhoid and measles.

In response, natural selection dramatically sculpted the genome of these early farmers. The genes for immunity are over-represented in terms of the evidence for natural selection and most of the changes can be timed to the adoption of farming. And geneticists suggest that 85% of the disease-causing gene variants among contemporary populations arose alongside the rise and spread of agriculture.

In the past, humans could only tolerate lactose during childhood, but with the domestication of dairy cows natural selection provided northern European farmers and pastoralist populations in Africa and West Asia the lactase gene. It’s almost completely absent elsewhere in the world and it allowed adults to tolerate lactose for the first time.

Starch consumption is also feature of agricultural societies and some hunter-gatherers living in arid environments. The amylase genes, which increase people’s ability to digest starch in their diet, were also subject to strong natural selection and increased dramatically in number with the advent of farming.

Another surprising change seen in the skeletons of early farmers is a smaller skull especially the bones of the face. Palaeolithic hunter-gatherers had larger skulls due to their more mobile and active lifestyle including a diet which required much more chewing.

Smaller faces affected oral health because human teeth didn’t reduce proportionately to the smaller jaw, so dental crowding ensued. This led to increased dental disease along with extra cavities from a starchy diet.

Living in densely populated villages and towns created for the first time in human history private living spaces where people no longer shared their food or possessions with their community.

These changes dramatically shaped people’s attitudes to material goods and wealth. Prestige items became highly sought after as hallmarks of power. And with larger populations came growing social and economic complexity and inequality and, naturally, increasing warfare.

Inequalities of wealth and status cemented the rise of hierarchical societies — first chiefdoms then hereditary lineages which ruled over the rapidly growing human settlements.

Eventually they expanded to form large cities, and then empires, with vast areas of land taken by force with armies under the control of emperors or kings and queens.

This inherited power was the foundation of the ‘great’ civilisations that developed across the ancient world and into the modern era with its colonial legacies that are still very much with us today.

The ConversationNo doubt the bad well and truly outweighs all the good that came from the invention of farming all those millenia ago. Jarred Diamond was right, the invention of agriculture was without doubt the biggest blunder in human history. But we’re stuck with it, and with so many mouths to feed today we have to make it work better than ever. For the future of humankind and the planet.

Darren Curnoe, Associate Professor and Chief Investigator, ARC Centre of Excellence for Australian Biodiversity and Heritage, University of New South Wales, UNSW

This article was originally published on The Conversation. Read the original article.


From shouting it out to staying at home: a brief history of British voting


File 20170530 30203 1mf93lw.jpg?ixlib=rb 1.1
Hogarth’s The Polling, from the Humours of an Election series.
Wikipedia

Malcolm Crook, Keele University and Tom Crook, Oxford Brookes University

Most of the voters who will be casting their ballots in the general election on Thursday June 8 will take their right to do so for granted, unaware of the contested history of this now familiar action. It’s actually less than 100 years since all adult males in the UK were awarded the franchise for parliamentary elections, in 1918, in the wake of World War I. That right wasn’t extended to all adult women for a further ten years after that.

Even today, it might be argued, the democratic principle of “one person, one vote” has not been fully implemented, since the royal family and members of the House of Lords are not allowed to vote in parliamentary elections. And even after the mass enfranchisement of the early 20th century, university graduates and owners of businesses retained a double vote, the former in their university constituencies as well as where they lived. These privileges were only abolished in 1948, in face of overwhelming Conservative opposition.

How Britain votes today is also a relatively late development in electoral history. Until 1872, parliamentary electors cast their votes orally, sometimes in front of a crowd, and these choices were then published in a poll book. Public voting was often a festive, even riotous affair. Problems of intimidation were widespread, and sanctions might be applied by landlords and employers if voters failed to follow their wishes, though this was widely accepted at the time as the “natural” state of affairs.

Open voting even had its defenders, notably the political radical John Stuart Mill, who regarded it as a manly mark of independence.

But as the franchise was partially extended in the 19th century, the campaign for secrecy grew. The method that was eventually adopted was borrowed from Australia, where the use of polling booths and uniform ballot papers marked with an “X” was pioneered in the 1850s.

More recent reforms took place in 1969, when the voting age was lowered from 21 to 18. Party emblems were also allowed on the ballot paper for the first time that year. It’s this kind of paper that will be used on June 8.

Staying at home

What no one predicted, however, when these franchise and balloting reforms were first implemented, is that voters would simply not bother to turn out and that they would abstain in such considerable numbers.

To be sure, this is a relatively recent phenomenon. In fact, turnout for much of the 20th century at general elections remained high, even by European standards. The best turnout was secured in the 1950 general election, when some 84% of those eligible to do so voted. And the figure didn’t dip below 70% until 2001, when only 59% voted. Since then things have improved slightly. In 2010, turnout was 65%. In 2015, it was 66%. But the fact remains that, today, a massive one-third of those eligible to vote fail to do so, preferring instead to stay at home (and the situation in local elections is far worse).

Turnout over the years.
Author provided

What was a regular habit for a substantial majority of the electorate has now become a more intermittent practice. Among the young and marginalised, non-voting has become widely entrenched. Greater personal mobility and the decline of social solidarity has made the decision to vote a more individual choice, which may or may not be exercised according to specific circumstances, whereas in the past it was more of a duty to be fulfilled.

Voters rarely spoil their papers in the UK, whereas in France it is a traditional form of protest that has reached epidemic proportions: some 4m ballot papers were deliberately invalidated in the second round of the recent presidential election. Like the rise in abstention in both countries, it surely reflects disenchantment with the electoral process as well as disappointment with the political elite.

In these circumstances, the idea of compulsory voting has re-emerged, though in liberal Britain the idea of forcing people to the polling station has never exerted the same attraction as on the continent. The obligation to vote is a blunt instrument for tackling a complex political and social problem. When the interest of the electorate is fully engaged, as in the recent Scottish or EU referendums, then turnout can still reach the 75% to 80% mark.

The ConversationHowever, in the forthcoming parliamentary election, following hard on the heels of its predecessor in 2015, the EU vote and elections to regional assemblies in 2016, plus the local elections in May, voter fatigue may take a toll. It’s hard to envisage more than two-thirds of those entitled to do so casting their ballot on June 8. Given the relatively small cost involved in conducting this civic act, which is the product of so much historical endeavour, such disaffection must be a cause for significant concern.

Malcolm Crook, Emeritus Professor of French History, Keele University and Tom Crook, Senior Lecturer in Modern British History, Oxford Brookes University

This article was originally published on The Conversation. Read the original article.


Dove, real beauty and the racist history of skin whitening



File 20171010 10908 17sb3zr.jpg?ixlib=rb 1.1
The Dove ad published on Facebook, which the company took down after many complaints of racial insensitivity.
NayTheMUA/Facebook

Liz Conor, La Trobe University

This week the marketing office of Dove, a personal care brand of Unilever, found itself in hot water over an ad that many people have taken to be racially insensitive. Social media users called for a boycott of the brand’s products.

The offending ad showed a black woman appearing to turn white after using its body lotion. This online campaign was swiftly removed but had already hurtled through social media after a US makeup artist, Naomi Blake (Naythemua), posted her dismay on Facebook, calling the ad “tone deaf”.

//platform.twitter.com/widgets.js

Dove responded initially via Twitter.

//platform.twitter.com/widgets.js

The company then followed up with a longer statement: “As a part of a campaign for Dove body wash, a three-second video clip was posted to the US Facebook page … It did not represent the diversity of real beauty which is something Dove is passionate about and is core to our beliefs, and it should not have happened.”

//platform.twitter.com/widgets.js

One has to ask, were the boys destined for Dove marketing kicking on at the pub instead of going to their History of Advertising lecture, the one with the 1884 Pears’ soap ad powerpoint? Jokes aside, Dove’s troubling ad buys into a racist history of seeing white skin as clean, and black skin as something to be cleansed.

The original Pears’ soap advert based on the fable Washing the Blackamoor white, published in the Graphic for Christmas 1884.
Author provided

Racist history

Dove has missed the mark before. In a 2011 ad, three progressively paler-skinned women stand in towels under two boards labelled “Before” and “After”, implying transitioning to lighter skin was the luminous beauty promise of Dove (Dove responded that all three women represented the “after” image).

Many of the indignant comments reference the longstanding trope of black babies and women scrubbed white. Australia has particular form on this front. Gamilaraay Yuwaalaraay historian Frances Peters–Little (filmmaker and performing artist) has demanded an apology from Dove. She posted a soap advertisement for Nulla Nulla soap from 1901 on Facebook to show the long reach of racism through entrenched tropes still at work in the Dove ads.

A soap advertisement for Nulla Nulla soap from 1901.
Author provided

Wiradjuri author Kathleen Jackson has also written about the Nulla Nulla ad and the kingplate, a badge of honour given by white settlers to Aboriginal people, labelled “DIRT”. She explains that whiteness was seen as purity, while blackness was seen as filth, something that colonialists were charged to expunge from the face of the Earth. Advertising suggested imperial soap had the power to eradicate indigeneity.

This coincided with policies that were expressly aimed at eliminating the “native”. In Australia the policy of assimilation was based on the entirely spurious scientific whimsy of “biological absorption”, that dark skin and indigenous features could be eliminated through “breeding out the colour”.

In New South Wales, “half-caste” girls were targeted for removal from their families and placed as domestic servants in white homes where it was assumed “lower-class” white men would marry them. These women were often vulnerable to sexual violence. Any resulting children, however begotten, would be fairer-skinned, due implicitly to the bleaching properties of white men’s semen.

Aboriginal mothers were vilified as unhygienic and neglectful. In fact, they battled against often impossible privation to turn their children out immaculately in the hope police would have less cause to remove them.

Real beauty?

Cleanliness and godliness, whiteness and maternal competency: these are the lacerations Dove liberally salted with its history-blind ad. It unwittingly strikes at the resistance and resilience of Aboriginal families who for generations fended off fragmentation, draconian administration and intrusive surveillance by state administrators. Its myopic implied characterisation of beauty as resulting from shedding blackness is mystifying.

In 2004, Dove kicked off a campaign for “Real Beauty”. It proclaims itself “an agent of change to educate and inspire girls on a wider definition of beauty and to make them feel more confident about themselves”. Dove’s online short films about beauty standards – including Daughters, Onslaught, Amy and Evolution – have been recognised with international advertising awards.

Yet Dove also sits in Unilever with Fair and Lovely, a skin whitening product and brand developed in India in 1975. This corporate cousin to Dove touts its bleaching agent as the No. 1 “fairness cream” and purports to work through activating “the Fair and Lovely vitamin system to give radiant even toned skin”. It is sold in over 40 countries.

Skin whitening products (there is also a Fair and Handsome for men, not associated with Unilever) are popular in Asia, where more than 60 companies compete in a market estimated at US$18 billion. They enforce social hierarchies around caste and ethnicity. Since the 1920s the racialised politics of skin lightening have spread around the globe as consumer capitalism reached into China, India and South Africa.

The ConversationDove responded to its controversial ad by saying that “the diversity of real beauty… is core to our beliefs”. But “core” here seems skin-deep when it fails to penetrate into the pores of its parent company and its subsidiaries.

Liz Conor, ARC Future Fellow, La Trobe University

This article was originally published on The Conversation. Read the original article.


Poland



10 Best History Apps for Android


The link below is to an article that looks at 10 of the best history apps for Android.

For more visit:
http://www.androidauthority.com/best-history-apps-for-android-801017/


A bloody decade of the iPhone



File 20170904 17292 1nx8xtk
Foxconn was nominated for the 2011 Public Eye Award, which produced this image as part of its campaign to end labour exploitation.
Greenpeace Switzerland/flickr, CC BY-NC-ND

Jack Linchuan Qiu, Chinese University of Hong Kong

This article is part of the Democracy Futures series, a joint global initiative with the Sydney Democracy Network. The project aims to stimulate fresh thinking about the many challenges facing democracies in the 21st century.


Ten years ago the first iPhone went on sale. The iconic product not only profoundly altered the world of gadgets, but also of consumption and tall corporate profit; this world would be impossible without the toiling of millions along the assembly line.

I look back at the first ten years of the iPhone and see a bloody decade of labour abuse, especially in Chinese factories such as those run by Foxconn, the world’s largest electronics manufacturer. At one point Foxconn had more employees in China than the US armed forces combined.

Foxconn makes most of its money from assembling iPhones, iPads, iMacs and iPods. Its notorious “military management” was blamed for causing a string of 17 worker suicides in 2010.

The company tried so hard to stop the suicides, not by digging out the roots of exploitation, but by erecting “anti-jumping nets” atop its buildings. Never before has a modern factory hidden behind such suicide-prevention netting, which last appeared on transatlantic slave ships centuries ago.

Foxconn is only one part of the Apple empire. The long and complicated supply chain has caused innumerable work injuries, occupational diseases and premature deaths over the past decade.

To date, Apple does not offer a full account for the total damage of victimised lives. The number must be many, many thousands if we include all Apple suppliers. And yet factories like Foxconn often enjoy immunity, sometimes taking no responsibility at all.

Readers unfamiliar with the dark reality behind the iPhone need only watch Complicit.

To make a living, workers must break the law

Apple continues to put out bogus claims:

Products made to have a positive impact. On the world and the people who make them.

The company claims to hold its suppliers accountable “to the highest standards”.

In reality, corporate practices in the making of the iPhone are substandard when held up against either Chinese labour regulations or ethical smartphone companies such as Fairphone. Apple’s standards for their workers are anything but “the highest”.

Wages remain low. Students and Scholars Against Corporate Misbehaviour calculate that the living wage for an iPhone worker in Shenzhen, China, should be about $650 per month. But to earn this amount today, an average worker would need to pull off 80-90 hours of overtime every month – that’s more than double the legal cap of 36 hours.

In other words, to make a living, workers have no choice but to break Chinese law.

Back in 2012, Apple vowed to work with Foxconn to bring the amount of overtime down to no more than 49 hours a week. It later broke its promise and retreated to adopt the Electronic Industry Code of Conduct (EICC), which stipulates “no more than 60 hours a week”.

The EICC standard is 25% lower than the Chinese legal threshold. So why did Apple opt for a less-than-legal code of conduct in the Chinese context over a higher standard? Tim Cook owes us an explanation.

Even with the EICC, workers refusing to do excessive overtime at the current wage level simply won’t be able to make ends meet. The only way for workers to earn a livelihood without doing an illegal amount of overtime, and without compromising their physical, mental and social health, is for Apple and their suppliers to raise basic wages.

Is there real progress behind the progress reports?

Apple also brags about its training programs. According to its 2017 Supplier Responsibility Progress Report, the company partnered with its suppliers to train more than 2.4 million workers on their rights as employees. One basic right is for workers to unionise.

However, those at Foxconn are stuck with a management-run fake union that is ineffective and fooling no one.

If Apple is serious about its words, it should let workers know about their rights to genuine union representation and use its influence to let workers exercise this right. Unfortunately, no such thing has occurred in the past ten years. Will it happen in the next ten?

Apple’s standards for their workers are anything but ‘the highest’.
Annette Bernhardt/flickr

Considering that Apple has recently backed out from the Fair Labor Association, a third-party auditor of corporate social responsibility (CSR), I’m sceptical. The FLA is not exactly “the highest standard” in labour-related auditing to begin with. But Apple no longer even bothers to ask it to assess supplier working conditions.

Despite this regressive move, Apple declared in its annual CSR report that it “continue(s) to partner with independent third-party auditors”.

The glossy report offers no information on who the auditors actually are, and how their independence is guaranteed. This is fairly inconsistent with Apple’s claim to be the most transparent of IT companies.

What then, are “the highest standards”? The least Apple can do is to let international trade union federations audit Foxconn and other suppliers to ensure their workers are not mistreated. If Apple and Foxconn are so proud of what they have done for workers, why would they be afraid?

Apple should also stop pretending it doesn’t know about Fairphone, the Lovie Award-winning Dutch smartphone firm that was Europe’s “fastest-growing tech startup” in 2015.

Fairphone, with its modular design, information transparency and worker welfare fund, has brought revolutionary change to the ethical design, manufacture and recycling of smartphones, setting a truly new standard for the likes of Apple.

Last August, I visited Hi-P, a factory in Suzhou, eastern China, that assembles Fairphones. Hi-P also happens to be a supplier for Apple. According to a worker I spoke to, she and her colleagues preferred to make Fairphones because the job was less demanding and more generously remunerated.

“It’s much harder working for Apple. They are so stingy,” the assembly-line worker in her late 30s told me. “Our managers asked them [Apple] to give us similar bonuses [as we received from Fairphone]. They tried again and again, but ended up getting nothing even close.”

If an ordinary worker can plainly demonstrate that Apple does not, in fact, have the “highest standards”, surely it’s time the company stopped pleading ignorance or innocence of its labour abuse.

There’s no excuse for Apple’s first bloody decade of the iPhone. And even less so for its next ten years.


The ConversationJack Linchuan Qiu’s book, Goodbye iSlave: A Manifesto for Digital Abolition, is available from The University of Illinois Press.

Jack Linchuan Qiu, Professor, School of Journalism and Communication, Chinese University of Hong Kong

This article was originally published on The Conversation. Read the original article.


A home for everyone? Property ownership has been about status and wealth since our convict days



File 20170908 9573 1gprv4p
A house and land on the River Derwent, Tasmania, 1822.
National Library of Australia

Imogen Wegman, University of Tasmania

While Australia has an egalitarian mythology, where everyone has a chance, the roots of problems with access to housing lie in our history. The first land grants were given to former convicts as a way to control an unfenced prison colony. As free settlers arrived in Australia, priorities changed, land ownership gained prestige, and smaller landholders were pushed out of the market.

When Governor Phillip stepped onto Australian soil for the first time, in 1788, he carried with him a set of instructions to guide him through the early days of the newest British colony. Included was some authority to grant land, and the number of acres each male convict could receive at the end of his sentence. Eighteen months later, the colony received further instructions from Home Secretary William Grenville, permitting soldiers and free settlers to receive parcels of land if they chose to stay in the colony.

Grants given to former convicts at Norfolk Plains, northern Tasmania, 1814.
G.W. Evans, held by Tasmanian Archives and Heritage Office, AF 396/1/1325

Grenville’s instructions also set out the pattern of land granting that would dominate the colony for the next two decades. Groups of grants were to be placed at the edge of a waterway, with each individual property stretching back into the land rather than along the bank. These rules had a long history; the American colony of Georgia received almost identical phrasing in 1754, but other versions had been in place since the early 18th century.

The rules had two specific purposes in Australia: to foster productivity; and to maintain surveillance over the landholding population, which consisted largely of former convicts.

Initially, all land grants were required to conform to these instructions, and status was shown by the amount of land received. Former convicts started at 30 acres, while free settlers got at least 100 acres.

Under this scheme everyone would receive a mixture of good and bad soils, access to a navigable river and the safety of a surrounding community – important in an unfamiliar land. These grants would reduce the colony’s reliance on imported provisions. Instead, it could feed excess produce into the ports that restocked passing ships.

Colonial exploration and expansion could then continue to stretch to the furthest parts of the globe. But the rules also kept the grantees contained and within a dayʼs travel of a centre of governance (Hobart or Launceston, for example).

Free settlers’ arrival changed the rules

In 1817, the Colonial Office began to encourage voluntary emigration to the Australian colonies, and ambitious free settlers arrived. People complained about the failings of the former convicts, as they practised a rough agriculture that did not fit British ideals.

At the same time the management of convicts in Van Diemen’s Land (Tasmania) moved towards the harsh penitentiary system today associated with convicts. Using land grants to pin the former convict population to specific locations, while permitting them the freedom to live their lives, conflicted with free settlersʼ aspirations for the colony.

It is no accident that Bothwell, in Tasmania’s Derwent Valley, was not directly connected to Hobart by river and was dominated by free settlers. The spread of Europeans across the land resulted from the mix of an expanding overland road network and the reduced need to keep these higher-status settlers within armʼs reach.

Grants at Bothwell were given primarily to free settlers.
Surveyor and date unknown, Tasmanian Archives and Heritage Office, AF 396/1/338

Land granting policies that excluded poorer settlers (most of whom were former convicts or the children of convicts) were introduced. Only those people with £500 capital and assets (roughly A$80,000) would be eligible. The minimum grant would be 320 acres.

One writer, the colonial surveyor G.W. Evans, asked at the time whether this was intended to drive those without means to the United States of America instead. Even if they scraped together the money, the sheer quantity of land would be beyond their ability to cultivate.

Average grant sizes, taken from specific representative regions to eliminate duplicates in the records.
Author, 2017

Locating former convicts on the rivers ensured productivity and the reliable transportation of goods, but these grants also kept them under close observation. As the penal system became more punitive convicts lost the hope of gaining a small piece of land after their sentence.

The ConversationBut before this, far from being intended as any kind of reward or enticement, the first land grants given in Australia represented ongoing control over the lowest class of settlers – those who had been “transported beyond the seas”. Since the beginning of our colonial history, land ownership in Australia has been intricately connected with role and status.

Imogen Wegman, PhD candidate, History and Classics, University of Tasmania

This article was originally published on The Conversation. Read the original article.


%d bloggers like this: