Tag Archives: health

Stay alert, infodemic, Black Death: the fascinating origins of pandemic terms



Shutterstock

Simon Horobin, University of Oxford

Language always tells a story. As COVID-19 shakes the world, many of the words we’re using to describe it originated during earlier calamities – and have colourful tales behind them.

In the Middle Ages, for example, fast-spreading infectious diseases were known as plagues – as in the Bubonic plague, named for the characteristic swellings (or buboes) that appear in the groin or armpit. With its origins in the Latin word plaga meaning “stroke” or “wound”, plague came to refer to a wider scourge through its use to describe the ten plagues suffered by the Egyptians in the biblical book of Exodus.




Read more:
Lost in translation: five common English phrases you may be using incorrectly


An alternative term, pestilence, derives from Latin pestis (“plague”), which is also the origin of French peste, the title of the 1947 novel by Albert Camus (La Peste, or The Plague) which has soared up the bestseller charts in recent weeks. Latin pestis also gives us pest, now used to describe animals that destroy crops, or any general nuisance or irritant. Indeed, the bacterium that causes Bubonic plague is called Yersinia pestis.

The bacterium Yersinia pestis, which causes Bubonic plague.
Shutterstock

The Bubonic plague outbreak of the 14th century was also known as the Great Mortality or the Great Death. The Black Death, which is now most widely used to describe that catastrophe, is, in fact, a 17th-century translation of a Danish name for the disease: “Den Sorte Død”.

Snake venom, the original ‘virus’

The later plagues of the 17th century led to the coining of the word epidemic. This came from a Greek word meaning “prevalent”, from epi “upon” and demos “people”. The more severe pandemic is so called because it affects everyone (from Greek pan “all”).

A more recent coinage, infodemic, a blend of info and epidemic, was introduced in 2003 to refer to the deluge of misinformation and fake news that accompanied the outbreak of SARS (an acronym formed from the initial letters of “severe acute respiratory syndrome”).

The 17th-century equivalent of social distancing was “avoiding someone like the plague”. According to Samuel Pepys’s account of the outbreak that ravaged London in 1665, infected houses were marked with a red cross and had the words “Lord have mercy upon us” inscribed on the doors. Best to avoid properties so marked.

The current pandemic, COVID-19, is a contracted form of Coronavirus disease 2019. The term for this genus of viruses was coined in 1968 and referred to their appearance under the microscope, which reveals a distinctive halo or crown (Latin corona). Virus comes from a Latin word meaning “poison”, first used in English to describe a snake’s venom.

The word vaccine comes from the Latin ‘vacca’, meaning ‘cow’.
Shutterstock

The race to find a vaccine has focused on the team at Oxford University’s Jenner Institute, named for Edward Jenner (1749-1823). It was his discovery that contact with cowpox resulted in milkmaids becoming immune to the more severe strain found in smallpox. This discovery is behind the term vaccine (from the Latin vacca “cow”) which gives individuals immunity (originally a term certifying exemption from public service). Inoculation was initially a horticultural term describing the grafting of a bud into a plant: from Latin oculus, meaning “bud” as well as “eye” (as in binoculars “having two eyes”).

Although we are currently adjusting to social distancing as part of the “new normal”, the term itself has been around since the 1950s. It was initially coined by sociologists to describe individuals or groups deliberately adopting a policy of social or emotional detachment.




Read more:
A history of English … in five words


Its use to refer to a strategy for limiting the spread of a disease goes back to the early 2000s, with reference to outbreaks of flu. Flu is a shortening of influenza, adopted into English from Italian following a major outbreak which began in Italy in 1743. Although it is often called the Spanish flu, the strain that triggered the pandemic of 1918 most likely began elsewhere, although its origins are uncertain. Its name derives from a particularly severe outbreak in Spain.

To the watchtower

Self-isolation, the measure of protection which involves deliberately cutting oneself off from others, is first recorded in the 1830s – isolate goes back to the Latin insulatus “insulated”, from insula “island”. An extended mode of isolation, known as quarantine, is from the Italian quarantina referring to “40 days”. The specific period derives from its original use to refer to the period of fasting in the wilderness undertaken by Jesus in the Christian gospels.

Lockdown, the most extreme form of social containment, in which citizens must remain in their homes at all times, comes from its use in prisons to describe a period of extended confinement following a disturbance.

Many governments have recently announced a gradual easing of restrictions and a call for citizens to “stay alert”. While some have expressed confusion over this message, for etymologists the required response is perfectly clear: we should all take to the nearest tall building, since alert is from the Italian all’erta “to the watchtower”.The Conversation

Simon Horobin, Professor of English Language and Literature, University of Oxford

This article is republished from The Conversation under a Creative Commons license. Read the original article.


Museums are losing millions every week but they are already working hard to preserve coronavirus artefacts



The Smithsonian Institute closed all of its museums due to the worldwide COVID-19 coronavirus pandemic.
Shutterstock

Anna M. Kotarba-Morley, Flinders University

The COVID-19 pandemic has no borders and has caused the deaths of hundreds of thousands of citizens from countries across the globe. But this outbreak is not just having an effect on the societies of today, it is also impacting our past.

Cultural resources and heritage assets – from sites and monuments, historic gardens and parks, museums and galleries, to the intangible lifeways of traditional culture bearers – require ongoing safeguarding and maintenance in an overstretched world increasingly prone to major crises.

Meanwhile, the heritage sector is already working hard to preserve the COVID-19 moment, predicting that future generations will need documentary evidence, photographic archives and artefacts to help them understand this period of history.

Closed to visitors

The severity of the pandemic, and the infection control responses that followed, has caused great uncertainties and potential long-term knock-on effects within the sector, especially for smaller and medium-sized institutions and businesses.

A survey published by the Network of European Museum Organisations (NEMO) and communications within organisations such as the International Committee for Archaeological Heritage Management (ICAHM) show that the majority of European museums are closed, incurring significant losses of income. By the beginning of April, 650 museums from 41 countries had responded to the NEMO survey, reporting 92% of them were closed.

Large museums such as the Kunsthistorisches Museum in Vienna and the Rijksmuseum and Stedelijk Museum in Amsterdam are losing €100,000-€600,000 (A$168,700-A$1,012,000) per week. Only about 70% of staff are currently being retained on average at most of the institutions.

Museums (both private and national) located in tourist areas have privately reported initial losses of 75-80% income based on the Heritage Sector Briefing to the UK government. Reports are also emerging of philanthropic income fall of 80-90% by heritage charities with many heading towards insolvency within weeks.

Cambodia’s Angkor Wat heritage site has lost 99.5% of its income in April compared to the same time last year.

Meanwhile, restorations to the cathedral of Notre-Dame de Paris came to an abrupt halt due to coronavirus just prior to the first anniversary of the fierce fire that damaged it. Builders have since returned to the site.

The situation is especially dire for culture bearers within remote and isolated indigenous communities still reeling from other catastrophes, such as the disastrous fires in Australia and the Amazon. Without means of social distancing these communities are at much higher risk of being infected and in turn their cultural custodianship affected.




Read more:
Coronavirus: as culture moves online, regional organisations need help bridging the digital divide


The right to culture

It is interesting to think about how this crisis will reshape visitor experience in the future.

The NEMO survey reports that more than 60% of the museums have increased their online presence since they were closed due to social distancing measures, but only 13.4% have increased their budget for online activities. We have yet to see more data about online traffic in virtual museums and tours, but as it stands it is certainly showing signs of significant increase.

As highlighted in the preamble of the 2003 UNESCO Declaration:

cultural heritage is an important component of cultural identity and of social cohesion, so that its intentional destruction may have adverse consequences on human dignity and human rights.

The human right of access to and enjoyment of cultural heritage is guaranteed by international law, emphasised in the Human Rights Council in its recent Resolution 33/20 (2016) that notes:

the destruction of or damage to cultural heritage may have a detrimental and irreversible impact on the enjoyment of cultural rights.

Article 27 of the Universal Declaration of Human Rights states that:

everyone has the right freely to participate in the cultural life of the community, to enjoy the arts and to share in scientific advancement and its benefits.




Read more:
Protecting heritage is a human right


In the future, generations will need the means to understand how the coronavirus pandemic affected our world, just as they can now reflect on the Spanish Flu or the Black Death.

Preserving a pandemic

Work is underway to preserve this legacy with organisations such as Historic England collecting “lockdown moments in living memories” through sourcing photographs from the public for their archive. Twitter account @Viral_Archive run by a number of academic archaeologists is following in a same vane with interesting theme of #ViralShadows.

In the United States, the Smithsonian’s National Museum of American History has assembled a dedicated COVID-19 collection task force. They are already collecting objects including personal protection equipment such as N95 and homemade cloth masks, empty boxes (to show scarcity), and patients’ illustrations.

The National Museum of Australia has invited Australians to share their “experiences, stories, reflections and images of the COVID-19 pandemic” so curators can enhance the “national conversation about an event which is already a defining moment in our nation’s history”. The State Library of New South Wales is collecting images of life in isolation to “help tell this story to future generations”.

Citizen science is a great way to engage public and although such work is labour-intensive it can lead to more online traffic and potentially fill in financial deficits by enticing visitors back to the sites.

The closed Van Gogh Museum in Amsterdam, Netherlands on March 22.
Shutterstock

Priorities here

The timing of the COVID-19 pandemic – occurring in the immediate aftermath of severe draught, catastrophic fire season and then floods, with inadequate intervening time for maintenance and conservation efforts – presents new challenges.

The federal government reports that in the financial year 2018-19, Australia generated A$60.8 billion in direct tourism gross domestic product (GDP). This represents a growth of 3.5% over the previous year – faster than the national GDP growth. Tourism directly employed 666,000 Australians making up 5% of Australia’s workforce. Museums and heritage sites are a significant pillar to tourism income and employment.

Even though the government assures us “heritage is all the things that make up Australia’s identity – our spirit and ingenuity, our historic buildings, and our unique, living landscapes” its placement within the Department of Agriculture, Water and Environment’s portfolio shows lack of prioritisation of the sector.

Given the struggles we are already seeing in the arts and culture sector, which has been recently moved to the portfolio of the Department of Infrastructure, Transport, Regional Development and Communications means that the future of our heritage (and our past) is far from certain.The Conversation

Anna M. Kotarba-Morley, Lecturer, Archaeology, Flinders University

This article is republished from The Conversation under a Creative Commons license. Read the original article.


Face masks: what the Spanish flu can teach us about making them compulsory



Red Cross nurses in San Francisco, 1918.
Wikimedia

Samuel Cohn, University of Glasgow

Should people be forced to wear face masks in public? That’s the question facing governments as more countries unwind their lockdowns. Over 30 countries have made masks compulsory in public, including Germany, Austria and Poland. This is despite the science saying masks do little to protect wearers, and only might prevent them from infecting other people.

Nicola Sturgeon, the Scottish first minister, has nonetheless announced new guidelines advising Scots to wear masks for shopping or on public transport, while the UK government is expected to announce a new stance shortly. Meanwhile, US vice president Mike Pence has controversially refused to mask up.

This all has echoes of the great influenza pandemic, aka the Spanish flu, which killed some 50 million people in 1918-20. It’s a great case study in how people will put up with very tough restrictions, so long as they think they have merit.

The great shutdown

In the US, no disease in history led to such intrusive restrictions as the great influenza. These included closures of schools, churches, soda fountains, theatres, movie houses, department stores and barber shops, and regulations on how much space should be allocated to people in indoor public places.

There were fines against coughing, sneezing, spitting, kissing and even talking outdoors – those the Boston Globe called “big talkers”. Special influenza police were hired to round up children playing on street corners and occasionally even in their own backyards.

Restrictions were similarly tough in Canada, Australia and South Africa, though much less so in the UK and continental Europe. Where there were such restrictions, the public accepted it all with few objections. Unlike the long history of cholera, especially in Europe, or the plague in the Indian subcontinent from 1896 to around 1902, no mass violence erupted and blame was rare – even against Spaniards or minorities.

Face masks came closest to being the measure that people most objected to, even though masks were often popular at first. The Oklahoma City Times in October 1918 described an “army of young women war workers” appearing “on crowded street cars and at their desks with their faces muffled in gauze shields”. From the same month, The Ogden Standard reported that “masks are the vogue”, while the Washington Times told of how they were becoming “general” in Detroit.

Shifting science

There was scientific debate from the beginning about whether the masks were effective, but the game began to change after French bacteriologist Charles Nicolle’s discovered in October 1918 that the influenza was much smaller than any other known bacterium.

The news spread rapidly, even in small-town American newspapers. Cartoons were published that read, “like using barbed wire fences to shut out flies”. Yet this was just at the point that mortality rates were ramping up in the western states of the US and Canada. Despite Nicolle’s discovery, various authorities began making masks compulsory. San Francisco was the first major US city to do so in October 1918, continuing on and off over a three-month period.

Alberta in Canada did likewise, and New South Wales, Australia, followed suit when the disease arrived in January 1919 (the state basing its decision on scientific evidence older than Charles Nicolle’s findings). The only American state to make masks mandatory was (briefly) California, while on the east coast and in other countries including the UK they were merely recommended for most people.

San Francisco gathering, 1918.
Wikimedia

Numerous photographs, like the one above, survive of large crowds wearing masks in the months after Nicolle’s discovery. But many had begun to distrust masks, and saw them as a violation of civil liberties. According to a November 1918 front page report from Utah’s Garland City Globe:

The average man wore the mask slung to the back of his neck until he came in sight of a policeman, and most people had holes cut into them to stick their cigars and cigarettes through.

Disobedience aplenty

San Francisco saw the creation of the anti-mask league, as well as protests and civil disobedience. People refused to wear masks in public or flaunted wearing them improperly. Some went to prison for not wearing them or refusing to pay fines.

In Tucson, Arizona, a banker insisted on going to jail instead of paying his fine for not masking up. In other western states, judges regularly refused to wear them in courtrooms. In Alberta, “scores” were fined in police courts for not wearing masks. In New South Wales, reports of violations flooded newspapers immediately after masks were made compulsory. Not even stretcher bearers carrying influenza victims followed the rules.

England was different. Masks were only advised as a precautionary measure in large cities, and then only for certain groups, such as influenza nurses in Manchester and Liverpool. Serious questions about efficacy only arose in March 1919, and only within the scientific community. Most British scientists now united against them, with the Lancet calling masks a “dubious remedy”.

These arguments were steadily being bolstered by statistics from the US. The head of California’s state board of health had presented late 1918 findings from San Francisco’s best run hospital showing that 78% of nurses became infected despite their careful wearing of masks.

Physicians and health authorities also presented statistics comparing San Francisco’s mortality rates with nearby San Mateo, Los Angeles and Chicago, none of which had made masks compulsory. Their mortality rates were either “no worse” or less. By the end of the pandemic in 1919, most scientists and health commissions had come to a consensus not unlike ours about the benefits of wearing masks.

Clearly, many of these details are relevant today. It’s telling that a frivolous requirement became such an issue while more severe rules banned things like talking on street corners, kissing your fiancé or attending religious services – even in the heart of America’s Bible belt.

Perhaps there’s something about masks and human impulses that has yet to be studied properly. If mass resistance to the mask should arise in the months to come, it will be interesting to see if new research will produce any useful findings on phobias about covering the face.The Conversation

Samuel Cohn, Professor of History, University of Glasgow

This article is republished from The Conversation under a Creative Commons license. Read the original article.


Coronavirus lessons from past crises: how WWI and WWII spurred scientific innovation in Australia



CSIRO Archives, CC BY-SA

Tom Spurling, Swinburne University of Technology and Garrett Upstill, Swinburne University of Technology

In the wake of COVID-19, we’re seeing intense international competition for urgently-needed supplies including personal protection equipment and ventilators. In Australia, this could extend to other critical imports such as pharmaceuticals and medicines. And when our manufacturing sector can’t fill unexpected breaks in supply chains, we all face risk.

However, Australians have lived through crises of comparable magnitude before. During and after the two world wars, scientific innovation played a crucial role in reform. It led to the creation of the Council for Scientific and Industrial Research (CSIR) and an array of subsequent discoveries.

Some may assume life will go back to normal once COVID-19 withdraws. But if the past is to be learnt from, Australia should prepare for a greatly different future – hopefully one in which science and innovation once more take centre stage.




Read more:
How Australia played the world’s first music on a computer


The birth of the CSIR

It was WWI that heightened awareness of the role of science in defence and economic growth. In December 1915, Prime Minister William (Billy) Hughes announced he would set up a national laboratory “which would allow men of all branches of science to use their capabilities in application to industry”.

A CSIR council meeting in 1935, held at the McMaster Laboratory in Sydney.
CSIRO Archives, CC BY

This led to the formation of the CSIR in 1926, and its rebirth as the CSIRO in 1949. In the years after WW1, the CSIR contributed greatly to improvements in primary production, including through animal nutrition, disease prevention, and the control of weeds and pests in crops. It also advanced primary product processing and overseas product transport.

In 1937, the CSIR’s mandate was expanded to include secondary industry research, including a national Aircraft and Engine Testing and Research Laboratory. This was motivated by the government’s concern to increase Australia’s manufacturing capabilities and reduce its dependence on technology imports.

War efforts in the spotlight

The CSIR’s research focus shifted in 1941 with the attack on Pearl Harbour. Australian war historian Boris Schedvin has written about the hectic scramble to increase the nation’s defence capacities and expand essential production following the attack, including expansion of the scientific workforce.

Minister John Dedman died in 1973.
Wikipedia (public domain)

The John Curtin government was commissioned in October, 1941. Curtin appointed John Dedman as the Minister for War Organisation and Industry, as well as the minister in charge of the CSIR. Dedman’s department was concerned with producing military supplies and equipment, and other items to support society in wartime.

Dedman instructed the council to concentrate on “problems connected with the war effort”. The CSIR responded robustly. By 1942, the divisions of food preservation and transport, forest products, aeronautics, industrial chemistry, the national standards laboratory and the lubricants and bearings section were practically focused on war work full-time.

Scaling up production

The Division of Industrial Chemistry was the division most closely involved in actual production. It was formed in 1940 with Ian Wark as chief, who’d previously worked at the Electrolytic Zinc Company.

Wark was familiar with the chemical industry, and quickly devoted resources to developing processes (using Australian materials) to produce essential chemicals to the pilot plant stage. They were soon producing chemicals for drugs at the Fishermans Bend site, including the starting material for the synthesis of the anaesthetic drug novocaine (procaine).

The researchers developed a method to separate the drug ergot, which is now essential in gynaecology, from rye. They also contributed directly to the war effort by manufacturing the plasticiser used in the nose caps of bullets and shells.

CSIRO today

In response to the current pandemic, CSIRO at the Australian Centre for Disease Preparedness in Geelong, Victoria, is working with the international Coalition for Epidemic Preparedness to improve understanding of the SARS-CoV-2 virus. They are currently testing two vaccine candidates for efficacy, and evaluating the best way to administer the vaccine.

CSIRO’s directors Trevor Drew and Rob Grenfell share progress on COVID-19 vaccine testing being carried out at the Australian Centre for Disease Preparedness in Geelong.

Australian scientists have made monumental contributions on this front in the past. In the 1980s, CSIRO and its university collaborators began efforts that led to the creation of anti-flu drug Relenza, the first drug to successfully treat the flu. Relenza was then commercialised by Australian biotech company Biota, which licensed the drug to British pharmaceutical company GlaxoSmithKline.

The CSIRO also invented the Hendra virus vaccine for horses, launched in 2012.

Prior to that, Ian Frazer at the University of Queensland developed the human papillomavirus (HPV) vaccine which was launched in 2006.




Read more:
How we developed the Hendra virus vaccine for horses


What can we take away?

COVID-19 is one of many viral diseases that need either a vaccine or a drug (or both). Others are hepatitis B, dengue fever, HIV and the viruses that cause the common cold. Now may be Australia’s chance to use our world class medical research and medicinal chemistry capabilities to become a dominant world supplier of anti-viral medications.

As was the case during WWI and WWII, this pandemic drives home the need to retain our capabilities at a time of supply chain disruption. While it’s impossible for a medium-sized economy like Australia’s to be entirely self-sufficient, it’s important we lean on our strengths to not only respond, but thrive during these complicated times.

In 2020, Australia has a much greater and broader research and production capacity than it did in 1940. And as we march through this pandemic, we can learn from the past and forge new paths to enhance our position as pioneers in sciencific innovation.The Conversation

Tom Spurling, Professor of Innovation Studies, Swinburne University of Technology and Garrett Upstill, Visiting Fellow, Swinburne University of Technology

This article is republished from The Conversation under a Creative Commons license. Read the original article.


Diary of Samuel Pepys shows how life under the bubonic plague mirrored today’s pandemic



There were eerie similarities between Pepys’ time and our own.
Justin Sullivan/Getty Images

Ute Lotz-Heumann, University of Arizona

In early April, writer Jen Miller urged New York Times readers to start a coronavirus diary.

“Who knows,” she wrote, “maybe one day your diary will provide a valuable window into this period.”

During a different pandemic, one 17th-century British naval administrator named Samuel Pepys did just that. He fastidiously kept a diary from 1660 to 1669 – a period of time that included a severe outbreak of the bubonic plague in London. Epidemics have always haunted humans, but rarely do we get such a detailed glimpse into one person’s life during a crisis from so long ago.

There were no Zoom meetings, drive-through testing or ventilators in 17th-century London. But Pepys’ diary reveals that there were some striking resemblances in how people responded to the pandemic.

A creeping sense of crisis

For Pepys and the inhabitants of London, there was no way of knowing whether an outbreak of the plague that occurred in the parish of St. Giles, a poor area outside the city walls, in late 1664 and early 1665 would become an epidemic.

The plague first entered Pepys’ consciousness enough to warrant a diary entry on April 30, 1665: “Great fears of the Sickenesse here in the City,” he wrote, “it being said that two or three houses are already shut up. God preserve us all.”

Portrait of Samuel Pepys by John Hayls (1666).
National Portrait Gallery

Pepys continued to live his life normally until the beginning of June, when, for the first time, he saw houses “shut up” – the term his contemporaries used for quarantine – with his own eyes, “marked with a red cross upon the doors, and ‘Lord have mercy upon us’ writ there.” After this, Pepys became increasingly troubled by the outbreak.

He soon observed corpses being taken to their burial in the streets, and a number of his acquaintances died, including his own physician.

By mid-August, he had drawn up his will, writing, “that I shall be in much better state of soul, I hope, if it should please the Lord to call me away this sickly time.” Later that month, he wrote of deserted streets; the pedestrians he encountered were “walking like people that had taken leave of the world.”

Tracking mortality counts

In London, the Company of Parish Clerks printed “bills of mortality,” the weekly tallies of burials.

Because these lists noted London’s burials – not deaths – they undoubtedly undercounted the dead. Just as we follow these numbers closely today, Pepys documented the growing number of plague victims in his diary.

‘Bills of mortality’ were regularly posted.
Photo 12/Universal Images Group via Getty Image

At the end of August, he cited the bill of mortality as having recorded 6,102 victims of the plague, but feared “that the true number of the dead this week is near 10,000,” mostly because the victims among the urban poor weren’t counted. A week later, he noted the official number of 6,978 in one week, “a most dreadfull Number.”

By mid-September, all attempts to control the plague were failing. Quarantines were not being enforced, and people gathered in places like the Royal Exchange. Social distancing, in short, was not happening.

He was equally alarmed by people attending funerals in spite of official orders. Although plague victims were supposed to be interred at night, this system broke down as well, and Pepys griped that burials were taking place “in broad daylight.”

Desperate for remedies

There are few known effective treatment options for COVID-19. Medical and scientific research need time, but people hit hard by the virus are willing to try anything. Fraudulent treatments, from teas and colloidal silver, to cognac and cow urine, have been floated.

Although Pepys lived during the Scientific Revolution, nobody in the 17th century knew that the Yersinia pestis bacterium carried by fleas caused the plague. Instead, the era’s scientists theorized that the plague was spreading through miasma, or “bad air” created by rotting organic matter and identifiable by its foul smell. Some of the most popular measures to combat the plague involved purifying the air by smoking tobacco or by holding herbs and spices in front of one’s nose.

Tobacco was the first remedy that Pepys sought during the plague outbreak. In early June, seeing shut-up houses “put me into an ill conception of myself and my smell, so that I was forced to buy some roll-tobacco to smell … and chaw.” Later, in July, a noble patroness gave him “a bottle of plague-water” – a medicine made from various herbs. But he wasn’t sure whether any of this was effective. Having participated in a coffeehouse discussion about “the plague growing upon us in this town and remedies against it,” he could only conclude that “some saying one thing, some another.”

A 1666 engraving by John Dunstall depicts deaths and burials in London during the bubonic plague.
Museum of London

During the outbreak, Pepys was also very concerned with his frame of mind; he constantly mentioned that he was trying to be in good spirits. This was not only an attempt to “not let it get to him” – as we might say today – but also informed by the medical theory of the era, which claimed that an imbalance of the so-called humors in the body – blood, black bile, yellow bile and phlegm – led to disease.

Melancholy – which, according to doctors, resulted from an excess of black bile – could be dangerous to one’s health, so Pepys sought to suppress negative emotions; on Sept. 14, for example, he wrote that hearing about dead friends and acquaintances “doth put me into great apprehensions of melancholy. … But I put off the thoughts of sadness as much as I can.”

Balancing paranoia and risk

Humans are social animals and thrive on interaction, so it’s no surprise that so many have found social distancing during the coronavirus pandemic challenging. It can require constant risk assessment: How close is too close? How can we avoid infection and keep our loved ones safe, while also staying sane? What should we do when someone in our house develops a cough?

During the plague, this sort of paranoia also abounded. Pepys found that when he left London and entered other towns, the townspeople became visibly nervous about visitors.

“They are afeared of us that come to them,” he wrote in mid-July, “insomuch that I am troubled at it.”

Pepys succumbed to paranoia himself: In late July, his servant Will suddenly developed a headache. Fearing that his entire house would be shut up if a servant came down with the plague, Pepys mobilized all his other servants to get Will out of the house as quickly as possible. It turned out that Will didn’t have the plague, and he returned the next day.

In early September, Pepys refrained from wearing a wig he bought in an area of London that was a hotspot of the disease, and he wondered whether other people would also fear wearing wigs because they could potentially be made of the hair of plague victims.

And yet he was willing to risk his health to meet certain needs; by early October, he visited his mistress without any regard for the danger: “round about and next door on every side is the plague, but I did not value it but there did what I could con ella.”

Just as people around the world eagerly wait for a falling death toll as a sign of the pandemic letting up, so did Pepys derive hope – and perhaps the impetus to see his mistress – from the first decline in deaths in mid-September. A week later, he noted a substantial decline of more than 1,800.

Let’s hope that, like Pepys, we’ll soon see some light at the end of the tunnel.

[You need to understand the coronavirus pandemic, and we can help. Read The Conversation’s newsletter.]The Conversation

Ute Lotz-Heumann, Heiko A. Oberman Professor of Late Medieval and Reformation History, University of Arizona

This article is republished from The Conversation under a Creative Commons license. Read the original article.


How the rich reacted to the bubonic plague has eerie similarities to today’s pandemic



Franz Xavier Winterhalter’s ‘The Decameron’ (1837).
Heritage Images via Getty Images

Kathryn McKinley, University of Maryland, Baltimore County

The coronavirus can infect anyone, but recent reporting has shown your socioeconomic status can play a big role, with a combination of job security, access to health care and mobility widening the gap in infection and mortality rates between rich and poor.

The wealthy work remotely and flee to resorts or pastoral second homes, while the urban poor are packed into small apartments and compelled to keep showing up to work.

As a medievalist, I’ve seen a version of this story before.

Following the 1348 Black Death in Italy, the Italian writer Giovanni Boccaccio wrote a collection of 100 novellas titled, “The Decameron.” These stories, though fictional, give us a window into medieval life during the Black Death – and how some of the same fissures opened up between the rich and the poor. Cultural historians today see “The Decameron” as an invaluable source of information on everyday life in 14th-century Italy.

Giovanni Boccaccio.
Leemage via Getty Images

Boccaccio was born in 1313 as the illegitimate son of a Florentine banker. A product of the middle class, he wrote, in “The Decameron,” stories about merchants and servants. This was unusual for his time, as medieval literature tended to focus on the lives of the nobility.

“The Decameron” begins with a gripping, graphic description of the Black Death, which was so virulent that a person who contracted it would die within four to seven days. Between 1347 and 1351, it killed between 40% and 50% of Europe’s population. Some of Boccaccio’s own family members died.

In this opening section, Boccaccio describes the rich secluding themselves at home, where they enjoy quality wines and provisions, music and other entertainment. The very wealthiest – whom Boccaccio describes as “ruthless” – deserted their neighborhoods altogether, retreating to comfortable estates in the countryside, “as though the plague was meant to harry only those remaining within their city walls.”

Meanwhile, the middle class or poor, forced to stay at home, “caught the plague by the thousand right there in their own neighborhood, day after day” and swiftly passed away. Servants dutifully attended to the sick in wealthy households, often succumbing to the illness themselves. Many, unable to leave Florence and convinced of their imminent death, decided to simply drink and party away their final days in nihilistic revelries, while in rural areas, laborers died “like brute beasts rather than human beings; night and day, with never a doctor to attend them.”

Josse Lieferinxe’s ‘Saint Sebastian Interceding for the Plague Stricken’ (c. 1498).
Wikimedia Commons

After the bleak description of the plague, Boccaccio shifts to the 100 stories. They’re narrated by 10 nobles who have fled the pallor of death hanging over Florence to luxuriate in amply stocked country mansions. From there, they tell their tales.

One key issue in “The Decameron” is how wealth and advantage can impair people’s abilities to empathize with the hardships of others. Boccaccio begins the forward with the proverb, “It is inherently human to show pity to those who are afflicted.” Yet in many of the tales he goes on to present characters who are sharply indifferent to the pain of others, blinded by their own drives and ambition.

In one fantasy story, a dead man returns from hell every Friday and ritually slaughters the same woman who had rejected him when he was alive. In another, a widow fends off a leering priest by tricking him into sleeping with her maid. In a third, the narrator praises a character for his undying loyalty to his friend when, in fact, he has profoundly betrayed that friend over many years.

Humans, Boccaccio seems to be saying, can think of themselves as upstanding and moral – but unawares, they may show indifference to others. We see this in the 10 storytellers themselves: They make a pact to live virtuously in their well-appointed retreats. Yet while they pamper themselves, they indulge in some stories that illustrate brutality, betrayal and exploitation.

Boccaccio wanted to challenge his readers, and make them think about their responsibilities to others. “The Decameron” raises the questions: How do the rich relate to the poor during times of widespread suffering? What is the value of a life?

In our own pandemic, with millions unemployed due to a virus that has killed thousands, these issues are strikingly relevant.

This is an updated version of an article originally published on April 16, 2020.

[Deep knowledge, daily. Sign up for The Conversation’s newsletter.]The Conversation

Kathryn McKinley, Professor of English, University of Maryland, Baltimore County

This article is republished from The Conversation under a Creative Commons license. Read the original article.


This isn’t the first global pandemic, and it won’t be the last. Here’s what we’ve learned from 4 others throughout history



Wikimedia/Pierart dou Tielt

David Griffin, The Peter Doherty Institute for Infection and Immunity and Justin Denholm, Melbourne Health

The course of human history has been shaped by infectious diseases, and the current crisis certainly won’t be the last time.

However, we can capitalise on the knowledge gained from past experiences, and reflect on how we’re better off this time around.




Read more:
Four of the most lethal infectious diseases of our time and how we’re overcoming them


1. The Plague, or ‘Black Death’ (14th Century)

While outbreaks of the plague (caused by the bacterium Yersinia pestis) still occur in several parts of the world, there are two that are particularly infamous.

The 200-year long Plague of Justinian began in 541 CE, wiping out millions in several waves across Europe, North Africa and the Middle East and crimping the expansionary aspirations of the Roman Empire (although some scholars argue that its impact has been overstated).

Then there’s the better known 14th century pandemic, which likely emerged from China and
decimated populations in Asia, Europe and Northern Africa.

Perhaps one of the greatest public health legacies to have emerged from the 14th century plague pandemic is the concept of “quarantine”, from the Venetian term “quarantena” meaning forty days.

The 14th century Black Death pandemic is thought to have catalysed enormous societal, economic, artistic and cultural reforms in Medieval Europe. It illustrates how infectious disease pandemics can be major turning points in history, with lasting impacts.

For example, widespread death caused labour shortages across feudal society, and often led to higher wages, cheaper land, better living conditions and increased freedoms for the lower class.

Various authorities lost credibility, since they were seen to have failed to protect communities from the overwhelming devastation of plague. People began to openly question long held certainties around societal structure, traditions, and religious orthodoxy.

This prompted fundamental shifts in peoples’ interactions and experience with religion, philosophy, and politics. The Renaissance period, which encouraged humanism and learning, soon followed.

The Dance of Death, or Danse Macabre was a common artistic trope of the time of the Black Death.
Public Domain/Wikimedia

The Black Death also had profound effects on art and literature, which took on more pessimistic and morbid themes. There were vivid depictions of violence and death in Biblical narratives,
still seen in many Christian places of worship across Europe.

How COVID-19 will reshape our culture, and what unexpected influence it will have for generations to come is unknown. There are already clear economic changes arising from this outbreak, as some industries rise, others fall and some businesses seem likely to disappear forever.

COVID-19 may permanently normalise the use of virtual technologies for socialising, business, education, healthcare, religious worship and even government.

2. Spanish influenza (1918)

The 1918 “Spanish Flu” pandemic’s reputation as one of the deadliest in human history is due to a complex interplay between how the virus works, the immune response and the social context in which it spread.

It arose in a world left vulnerable by the preceding four years of World War I. Malnutrition and overcrowding were common.

Around 500 million people were infected – a third of the global population at the time – leading to 50-100 million deaths.

A unique characteristic of infection was its tendency to kill healthy adults between the ages of 20 and 40.

At the time, influenza infection was attributed to a bacterium (Haemophilus influenzae) rather than a virus. Antibiotics for secondary bacterial infections were still more than a decade away, and intensive care wards with mechanical ventilators were unheard of.

Clearly, our medical and scientific understanding of the ‘flu in 1918 made it difficult to combat. However, public health interventions, including quarantine, the use of face masks and bans on mass gatherings helped limit the spread in some areas, building on prior successes in controlling tuberculosis, cholera and other infectious diseases.

Australia imposed maritime quarantine, requiring all arriving ships to be cleared by Commonwealth Quarantine Officials before disembarkation. That likely delayed and reduced the Spanish flu impact on Australia, and had secondary effects on the other Pacific Islands.

The effect of maritime quarantine was most striking in Western and American Samoa, with the latter enforcing strict quarantine and experiencing no deaths. By contrast, 25% of Western Samoans died, after influenza was introduced by a ship from New Zealand.

In some cities, mass gatherings were banned, and schools, churches, theatres, dance and pool halls closed.

In the United States, cities that committed earlier, longer and more aggressively to social distancing interventions, not only saved lives, but also emerged economically stronger than those that didn’t.

Face masks and hand hygiene were popularised and sometimes enforced in cities.

In San Francisco, a Red Cross-led public education campaign was combined with mandatory mask-wearing outside the home.

This was tightly enforced in some jurisdictions by police officers issuing fines, and at times using weapons.


The Conversation, CC BY-ND

3. HIV/AIDS (20th century)

The first reported cases of HIV/AIDS in the Western world emerged in 1981.

Since then, around 75 million people have become infected with HIV, and about 32 million people have died.

Many readers may remember how baffling and frightening the HIV/AIDs pandemic was in the early days (and still is in many parts of the developing world).

We now understand that people living with HIV infection who are on treatment are far less likely to develop serious complications.

These treatments, known as antiretrovirals stop HIV from replicating. This can lead to an “undetectable viral load” in a person’s blood. Evidence shows that people with an undetectable viral load can’t pass the virus on to others during sex.

Condoms and PrEP (short for “pre-exposure prophylaxis,” where people take an oral antiretroviral pill once a day), can be used by people who don’t have HIV infection to reduce the risk of acquiring the virus.

Unfortunately, there are currently no proven antivirals available for the prevention or treatment of COVID-19, though research is ongoing.

The HIV pandemic taught us about the value of a well-designed public health campaign, and the importance of contact tracing. Broad testing in appropriate people is fundamental to this, to understand the extent of infection in the community and allow appropriately targeted individual and population-level interventions.

It also demonstrated that words and stigma matter; people need to feel they can test safely and be supported, rather than ostracised. Stigmatising language can fuel misconceptions, discrimination and discourage testing.

4. Severe Acute Respiratory Syndrome (SARS) (2002-2003)

The current pandemic is the third coronavirus outbreak in the past two decades.

The first was in 2002, when SARS emerged from horseshoe bats in China and spread to at least 29 countries around the world, causing 8,098 cases and 774 deaths.

SARS was finally contained in July, 2003. SARS-CoV-2, however, appears much more easily spread than the original SARS coronavirus.

To some extent SARS was a practice run for COVID-19. Researchers focused on SARS and MERS (Middle Eastern Respiratory Syndrome, another coronavirus that remains a problem in selected regions), are providing important foundational research for potential vaccines against SARS-CoV-2.

Knowledge gleaned from SARS may also lead to antiviral drugs to treat the current virus.

SARS also emphasised the importance of communication in a pandemic, and the need for frank, honest and timely information sharing.

Certainly, SARS was a catalyst for change in China; the government invested in enhanced surveillance systems, that facilitate the real time collection and communication of infectious diseases and syndromes from emergency departments back to a centralised government database.

This was coupled with the International Health Regulations, which requires the reporting of unusual and unexpected outbreaks of disease.

Advances in science, information technology and knowledge gained from SARS, allowed us to quickly isolate, sequence and share SARS-CoV-2 data globally. Likewise, important clinical information was distributed early to the medical community.

SARS demonstrated how quickly and comprehensively a virus could spread around the world in the era of air transportation, and the role of individual “superspreaders”.

SARS also underlined the importance of the inextricable link between human, animal and environmental health, known as “One Health”, that may facilitate the crossover of germs between species.

Finally, a crucial, but perhaps overlooked lesson from SARS is the need for sustained investment in vaccine and infectious disease treatment research.




Read more:
Coronavirus is a wake-up call: our war with the environment is leading to pandemics


Few infectious disease researchers were surprised when another coronavirus pandemic broke out. A globalised world, with overcrowded, well connected people and cities, where humans and animals live in close proximity, provides fertile conditions for infectious diseases.

We must be ever prepared for the emergence of another pandemic, and learn the lessons of history to navigate the next threat.The Conversation

David Griffin, Infectious Diseases Fellow, The Peter Doherty Institute for Infection and Immunity and Justin Denholm, Associate Professor, Melbourne Health

This article is republished from The Conversation under a Creative Commons license. Read the original article.


Florence Nightingale: a pioneer of hand washing and hygiene for health



Helping the wounded.
Shutterstock/Everett Historical

Richard Bates, University of Nottingham

Florence Nightingale, who was born 200 years ago, is rightly famed for revolutionising nursing. Her approach to caring for wounded soldiers and training nurses in the 19th century saved and improved countless lives. And her ideas on how to stay healthy still resonate today – as politicians give official guidance on how best to battle coronavirus.

For example, although Nightingale did not fully subscribe to the idea that many diseases are caused by specific micro-organisms known as germs until she was in her sixties, in the 1880s, she was well aware of the importance of hand washing. In her book Notes on Nursing (1860), she wrote that:

Every nurse ought to be careful to wash her hands very frequently during the day. If her face, too, so much the better.

During the Crimean War (1853-1856) Nightingale had implemented hand washing and other hygiene practices in British army hospitals. This was relatively new advice, first publicised by Hungarian doctor Ignaz Semmelweis in the 1840s, who had observed the dramatic difference it made to death rates on maternity wards.

Nightingale’s attention to international medical research and developments was just one factor behind her ability to make effective interventions in public health. Like many public health experts of her age, Nightingale considered the home to be a crucial site for disease-preventing interventions. This was the place where most people contracted and suffered from infectious diseases. (The same is true today: in Wuhan’s coronavirus outbreak, around 75-80% of transmissions were reportedly in family clusters).

Nightingale’s book, Notes on Nursing (1860), was more of a public health instruction book than a nursing manual. It advised ordinary people how to maintain healthy homes – particularly women, in accordance with the worldview of the times. There was straightforward advice on everything from how to avoid excessive smoke from fireplaces (don’t let the fire get too low, and don’t overwhelm it with coal) to the safest material with which to cover walls (oil paints, not wallpaper).

Nightingale strongly counselled that people open windows to maximise light and ventilation and displace “stagnant, musty and corrupt” air. And she advocated improving drainage to combat water-borne diseases like cholera and typhoid.

In her view, all domestic interiors must be kept clean. Dirty carpets and unclean furniture, she wrote with characteristic bluntness, “pollute the air just as much as if there were a dung heap in the basement”.

Notes on Nursing also called upon the “mistress” of every building to clean “every hole and corner” of her home regularly, for the sake of her family’s health. But Nightingale also recommended a more holistic approach to health. She encouraged soldiers to read, write and socialise during their convalescence so they would not sink into boredom and alcoholism.

Good data

During her youth, Nightingale’s father had introduced her to a leading practitioner of statistics, then a brand new academic field, and paid for her to have a mathematics tutor. During and after the Crimean War, Nightingale seized on statistics as a way of proving the effectiveness of different interventions.

She went on to produce her famous diagrams, which demonstrated the high proportion of soldiers’ deaths caused by disease as opposed to battle wounds, and became the first woman admitted to the London Statistical Society in 1858.

Thereafter she designed questionnaires to obtain data on such questions as the sanitary condition of army stations in India, or the mortality rates of aboriginal populations in Australia. Her guiding principle was that a health problem could only be effectively tackled once its dimensions were reliably established.

In 1857, around a year after returning from the Crimean War, Nightingale suffered a severe collapse, now believed to have been caused by a flu-like infection called brucellosis. For much of her subsequent life, she was racked with chronic pain, often unable to walk or leave her bed.

Working from home

Having been declared an invalid, she imposed a rule of seclusion on herself because of pain and tiredness rather than from fears of contagion – a form of self-isolation that extended to her closest family (though she still had servants and other visitors).

During her first years of working entirely from home, Nightingale’s productivity was extraordinary. As well as writing Notes on Nursing, she produced an influential 900-page report on the medical failings during the Crimean War, and a book on hospital design.

This was in addition to setting up the Nightingale Training School for nurses at St Thomas’ hospital in London in 1860, and a midwifery training programme at King’s College Hospital in 1861, plus advising on the design of a number of new hospitals.
Later in the 1860s, Nightingale proposed a reform of workhouse infirmaries to make them high quality taxpayer-funded hospitals; and also worked on sanitary and social reforms in India. All of this she accomplished without leaving her house (though government ministers sometimes came to her home for meetings).

Having said this, it is worth remembering that Nightingale’s was a privileged form of self-isolation. Her father’s fortune, derived from Derbyshire mining interests, meant she had no money worries.

She lived in a nice house in London with various assistants and servants to help, shop and cook for her, and had no children to look after. Her entire waking time could be devoted to reading and writing. So while this is an appropriate time to recall and celebrate the huge contribution Nightingale made to modern nursing and public health care, we shouldn’t feel too bad if we don’t quite live up to her high standards of isolated productivity.The Conversation

Richard Bates, Postdoctoral Research Fellow, Department of History, University of Nottingham

This article is republished from The Conversation under a Creative Commons license. Read the original article.


Lessons from the Great Depression: how to prevent evictions in an economic crisis



Eviction in Redfern, NSW, in 1934.
State Library of New South Wales

Vanessa Whittington, Western Sydney University

The queues of unemployed people outside Centrelink offices in recent days are reminiscent of the dole queues seen across Australia during the Great Depression of the 1930s.

At that time, most states provided inadequate food vouchers rather than cash to people in the form of income support payments. This made it particularly difficult for renters, many of whom were unemployed due to the mass closure of factories, to continue to pay rent.

In NSW, lower-income areas of Sydney were particularly badly hit by unemployment, and because the working class was a renting class, this quickly translated into homelessness.




Read more:
As coronavirus hits holiday lettings, a shift to longer rentals could help many of us


For example, male unemployment reached 38.9% in the then-working class suburb of Newtown by 1933, well above the NSW average of 32% and three times the rate in the affluent suburb of Vaucluse.

Tent cities sprang up in Sydney’s Domain and on the outskirts of the city in suburbs like La Perouse, such as the ironically named tent city, Happy Valley. Although this is likely to underestimate the numbers of homeless at the time, the 1933 census reported

33,000 people [were] travelling in the hope of work and 400,000 [were]
living in shelters made of ‘iron, calico, canvas, bark, hessian and other scavenged materials’.

Residents in Happy Valley in the 1930s.
State Library of New South Wales

COVID-19 and assistance for renters

There are distinct parallels between the severe economic downturn of the 1930s and the economic repercussions of the COVID-19 crisis in terms of mass business closures and worker layoffs.

The Australian government has estimated that one million Australians could become unemployed as a result of the coronavirus. However, it is not clear if this comprises only those who will be directly affected by business closures or includes people impacted by the flow-on effects.

Taking into account the current unemployment rate, an additional one million Australians would bring the rate to 13% of the Australian workforce, from my own estimates.

Although the increase in Centrelink payments announced by the Morrison government will help those workers suddenly without jobs, additional measures are needed to protect people who can’t pay their rents and are faced with possible eviction.




Read more:
Why housing evictions must be suspended to defend us against coronavirus


The National Cabinet is working on a range of strategies to assist renters, including preventing landlords from evicting tenants directly impacted by the coronavirus and offering tax relief to landlords who reduce or waive rents.

But these need to be supplemented by strong legislative measures, such as the amendment passed by the NSW parliament this week that empowers the housing minister to ban evictions for renters for six months.

Emergency laws to protect renters are also currently being debated in Tasmania.

Queues of people formed outside Centrelink offices nationwide this week.
JOEL CARRETT/AAP

Staving off homelessness in the Great Depression

There is precedent for legislative reform of this kind from the Great Depression.

In response to the mass numbers of job losses in NSW, the government at the time, led by Premier Jack Lang, passed two pieces of legislation aimed at providing relief for renters. This legislation was very significant, as it was the first of its kind that afforded tenants across NSW any serious amount of protection.

One of the bills, passed as the Reduction of Rent Act 1931, reduced rents state-wide by 22.5% and made leases that did not acknowledge this reduction illegal.




Read more:
Coronavirus puts casual workers at risk of homelessness unless they get more support


The other piece of significant tenancy reform was the Ejectments Postponement Bill 1931. This bill prohibited eviction from a dwelling house without an order of the court. If the court could be shown the rent could not be paid, the tenancy could be extended indefinitely.

In his second reading speech, William McKell, minister for justice in the Lang government, described the bill as “a bona fide effort to provide against hardship due to unemployment”.

As honourable members are aware, there is a large amount of unemployment, and there are many very deserving and reputable people who, unfortunately, are not able to pay their rent. It is a tragedy that people of that type, with their families, are being evicted from their homes, and the Government is desirous of preventing as far as possible evictions of that character.

Though the government was committed to helping renters, McKell clearly distinguishes between the deserving and undeserving unemployed in his speech, an unhelpful way of thinking that is still with us today.

Although it is not known how many evictions the reforms of 1931 prevented, the new laws were undoubtedly a boon for renters, given the news coverage of the time. Landlords and their representatives complained about the impact the laws had on their ability to evict tenants.

In fact, the Real Estate Institute noted the financial hardship the Ejectments Postponement Act was placing on landlords.

Hundreds of cases have been reported to the Real Estate Institute, where the owners of houses, dependent on rents for their livelihood, have been refused possession, and have also been refused relief under the dole system, on the grounds that they are property owners.

Unfortunately for renters, these reforms were relatively short-lived. The Lang government was sacked by the NSW governor in May 1932 and replaced in the next election by the more conservative United Australia Party and Country Party coalition government.

This change in government saw the passage of the Landlord and Tenant (Amendment) Act 1932, which repealed the Ejectments Postponement Act 1931. The rent reduction law was also made more favourable to landlords.

The interests of landlords were prioritised over those of unemployed renters, a salutary lesson to present governments not to let ideology and vested interests get in the way of needed reforms that will benefit a significant portion of the population during a crisis not of their making.The Conversation

Vanessa Whittington, PhD Candidate, Institute for Culture and Society, Western Sydney University

This article is republished from The Conversation under a Creative Commons license. Read the original article.


The fashionable history of social distancing



Crinolines, by design, made physical contact nearly impossible.
Hulton Archive/Stringer via Getty Images

Einav Rabinovitch-Fox, Case Western Reserve University

As the world grapples with the coronavirus outbreak, “social distancing” has become a buzzword of these strange times.

Instead of stockpiling food or rushing to the hospital, authorities are saying social distancing – deliberately increasing the physical space between people – is the best way ordinary people can help “flatten the curve” and stem the spread of the virus.

Fashion might not be the first thing that comes to mind when we think of isolation strategies. But as a historian who writes about the political and cultural meanings of clothing, I know that fashion can play an important role in the project of social distancing, whether the space created helps solve a health crisis or keep away pesky suitors.

Clothing has long served as a useful way to mitigate close contact and unnecessary exposure. In this current crisis, face masks have become a fashion accessory that signals, “stay away.”

A copper engraving of a plague doctor in 17th-century Rome.
Wikimedia Commons

Fashion also proved to be handy during past epidemics such as the bubonic plague, when doctors wore pointed, bird-like masks as a way to keep their distance from sick patients. Some lepers were forced to wear a heart on their clothes and don bells or clappers to warn others of their presence.

However, more often than not, it doesn’t take a worldwide pandemic for people to want to keep others at arm’s length.

In the past, maintaining distance – especially between genders, classes and races – was an important aspect of social gatherings and public life. Social distancing didn’t have anything to do with isolation or health; it was about etiquette and class. And fashion was the perfect tool.

Take the Victorian-era “crinoline.” This large, voluminous skirt, which became fashionable in the mid-19th century, was used to create a barrier between the genders in social settings.

While the origins of this trend can be traced to the 15th-century Spanish court, these voluminous skirts became a marker of class in the 18th century. Only those privileged enough to avoid household chores could wear them; you needed a house with enough space to be able to comfortably move from room to room, along with a servant to help you put it on. The bigger your skirt, the higher your status.

A satirical comic pokes fun at the ballooning crinolines of the mid-19th century.
Wikimedia Commons

In the 1850s and 1860s, more middle-class women started wearing the crinoline as caged hoop skirts started being mass-produced. Soon, “Crinolinemania” swept the fashion world.

Despite critiques by dress reformers who saw it as another tool to oppress women’s mobility and freedom, the large hoop skirt was a sophisticated way of maintaining women’s social safety. The crinoline mandated that a potential suitor – or, worse yet, a stranger – would keep a safe distance from a woman’s body and cleavage.

Although these skirts probably inadvertently helped mitigate the dangers of the era’s smallpox and cholera outbreaks, crinolines could be a health hazard: Many women burned to death after their skirts caught fire. By the 1870s, the crinoline gave way to the bustle, which only emphasized the fullness of the skirt on the posterior.

Women nonetheless continued to use fashion as a weapon against unwanted male attention. As skirts got narrower in the 1890s and early 1900s, large hats – and, more importantly, hat pins, which were sharp metal needles used to fasten the hats – offered women the protection from harassers that crinolines once gave.

As for keeping healthy, germ theory and a better understanding of hygiene led to the popularization of face masks – very similar to the ones we use today – during the Spanish flu. And while the need for women to keep their distance from pesky suitors remained, hats were used more to keep masks intact than to push strangers away.

Today, it isn’t clear whether the coronavirus will lead to new styles and accessories. Perhaps we’ll see the rise of novel forms of protective outerwear, like the “wearable shield” that one Chinese company developed.

But for now, it seems most likely that we’ll all just continue wearing pajamas.

[You need to understand the coronavirus pandemic, and we can help. Read our newsletter.]The Conversation

Einav Rabinovitch-Fox, Visiting Assistant Professor, Case Western Reserve University

This article is republished from The Conversation under a Creative Commons license. Read the original article.


%d bloggers like this: