Tag Archives: health

Lockdowns, second waves and burn outs. Spanish flu’s clues about how coronavirus might play out in Australia



National Museum of Australia

Jeff Kildea, UNSW

In a remarkable coincidence, the first media reports about Spanish flu and COVID-19 in Australia both occurred on January 25 – exactly 101 years apart.

This is not the only similarity between the two pandemics.

Although history does not repeat, it rhymes. The story of how Australia – and particular the NSW government – handled Spanish flu in 1919 provides some clues about how COVID-19 might play out here in 2020.

Sign up to The Conversation

Spanish flu arrives

Australia’s first case of Spanish flu was likely admitted to hospital in Melbourne on January 9 1919, though it was not diagnosed as such at the time. Ten days later, there were 50 to 100 cases.

Commonwealth and Victorian health authorities initially believed the outbreak was a local variety of influenza prevalent in late 1918.

Consequently, Victoria delayed until January 28 notifying the Commonwealth, as required by a 1918 federal-state agreement designed to coordinate state responses.




Read more:
Fleas to flu to coronavirus: how ‘death ships’ spread disease through the ages


Meanwhile, travellers from Melbourne had carried the disease to NSW. On January 25, Sydney’s newspapers reported that a returned soldier from Melbourne was in hospital at Randwick with suspected pneumonic influenza.

Shutdown circa 1919: libraries, theatres, churches close

The NSW government quickly imposed restrictions on the population when Spanish flu first arrived.
National Library of Australia

Acting quickly, in late January, the NSW government ordered “everyone shall wear a mask,” while all libraries, schools, churches, theatres, public halls, and places of indoor public entertainment in metropolitan Sydney were told to close.

It also imposed restrictions on travel from Victoria in breach of the federal-state agreement.

Thereafter, each state went its own way and the Commonwealth, with few powers and little money compared with today, effectively left them to it.

Generally, the restrictions were received with little demur. But inconsistencies led to complaints, especially from churches and the owners of theatres and racecourses.

People were allowed to ride in crowded public transport to thronged beaches. But masked churchgoers, observing physical distancing, were forbidden to assemble outside for worship.

Later, crowds of spectators would be permitted to watch football matches while racecourses were closed.

Spanish flu subsides

Nevertheless, NSW’s prompt and thorough application of restrictions initially proved successful.

During February, Sydney’s hospital admissions were only 139, while total deaths across the state were 15. By contrast, Victoria, which had taken three weeks before introducing more limited restrictions, recorded 489 deaths.

At the end of February, NSW lifted most restrictions.

Even so, the state government did not escape a political attack. The Labor opposition accused it of overreacting and imposing unnecessary economic and social burdens on people. It was particularly critical that the order requiring mask-wearing was not limited to confined spaces, such as public transport.

There was also debate about the usefulness of closing schools, especially in the metropolitan area.

But then it returns

In mid-March, new cases began to rise. Chastened by the criticism of its earlier measures, the government delayed reimposing restrictions until early April, allowing the virus to take hold.

This led The Catholic Press to declare

the Ministry fiddled for popularity while the country was threatened with this terrible pestilence.

Sydney’s hospital capacity was exceeded and the state’s death toll for April totalled 1,395. Then the numbers began falling again. After ten weeks the epidemic seemed to have run its course, but as May turned to June, new cases appeared.

The resurgence came with a virulence surpassing the worst days of April. This time, notwithstanding a mounting death toll, the NSW cabinet decided against reinstating restrictions, but urged people to impose their own restraints.

The government goes for “burn out”

After two unsuccessful attempts to defeat the epidemic – at great social and economic cost – the government decided to let it take its course.

It hoped the public by now realised the gravity of the danger and that it should be sufficient to warn them to avoid the chances of infection. The Sydney Morning Herald concurred, declaring

there is a stage at which governmental responsibility for the public health ends.

The second wave’s peak arrived in the first week of July, with 850 deaths across NSW and 2,400 for the month. Sydney’s hospital capacity again was exceeded. Then, as in April, the numbers began to decline. In August the epidemic was officially declared over.

Cases continued intermittently for months, but by October, admissions and deaths were in single figures. Like its predecessor, the second wave lasted ten weeks. But this time the epidemic did not return.




Read more:
How Australia’s response to the Spanish flu of 1919 sounds warnings on dealing with coronavirus


More than 12,000 Australians had died.

While Victoria had suffered badly early on compared to NSW, in the end, NSW had more deaths than Victoria – about 6,000 compared to 3,500. The NSW government’s decision not to restore restrictions saw the epidemic “burn out”, but at a terrible cost in lives.

That decision did not cause a ripple of objection. At the NSW state elections in March 1920, Spanish flu was not even a campaign issue.

The lessons of 1919

In many ways we have learned the lessons of 1919.

We have better federal-state coordination, sophisticated testing and contact tracing, staged lifting of restrictions and improved knowledge of virology.

Australia’s response to coronavirus has seen sophisticated testing and contact tracing.
Dean Lewis/AAP

But in other ways we have not learned the lessons.

Despite our increased medical knowledge, we are struggling to find a vaccine and effective treatments. And we are debating the same issues – to mask or not, to close schools or not.

Meanwhile, inconsistencies and mixed messaging undermine confidence that restrictions are necessary.

Yet, we are still to face the most difficult question of all.

The Spanish flu demonstrated that a suppression strategy requires rounds of restrictions and relaxations. And that these involve significant social and economic costs.

With the federal and state governments’ current suppression strategies we are already seeing signs of social and economic stress, and this is just round one.

Would Australians today tolerate a “burn out”?

The Spanish flu experience also showed that a “burn out” strategy is costly in lives – nowadays it would be measured in tens of thousands. Would Australians today abide such an outcome as people did in 1919?

It is not as if Australians back then were more trusting of their political leaders than we are today. In fact, in the wake of the wartime split in the Labor Party and shifting political allegiances, respect for political leaders was at a low ebb in Australia.

Australians today may not tolerate the large numbers of deaths we saw in 1919.
James Gourley/AAP

A more likely explanation is that people then were prepared to tolerate a death toll that Australians today would find unacceptable. People in 1919 were much more familiar with death from infectious diseases.

Also, they had just emerged from a world war in which 60,000 Australians had died. These days the death of a single soldier in combat prompts national mourning.

Yet, in the absence of an effective vaccine, governments may end up facing a “Sophie’s Choice”: is the community willing and able to sustain repeated and costly disruptions in order to defeat this epidemic or, as the NSW cabinet decided in 1919, is it better to let it run its course notwithstanding the cost in lives?




Read more:
Coronavirus is a ‘sliding doors’ moment. What we do now could change Earth’s trajectory


The Conversation


Jeff Kildea, Adjunct Professor Irish Studies, UNSW

This article is republished from The Conversation under a Creative Commons license. Read the original article.


Before epidemiologists began modelling disease, it was the job of astrologers



Women, representing nature, argue the influence of the zodiac with scholars in this undated 17th century engraving.
Wellcome Collection, CC BY

Michelle Pfeffer, The University of Queensland

The internet is awash with comparisons between life during COVID-19 and life during the Bubonic plague. The two have many similarities, from the spread of misinformation and the tracking of mortality figures, to the ubiquity of the question “when will it end?”

But there are, of course, crucial differences between the two. Today, when looking for information on the incidence, distribution, and likely outcome of the pandemic, we turn to epidemiologists and infectious disease models. During the Bubonic plague, people turned to astrologers.

Exploring the role played by astrologers in past epidemics reminds us that although astrology has been debunked, it was integral to the development of medicine and public health.

The flu, written in the stars

Before germ theory, the Scientific Revolution and then the Age of Enlightenment, it was common for medical practitioners to use astrological techniques in their everyday practice.

Hans Holbein’s Danse Macabre woodcut (1523-25).
Wikimedia

Compared to the simplistic horoscopes in today’s magazines, premodern astrology was a complex field based on detailed astronomical calculations. Astrologers were respected health authorities who were taught at the finest universities throughout Europe, and hired to treat princes and dukes.

Astrology provided physicians with a naturalistic explanation for the onset and course of disease. They believed the movements of the celestial bodies, in relation to each other and the signs of the Zodiac, governed events on earth. Horoscopes mapped the heavens, allowing physicians to draw conclusions about the onset, severity, and duration of illness.

The impact of astrology on the history of medicine can still be seen today. The term “influenza” was derived from the idea that respiratory disease was a product of the influence of the stars.




Read more:
Altered mind this morning? Hehe, just blame the planets


Public health and plague

Astrologers were seen as important authorities for the health of communities as well as individuals. They offered public health advice in annual almanacs, which were some of the most widely read literature in the premodern world.

Almanacs provided readers with tables for astrological events for the coming year, as well as advice on farming, political events, and the weather.

The publications were also important disseminators of medical knowledge. They explained basic medical principles and suggested remedies. They made prognostications about national health, using astrology to predict when an influx of venereal disease or plague was likely to arise.

These public health predictions were often based on the astrological theory of conjunctions. According to this theory, when certain planets seem to approach each other in the sky from our perspective on earth, great socio-cultural events are bound to occur.

When Bubonic plague hit France in 1348, the King asked the physicians at the University of Paris to account for its origins. Their answer was that the plague was caused by a conjunction of Saturn, Mars, and Jupiter.




Read more:
How medieval writers struggled to make sense of the Black Death


Predictions from above

Astrological accounts of plague remained popular into the 17th century. In this period, astrology was increasingly attacked as superstitious, so some astrologers tried to set their field on a more scientific grounding.

In an effort to make astrology more scientific, the English astrologer John Gadbury produced one of the earliest epidemiological studies of disease.

In London’s Deliverance Predicted (1655), Gadbury claimed his contemporaries couldn’t explain when plagues would arrive, or how long they’d last.

Gadbury proposed that if planets caused plagues, then planets also stopped plagues. Studying astrological events would therefore allow one to predict the course of an epidemic.

He gathered data from the previous four great London plagues (in 1593, 1603, 1625, and 1636), scouring the Bills of Mortality for weekly plague death rates, and compiling A Table shewing the Increase and Abatement of the Plague. Gadbury also used planetary tables to locate the planets’ positions throughout the epidemics. He then compared his data sets, looking for correlations.

Gadbury found a correlation between intensity of plague and the positions of Mars and Venus. Plague deaths increased sharply in July 1593, at which point Mars had moved into an astrologically significant position. Deaths then abated in September, when Venus’s position became more significant. Gadbury concluded that the movement of “the fiery Planet Mars” was the origin of pestilence and the “cause of its raging”, while the influence of the “friendly” Venus helped abate it.

Gadbury then applied his findings to the pestilence plaguing London at the time. He was able to correlate the beginnings of the plague in late 1664 and its growing intensity in June 1665 with recent astrological events.

He predicted the upcoming movement of Venus in August would see a fall in plague deaths. Then the movement of Mars in September would make the plague deadlier, but the movements of Venus in October, November, and December would halt the death rate.

The black death in London, circa 1665. Creator unknown.
The black death in London

Looking for patterns

Unfortunately for Gadbury, plague deaths increased dramatically in August. However, he was right in predicting a peak in September followed by a steep decrease at the end of the year. If Gadbury had accounted for other correlates – such as the coming of winter – his study might have been received more favourably.

The medical advice in Gadbury’s book certainly doesn’t stand up today. He argued the plague was not contagious, and that isolating at home only caused more deaths. Yet his attempt to find correlations with fluctuating mortality rates offers an early example of what we now call epidemiology.

While we may discredit Gadbury’s astrological assumptions, examples such as this illustrate the important role astrology played in the history of medicine, paving the way for naturalist explanations of infectious disease.The Conversation

Michelle Pfeffer, Postdoctoral Research Fellow in History, The University of Queensland

This article is republished from The Conversation under a Creative Commons license. Read the original article.


Stay alert, infodemic, Black Death: the fascinating origins of pandemic terms



Shutterstock

Simon Horobin, University of Oxford

Language always tells a story. As COVID-19 shakes the world, many of the words we’re using to describe it originated during earlier calamities – and have colourful tales behind them.

In the Middle Ages, for example, fast-spreading infectious diseases were known as plagues – as in the Bubonic plague, named for the characteristic swellings (or buboes) that appear in the groin or armpit. With its origins in the Latin word plaga meaning “stroke” or “wound”, plague came to refer to a wider scourge through its use to describe the ten plagues suffered by the Egyptians in the biblical book of Exodus.




Read more:
Lost in translation: five common English phrases you may be using incorrectly


An alternative term, pestilence, derives from Latin pestis (“plague”), which is also the origin of French peste, the title of the 1947 novel by Albert Camus (La Peste, or The Plague) which has soared up the bestseller charts in recent weeks. Latin pestis also gives us pest, now used to describe animals that destroy crops, or any general nuisance or irritant. Indeed, the bacterium that causes Bubonic plague is called Yersinia pestis.

The bacterium Yersinia pestis, which causes Bubonic plague.
Shutterstock

The Bubonic plague outbreak of the 14th century was also known as the Great Mortality or the Great Death. The Black Death, which is now most widely used to describe that catastrophe, is, in fact, a 17th-century translation of a Danish name for the disease: “Den Sorte Død”.

Snake venom, the original ‘virus’

The later plagues of the 17th century led to the coining of the word epidemic. This came from a Greek word meaning “prevalent”, from epi “upon” and demos “people”. The more severe pandemic is so called because it affects everyone (from Greek pan “all”).

A more recent coinage, infodemic, a blend of info and epidemic, was introduced in 2003 to refer to the deluge of misinformation and fake news that accompanied the outbreak of SARS (an acronym formed from the initial letters of “severe acute respiratory syndrome”).

The 17th-century equivalent of social distancing was “avoiding someone like the plague”. According to Samuel Pepys’s account of the outbreak that ravaged London in 1665, infected houses were marked with a red cross and had the words “Lord have mercy upon us” inscribed on the doors. Best to avoid properties so marked.

The current pandemic, COVID-19, is a contracted form of Coronavirus disease 2019. The term for this genus of viruses was coined in 1968 and referred to their appearance under the microscope, which reveals a distinctive halo or crown (Latin corona). Virus comes from a Latin word meaning “poison”, first used in English to describe a snake’s venom.

The word vaccine comes from the Latin ‘vacca’, meaning ‘cow’.
Shutterstock

The race to find a vaccine has focused on the team at Oxford University’s Jenner Institute, named for Edward Jenner (1749-1823). It was his discovery that contact with cowpox resulted in milkmaids becoming immune to the more severe strain found in smallpox. This discovery is behind the term vaccine (from the Latin vacca “cow”) which gives individuals immunity (originally a term certifying exemption from public service). Inoculation was initially a horticultural term describing the grafting of a bud into a plant: from Latin oculus, meaning “bud” as well as “eye” (as in binoculars “having two eyes”).

Although we are currently adjusting to social distancing as part of the “new normal”, the term itself has been around since the 1950s. It was initially coined by sociologists to describe individuals or groups deliberately adopting a policy of social or emotional detachment.




Read more:
A history of English … in five words


Its use to refer to a strategy for limiting the spread of a disease goes back to the early 2000s, with reference to outbreaks of flu. Flu is a shortening of influenza, adopted into English from Italian following a major outbreak which began in Italy in 1743. Although it is often called the Spanish flu, the strain that triggered the pandemic of 1918 most likely began elsewhere, although its origins are uncertain. Its name derives from a particularly severe outbreak in Spain.

To the watchtower

Self-isolation, the measure of protection which involves deliberately cutting oneself off from others, is first recorded in the 1830s – isolate goes back to the Latin insulatus “insulated”, from insula “island”. An extended mode of isolation, known as quarantine, is from the Italian quarantina referring to “40 days”. The specific period derives from its original use to refer to the period of fasting in the wilderness undertaken by Jesus in the Christian gospels.

Lockdown, the most extreme form of social containment, in which citizens must remain in their homes at all times, comes from its use in prisons to describe a period of extended confinement following a disturbance.

Many governments have recently announced a gradual easing of restrictions and a call for citizens to “stay alert”. While some have expressed confusion over this message, for etymologists the required response is perfectly clear: we should all take to the nearest tall building, since alert is from the Italian all’erta “to the watchtower”.The Conversation

Simon Horobin, Professor of English Language and Literature, University of Oxford

This article is republished from The Conversation under a Creative Commons license. Read the original article.


Museums are losing millions every week but they are already working hard to preserve coronavirus artefacts



The Smithsonian Institute closed all of its museums due to the worldwide COVID-19 coronavirus pandemic.
Shutterstock

Anna M. Kotarba-Morley, Flinders University

The COVID-19 pandemic has no borders and has caused the deaths of hundreds of thousands of citizens from countries across the globe. But this outbreak is not just having an effect on the societies of today, it is also impacting our past.

Cultural resources and heritage assets – from sites and monuments, historic gardens and parks, museums and galleries, to the intangible lifeways of traditional culture bearers – require ongoing safeguarding and maintenance in an overstretched world increasingly prone to major crises.

Meanwhile, the heritage sector is already working hard to preserve the COVID-19 moment, predicting that future generations will need documentary evidence, photographic archives and artefacts to help them understand this period of history.

Closed to visitors

The severity of the pandemic, and the infection control responses that followed, has caused great uncertainties and potential long-term knock-on effects within the sector, especially for smaller and medium-sized institutions and businesses.

A survey published by the Network of European Museum Organisations (NEMO) and communications within organisations such as the International Committee for Archaeological Heritage Management (ICAHM) show that the majority of European museums are closed, incurring significant losses of income. By the beginning of April, 650 museums from 41 countries had responded to the NEMO survey, reporting 92% of them were closed.

Large museums such as the Kunsthistorisches Museum in Vienna and the Rijksmuseum and Stedelijk Museum in Amsterdam are losing €100,000-€600,000 (A$168,700-A$1,012,000) per week. Only about 70% of staff are currently being retained on average at most of the institutions.

Museums (both private and national) located in tourist areas have privately reported initial losses of 75-80% income based on the Heritage Sector Briefing to the UK government. Reports are also emerging of philanthropic income fall of 80-90% by heritage charities with many heading towards insolvency within weeks.

Cambodia’s Angkor Wat heritage site has lost 99.5% of its income in April compared to the same time last year.

Meanwhile, restorations to the cathedral of Notre-Dame de Paris came to an abrupt halt due to coronavirus just prior to the first anniversary of the fierce fire that damaged it. Builders have since returned to the site.

The situation is especially dire for culture bearers within remote and isolated indigenous communities still reeling from other catastrophes, such as the disastrous fires in Australia and the Amazon. Without means of social distancing these communities are at much higher risk of being infected and in turn their cultural custodianship affected.




Read more:
Coronavirus: as culture moves online, regional organisations need help bridging the digital divide


The right to culture

It is interesting to think about how this crisis will reshape visitor experience in the future.

The NEMO survey reports that more than 60% of the museums have increased their online presence since they were closed due to social distancing measures, but only 13.4% have increased their budget for online activities. We have yet to see more data about online traffic in virtual museums and tours, but as it stands it is certainly showing signs of significant increase.

As highlighted in the preamble of the 2003 UNESCO Declaration:

cultural heritage is an important component of cultural identity and of social cohesion, so that its intentional destruction may have adverse consequences on human dignity and human rights.

The human right of access to and enjoyment of cultural heritage is guaranteed by international law, emphasised in the Human Rights Council in its recent Resolution 33/20 (2016) that notes:

the destruction of or damage to cultural heritage may have a detrimental and irreversible impact on the enjoyment of cultural rights.

Article 27 of the Universal Declaration of Human Rights states that:

everyone has the right freely to participate in the cultural life of the community, to enjoy the arts and to share in scientific advancement and its benefits.




Read more:
Protecting heritage is a human right


In the future, generations will need the means to understand how the coronavirus pandemic affected our world, just as they can now reflect on the Spanish Flu or the Black Death.

Preserving a pandemic

Work is underway to preserve this legacy with organisations such as Historic England collecting “lockdown moments in living memories” through sourcing photographs from the public for their archive. Twitter account @Viral_Archive run by a number of academic archaeologists is following in a same vane with interesting theme of #ViralShadows.

In the United States, the Smithsonian’s National Museum of American History has assembled a dedicated COVID-19 collection task force. They are already collecting objects including personal protection equipment such as N95 and homemade cloth masks, empty boxes (to show scarcity), and patients’ illustrations.

The National Museum of Australia has invited Australians to share their “experiences, stories, reflections and images of the COVID-19 pandemic” so curators can enhance the “national conversation about an event which is already a defining moment in our nation’s history”. The State Library of New South Wales is collecting images of life in isolation to “help tell this story to future generations”.

Citizen science is a great way to engage public and although such work is labour-intensive it can lead to more online traffic and potentially fill in financial deficits by enticing visitors back to the sites.

The closed Van Gogh Museum in Amsterdam, Netherlands on March 22.
Shutterstock

Priorities here

The timing of the COVID-19 pandemic – occurring in the immediate aftermath of severe draught, catastrophic fire season and then floods, with inadequate intervening time for maintenance and conservation efforts – presents new challenges.

The federal government reports that in the financial year 2018-19, Australia generated A$60.8 billion in direct tourism gross domestic product (GDP). This represents a growth of 3.5% over the previous year – faster than the national GDP growth. Tourism directly employed 666,000 Australians making up 5% of Australia’s workforce. Museums and heritage sites are a significant pillar to tourism income and employment.

Even though the government assures us “heritage is all the things that make up Australia’s identity – our spirit and ingenuity, our historic buildings, and our unique, living landscapes” its placement within the Department of Agriculture, Water and Environment’s portfolio shows lack of prioritisation of the sector.

Given the struggles we are already seeing in the arts and culture sector, which has been recently moved to the portfolio of the Department of Infrastructure, Transport, Regional Development and Communications means that the future of our heritage (and our past) is far from certain.The Conversation

Anna M. Kotarba-Morley, Lecturer, Archaeology, Flinders University

This article is republished from The Conversation under a Creative Commons license. Read the original article.


Face masks: what the Spanish flu can teach us about making them compulsory



Red Cross nurses in San Francisco, 1918.
Wikimedia

Samuel Cohn, University of Glasgow

Should people be forced to wear face masks in public? That’s the question facing governments as more countries unwind their lockdowns. Over 30 countries have made masks compulsory in public, including Germany, Austria and Poland. This is despite the science saying masks do little to protect wearers, and only might prevent them from infecting other people.

Nicola Sturgeon, the Scottish first minister, has nonetheless announced new guidelines advising Scots to wear masks for shopping or on public transport, while the UK government is expected to announce a new stance shortly. Meanwhile, US vice president Mike Pence has controversially refused to mask up.

This all has echoes of the great influenza pandemic, aka the Spanish flu, which killed some 50 million people in 1918-20. It’s a great case study in how people will put up with very tough restrictions, so long as they think they have merit.

The great shutdown

In the US, no disease in history led to such intrusive restrictions as the great influenza. These included closures of schools, churches, soda fountains, theatres, movie houses, department stores and barber shops, and regulations on how much space should be allocated to people in indoor public places.

There were fines against coughing, sneezing, spitting, kissing and even talking outdoors – those the Boston Globe called “big talkers”. Special influenza police were hired to round up children playing on street corners and occasionally even in their own backyards.

Restrictions were similarly tough in Canada, Australia and South Africa, though much less so in the UK and continental Europe. Where there were such restrictions, the public accepted it all with few objections. Unlike the long history of cholera, especially in Europe, or the plague in the Indian subcontinent from 1896 to around 1902, no mass violence erupted and blame was rare – even against Spaniards or minorities.

Face masks came closest to being the measure that people most objected to, even though masks were often popular at first. The Oklahoma City Times in October 1918 described an “army of young women war workers” appearing “on crowded street cars and at their desks with their faces muffled in gauze shields”. From the same month, The Ogden Standard reported that “masks are the vogue”, while the Washington Times told of how they were becoming “general” in Detroit.

Shifting science

There was scientific debate from the beginning about whether the masks were effective, but the game began to change after French bacteriologist Charles Nicolle’s discovered in October 1918 that the influenza was much smaller than any other known bacterium.

The news spread rapidly, even in small-town American newspapers. Cartoons were published that read, “like using barbed wire fences to shut out flies”. Yet this was just at the point that mortality rates were ramping up in the western states of the US and Canada. Despite Nicolle’s discovery, various authorities began making masks compulsory. San Francisco was the first major US city to do so in October 1918, continuing on and off over a three-month period.

Alberta in Canada did likewise, and New South Wales, Australia, followed suit when the disease arrived in January 1919 (the state basing its decision on scientific evidence older than Charles Nicolle’s findings). The only American state to make masks mandatory was (briefly) California, while on the east coast and in other countries including the UK they were merely recommended for most people.

San Francisco gathering, 1918.
Wikimedia

Numerous photographs, like the one above, survive of large crowds wearing masks in the months after Nicolle’s discovery. But many had begun to distrust masks, and saw them as a violation of civil liberties. According to a November 1918 front page report from Utah’s Garland City Globe:

The average man wore the mask slung to the back of his neck until he came in sight of a policeman, and most people had holes cut into them to stick their cigars and cigarettes through.

Disobedience aplenty

San Francisco saw the creation of the anti-mask league, as well as protests and civil disobedience. People refused to wear masks in public or flaunted wearing them improperly. Some went to prison for not wearing them or refusing to pay fines.

In Tucson, Arizona, a banker insisted on going to jail instead of paying his fine for not masking up. In other western states, judges regularly refused to wear them in courtrooms. In Alberta, “scores” were fined in police courts for not wearing masks. In New South Wales, reports of violations flooded newspapers immediately after masks were made compulsory. Not even stretcher bearers carrying influenza victims followed the rules.

England was different. Masks were only advised as a precautionary measure in large cities, and then only for certain groups, such as influenza nurses in Manchester and Liverpool. Serious questions about efficacy only arose in March 1919, and only within the scientific community. Most British scientists now united against them, with the Lancet calling masks a “dubious remedy”.

These arguments were steadily being bolstered by statistics from the US. The head of California’s state board of health had presented late 1918 findings from San Francisco’s best run hospital showing that 78% of nurses became infected despite their careful wearing of masks.

Physicians and health authorities also presented statistics comparing San Francisco’s mortality rates with nearby San Mateo, Los Angeles and Chicago, none of which had made masks compulsory. Their mortality rates were either “no worse” or less. By the end of the pandemic in 1919, most scientists and health commissions had come to a consensus not unlike ours about the benefits of wearing masks.

Clearly, many of these details are relevant today. It’s telling that a frivolous requirement became such an issue while more severe rules banned things like talking on street corners, kissing your fiancé or attending religious services – even in the heart of America’s Bible belt.

Perhaps there’s something about masks and human impulses that has yet to be studied properly. If mass resistance to the mask should arise in the months to come, it will be interesting to see if new research will produce any useful findings on phobias about covering the face.The Conversation

Samuel Cohn, Professor of History, University of Glasgow

This article is republished from The Conversation under a Creative Commons license. Read the original article.


Coronavirus lessons from past crises: how WWI and WWII spurred scientific innovation in Australia



CSIRO Archives, CC BY-SA

Tom Spurling, Swinburne University of Technology and Garrett Upstill, Swinburne University of Technology

In the wake of COVID-19, we’re seeing intense international competition for urgently-needed supplies including personal protection equipment and ventilators. In Australia, this could extend to other critical imports such as pharmaceuticals and medicines. And when our manufacturing sector can’t fill unexpected breaks in supply chains, we all face risk.

However, Australians have lived through crises of comparable magnitude before. During and after the two world wars, scientific innovation played a crucial role in reform. It led to the creation of the Council for Scientific and Industrial Research (CSIR) and an array of subsequent discoveries.

Some may assume life will go back to normal once COVID-19 withdraws. But if the past is to be learnt from, Australia should prepare for a greatly different future – hopefully one in which science and innovation once more take centre stage.




Read more:
How Australia played the world’s first music on a computer


The birth of the CSIR

It was WWI that heightened awareness of the role of science in defence and economic growth. In December 1915, Prime Minister William (Billy) Hughes announced he would set up a national laboratory “which would allow men of all branches of science to use their capabilities in application to industry”.

A CSIR council meeting in 1935, held at the McMaster Laboratory in Sydney.
CSIRO Archives, CC BY

This led to the formation of the CSIR in 1926, and its rebirth as the CSIRO in 1949. In the years after WW1, the CSIR contributed greatly to improvements in primary production, including through animal nutrition, disease prevention, and the control of weeds and pests in crops. It also advanced primary product processing and overseas product transport.

In 1937, the CSIR’s mandate was expanded to include secondary industry research, including a national Aircraft and Engine Testing and Research Laboratory. This was motivated by the government’s concern to increase Australia’s manufacturing capabilities and reduce its dependence on technology imports.

War efforts in the spotlight

The CSIR’s research focus shifted in 1941 with the attack on Pearl Harbour. Australian war historian Boris Schedvin has written about the hectic scramble to increase the nation’s defence capacities and expand essential production following the attack, including expansion of the scientific workforce.

Minister John Dedman died in 1973.
Wikipedia (public domain)

The John Curtin government was commissioned in October, 1941. Curtin appointed John Dedman as the Minister for War Organisation and Industry, as well as the minister in charge of the CSIR. Dedman’s department was concerned with producing military supplies and equipment, and other items to support society in wartime.

Dedman instructed the council to concentrate on “problems connected with the war effort”. The CSIR responded robustly. By 1942, the divisions of food preservation and transport, forest products, aeronautics, industrial chemistry, the national standards laboratory and the lubricants and bearings section were practically focused on war work full-time.

Scaling up production

The Division of Industrial Chemistry was the division most closely involved in actual production. It was formed in 1940 with Ian Wark as chief, who’d previously worked at the Electrolytic Zinc Company.

Wark was familiar with the chemical industry, and quickly devoted resources to developing processes (using Australian materials) to produce essential chemicals to the pilot plant stage. They were soon producing chemicals for drugs at the Fishermans Bend site, including the starting material for the synthesis of the anaesthetic drug novocaine (procaine).

The researchers developed a method to separate the drug ergot, which is now essential in gynaecology, from rye. They also contributed directly to the war effort by manufacturing the plasticiser used in the nose caps of bullets and shells.

CSIRO today

In response to the current pandemic, CSIRO at the Australian Centre for Disease Preparedness in Geelong, Victoria, is working with the international Coalition for Epidemic Preparedness to improve understanding of the SARS-CoV-2 virus. They are currently testing two vaccine candidates for efficacy, and evaluating the best way to administer the vaccine.

CSIRO’s directors Trevor Drew and Rob Grenfell share progress on COVID-19 vaccine testing being carried out at the Australian Centre for Disease Preparedness in Geelong.

Australian scientists have made monumental contributions on this front in the past. In the 1980s, CSIRO and its university collaborators began efforts that led to the creation of anti-flu drug Relenza, the first drug to successfully treat the flu. Relenza was then commercialised by Australian biotech company Biota, which licensed the drug to British pharmaceutical company GlaxoSmithKline.

The CSIRO also invented the Hendra virus vaccine for horses, launched in 2012.

Prior to that, Ian Frazer at the University of Queensland developed the human papillomavirus (HPV) vaccine which was launched in 2006.




Read more:
How we developed the Hendra virus vaccine for horses


What can we take away?

COVID-19 is one of many viral diseases that need either a vaccine or a drug (or both). Others are hepatitis B, dengue fever, HIV and the viruses that cause the common cold. Now may be Australia’s chance to use our world class medical research and medicinal chemistry capabilities to become a dominant world supplier of anti-viral medications.

As was the case during WWI and WWII, this pandemic drives home the need to retain our capabilities at a time of supply chain disruption. While it’s impossible for a medium-sized economy like Australia’s to be entirely self-sufficient, it’s important we lean on our strengths to not only respond, but thrive during these complicated times.

In 2020, Australia has a much greater and broader research and production capacity than it did in 1940. And as we march through this pandemic, we can learn from the past and forge new paths to enhance our position as pioneers in sciencific innovation.The Conversation

Tom Spurling, Professor of Innovation Studies, Swinburne University of Technology and Garrett Upstill, Visiting Fellow, Swinburne University of Technology

This article is republished from The Conversation under a Creative Commons license. Read the original article.


Diary of Samuel Pepys shows how life under the bubonic plague mirrored today’s pandemic



There were eerie similarities between Pepys’ time and our own.
Justin Sullivan/Getty Images

Ute Lotz-Heumann, University of Arizona

In early April, writer Jen Miller urged New York Times readers to start a coronavirus diary.

“Who knows,” she wrote, “maybe one day your diary will provide a valuable window into this period.”

During a different pandemic, one 17th-century British naval administrator named Samuel Pepys did just that. He fastidiously kept a diary from 1660 to 1669 – a period of time that included a severe outbreak of the bubonic plague in London. Epidemics have always haunted humans, but rarely do we get such a detailed glimpse into one person’s life during a crisis from so long ago.

There were no Zoom meetings, drive-through testing or ventilators in 17th-century London. But Pepys’ diary reveals that there were some striking resemblances in how people responded to the pandemic.

A creeping sense of crisis

For Pepys and the inhabitants of London, there was no way of knowing whether an outbreak of the plague that occurred in the parish of St. Giles, a poor area outside the city walls, in late 1664 and early 1665 would become an epidemic.

The plague first entered Pepys’ consciousness enough to warrant a diary entry on April 30, 1665: “Great fears of the Sickenesse here in the City,” he wrote, “it being said that two or three houses are already shut up. God preserve us all.”

Portrait of Samuel Pepys by John Hayls (1666).
National Portrait Gallery

Pepys continued to live his life normally until the beginning of June, when, for the first time, he saw houses “shut up” – the term his contemporaries used for quarantine – with his own eyes, “marked with a red cross upon the doors, and ‘Lord have mercy upon us’ writ there.” After this, Pepys became increasingly troubled by the outbreak.

He soon observed corpses being taken to their burial in the streets, and a number of his acquaintances died, including his own physician.

By mid-August, he had drawn up his will, writing, “that I shall be in much better state of soul, I hope, if it should please the Lord to call me away this sickly time.” Later that month, he wrote of deserted streets; the pedestrians he encountered were “walking like people that had taken leave of the world.”

Tracking mortality counts

In London, the Company of Parish Clerks printed “bills of mortality,” the weekly tallies of burials.

Because these lists noted London’s burials – not deaths – they undoubtedly undercounted the dead. Just as we follow these numbers closely today, Pepys documented the growing number of plague victims in his diary.

‘Bills of mortality’ were regularly posted.
Photo 12/Universal Images Group via Getty Image

At the end of August, he cited the bill of mortality as having recorded 6,102 victims of the plague, but feared “that the true number of the dead this week is near 10,000,” mostly because the victims among the urban poor weren’t counted. A week later, he noted the official number of 6,978 in one week, “a most dreadfull Number.”

By mid-September, all attempts to control the plague were failing. Quarantines were not being enforced, and people gathered in places like the Royal Exchange. Social distancing, in short, was not happening.

He was equally alarmed by people attending funerals in spite of official orders. Although plague victims were supposed to be interred at night, this system broke down as well, and Pepys griped that burials were taking place “in broad daylight.”

Desperate for remedies

There are few known effective treatment options for COVID-19. Medical and scientific research need time, but people hit hard by the virus are willing to try anything. Fraudulent treatments, from teas and colloidal silver, to cognac and cow urine, have been floated.

Although Pepys lived during the Scientific Revolution, nobody in the 17th century knew that the Yersinia pestis bacterium carried by fleas caused the plague. Instead, the era’s scientists theorized that the plague was spreading through miasma, or “bad air” created by rotting organic matter and identifiable by its foul smell. Some of the most popular measures to combat the plague involved purifying the air by smoking tobacco or by holding herbs and spices in front of one’s nose.

Tobacco was the first remedy that Pepys sought during the plague outbreak. In early June, seeing shut-up houses “put me into an ill conception of myself and my smell, so that I was forced to buy some roll-tobacco to smell … and chaw.” Later, in July, a noble patroness gave him “a bottle of plague-water” – a medicine made from various herbs. But he wasn’t sure whether any of this was effective. Having participated in a coffeehouse discussion about “the plague growing upon us in this town and remedies against it,” he could only conclude that “some saying one thing, some another.”

A 1666 engraving by John Dunstall depicts deaths and burials in London during the bubonic plague.
Museum of London

During the outbreak, Pepys was also very concerned with his frame of mind; he constantly mentioned that he was trying to be in good spirits. This was not only an attempt to “not let it get to him” – as we might say today – but also informed by the medical theory of the era, which claimed that an imbalance of the so-called humors in the body – blood, black bile, yellow bile and phlegm – led to disease.

Melancholy – which, according to doctors, resulted from an excess of black bile – could be dangerous to one’s health, so Pepys sought to suppress negative emotions; on Sept. 14, for example, he wrote that hearing about dead friends and acquaintances “doth put me into great apprehensions of melancholy. … But I put off the thoughts of sadness as much as I can.”

Balancing paranoia and risk

Humans are social animals and thrive on interaction, so it’s no surprise that so many have found social distancing during the coronavirus pandemic challenging. It can require constant risk assessment: How close is too close? How can we avoid infection and keep our loved ones safe, while also staying sane? What should we do when someone in our house develops a cough?

During the plague, this sort of paranoia also abounded. Pepys found that when he left London and entered other towns, the townspeople became visibly nervous about visitors.

“They are afeared of us that come to them,” he wrote in mid-July, “insomuch that I am troubled at it.”

Pepys succumbed to paranoia himself: In late July, his servant Will suddenly developed a headache. Fearing that his entire house would be shut up if a servant came down with the plague, Pepys mobilized all his other servants to get Will out of the house as quickly as possible. It turned out that Will didn’t have the plague, and he returned the next day.

In early September, Pepys refrained from wearing a wig he bought in an area of London that was a hotspot of the disease, and he wondered whether other people would also fear wearing wigs because they could potentially be made of the hair of plague victims.

And yet he was willing to risk his health to meet certain needs; by early October, he visited his mistress without any regard for the danger: “round about and next door on every side is the plague, but I did not value it but there did what I could con ella.”

Just as people around the world eagerly wait for a falling death toll as a sign of the pandemic letting up, so did Pepys derive hope – and perhaps the impetus to see his mistress – from the first decline in deaths in mid-September. A week later, he noted a substantial decline of more than 1,800.

Let’s hope that, like Pepys, we’ll soon see some light at the end of the tunnel.

[You need to understand the coronavirus pandemic, and we can help. Read The Conversation’s newsletter.]The Conversation

Ute Lotz-Heumann, Heiko A. Oberman Professor of Late Medieval and Reformation History, University of Arizona

This article is republished from The Conversation under a Creative Commons license. Read the original article.


How the rich reacted to the bubonic plague has eerie similarities to today’s pandemic



Franz Xavier Winterhalter’s ‘The Decameron’ (1837).
Heritage Images via Getty Images

Kathryn McKinley, University of Maryland, Baltimore County

The coronavirus can infect anyone, but recent reporting has shown your socioeconomic status can play a big role, with a combination of job security, access to health care and mobility widening the gap in infection and mortality rates between rich and poor.

The wealthy work remotely and flee to resorts or pastoral second homes, while the urban poor are packed into small apartments and compelled to keep showing up to work.

As a medievalist, I’ve seen a version of this story before.

Following the 1348 Black Death in Italy, the Italian writer Giovanni Boccaccio wrote a collection of 100 novellas titled, “The Decameron.” These stories, though fictional, give us a window into medieval life during the Black Death – and how some of the same fissures opened up between the rich and the poor. Cultural historians today see “The Decameron” as an invaluable source of information on everyday life in 14th-century Italy.

Giovanni Boccaccio.
Leemage via Getty Images

Boccaccio was born in 1313 as the illegitimate son of a Florentine banker. A product of the middle class, he wrote, in “The Decameron,” stories about merchants and servants. This was unusual for his time, as medieval literature tended to focus on the lives of the nobility.

“The Decameron” begins with a gripping, graphic description of the Black Death, which was so virulent that a person who contracted it would die within four to seven days. Between 1347 and 1351, it killed between 40% and 50% of Europe’s population. Some of Boccaccio’s own family members died.

In this opening section, Boccaccio describes the rich secluding themselves at home, where they enjoy quality wines and provisions, music and other entertainment. The very wealthiest – whom Boccaccio describes as “ruthless” – deserted their neighborhoods altogether, retreating to comfortable estates in the countryside, “as though the plague was meant to harry only those remaining within their city walls.”

Meanwhile, the middle class or poor, forced to stay at home, “caught the plague by the thousand right there in their own neighborhood, day after day” and swiftly passed away. Servants dutifully attended to the sick in wealthy households, often succumbing to the illness themselves. Many, unable to leave Florence and convinced of their imminent death, decided to simply drink and party away their final days in nihilistic revelries, while in rural areas, laborers died “like brute beasts rather than human beings; night and day, with never a doctor to attend them.”

Josse Lieferinxe’s ‘Saint Sebastian Interceding for the Plague Stricken’ (c. 1498).
Wikimedia Commons

After the bleak description of the plague, Boccaccio shifts to the 100 stories. They’re narrated by 10 nobles who have fled the pallor of death hanging over Florence to luxuriate in amply stocked country mansions. From there, they tell their tales.

One key issue in “The Decameron” is how wealth and advantage can impair people’s abilities to empathize with the hardships of others. Boccaccio begins the forward with the proverb, “It is inherently human to show pity to those who are afflicted.” Yet in many of the tales he goes on to present characters who are sharply indifferent to the pain of others, blinded by their own drives and ambition.

In one fantasy story, a dead man returns from hell every Friday and ritually slaughters the same woman who had rejected him when he was alive. In another, a widow fends off a leering priest by tricking him into sleeping with her maid. In a third, the narrator praises a character for his undying loyalty to his friend when, in fact, he has profoundly betrayed that friend over many years.

Humans, Boccaccio seems to be saying, can think of themselves as upstanding and moral – but unawares, they may show indifference to others. We see this in the 10 storytellers themselves: They make a pact to live virtuously in their well-appointed retreats. Yet while they pamper themselves, they indulge in some stories that illustrate brutality, betrayal and exploitation.

Boccaccio wanted to challenge his readers, and make them think about their responsibilities to others. “The Decameron” raises the questions: How do the rich relate to the poor during times of widespread suffering? What is the value of a life?

In our own pandemic, with millions unemployed due to a virus that has killed thousands, these issues are strikingly relevant.

This is an updated version of an article originally published on April 16, 2020.

[Deep knowledge, daily. Sign up for The Conversation’s newsletter.]The Conversation

Kathryn McKinley, Professor of English, University of Maryland, Baltimore County

This article is republished from The Conversation under a Creative Commons license. Read the original article.


This isn’t the first global pandemic, and it won’t be the last. Here’s what we’ve learned from 4 others throughout history



Wikimedia/Pierart dou Tielt

David Griffin, The Peter Doherty Institute for Infection and Immunity and Justin Denholm, Melbourne Health

The course of human history has been shaped by infectious diseases, and the current crisis certainly won’t be the last time.

However, we can capitalise on the knowledge gained from past experiences, and reflect on how we’re better off this time around.




Read more:
Four of the most lethal infectious diseases of our time and how we’re overcoming them


1. The Plague, or ‘Black Death’ (14th Century)

While outbreaks of the plague (caused by the bacterium Yersinia pestis) still occur in several parts of the world, there are two that are particularly infamous.

The 200-year long Plague of Justinian began in 541 CE, wiping out millions in several waves across Europe, North Africa and the Middle East and crimping the expansionary aspirations of the Roman Empire (although some scholars argue that its impact has been overstated).

Then there’s the better known 14th century pandemic, which likely emerged from China and
decimated populations in Asia, Europe and Northern Africa.

Perhaps one of the greatest public health legacies to have emerged from the 14th century plague pandemic is the concept of “quarantine”, from the Venetian term “quarantena” meaning forty days.

The 14th century Black Death pandemic is thought to have catalysed enormous societal, economic, artistic and cultural reforms in Medieval Europe. It illustrates how infectious disease pandemics can be major turning points in history, with lasting impacts.

For example, widespread death caused labour shortages across feudal society, and often led to higher wages, cheaper land, better living conditions and increased freedoms for the lower class.

Various authorities lost credibility, since they were seen to have failed to protect communities from the overwhelming devastation of plague. People began to openly question long held certainties around societal structure, traditions, and religious orthodoxy.

This prompted fundamental shifts in peoples’ interactions and experience with religion, philosophy, and politics. The Renaissance period, which encouraged humanism and learning, soon followed.

The Dance of Death, or Danse Macabre was a common artistic trope of the time of the Black Death.
Public Domain/Wikimedia

The Black Death also had profound effects on art and literature, which took on more pessimistic and morbid themes. There were vivid depictions of violence and death in Biblical narratives,
still seen in many Christian places of worship across Europe.

How COVID-19 will reshape our culture, and what unexpected influence it will have for generations to come is unknown. There are already clear economic changes arising from this outbreak, as some industries rise, others fall and some businesses seem likely to disappear forever.

COVID-19 may permanently normalise the use of virtual technologies for socialising, business, education, healthcare, religious worship and even government.

2. Spanish influenza (1918)

The 1918 “Spanish Flu” pandemic’s reputation as one of the deadliest in human history is due to a complex interplay between how the virus works, the immune response and the social context in which it spread.

It arose in a world left vulnerable by the preceding four years of World War I. Malnutrition and overcrowding were common.

Around 500 million people were infected – a third of the global population at the time – leading to 50-100 million deaths.

A unique characteristic of infection was its tendency to kill healthy adults between the ages of 20 and 40.

At the time, influenza infection was attributed to a bacterium (Haemophilus influenzae) rather than a virus. Antibiotics for secondary bacterial infections were still more than a decade away, and intensive care wards with mechanical ventilators were unheard of.

Clearly, our medical and scientific understanding of the ‘flu in 1918 made it difficult to combat. However, public health interventions, including quarantine, the use of face masks and bans on mass gatherings helped limit the spread in some areas, building on prior successes in controlling tuberculosis, cholera and other infectious diseases.

Australia imposed maritime quarantine, requiring all arriving ships to be cleared by Commonwealth Quarantine Officials before disembarkation. That likely delayed and reduced the Spanish flu impact on Australia, and had secondary effects on the other Pacific Islands.

The effect of maritime quarantine was most striking in Western and American Samoa, with the latter enforcing strict quarantine and experiencing no deaths. By contrast, 25% of Western Samoans died, after influenza was introduced by a ship from New Zealand.

In some cities, mass gatherings were banned, and schools, churches, theatres, dance and pool halls closed.

In the United States, cities that committed earlier, longer and more aggressively to social distancing interventions, not only saved lives, but also emerged economically stronger than those that didn’t.

Face masks and hand hygiene were popularised and sometimes enforced in cities.

In San Francisco, a Red Cross-led public education campaign was combined with mandatory mask-wearing outside the home.

This was tightly enforced in some jurisdictions by police officers issuing fines, and at times using weapons.


The Conversation, CC BY-ND

3. HIV/AIDS (20th century)

The first reported cases of HIV/AIDS in the Western world emerged in 1981.

Since then, around 75 million people have become infected with HIV, and about 32 million people have died.

Many readers may remember how baffling and frightening the HIV/AIDs pandemic was in the early days (and still is in many parts of the developing world).

We now understand that people living with HIV infection who are on treatment are far less likely to develop serious complications.

These treatments, known as antiretrovirals stop HIV from replicating. This can lead to an “undetectable viral load” in a person’s blood. Evidence shows that people with an undetectable viral load can’t pass the virus on to others during sex.

Condoms and PrEP (short for “pre-exposure prophylaxis,” where people take an oral antiretroviral pill once a day), can be used by people who don’t have HIV infection to reduce the risk of acquiring the virus.

Unfortunately, there are currently no proven antivirals available for the prevention or treatment of COVID-19, though research is ongoing.

The HIV pandemic taught us about the value of a well-designed public health campaign, and the importance of contact tracing. Broad testing in appropriate people is fundamental to this, to understand the extent of infection in the community and allow appropriately targeted individual and population-level interventions.

It also demonstrated that words and stigma matter; people need to feel they can test safely and be supported, rather than ostracised. Stigmatising language can fuel misconceptions, discrimination and discourage testing.

4. Severe Acute Respiratory Syndrome (SARS) (2002-2003)

The current pandemic is the third coronavirus outbreak in the past two decades.

The first was in 2002, when SARS emerged from horseshoe bats in China and spread to at least 29 countries around the world, causing 8,098 cases and 774 deaths.

SARS was finally contained in July, 2003. SARS-CoV-2, however, appears much more easily spread than the original SARS coronavirus.

To some extent SARS was a practice run for COVID-19. Researchers focused on SARS and MERS (Middle Eastern Respiratory Syndrome, another coronavirus that remains a problem in selected regions), are providing important foundational research for potential vaccines against SARS-CoV-2.

Knowledge gleaned from SARS may also lead to antiviral drugs to treat the current virus.

SARS also emphasised the importance of communication in a pandemic, and the need for frank, honest and timely information sharing.

Certainly, SARS was a catalyst for change in China; the government invested in enhanced surveillance systems, that facilitate the real time collection and communication of infectious diseases and syndromes from emergency departments back to a centralised government database.

This was coupled with the International Health Regulations, which requires the reporting of unusual and unexpected outbreaks of disease.

Advances in science, information technology and knowledge gained from SARS, allowed us to quickly isolate, sequence and share SARS-CoV-2 data globally. Likewise, important clinical information was distributed early to the medical community.

SARS demonstrated how quickly and comprehensively a virus could spread around the world in the era of air transportation, and the role of individual “superspreaders”.

SARS also underlined the importance of the inextricable link between human, animal and environmental health, known as “One Health”, that may facilitate the crossover of germs between species.

Finally, a crucial, but perhaps overlooked lesson from SARS is the need for sustained investment in vaccine and infectious disease treatment research.




Read more:
Coronavirus is a wake-up call: our war with the environment is leading to pandemics


Few infectious disease researchers were surprised when another coronavirus pandemic broke out. A globalised world, with overcrowded, well connected people and cities, where humans and animals live in close proximity, provides fertile conditions for infectious diseases.

We must be ever prepared for the emergence of another pandemic, and learn the lessons of history to navigate the next threat.The Conversation

David Griffin, Infectious Diseases Fellow, The Peter Doherty Institute for Infection and Immunity and Justin Denholm, Associate Professor, Melbourne Health

This article is republished from The Conversation under a Creative Commons license. Read the original article.


Florence Nightingale: a pioneer of hand washing and hygiene for health



Helping the wounded.
Shutterstock/Everett Historical

Richard Bates, University of Nottingham

Florence Nightingale, who was born 200 years ago, is rightly famed for revolutionising nursing. Her approach to caring for wounded soldiers and training nurses in the 19th century saved and improved countless lives. And her ideas on how to stay healthy still resonate today – as politicians give official guidance on how best to battle coronavirus.

For example, although Nightingale did not fully subscribe to the idea that many diseases are caused by specific micro-organisms known as germs until she was in her sixties, in the 1880s, she was well aware of the importance of hand washing. In her book Notes on Nursing (1860), she wrote that:

Every nurse ought to be careful to wash her hands very frequently during the day. If her face, too, so much the better.

During the Crimean War (1853-1856) Nightingale had implemented hand washing and other hygiene practices in British army hospitals. This was relatively new advice, first publicised by Hungarian doctor Ignaz Semmelweis in the 1840s, who had observed the dramatic difference it made to death rates on maternity wards.

Nightingale’s attention to international medical research and developments was just one factor behind her ability to make effective interventions in public health. Like many public health experts of her age, Nightingale considered the home to be a crucial site for disease-preventing interventions. This was the place where most people contracted and suffered from infectious diseases. (The same is true today: in Wuhan’s coronavirus outbreak, around 75-80% of transmissions were reportedly in family clusters).

Nightingale’s book, Notes on Nursing (1860), was more of a public health instruction book than a nursing manual. It advised ordinary people how to maintain healthy homes – particularly women, in accordance with the worldview of the times. There was straightforward advice on everything from how to avoid excessive smoke from fireplaces (don’t let the fire get too low, and don’t overwhelm it with coal) to the safest material with which to cover walls (oil paints, not wallpaper).

Nightingale strongly counselled that people open windows to maximise light and ventilation and displace “stagnant, musty and corrupt” air. And she advocated improving drainage to combat water-borne diseases like cholera and typhoid.

In her view, all domestic interiors must be kept clean. Dirty carpets and unclean furniture, she wrote with characteristic bluntness, “pollute the air just as much as if there were a dung heap in the basement”.

Notes on Nursing also called upon the “mistress” of every building to clean “every hole and corner” of her home regularly, for the sake of her family’s health. But Nightingale also recommended a more holistic approach to health. She encouraged soldiers to read, write and socialise during their convalescence so they would not sink into boredom and alcoholism.

Good data

During her youth, Nightingale’s father had introduced her to a leading practitioner of statistics, then a brand new academic field, and paid for her to have a mathematics tutor. During and after the Crimean War, Nightingale seized on statistics as a way of proving the effectiveness of different interventions.

She went on to produce her famous diagrams, which demonstrated the high proportion of soldiers’ deaths caused by disease as opposed to battle wounds, and became the first woman admitted to the London Statistical Society in 1858.

Thereafter she designed questionnaires to obtain data on such questions as the sanitary condition of army stations in India, or the mortality rates of aboriginal populations in Australia. Her guiding principle was that a health problem could only be effectively tackled once its dimensions were reliably established.

In 1857, around a year after returning from the Crimean War, Nightingale suffered a severe collapse, now believed to have been caused by a flu-like infection called brucellosis. For much of her subsequent life, she was racked with chronic pain, often unable to walk or leave her bed.

Working from home

Having been declared an invalid, she imposed a rule of seclusion on herself because of pain and tiredness rather than from fears of contagion – a form of self-isolation that extended to her closest family (though she still had servants and other visitors).

During her first years of working entirely from home, Nightingale’s productivity was extraordinary. As well as writing Notes on Nursing, she produced an influential 900-page report on the medical failings during the Crimean War, and a book on hospital design.

This was in addition to setting up the Nightingale Training School for nurses at St Thomas’ hospital in London in 1860, and a midwifery training programme at King’s College Hospital in 1861, plus advising on the design of a number of new hospitals.
Later in the 1860s, Nightingale proposed a reform of workhouse infirmaries to make them high quality taxpayer-funded hospitals; and also worked on sanitary and social reforms in India. All of this she accomplished without leaving her house (though government ministers sometimes came to her home for meetings).

Having said this, it is worth remembering that Nightingale’s was a privileged form of self-isolation. Her father’s fortune, derived from Derbyshire mining interests, meant she had no money worries.

She lived in a nice house in London with various assistants and servants to help, shop and cook for her, and had no children to look after. Her entire waking time could be devoted to reading and writing. So while this is an appropriate time to recall and celebrate the huge contribution Nightingale made to modern nursing and public health care, we shouldn’t feel too bad if we don’t quite live up to her high standards of isolated productivity.The Conversation

Richard Bates, Postdoctoral Research Fellow, Department of History, University of Nottingham

This article is republished from The Conversation under a Creative Commons license. Read the original article.


%d bloggers like this: