The link below is to an infographic/timeline of the history of medical technology.
Tag Archives: health
At Sydney’s enormous Rookwood Cemetery, a lichen-spotted headstone captures a family’s double burden of grief.
The grave contains the remains of 19-year-old Harriet Ann Ottaway, who died on 2 July 1919. Its monument also commemorates her brother Henry James Ottaway, who “died of wounds in Belgium, 23rd Sept 1917, aged 21 years”.
While Henry was killed at the infamous Battle of Passchendaele, Harriet’s headstone makes no mention of her own courageous combat with “Spanish flu”.
Harriet’s story typifies the enduring public silence around the pneumonic influenza pandemic of 1918–19. Worldwide, it killed an estimated 50-100 million people – at least three times all of the deaths caused by the First World War.
Why historians ignored the Spanish flu
After the disease came ashore in January 1919, about a third of all Australians were infected and the flu left nearly 15,000 dead in under a year. Those figures match the average annual death rate for the Australian Imperial Force throughout 1914–18.
Arguably, we could consider 1919 as another year of war, albeit against a new enemy. Indeed, the typical victims had similar profiles: fit, young adults aged 20-40. The major difference was that in 1919, women like Harriet formed a significant proportion of the casualties.
Deadly flu spread rapidly
There was no doubt about the medical and social impact of the “Spanish flu”. Although its origins remain contested, it certainly didn’t arise in Spain. What is known is that by early 1918, a highly infectious respiratory disease, caused by a then-unknown agent, was moving rapidly across Europe and the United States. By the middle of that year, as the war was reaching a tipping point, it had spread to Africa, India and Asia.
It also took on a much deadlier profile. While victims initially suffered the typical signs and symptoms of influenza – including aches, fever, coughing and an overwhelming weariness – a frighteningly high proportion went rapidly downhill.
Patients’ lungs filled with fluid – which is why it became known as “pneumonic influenza” – and they struggled to breathe. For nurses and doctors, a tell-tale sign of impending death was a blue, plum or mahogany colour in the victim’s cheeks.
This, sadly, was the fate of young Harriet Ottaway. Having nursed a dying aunt through early 1919, in June she tended her married sister Lillian, who had come down with pneumonic influenza.
Despite taking the recommended precautions, Harriet contracted the infection and died in hospital. Ironically, Lillian survived. But in the space of less than two years she had lost both a brother to the Great War and her younger sister to the Spanish flu.
An intimate impact worldwide
Indeed, as Harriet’s headstone reminds us, this was an intimate pandemic. The statistics can seem overwhelming until you realise what it means that about a third of the entire world’s population was infected.
It wasn’t just victims who were affected. Across Australia, regulations intended to reduce the spread and impact of the pandemic caused profound disruption. The nation’s quarantine system held back the flu for several months, meaning that a less deadly version came ashore in 1919.
But it caused delay and resentment for the 180,000 soldiers, nurses and partners who returned home by sea that year.
Responses within Australia varied from state to state but the crisis often led to the closure of schools, churches, theatres, pubs, race meetings and agricultural shows, plus the delay of victory celebrations.
The result was not only economic hardship, but significant interruptions in education, entertainment, travel, shopping and worship. The funeral business boomed, however, as the nation’s annual death rate went up by approximately 25%.
Yet for some reason, the silence of Harriet’s headstone is repeated across the country. Compared with the Anzac memorials that peppered our towns and suburbs in the decades after the Great War, few monuments mark the impact of pneumonic influenza.
Nevertheless, its stories of suffering and sacrifice have been perpetuated in other ways, especially within family and community memories. A century later, these stories deserve to be researched and commemorated.
Despite the disruption, fear and substantial personal risk posed by the flu, tens of thousands of ordinary Australians rose to the challenge. The wartime spirit of volunteering and community service saw church groups, civic leaders, council workers, teachers, nurses and organisations such as the Red Cross step up.
They staffed relief depots and emergency hospitals, delivered comforts from pyjamas to soup, and cared for victims who were critically ill or convalescent. A substantial proportion of these courageous carers were women, at a time when many were being commanded to hand back their wartime jobs to returning servicemen.
In resurrecting stories such as the sad tale of Harriet Ottaway, it’s time to restore our memories of the “Spanish flu” and commemorate how our community came together to battle this unprecedented public crisis.
The poor health conditions of eight young Aboriginal people who died around the time of early European colonisation have been revealed in their skeletal remains, according to a new study.
The bones provide evidence of the displacement of Indigenous Australians from their traditional lands as a result of European colonisation. We view this as an opportunity to undertake “truth-telling” of our colonial history, as outlined in the 2017 Uluru Statement from the Heart.
The remains were sold as “scientific specimens” to the Australian Museum in Sydney in the early 20th century, but were repatriated in the 1990s to the local community in remote northwest Queensland.
A discovery of skeletal remains
In 2015 one of us (Michael) was contacted by the Queensland Police for advice on the skeletal remains of several individuals. They had been found eroding from a floodplain just outside the town of Normanton.
They were identified as Aboriginal but it was obvious they were not from a traditional Aboriginal burial site.
The remains appeared to have been reburied together. They were heavily weathered and did not include complete skeletons, just skulls and some long bones.
The state archaeologist Stephen Nichols contacted several museums, and deduced that these individuals had been repatriated in the 1990s from the Australian Museum. At around the same time, local Aboriginal people told police that the remains had been reburied in this location after their repatriation.
It quickly became apparent that these were the remains of eight young people who had died of disease on the colonial frontier in the late 19th century and had been collected by the Aboriginal Protector, Walter Roth.
The collection of Aboriginal skeletal remains (ancestral remains) was common practice in the 19th and much of the 20th century. Today, many thousands of individuals remain in institutions around the world awaiting repatriation.
The Gkuthaarn and Kukatj people from Normanton wanted to find out more about the lives of these people who had been taken from their country. They discussed this after one of us (Michael) attended the site.
The human skeleton provides a unique record of an individual’s life history. Our investigation showed the remains were all young people, with an average age of about 15 years, and some as young as seven.
Evidence of stress
The remains told the story of young people who had undergone significant nutritional stress in their formative years. This was evident from linear stress markers recorded as defects in their tooth enamel, referred to as dental enamel hypoplasias.
The teeth also indicated that while traditional foods were still important in their diet they also regularly consumed European foods rich in sugar and carbohydrates. This had created dental caries (cavities) in their teeth, similar to those we see today in many modern populations but which are unknown in pre-contact Aboriginal remains.
Walter Roth wrote about the high frequency of disease in Aboriginal people found barely holding on in the fringe camps around Normanton (reported in 1901). He reported that “about half” of the 176 Aboriginal inhabitants were suffering from introduced venereal diseases.
The remains provide first-hand pathological evidence in the wake of colonisation. In one individual there were signs of a pathological lesion defined as caries sicca, a lesion diagnostic of syphilis.
Syphilis was also evident in two tibiae (lower leg bones) reburied with the crania (skulls minus the jaws) in the form of a condition known as Sabre Shin, where significant bowing of these long bones is evident.
This all provides evidence of the stress that Aboriginal people endured during the early colonial period.
‘Truth telling’ and history
The Gkuthaarn and Kukatj people’s request for help was in the spirit of the Uluru Statement from the Heart where “truth telling” about the colonial past was emphasised as a priority for reconciliation between all Australians.
Research into our shared colonial past plays a fundamental role in this objective. Bioarchaeology can offer new narratives from the historic period that have not been captured in the historic record.
Some archaeologists have called for a post-colonial approach to the discipline, in which we establish, together with Aboriginal people, the types of historic investigations they consider important.
Traditionally this has not included research on the skeletal remains of their ancestors, as this has been a taboo research area for many Aboriginal groups.
But in parts of the country, Indigenous attitudes towards research are changing, with groups such as the Gkuthaarn and Kukatj people wanting to know more about their past.
As one Indigenous leader from this community said:
… these were young people who left behind such a sad story that needs to be told so non-Indigenous people, not just throughout Australia but particularly in our region of northwest Queensland, know and understand that these traumas still impact on our people 120 years later.
These eight young people from Normanton, who died at the end of the 19th century, are not forgotten. They provide tangible evidence of the hardships that Aboriginal people endured through the colonial acquisition of their land and displacement of their way of life.
Susan Burton Phillips, Counsel to the Gkuthaarn and Kukatj people, contributed to this article.
Shaun Adams, Isotope Bioarchaeologist Research Fellow, Griffith University; Michael Westaway, Senior Research Fellow, Australian Research Centre for Human Evolution, Griffith University, and Richard Martin, Senior lecturer, The University of Queensland
But there’s a long history of opposition to childhood vaccination, from when it was introduced in England in 1796 to protect against smallpox. And many of the themes played out more than 200 years ago still resonate today.
For instance, whether childhood vaccination should be compulsory, or whether there should be penalties for not vaccinating, was debated then as it is now.
Throughout the 19th century, anti-vaxxers widely opposed Britain’s compulsory vaccination laws, leading to their effective end in 1907, when it became much easier to be a conscientious objector. Today, the focus in Australia has turned to ‘no jab, no pay’ or ‘no jab, no play’, policies linking childhood vaccination to welfare payments or childcare attendance.
Of course, the methods vaccine objectors use to discuss their position has changed. Today, people share their views on social media, blogs and websites; then, they wrote letters to newspapers for publication, the focus of my research.
Many studies have looked at the role of organised anti-vaccination societies in shaping the vaccination debate. However, “letters to the editor” let us look beyond the inner workings of these societies to show what ordinary people thought about vaccination.
Many of the UK’s larger metropolitan newspapers were wary of publishing letters opposing vaccination, especially those criticising the laws. However, regional newspapers would often publish them.
As part of my research, I looked at more than 1,100 letters to the editor, published in 30 newspapers from south-west England. Here are some of the recurring themes.
Smallpox vaccination a gruesome affair
In 19th century Britain, the only vaccine widely available to the public was against smallpox. Vaccination involved making a series of deep cuts to the arm of the child into which the doctor would insert matter from the wound of a previously vaccinated child.
These open wounds left many children vulnerable to infections, blood poisoning and gangrene. Parents and anti-vaccination campaigners alike described the gruesome scenes that often accompanied the procedure, like this example from the Royal Cornwall Gazette from December 1886:
Some of these poor infants have been borne of pillows for weeks, decaying alive before death ended their sufferings.
Conspiracy theories and vaccine cults
Side-effects were so widespread many parents refused to vaccinate their children. And letters to the editor show they became convinced the medical establishment and the government were aware of the dangers of vaccination.
If this was the case, why was vaccination compulsory? The answer, for many, could be found in a conspiracy theory.
Their letters argued doctors had conned the government into enforcing compulsory vaccination so they could reap the financial benefits. After all, public vaccinators were paid a fee for each child they vaccinated. So people believed compulsory vaccination must have been introduced to maximise doctors’ profits, as this example from the Wiltshire Times in February 1894 shows:
What are the benefits of vaccination? Salaries and bonuses to public vaccinators; these are the benefits; while the individuals who have to endure the operation also have to endure the evils which result from it. Health shattered, lives crippled or destroyed – are these benefits?
Conspiracy theories went further. If doctors knew vaccination could result in infections, then they knew children died from the procedure. As a result, some conspiracy theorists began to argue there was something inherently evil about vaccination. Some saw vaccination as “the mark of the beast”, a ritual perpetuated by a “vaccine cult”. Writing in the Salisbury Times, in December 1903, one critic said:
This is but the prototype of that modern species of doctorcraft, which would have us believe that their highly remunerative invocations of the vaccine god alone avert the utter extermination of the human race by small-pox.
For many, the issue of compulsory vaccination was directly related to the rights of the individual. Just like modern anti-vaccination arguments, many people in the 19th century believed compulsory vaccination laws were an incursion into the rights enjoyed by free citizens.
By submitting to the compulsory vaccination laws, a parent was allowing the government to insert itself into the individual home, and take control of a child’s body, something traditionally protected by the parent. Here’s an example from the Royal Cornwall Gazette in April 1899:
[…] civil and religious liberty must of necessity include the right to protect healthy children from calf-lymph defilement […] trust […] cannot be handed over at the demand of a medical tradesunion, or tamely relinquished at the cool request of some reverend rural justice of the peace.
What can we learn by looking at the past?
If anti-vaccination arguments from the past significantly overlap with those presented by their counterparts today, then we can learn about how to deal with anti-vaccination movements in the future.
Not only can we see compulsory vaccination laws in Australia could, as some researchers say, be problematic, we can use the history of vaccine opposition to better understand why vaccination remains so controversial for some people.
Surgeries and treatments come and go. A new BMJ guideline, for example, makes “strong recommendations” against the use of arthroscopic surgery for certain knee conditions. But while this key-hole surgery may slowly be scrapped in some cases due to its ineffectiveness, a number of historic “cures” fell out of favour because they were more akin to a method of torture. Here are five of the most extraordinary and unpleasant.
Trepanation (drilling or scraping a hole in the skull) is the oldest form of surgery we know of. Humans have been performing it since neolithic times. We don’t know why people did it, but some experts believe it could have been to release demons from the skull. Surprisingly, some people lived for many years after this brutal procedure was performed on them, as revealed by ancient skulls that show evidence of healing.
Although surgeons no longer scrape holes in peoples’ skulls to release troublesome spirits, there are still reports of doctors performing the procedure to relieve pressure on the brain. For example, a GP at a district hospital in Australia used an electric drill he found in a maintenance cupboard to bore a hole in a 13-year-old boy’s skull. Without the surgery, the boy would have died from a blood clot on the brain.
It’s hard to believe that a procedure more brutal than trepanation was widely performed in the 20th century. Lobotomy involved severing connections in the brain’s prefrontal lobe with an implement resembling an icepick (a leucotome).
Antonio Egas Moniz, a Portuguese neurologist, invented the procedure in 1935. A year later, Walter Freeman brought the procedure to the US. Freeman was an evangelist for this new form of “psychosurgery”. He drove around the country in his “loboto-mobile” performing the procedure on thousands of hapless patients.
Instead of a leucotome, Freeman used an actual icepick, which he would hammer through the corner of an eye socket using a mallet. He would then jiggle the icepick around in a most unscientific manner. Patients weren’t anaesthetised – rather they were in an induced seizure.
Thankfully, advances in psychiatric drugs saw the procedure fall from favour in the 1960s. Freeman performed his last two icepick lobotomies in 1967. One of the patients died from a brain haemorrhage three days later.
Ancient Greek, Roman, Persian and Hindu texts refer to a procedure, known as lithotomy, for removing bladder stones. The patient would lay on their back, feet apart, while a blade was passed into the bladder through the perineum – the soft bit of flesh between the sex organ and anus. Further indignity was inflicted by surgeons inserting their fingers or surgical instruments into the rectum or urethra to assist in the removal of the stone. It was an intensely painful procedure with a mortality rate of about 50%.
The number of lithotomy operations performed began to fall in the 19th century, and it was replaced by more humane methods of stone extraction. Healthier diets in the 20th century helped make bladder stones a rarity, too.
4. Rhinoplasty (old school)
Syphilis arrived in Italy in the 16th century, possibly carried by sailors returning from the newly exploited Americas (the so-called Columbian exchange).
The sexually transmitted disease had a number of cruel symptoms, one of which was known as “saddle-nose”, where the bridge of the nose collapses. This nasal deformity was an indicator of indiscretions, and many used surgery to try and hide it.
An Italian surgeon, Gaspare Tagliacozzi, developed a method for concealing this nasal deformity. He created a new nose using tissue from the patient’s arm. He would then cover this with a flap of skin from the upper arm, which was rather awkwardly still attached to the limb. Once the skin graft was firmly attached – after about three weeks – Tagliacozzie would separate the skin from the arm.
The were reported cases of patients’ noses turning purple in cold winter months and falling off.
Today, syphilis is easily treated with a course of antibiotics.
Losing blood, in modern medicine, is generally considered to be a bad thing. But, for about 2,000 years, bloodletting was one of the most common procedures performed by surgeons.
The procedure was based on a flawed scientific theory that humans possessed four “humours” (fluids): blood, phlegm, black bile and yellow bile. An imbalance in these humours was thought to result in disease. Lancets, blades or fleams (some spring loaded for added oomph) were used to open superficial veins, and in some cases arteries, to release blood over several days in an attempt to restore balance to these vital fluids.
Bloodletting in the West continued up until the 19th century. In 1838, Henry Clutterbuck, a lecturer at the Royal College of Physicians, claimed that “blood-letting is a remedy which, when judiciously employed, it is hardly possible to estimate too highly”.
Finally, one medical procedure, dating from one of the earliest Egyptian medical texts, that isn’t used anymore – and I can’t for the life of me think why – is the administration of half an onion and the froth of beer. It cures death, apparently.
This is a post I’m putting up at all my Blogs, even though that particular Blog may be unaffected due to scheduled posting, etc. Blog posts may be down a little at the moment and that for the last week or so. I have been ill with various illnesses and complaints for the last several weeks, so I have now decided to take the next week off from most Blogging activities in an attempt to rest and recover – if I can while still actually doing my very physically demanding actual job in the real world. I hope to return to Blogging full time in about a week’s time.
The link below is to an article taking a look at ‘hazmat suits’ from the great Bubonic Plague era.
The link below is to an article that looks at the old practice of body snatching for science.