Category Archives: Health and Fitness

A Brief History of the Lobotomy


The link below is to an article that takes a brief look at the history of the lobotomy.

For more visit:
https://lithub.com/a-brief-and-awful-history-of-the-lobotomy/

Advertisements

Infographic: Timeline of Medical Technology


The link below is to an infographic/timeline of the history of medical technology.

For more visit:
https://coolinfographics.com/blog/2019/4/5/5000-year-timeline-of-medical-technology


100 years later, why don’t we commemorate the victims and heroes of ‘Spanish flu’?


File 20190118 100292 x8l4i8.jpg?ixlib=rb 1.1
Women were at the forefront of managing the influenza pandemic.
AUSTRALIAN WAR MEMORIAL

Peter Hobbins, University of Sydney

At Sydney’s enormous Rookwood Cemetery, a lichen-spotted headstone captures a family’s double burden of grief.

The grave contains the remains of 19-year-old Harriet Ann Ottaway, who died on 2 July 1919. Its monument also commemorates her brother Henry James Ottaway, who “died of wounds in Belgium, 23rd Sept 1917, aged 21 years”.

While Henry was killed at the infamous Battle of Passchendaele, Harriet’s headstone makes no mention of her own courageous combat with “Spanish flu”.

Harriet’s story typifies the enduring public silence around the pneumonic influenza pandemic of 1918–19. Worldwide, it killed an estimated 50-100 million people – at least three times all of the deaths caused by the First World War.




Read more:
Why historians ignored the Spanish flu


After the disease came ashore in January 1919, about a third of all Australians were infected and the flu left nearly 15,000 dead in under a year. Those figures match the average annual death rate for the Australian Imperial Force throughout 1914–18.

Arguably, we could consider 1919 as another year of war, albeit against a new enemy. Indeed, the typical victims had similar profiles: fit, young adults aged 20-40. The major difference was that in 1919, women like Harriet formed a significant proportion of the casualties.




Read more:
World politics explainer: The Great War (WWI)


Deadly flu spread rapidly

There was no doubt about the medical and social impact of the “Spanish flu”. Although its origins remain contested, it certainly didn’t arise in Spain. What is known is that by early 1918, a highly infectious respiratory disease, caused by a then-unknown agent, was moving rapidly across Europe and the United States. By the middle of that year, as the war was reaching a tipping point, it had spread to Africa, India and Asia.

About a third of the entire world’s population was infected with Spanish flu.
Macleay Museum, Author provided

It also took on a much deadlier profile. While victims initially suffered the typical signs and symptoms of influenza – including aches, fever, coughing and an overwhelming weariness – a frighteningly high proportion went rapidly downhill.

Patients’ lungs filled with fluid – which is why it became known as “pneumonic influenza” – and they struggled to breathe. For nurses and doctors, a tell-tale sign of impending death was a blue, plum or mahogany colour in the victim’s cheeks.

This, sadly, was the fate of young Harriet Ottaway. Having nursed a dying aunt through early 1919, in June she tended her married sister Lillian, who had come down with pneumonic influenza.

Despite taking the recommended precautions, Harriet contracted the infection and died in hospital. Ironically, Lillian survived. But in the space of less than two years she had lost both a brother to the Great War and her younger sister to the Spanish flu.

An intimate impact worldwide

Indeed, as Harriet’s headstone reminds us, this was an intimate pandemic. The statistics can seem overwhelming until you realise what it means that about a third of the entire world’s population was infected.

Whatever your heritage, your ancestors and their communities were almost certainly touched by the disease. It’s a part of all of our family histories and many local histories.




Read more:
How infectious diseases have shaped our culture, habits and language


It wasn’t just victims who were affected. Across Australia, regulations intended to reduce the spread and impact of the pandemic caused profound disruption. The nation’s quarantine system held back the flu for several months, meaning that a less deadly version came ashore in 1919.

But it caused delay and resentment for the 180,000 soldiers, nurses and partners who returned home by sea that year.

Leaflets like this one from Victoria tried to warn people of the dangers of Spanish flu.
Board of Public Health, Victoria/Public Records Office of Victoria

Responses within Australia varied from state to state but the crisis often led to the closure of schools, churches, theatres, pubs, race meetings and agricultural shows, plus the delay of victory celebrations.

The result was not only economic hardship, but significant interruptions in education, entertainment, travel, shopping and worship. The funeral business boomed, however, as the nation’s annual death rate went up by approximately 25%.

Yet for some reason, the silence of Harriet’s headstone is repeated across the country. Compared with the Anzac memorials that peppered our towns and suburbs in the decades after the Great War, few monuments mark the impact of pneumonic influenza.

Nevertheless, its stories of suffering and sacrifice have been perpetuated in other ways, especially within family and community memories. A century later, these stories deserve to be researched and commemorated.




Read more:
Speaking with: Peter Doherty about infectious disease pandemics


Despite the disruption, fear and substantial personal risk posed by the flu, tens of thousands of ordinary Australians rose to the challenge. The wartime spirit of volunteering and community service saw church groups, civic leaders, council workers, teachers, nurses and organisations such as the Red Cross step up.

They staffed relief depots and emergency hospitals, delivered comforts from pyjamas to soup, and cared for victims who were critically ill or convalescent. A substantial proportion of these courageous carers were women, at a time when many were being commanded to hand back their wartime jobs to returning servicemen.

In resurrecting stories such as the sad tale of Harriet Ottaway, it’s time to restore our memories of the “Spanish flu” and commemorate how our community came together to battle this unprecedented public crisis.The Conversation

Peter Hobbins, ARC DECRA Fellow, University of Sydney

This article is republished from The Conversation under a Creative Commons license. Read the original article.


The History of Aspirin



A brief history of fake doctors, and how they get away with it



File 20180409 176959 1cm1f78.jpg?ixlib=rb 1.1
Impersonation of doctors is a modern phenomenon that grew out of Western medicine’s drive towards professionalism.
from shutterstock.com

Philippa Martyr, University of Western Australia

Melbourne man Raffaele Di Paolo pleaded guilty last week to a number of charges related to practising as a medical specialist when he wasn’t qualified to do so. Di Paolo is in jail awaiting his sentence after being found guilty of fraud, indecent assault and sexual penetration.

This case follows that of another so-called “fake doctor” in New South Wales. Sarang Chitale worked in the state’s public health service as a junior doctor from 2003 until 2014. It was only in 2016, after his last employer – the research firm Novotech – reported him to the Australian Health Practitioner Regulation Agency (AHPRA), that his qualifications were investigated.

“Dr” Chitale turned out to be Shyam Acharya, who had stolen the real Dr Chitale’s identity and obtained Australian citizenship and employment at a six-figure salary. Acharya had no medical qualifications at all.

Cases of impersonation, identity theft and fraudulent practice happen across a range of disciplines. There have been instances of fake pilots, veterinarians and priests. It’s especially confronting when it happens in medicine, because of the immense trust we place in those looking after our health.

So what drives people to go to such extremes, and how do they get away with?

A modern phenomenon

Impersonation of doctors is a modern phenomenon. It grew out of Western medicine’s drive towards professionalism in the 19th century, which ran alongside the explosion of scientific medical research.

Before this, doctors would be trained by an apprentice-type system, and there was little recourse for damages. A person hired a doctor if they could afford it, and if the treatment was poor, or killed the patient, it was a case of caveat emptor – buyer beware.

But as science made medicine more reliable, the title of “doctor” really began to mean something – especially as the fees began to rise. By the end of the 19th century in the British Empire, becoming a doctor was a complex process. It required long university training, an independent income and the right social connections. Legislation backed this up, with medical registration acts controlling who could and couldn’t use medical titles.

Given the present social status and salaries of medical professionals, it’s easy to see why people would aspire to be doctors. And when the road ahead looks too hard and expensive, it may be tempting to take short cuts.

Today, there are four common elements that point to weaknesses in our health-care systems, which allow fraudsters to slip through the cracks and practise medicine.

1. Misplaced trust

Everyone believes someone, somewhere, has checked and verified a person’s credentials. But sometimes this hasn’t been done, or it takes a long time.

Fake psychiatrist Mohamed Shakeel Siddiqui – a qualified doctor who stole a real psychiatrist’s identity and worked in New Zealand for six months in 2015 – left a complicated trail of identity theft that required the assistance of the FBI to unravel.

Last year, in Germany, a man was found to have forged foreign qualifications that he presented to the registering body in early 2016. He was issued with a temporary licence while these were checked. When the qualifications turned out to be fraudulent, he was fired from his job as a junior doctor in a psychiatric ward. But this wasn’t until June 2017.

2. Foreign credentials

Credentials from a foreign university, issued in a different language, are another common element among medical fraudsters. Verifying these can be time-consuming, so a health system desperate for staff may cut corners.

Ioannis Kastanis was appointed as head of medicine at Skyros Regional Hospital in Greece in 1999 with fake degrees from Sapienza University of Rome. The degrees were recognised and the certificates translated, but their authenticity was never checked.

Dusan Milosevic, who practised as a psychologist for ten years, registered in Victoria in 1998. He held bogus degrees from the University of Belgrade in Serbia – at the time a war-torn corner of Europe, which made verification difficult.

3. Regional and remote practice

It’s easier to get away with faking in regional or remote areas where there is less scrutiny. Desperation to retain staff may also silence complaints.

“Dr” Balaji Varatharaju fraudulently gained employment in remote Alice Springs, where he worked as a junior doctor for nine months.

Ioannis Kastanis had worked on a distant Greek island with a population of only around 3,000 people.

4. It’s not easy to dob

Finally, there are two unnerving questions. How do you tell a poorly trained but legally qualified practitioner from a faker? And who do you tell if you suspect something is off?

The people best placed to spot the fakes – other hospital and health-care staff – work in often stressful conditions where complaints about colleagues can lead to reprisals. If the practitioner is from another ethnicity or culture, this adds an extra layer of sensitivity. It was only after “Dr Chitale” was exposed that staff were willing to say his practice had been “shabby”, “unsavoury” and “poor”.

So, why do they do it?

The reasons for fakery are as diverse as the fakers. “Dr Nick Delaney”, at Lady Cilento Children’s Hospital in Brisbane, reportedly pretended to be a doctor to “make friends” and keep a fling going with a security guard at the same hospital.

On a more sinister level, there are possible sexually predatory reasons, like those of bogus gynaecologist Raffale Di Paolo. Fake psychiatrist Mohamed Shakeel Siddiqui said he only did it to help people.

There are also the less easily understood fakers, like “Dr” Adam Litwin, who worked as a resident in surgery at UCLA Medical Center in California for six months in 1999. Questions only began to be asked when he turned up to work in his white coat with a picture of himself silk-screened on it: even by Californian standards, this was going too far.

So how do we stop this happening?

Part of the problem is our cultural dependence on qualifications as the passkey to higher income and social status, making them an easy target for fraudsters. Qualifications only reduce risk, but they can’t eliminate it. Qualified doctors can also cause havoc: think Jayant Patel and other bona fide qualified practitioners who have been struck off for malpractice, mutilation and manslaughter.

Conversely, no one complained about “Dr Chitale” in 11 years. The only complaints Kastanis received in 14 years were from people who thought his Ferrari was vulgar. The German junior doctor had an excellent knowledge of mental health-care procedures and language – obtained from his time as a psychiatric patient.

The ConversationMost of these loopholes can be closed with time and patience. What would help is if hospital and health-care staff felt sufficiently supported to report their suspicions to their employer, rather than to their colleagues. This would foster a more open culture of flagging concerns about fellow practitioners without fear of formal or informal punishment. It might also uncover more “Dr Chitales” before anyone is seriously harmed.

Philippa Martyr, Lecturer, Pharmacology, University of Western Australia

This article was originally published on The Conversation. Read the original article.


England: The Black Death



1918 Flu Pandemic



A short history of vaccine objection, vaccine cults and conspiracy theories


File 20170630 11661 1db8jxr
Edward Jenner, who pioneered vaccination, and two colleagues (right) seeing off three anti-vaccination opponents, with the dead lying at their feet (1808).
I Cruikshank/Wellcome Images/Wikimedia Commons, CC BY-SA

Ella Stewart-Peters, Flinders University and Catherine Kevin, Flinders University

When we hear phrases like vaccine objection, vaccine refusal and anti-vaxxers, it’s easy to assume these are new labels used in today’s childhood vaccination debates.

But there’s a long history of opposition to childhood vaccination, from when it was introduced in England in 1796 to protect against smallpox. And many of the themes played out more than 200 years ago still resonate today.

For instance, whether childhood vaccination should be compulsory, or whether there should be penalties for not vaccinating, was debated then as it is now.

Throughout the 19th century, anti-vaxxers widely opposed Britain’s compulsory vaccination laws, leading to their effective end in 1907, when it became much easier to be a conscientious objector. Today, the focus in Australia has turned to ‘no jab, no pay’ or ‘no jab, no play’, policies linking childhood vaccination to welfare payments or childcare attendance.

Of course, the methods vaccine objectors use to discuss their position has changed. Today, people share their views on social media, blogs and websites; then, they wrote letters to newspapers for publication, the focus of my research.

Many studies have looked at the role of organised anti-vaccination societies in shaping the vaccination debate. However, “letters to the editor” let us look beyond the inner workings of these societies to show what ordinary people thought about vaccination.

Many of the UK’s larger metropolitan newspapers were wary of publishing letters opposing vaccination, especially those criticising the laws. However, regional newspapers would often publish them.

As part of my research, I looked at more than 1,100 letters to the editor, published in 30 newspapers from south-west England. Here are some of the recurring themes.

Smallpox vaccination a gruesome affair

In 19th century Britain, the only vaccine widely available to the public was against smallpox. Vaccination involved making a series of deep cuts to the arm of the child into which the doctor would insert matter from the wound of a previously vaccinated child.

These open wounds left many children vulnerable to infections, blood poisoning and gangrene. Parents and anti-vaccination campaigners alike described the gruesome scenes that often accompanied the procedure, like this example from the Royal Cornwall Gazette from December 1886:

Some of these poor infants have been borne of pillows for weeks, decaying alive before death ended their sufferings.

Conspiracy theories and vaccine cults

Side-effects were so widespread many parents refused to vaccinate their children. And letters to the editor show they became convinced the medical establishment and the government were aware of the dangers of vaccination.

If this was the case, why was vaccination compulsory? The answer, for many, could be found in a conspiracy theory.

Their letters argued doctors had conned the government into enforcing compulsory vaccination so they could reap the financial benefits. After all, public vaccinators were paid a fee for each child they vaccinated. So people believed compulsory vaccination must have been introduced to maximise doctors’ profits, as this example from the Wiltshire Times in February 1894 shows:

What are the benefits of vaccination? Salaries and bonuses to public vaccinators; these are the benefits; while the individuals who have to endure the operation also have to endure the evils which result from it. Health shattered, lives crippled or destroyed – are these benefits?

Conspiracy theories went further. If doctors knew vaccination could result in infections, then they knew children died from the procedure. As a result, some conspiracy theorists began to argue there was something inherently evil about vaccination. Some saw vaccination as “the mark of the beast”, a ritual perpetuated by a “vaccine cult”. Writing in the Salisbury Times, in December 1903, one critic said:

This is but the prototype of that modern species of doctorcraft, which would have us believe that their highly remunerative invocations of the vaccine god alone avert the utter extermination of the human race by small-pox.

Of course, this is an extreme view. But issues of morality and religion still permeate the anti-vaccination movement today.

Individual rights

For many, the issue of compulsory vaccination was directly related to the rights of the individual. Just like modern anti-vaccination arguments, many people in the 19th century believed compulsory vaccination laws were an incursion into the rights enjoyed by free citizens.

By submitting to the compulsory vaccination laws, a parent was allowing the government to insert itself into the individual home, and take control of a child’s body, something traditionally protected by the parent. Here’s an example from the Royal Cornwall Gazette in April 1899:

[…] civil and religious liberty must of necessity include the right to protect healthy children from calf-lymph defilement […] trust […] cannot be handed over at the demand of a medical tradesunion, or tamely relinquished at the cool request of some reverend rural justice of the peace.

What can we learn by looking at the past?

If anti-vaccination arguments from the past significantly overlap with those presented by their counterparts today, then we can learn about how to deal with anti-vaccination movements in the future.

The ConversationNot only can we see compulsory vaccination laws in Australia could, as some researchers say, be problematic, we can use the history of vaccine opposition to better understand why vaccination remains so controversial for some people.

Ella Stewart-Peters, PhD Candidate in History, Flinders University and Catherine Kevin, Senior Lecturer in Australian History, Flinders University

This article was originally published on The Conversation. Read the original article.


Five bloodcurdling medical procedures that are no longer performed … thankfully



File 20170518 12263 hozxx6

Kunstmuseum St Gallen/Wikimedia Commons

Adam Taylor, Lancaster University

Surgeries and treatments come and go. A new BMJ guideline, for example, makes “strong recommendations” against the use of arthroscopic surgery for certain knee conditions. But while this key-hole surgery may slowly be scrapped in some cases due to its ineffectiveness, a number of historic “cures” fell out of favour because they were more akin to a method of torture. Here are five of the most extraordinary and unpleasant.

1. Trepanation

Trepanation (drilling or scraping a hole in the skull) is the oldest form of surgery we know of. Humans have been performing it since neolithic times. We don’t know why people did it, but some experts believe it could have been to release demons from the skull. Surprisingly, some people lived for many years after this brutal procedure was performed on them, as revealed by ancient skulls that show evidence of healing.

Although surgeons no longer scrape holes in peoples’ skulls to release troublesome spirits, there are still reports of doctors performing the procedure to relieve pressure on the brain. For example, a GP at a district hospital in Australia used an electric drill he found in a maintenance cupboard to bore a hole in a 13-year-old boy’s skull. Without the surgery, the boy would have died from a blood clot on the brain.

2. Lobotomy

It’s hard to believe that a procedure more brutal than trepanation was widely performed in the 20th century. Lobotomy involved severing connections in the brain’s prefrontal lobe with an implement resembling an icepick (a leucotome).

Antonio Egas Moniz, a Portuguese neurologist, invented the procedure in 1935. A year later, Walter Freeman brought the procedure to the US. Freeman was an evangelist for this new form of “psychosurgery”. He drove around the country in his “loboto-mobile” performing the procedure on thousands of hapless patients.

Instead of a leucotome, Freeman used an actual icepick, which he would hammer through the corner of an eye socket using a mallet. He would then jiggle the icepick around in a most unscientific manner. Patients weren’t anaesthetised – rather they were in an induced seizure.

Thankfully, advances in psychiatric drugs saw the procedure fall from favour in the 1960s. Freeman performed his last two icepick lobotomies in 1967. One of the patients died from a brain haemorrhage three days later.

Walter Freeman (left) and James Watts study an x-ray prior to conducting ‘psychosurgery’.
Wikimedia Commons/Harris A Ewing

3. Lithotomy

This Dutch blacksmith, Jan de Doot, removed his own bladder stone.
Wikimedia Commons

Ancient Greek, Roman, Persian and Hindu texts refer to a procedure, known as lithotomy, for removing bladder stones. The patient would lay on their back, feet apart, while a blade was passed into the bladder through the perineum – the soft bit of flesh between the sex organ and anus. Further indignity was inflicted by surgeons inserting their fingers or surgical instruments into the rectum or urethra to assist in the removal of the stone. It was an intensely painful procedure with a mortality rate of about 50%.

The number of lithotomy operations performed began to fall in the 19th century, and it was replaced by more humane methods of stone extraction. Healthier diets in the 20th century helped make bladder stones a rarity, too.

4. Rhinoplasty (old school)

Syphilis arrived in Italy in the 16th century, possibly carried by sailors returning from the newly exploited Americas (the so-called Columbian exchange).

The sexually transmitted disease had a number of cruel symptoms, one of which was known as “saddle-nose”, where the bridge of the nose collapses. This nasal deformity was an indicator of indiscretions, and many used surgery to try and hide it.

A patient undergoing Tagliacozzi’s procedure for fixing saddle-nose.
Wikimedia Commons/Wellcome Images

An Italian surgeon, Gaspare Tagliacozzi, developed a method for concealing this nasal deformity. He created a new nose using tissue from the patient’s arm. He would then cover this with a flap of skin from the upper arm, which was rather awkwardly still attached to the limb. Once the skin graft was firmly attached – after about three weeks – Tagliacozzie would separate the skin from the arm.

The were reported cases of patients’ noses turning purple in cold winter months and falling off.

Today, syphilis is easily treated with a course of antibiotics.

5. Bloodletting

Losing blood, in modern medicine, is generally considered to be a bad thing. But, for about 2,000 years, bloodletting was one of the most common procedures performed by surgeons.

The procedure was based on a flawed scientific theory that humans possessed four “humours” (fluids): blood, phlegm, black bile and yellow bile. An imbalance in these humours was thought to result in disease. Lancets, blades or fleams (some spring loaded for added oomph) were used to open superficial veins, and in some cases arteries, to release blood over several days in an attempt to restore balance to these vital fluids.

Bloodletting in the West continued up until the 19th century. In 1838, Henry Clutterbuck, a lecturer at the Royal College of Physicians, claimed that “blood-letting is a remedy which, when judiciously employed, it is hardly possible to estimate too highly”.

A barber surgeon’s bloodletting set.
Anagoria/Wikimedia Commons, CC BY

The ConversationFinally, one medical procedure, dating from one of the earliest Egyptian medical texts, that isn’t used anymore – and I can’t for the life of me think why – is the administration of half an onion and the froth of beer. It cures death, apparently.

Adam Taylor, Director of the Clinical Anatomy Learning Centre & Senior Lecturer in Anatomy, Lancaster University

This article was originally published on The Conversation. Read the original article.


A short history of anaesthesia: from unspeakable agony to unlocking consciousness



File 20170421 20054 iqgbuw
General anaesthesia has come a long way since its first public demonstration in the 19th century, depicted here.
Wellcome Library, London/Wikimedia, CC BY-SA

David Liley, Swinburne University of Technology

We expect to feel no pain during surgery or at least to have no memory of the procedure. But it wasn’t always so. The Conversation

Until the discovery of general anaesthesia in the middle of the 19th century, surgery was performed only as a last and desperate resort. Conscious and without pain relief, it was beset with unimaginable terror, unspeakable agony and considerable risk.

Not surprisingly, few chose to write about their experience in case it reawakened suppressed memories of a necessary torture.

One of the most well-known and vivid records of this “terror that surpasses all description” was by Fanny Burney, a popular English novelist, who on the morning of September 30, 1811 eventually submitted to having a mastectomy:

When the dreadful steel was plunged into the breast … I needed no injunctions not to restrain my cries. I began a scream that lasted unintermittently during the whole time of the incision … so excruciating was the agony … I then felt the Knife [rack]ling against the breast bone – scraping it.

But it wasn’t only the patient who suffered. Surgeons too had to endure considerable anxiety and distress.

John Abernethy, a surgeon at London’s St Bartholomew’s Hospital at the turn of the 19th century, described walking to the operating room as like “going to a hanging” and was sometimes known to shed tears and vomit after a particularly gruesome operation.

Discovery of anaesthesia

It was against this background that general anaesthesia was discovered.

A young US dentist named William Morton, spurred on by the business opportunities afforded by technical advances in artificial teeth, doggedly searched for a surefire way to relieve pain and boost dental profits.

His efforts were soon rewarded. He discovered when he or small animals inhaled sulfuric ether (now known as ethyl ether or simply ether) they passed out and became unresponsive.

A few months after this discovery, on October 16, 1846 and with much showmanship, Morton anaesthetised a young male patient in a public demonstration at Massachusetts General Hospital.

The hospital’s chief surgeon then removed a tumour on the left side of the jaw. This occurred without the patient apparently moving or complaining, much to the surgeon’s and audience’s great surprise.

So began the story of general anaesthesia, which for good reason is now widely regarded as one of the greatest discoveries of all time.

Anaesthesia used routinely

News of ether’s remarkable properties spread rapidly across the Atlantic to Britain, ultimately stimulating the discovery of chloroform, a volatile general anaesthetic.

According to its discoverer, James Simpson, it had none of ether’s “inconveniences and objections” – a pungent odour, irritation of throat and nasal passages and a perplexing initial phase of physical agitation instead of the more desirable suppression of all behaviour.

This chloroform inhaler was the type John Snow used on Queen Victoria to ease the pain of childbirth. Chloroform vapours were delivered down a tube via the brass and velvet face mask.
Science Museum, London/Wellcome Images/Wikimedia, CC BY-SA

Chloroform subsequently became the most commonly used general anaesthetic in British surgical and dental anaesthetic practice, mainly due to the founding father of scientific anaesthesia John Snow, but remained non-essential to the practice of most doctors.

This changed after Snow gave Queen Victoria chloroform during the birth of her eighth child, Prince Leopold. The publicity that followed made anaesthesia more acceptable and demand increased, whether during childbirth or for other reasons.

By the end of the 19th century, anaesthesia was commonplace, arguably becoming the first example in which medical practice was backed by emerging scientific developments.

Anaesthesia is safe

Today, sulfuric ether and chloroform have been replaced by much safer and more effective agents such as sevoflurane and isoflurane.

Ether was highly flammable so could not be used with electrocautery (which involves an electrical current being passed through a probe to stem blood flow or cut tissue) or when monitoring patients electronically. And chloroform was associated with an unacceptably high rate of deaths, mainly due to cardiac arrest (when the heart stops beating).

The practice of general anaesthesia has now evolved to the point that it is among the safest of all major routine medical procedures. For around 300,000 fit and healthy people having elective medical procedures, one person dies due to anaesthesia.

Despite the increasing clinical effectiveness with which anaesthesia has been administered for over the past 170 years, and its scientific and technical foundations, we still have only the vaguest idea about how anaesthetics produce a state of unconsciousness.

Anaesthesia remains a mystery

General anaesthesia needs patients to be immobile, pain free and unconscious. Of these, unconsciousness is the most difficult to define and measure.

For example, not responding to, or then not remembering, some event (such as the voice of the anaesthetist or the moment of surgical incision), while clinically useful, is not enough to decisively determine whether someone is or was unconscious.

We need some other way to define consciousness and to understand its disruption by the biological actions of general anaesthetics.

Early in the 20th century, we thought anaesthetics worked by dissolving into the fatty parts of the outside of brain cells (the cell membrane) and interfering with the way they worked.

But we now know anaesthetics directly affect the behaviour of a wide variety of proteins necessary to support the activity of neurones (nerve cells) and their coordinated behaviour.

For this reason the only way to develop an integrated understanding of the effects of these multiple, and individually insufficient, neuronal protein targets is by developing testable, mathematically formulated theories.

These theories need to not only describe how consciousness emerges from brain activity but to also explain how this brain activity is affected by the multiple targets of anaesthetic action.

Despite the tremendous advances in the science of anaesthesia, after almost 200 years we are still waiting for such a theory.

Until then we are still looking for the missing link between the physical substance of our brain and the subjective content of our minds.

David Liley, Professor, Centre for Human Psychopharmacology, Swinburne University of Technology

This article was originally published on The Conversation. Read the original article.


%d bloggers like this: