Monthly Archives: April 2016
Reality television shows based on surgical transformations, such as The Swan and Extreme Makeover, were not the first public spectacles to offer women the ability to compete for the chance to be beautiful.
In 1924, a competition ad in the New York Daily Mirror asked the affronting question “Who is the homeliest girl in New York?” It promised the unfortunate winner that a plastic surgeon would “make a beauty of her”. Entrants were reassured that they would be spared embarrassment, as the paper’s art department would paint “masks” on their photographs when they were published.
Cosmetic surgery instinctively seems like a modern phenomenon. Yet it has a much longer and more complicated history than most people likely imagine. Its origins lie in part in the correction of syphilitic deformities and racialised ideas about “healthy” and acceptable facial features as much as any purely aesthetic ideas about symmetry, for instance.
In her study of how beauty is related to social discrimination and bias, sociologist Bonnie Berry estimates that 50% of Americans are “unhappy with their looks”. Berry links this prevalence to mass media images. However, people have long been driven to painful, surgical measures to “correct” their facial features and body parts, even prior to the use of anaesthesia and discovery of antiseptic principles.
Some of the first recorded surgeries took place in 16th-century Britain and Europe. Tudor “barber-surgeons” treated facial injuries, which as medical historian Margaret Pelling explains, was crucial in a culture where damaged or ugly faces were seen to reflect a disfigured inner self.
With the pain and risks to life inherent in any kind of surgery at this time, cosmetic procedures were usually confined to severe and stigmatised disfigurements, such as the loss of a nose through trauma or epidemic syphilis.
The first pedicle flap grafts to fashion new noses were performed in 16th-century Europe. A section of skin would be cut from the forehead, folded down, and stitched, or would be harvested from the patient’s arm.
A later representation of this procedure in Iconografia d’anatomia
published in 1841, as reproduced in Richard Barnett’s Crucial Interventions, shows the patient with his raised arm still gruesomely attached to his face during the graft’s healing period.
As socially crippling as facial disfigurements could be and as desperate as some individuals were to remedy them, purely cosmetic surgery did not become commonplace until operations were not excruciatingly painful and life threatening.
In 1846, what is frequently described as the first “painless” operation was performed by American dentist William Morton, who gave ether to a patient. The ether was administered via inhalation through either a handkerchief or bellows. Both of these were imprecise methods of delivery that could cause an overdose and kill the patient.
The removal of the second major impediment to cosmetic surgery occurred in the 1860s. English doctor Joseph Lister’s model of aseptic, or sterile, surgery was taken up in France, Germany, Austria and Italy, reducing the chance of infection and death.
By the 1880s, with the further refinement of anaesthesia, cosmetic surgery became a relatively safe and painless prospect for healthy people who felt unattractive.
The Derma-Featural Co advertised its “treatments” for “humped, depressed, or… ill-shaped noses”, protruding ears, and wrinkles (“the finger marks of Time”) in the English magazine World of Dress in 1901.
A report from a 1908 court case involving the company shows that they continued to use skin harvested from – and attached to – the arm for rhinoplasties.
The report also refers to the non-surgical “paraffin wax” rhinoplasty, in which hot, liquid wax was injected into the nose and then “moulded by the operator into the desired shape”. The wax could potentially migrate to other parts of the face and be disfiguring, or cause “paraffinomas” or wax cancers.
Advertisements for the likes of the the Derma-Featural Co were rare in women’s magazines around the turn of the 20th century. But there were frequently ads published for bogus devices promising to deliver dramatic face and body changes that might reasonably be expected only from surgical intervention.
Various models of chin and forehead straps, such as the patented “Ganesh” brand, were advertised as a means for removing double chins and wrinkles around the eyes.
Bust reducers and hip and stomach reducers, such as the JZ Hygienic Beauty Belt, also promised non-surgical ways to reshape the body.
The frequency of these ads in popular magazines suggests that use of these devices was socially acceptable. In comparison, coloured cosmetics such as rouge and kohl eyeliner were rarely advertised. The ads for “powder and paint” that do exist often emphasised the product’s “natural look” to avoid any negative association between cosmetics and artifice.
The racialised origins of cosmetic surgery
The most common cosmetic operations requested before the 20th century aimed to correct features such as ears, noses, and breasts classified as “ugly” because they weren’t typical for “white” people.
At this time, racial science was concerned with “improving” the white race.
In the United States, with its growing populations of Jewish and Irish immigrants and African Americans, “pug” noses, large noses and flat noses were signs of racial difference and therefore ugliness.
Sander L. Gilman suggests that the “primitive” associations of non-white noses arose “because the too-flat nose came to be associated with the inherited syphilitic nose”.
American otolaryngologist John Orlando Roe’s discovery of a method for performing rhinoplasties inside the nose, without leaving a tell-tale external scar, was a crucial development in the 1880s. As is the case today, patients wanted to be able to “pass” (in this case as “white”) and for their surgery to be undetectable.
In 2015, 627,165 American women, or an astonishing 1 in 250, received breast implants. In the early years of cosmetic surgery, breasts were never made larger.
Breasts acted historically as a “racial sign”. Small, rounded breasts were viewed as youthful and sexually controlled. Larger, pendulous breasts were regarded as “primitive” and therefore as a deformity.
In the age of the flapper, in the early 20th century, breast reductions were common. It was not until the 1950s that small breasts were transformed into a medical problem and seen to make women unhappy.
Shifting views about desirable breasts illustrate how beauty standards change across time and place. Beauty was once considered as God-given, natural or a sign of health or a person’s good character.
When beauty began to be understood as located outside of each person and as capable of being changed, more women, in particular, tried to improve their appearance through beauty products, as they now increasingly turn to surgery.
As Elizabeth Haiken points out in Venus Envy, 1921 not only marked the first meeting of an American association of plastic surgery specialists, but also the first Miss America pageant in Atlantic City. All of the finalists were white. The winner, sixteen-year-old Margaret Gorman, was short compared to today’s towering models at five-feet-one-inch tall, and her breast measurement was smaller than that of her hips.
There is a close link between cosmetic surgical trends and the qualities we value as a culture, as well as shifting ideas about race, health, femininity, and ageing.
Last year was celebrated by some within the field as the 100th anniversary of modern cosmetic surgery. New Zealander Dr Harold Gillies has been championed for inventing the pedicle flap graft during World War I to reconstruct the faces of maimed soldiers. Yet as is well documented, primitive versions of this technique had been in use for centuries.
Such an inspiring story obscures the fact that modern cosmetic surgery was really born in the late 19th century and that it owes as much to syphilis and racism as to rebuilding the noses and jaws of war heroes.
The surgical fraternity – and it is a brotherhood, as more than 90% of cosmetic surgeons are male— conveniently places itself in a history that begins with reconstructing the faces and work prospects of the war wounded.
In reality, cosmetic surgeons are instruments of shifting whims about what is attractive. They have helped people to conceal or transform features that might make them stand out as once diseased, ethnically different, “primitive”, too feminine, or too masculine.
The sheer risks that people have been willing to run in order to pass as “normal” or even to turn the “misfortune” of ugliness, as the homeliest girl contest put it, into beauty, shows how strongly people internalise ideas about what is beautiful.
Looking back at the ugly history of cosmetic surgery should give us the impetus to more fully consider how our own beauty norms are shaped by prejudices including racism and sexism.
There are fashions in diseases, as in anything else. It’s understandable that a new, infectious and life-threatening malady could preoccupy us, such as cholera in the 19th century or Ebola in recent times.
It is harder to see why a panic erupts around a diagnosis that’s a century old, but a telegenic celebrity death can help. When the singer Karen Carpenter died aged 32 in 1983, her heart gave out because of complications due to anorexia. Her death is widely credited with pushing eating disorders into the public consciousness.
Karen Carpenter was not the first famous young woman to starve to death. Sarah Jacob, “the Welsh Fasting Girl”, was once a national craze across Britain. She died at her parents’ farm in December 1869 in front of a team of nurses who had been sent from London to Carmathenshire to monitor her.
Sarah was believed by her family and her local clergyman to eat nothing at all. Her parents agreed to have her watched to make sure she was not secretly eating, but their faith in her was strong enough that they refused to have her force-fed.
As with other fasting girls, her alleged ability to live without food was taken by her supporters as a sign of special spiritual status, and seen by materialist physicians as evidence of hysteria and deceit.
Did Sarah Jacob, like Karen Carpenter, die of anorexia?
The diagnostic label “anorexia nervosa” was not coined until shortly after Sarah Jacob died, but of course a disease can exist prior to being named. She did not have all the symptoms associated with the modern diagnosis, but most mental disorders vary from patient to patient.
Anorexia is often seen as an expression of will – an assertion of autonomy and control by a young woman who is engaged in a battle with her family and therapists. If that’s the crucial point about anorexia then maybe Sarah Jacob was anorexic. Her fast turned her whole domestic world upside down and she maintained it right to the end.
In her 1988 history of anorexia, Fasting Girls, Joan Jacobs Brumberg, noting the presence of the medical team watching in her room, asserted that Sarah was “killed by experimental design”. But maybe she died of pride.
If the assertion of will, over both one’s own appetite and the authority of others, is the heart of anorexia, then perhaps we can push its history back further. In Holy Anorexia (1985), Rudolph Bell argued that anorexia shaped the lives of many medieval saints and other holy women, who ate next to nothing.
Saint Catherine of Siena fasted for days, far beyond what was expected of even the most pious young women in 14th-century Italy. She did so even when the male priests she was supposed to defer to expressly told her to eat something, on the grounds that her spiritual husband, Jesus himself, outranked them.
For Bell, it is Catherine’s assertion of her will – she sent angry letters to the Pope – that marks her out and puts her in a long line of anorexics extending to the present day.
Brumberg attacks Bell for assuming that female psychology has not changed over the centuries and that the past and present are the same.
But that’s unfair. It is certainly possible to acknowledge that both psychology and culture have changed dramatically over the years while also thinking that two people share enough relevant symptoms and personality features to justify applying the same diagnostic label to them both even if they lived centuries apart.
But obviously not just any remote similarity is enough, so how can we decide?
Archaeologists can find on ancient skeletons the traces of familiar diseases, but there is no physical marker to point to that would decide whether a mental illness was present in the middle ages.
Clearly, young women (and men) have been dramatically restricting their calorie intake for centuries, but not all the symptoms of modern anorexia have always been present, and some saintly behaviours are no longer associated with eating disorders.
Similarly, melancholy has a very long history, and many scholars see modern depression as essentially the same thing.
But modern clinical depression has dropped the distinction between melancholy, which has no obvious cause, and ordinary sadness, which is a reasonable response to the tragedies of life. “Depression” pathologises parts of our mental life that “melancholy” treated as normal – is it the same disease, or not?
Well, if you think mental illness is above all a problem with a neurological system, then there might seem to be an easy answer. The disease label refers to what is going wrong within your brain, and the cultural context just supplies the input and output.
Take an anorexic brain and plug it into 14th-century Italy and you get one set of symptoms. Plug it into modern Western societies and you get another. The different symptoms are reflections of different cultures acting on the brain.
Joel and Ian Gold, in Suspicious Minds, have discussed the emergence of what they consider to be a new form of psychopathology – the “Truman Show delusion” – in which, like the hero of the movie of that name, subjects imagine themselves as the star of a reality TV show. The existence of the show is known but kept secret by their friends.
The Golds argue that the delusion was caused by the rise of new forms of media and an attendant loss of privacy. It’s what you get when a paranoid brain deals with the contemporary social world, whereas perhaps a few hundred years ago these subjects would have been afraid of witches, not TV producers.
It’s a simple picture, and the brain-based concept of mental illness has great power. But culture shapes the brain in ways that makes the simple opposition too stark – London taxi drivers have extra-large hippocampuses, which have grown from use (it keeps a mental map of your surroundings) like the muscles of an athlete.
Over the centuries our brains have been sculpted by our cultural selection just as by natural selection, and mental illness has been shaped accordingly.
At different times, different aspects of a syndrome will predominate, to be succeeded by others as the culture shifts. Historians need to argue about how to apply the labels, but the history of human society is reflected in the ways our minds go wrong.
This is the second instalment in our disease evolution package. Click here to read the first: Disease evolution: our long history of fighting viruses.
The 20th anniversary of the massacre at Port Arthur again raises pressing questions – for surviving victims, their families and the Australian community more broadly – about ways of remembering the tragedy.
The relationship between trauma, tourism, commemoration and the nature of the place itself is a complicated one.
From the time it was established, the settlement at Port Arthur was associated with trauma. It was meant to be.
The isolated prison, housing the worst convicts, was intended to instil fear to deter others. And the authorities played up the horror of punishment there.
Here convicts – already languishing as far from their homes as possible – were now subjected to unknown terrors in an alien wilderness. Though the actual administration was relatively “enlightened”, the image was unrelentingly negative.
Everyone, it seemed, had an interest in playing up the horror.
The full circle
In 1877, the prison was closed. The government sought to obliterate its dark history and the shame of a convict past by changing the township’s name to Carnarvon. And by selling off the prison buildings on condition they were demolished.
Yet almost immediately tourists began to flock to the place, creating an important local industry. Souvenirs, guidebooks and postcards appeared; convict buildings were turned into guesthouses.
Fishing and hunting were popular but many tourists were drawn by morbid curiosity and a taste for the macabre.
Those early tourists could be a raucous mob. Reports spoke of “merry crowds” who danced in the mess rooms; pilfered “relics”; enjoyed the “thrill” of being shut up in a cell; and shrieked at the tales of horror told by the guides.
Some were ex-convicts: one would, for an extra shilling, remove his shirt and display the scars left by the lash.
Some tourists might reflect on the past’s brutality or British perfidy, but generally, a good time was had by all. The violence and gruesomeness were an entertainment.
The horror was in stark contrast to the landscape itself. Though at first seen as gloomy, alien and oppressive, the natural setting soon came to be regarded as romantically wild, awe-inspiring and picturesque.
Tastes were changing. As romanticism seeped into popular consciousness, the idea of wilderness took on new meaning, something to be sought out rather than avoided.
The site’s neo-Gothic church, badly damaged by fire and covered with ivy, came to be seen as a romantically picturesque ruin.
Visitors drew attention to the irony of somewhere so beautiful being the scene of horror. Trauma amid beauty would become a common theme, revisited following the events of 1996.
A fine balance
Successive governments could not ignore the fact that Port Arthur was a money-spinner. In 1916, the site received some minimal protection. And, in 1928, the name was changed back to Port Arthur – Carnarvon had never caught on.
By 1937, the Tasmanian treasurer commended Port Arthur as:
The Stone Henge of Australia and one of the greatest tourist assets which this state possesses.
A more middlebrow and respectable class of tourist began to take an interest, admiring the site’s “Englishness” and the beauty of its historic ruins.
As one visitor put it in 1918, “bitter memories are fading into romantic interest”: the “beautiful workmanship” of the carved stone conjured up an English monastery rather than an Australian gaol.
A new management authority in 1987 treated the convict past with more sensitivity and respect, contrasting with some of the tackier commercial exploitation. But it still introduced a ghost tour that, on Viator’s tourism website, promises “ghoulish stories”, “terrifying tales”, “harrowing history” and a generally “spine-chilling” and “spooky” experience.
The melancholy and reflective were still jostled by people having a good time. For some reason, convict suffering is fun.
This touristic enjoyment of trauma poses a problem.
At places as diverse as Auschwitz, Ghana’s “slave castles”, the Tower of London, Gallipoli and Aboriginal massacre sites, this “dark tourism” is an important way of respecting the memory of past atrocity.
But often the response can verge on voyeurism and emotional indulgence; melancholy, pity and sorrow can be perversely pleasurable emotions.
What marks out convict tourism is the way that, while some tourists are moved, others are simply entertained. This lies at the core of the dilemma facing Port Arthur managers on the 20th anniversary of the massacre.
The tragedy that unfolded 20 years ago added another layer of horror to a site already scarred by atrocity, but one where heartbreak jostled awkwardly with holiday making.
The management’s immediate response was purposely low key, with a sensitively understated memorial to the massacre – off the beaten tourist track. It allowed tourists and workers to quietly remember the dead, who were also tourists and workers.
The switch to a more public commemoration for the 20th anniversary shows the dilemma remains: how to commemorate Port Arthur as a tourist site.
In truth, the best memorial to the victims of Martin Bryant, his Colt AR-15 and his FN FAL, will always be effective gun control.
This article is part of a package marking the 20th anniversary of the 1996 Port Arthur massacre.