Category Archives: war

The Molotov-Ribbentrop Pact


Advertisements

What happens now we’ve found the site of the lost Australian freighter SS Iron Crown, sunk in WWII



File 20190423 175521 rnurec.jpg?ixlib=rb 1.1
A bathymetric map showing SS Iron Crown on the sea floor.
CSIRO, Author provided

Emily Jateff, Flinders University and Maddy McAllister, James Cook University

Finding shipwrecks isn’t easy – it’s a combination of survivor reports, excellent archival research, a highly skilled team, top equipment and some good old-fashioned luck.

And that’s just what happened with the recent discovery of SS Iron Crown, lost off the coast of Victoria in Bass Strait during the second world war.

Based on archival research by Heritage Victoria and the Maritime Archaeological Association of Victoria, we scoped an area for investigation of approximately 3 by 5 nautical miles, at a location 44 nautical miles SSW of Gabo Island.

Hunting by sound

We used the CSIRO research vessel Investigator to look for the sunken vessel. The Investigator deploys multibeam echosounder technology on a gondola 1.2 metres below the hull.

Multibeam echosounders send acoustic signal beams down and out from the vessel and measure both the signal strength and time of return on a receiver array.

The science team watches the survey from the operations room of the CSIRO research vessel (RV) Investigator.
CSIRO, Author provided

The receiver transmits the data to the operations room for real-time processing. These data provide topographic information and register features within the water column and on the seabed.

At 8pm on April 16, we arrived on site and within a couple of hours noted a feature in the multibeam data that looked suspiciously like a shipwreck. It measured 100m in length with an approximate beam of 16-22m and profile of 8m sitting at a water depth of 650m.

Given that we were close to maxing out what the multibeam could do, it provided an excellent opportunity to put the drop camera in the water and get “eyes on”.

Down goes the camera.

The camera collected footage of the stern, midship and bow sections of the wreck. These were compared to archival photos. Given the location, dimension and noted features, we identified it as SS Iron Crown.

The merchant steamer

SS Iron Crown was an Australian merchant vessel built at the government dockyard at Williamstown, Victoria, in 1922.

SS Iron Crown afloat.
South Australian Maritime Museum, Author provided

On June 4 1942, the steel screw steamer of the merchant vavy was transporting manganese ore and iron ore from Whyalla to Newcastle when it was torpedoed by the Japanese Imperial Type B (巡潜乙型) submarine I-27.

Survivor accounts state that the torpedo struck the vessel on the port side, aft of the bridge. It sank within minutes. Thirty-eight of the 43 crew went down with the ship.

This vessel is one of four WWII losses in Victorian waters (the others were HMAS Goorangai lost in a collision, SS Cambridge and MV City of Rayville lost to mines) and the only vessel torpedoed.

After the discovery

Now we’ve finally located the wreck – seven decades after it was sunk – it is what happens next that is truly interesting.

A bathymetric map showing SS Iron Crown on the sea floor with its bow on the right.
CSIRO, Author provided

It’s not just the opportunity to finally do an in-depth review of the collected footage stored on an external hard drive and shoved in my backpack, but to take the important step of ensuring how the story is told going forward.

When a shipwreck is located, the finder must report it within seven days to the Commonwealth’s Historic Shipwreck Program or to the recognised delegate in each state/territory with location information and as much other relevant data as possible.

Shipwrecks aren’t just found by professionals, but are often located by knowledgeable divers, surveyors, the military, transport ships and beachcombers. It’s no big surprise that many shipwrecks are well-known community fishing spots.

While it is possible to access the site using remotely operated vehicles or submersibles, we hope the data retrieved from this voyage will be enough.

It was only 77 years ago that the SS Iron Crown went down. This means it still has a presence in the memories of the communities and families that were touched by the event and its aftermath.

No war grave, but protected

Even though those who died were merchant navy, the site isn’t officially recognised yet as a war grave. But thanks to both state and Commonwealth legislation, the SS Iron Crown was protected before it was even located.

All shipwrecks over 75 years of age are protected under the Commonwealth Historic Shipwrecks Act 1976. It is an offence to damage or remove anything from the site.

A drop camera view of the bow of SS Iron Crown with anchor chains.
CSIRO, Author provided

This protection is enhanced by its location in deeper water and, one hopes, by the circumstances of its loss.

Sitting on the sea floor in Bass Strait, SS Iron Crown is well below the reach of even technical divers. So the site is unlikely to be illegally salvaged for artefacts and treasures.

Yet this also means that maritime archaeologists have limited access to the site and the data that can be learnt from an untouched, well-preserved shipwreck.

Virtual wreck sites

But, like the increasing capabilities for locating such sites, maritime archaeologists now have access to digital mapping, 3D modelling technologies and high-resolution imagery as was used for the British Merchant Navy shipwreck of the SS Thistlegorm.

You can move within the video.

These can even allow us to record shipwreck sites (at whatever the depth) and present them to the public in a vibrant and engaging medium.




Read more:
VR technology gives new meaning to ‘holidaying at home’. But is it really a substitute for travel?


Better than a thousand words could ever describe, these realistic models allow us to convey the excitement, wonder and awe that we have all felt at a shipwreck.
Digital 3D models enable those who cannot dive, travel or ever dream of visiting shipwrecks to do so through their laptops, mobiles and other digital devices.

Without these capabilities to record, visualise and manage these deepwater sites, they will literally fade back into the depths of the ocean, leaving only the archaeologists and a few shipwreck enthusiasts to investigate and appreciate them.

So that’s the next step, a bigger challenge than finding a site, to record a deepwater shipwreck and enable the public to experience a well-preserved shipwreck.The Conversation

SS Iron Crown alongside SS Hagen.
National Library of Australia, Author provided

Emily Jateff, Adjunct lecturer in archaeology, Flinders University and Maddy McAllister, Senior Curator – Maritime Archaeology, James Cook University

This article is republished from The Conversation under a Creative Commons license. Read the original article.


Before the Anzac biscuit, soldiers ate a tile so hard you could write on it



File 20190423 15218 1386xfm.jpg?ixlib=rb 1.1
Christmas hard tack biscuit: Boer War. Australian War Memorial. Accession Number: REL/10747.
Courtesy of the Australian War Memorial

Lindsay Kelley, UNSW

Before Anzac biscuits found the sticky sweet form we bake and eat today, Anzac soldiers ate durable but bland “Anzac tiles”, a new name for an ancient ration.

Anzac tiles are also known as army biscuits, ship’s biscuits, or hard tack. A variety of homemade sweet biscuits sent to soldiers during the first world war may have been referred to as “Anzac biscuits” to distinguish them from “Anzac tiles” on the battlefield.




Read more:
Feeding the troops: the emotional meaning of food in wartime


Rations and care package treats alike can be found in museum collections, often classified as “heraldry” alongside medals and uniforms. They sometimes served novel purposes: Sergeant Cecil Robert Christmas wrote a Christmas card from Gallipoli on a hard tack biscuit in 1915.

The back of the biscuit reads “M[erry] Christ[mas] [Illegible] / Prosperous New Y[ear] / from Old friends / Anzac / Gallipoli 1915 / [P]te C.R. Christmas MM / 3903 / [illegible] / AIF AAMC”. More than a Christmas card, biscuits like these gave family at home a taste of foods soldiers carried and ate in battle. Archives around the world hold dozens of similar edible letters home.

Damaged army hard tack biscuit used as a Christmas card. Accession number REL/00918.
Courtesy of Australian War Memorial

Biscuit as stationery

This Anzac tile was made in Melbourne. In pencil, an anonymous soldier has documented his location directly on the biscuit’s surface: “Engineers Camp, Seymour. April 2nd to 25th 1917.”

Army Hard-tack Biscuit. Australian War Memorial. Accession Number: REL/03116.
Courtesy of the Australian War Memorial

In her history of the Anzac biscuit, culinary historian Allison Reynolds observes that “soldiers creatively made use of hardtack biscuits as a way of solving the shortage of stationery”.

Hardtack art

Army biscuits also became art materials on the battlefield. This Boer War era “Christmas hardtack biscuit”, artist unknown, serves as an elaborate picture frame.

Incorporating embroidery that uses the biscuit’s perforations as a guide, it also includes bullets, which form a metallic border for the photograph mounted on the biscuit.

Christmas hard tack biscuit: Boer War. Australian War Memorial. Accession Number: REL/10747.
Courtesy of the Australian War Memorial

A tin sealed with sadness

During WWI, any care package biscuit that was sweetly superior to an Anzac tile might have been called “Anzac biscuit”. Eventually, the name “Anzac biscuit” was given to a specific recipe containing golden syrup, desiccated coconut, oats, but never eggs.

Anzac biscuits held in our archives evoke everyday experiences of baking and eating. In one case, the biscuits also tell a story of loss. Lance Corporal Terry Hendle was killed in action just hours after his mother’s homemade biscuits arrived in Vietnam. The tin was returned to his mother, Adelaide, who kept it sealed and passed it down to his sister, Desley.

Australian War Memorial curator Dianne Rutherford explains that the museum will never open the sealed tin, because “this tin became a family Memorial to Terry and is significant for that reason. After Terry’s death, Adelaide and Desley never baked Anzac biscuits again”.

Sealed biscuit tin with Anzac biscuits: Lance Corporal Terence ‘Terry’ Edward Hendle, 6th Battalion, Royal Australian Regiment. Australian War Memorial. Accession Number: AWM2016.460.1.
Courtesy of the Australian War Memorial

Today, biscuit manufacturers must apply for Department of Veterans’ Affairs permission to use the word “Anzac”, which will only be granted if “the product generally conforms to the traditional recipe and shape”. Variations on the name are also not permitted – in a recent example, ice cream chain Gelato Messina was asked to change the name of a gelato from “Anzac Bikkie” to “Anzac Biscuit”.

The Anzac tile, on the other hand, rarely rates a mention in our commemorations of Anzacs at war – although school children and food critics alike undertake taste tests today in an effort to understand the culinary “trials” of the Anzac experience.

Scholar Sian Supski argues that Anzac biscuits have become a “culinary memorial”. What if the biscuits you bake this Anzac day ended up in a museum? What stories do your biscuits tell?


Lindsay will be launching a three year project about biscuits called “Tasting History” during the Everyday Militarisms Symposium at the University of Sydney on April 26.

She is recruiting participants for upcoming biscuit tasting workshops. Sign up here.The Conversation

Lindsay Kelley, Lecturer, Art & Design, UNSW

This article is republished from The Conversation under a Creative Commons license. Read the original article.


A whole new world: how WWI brought new skills and professions back to Australia



File 20190423 15218 xvvazf.jpg?ixlib=rb 1.1
The war spurred surgeons to develop new techniques, such as traction splints and blood transfusions.
from shutterstock.com

James Waghorne, University of Melbourne and Kate Darian-Smith, University of Tasmania

The first world war was significant to the formation of Australian national identity and defining national characteristics, such as making do and mateship. This is well acknowledged.

But it was also a technical war, which spurred advances in knowledge and expertise. Combined with the status of professionals in the public service, it profoundly reshaped Australia. It also led to the development of universities as places for training and professional qualification, as well as important research.

Before the war, concern about efficient use of public money and a desire to protect the public led governments to pass legislation to control professional practice. This ensured only qualified doctors could provide medical treatment, only qualified teachers taught in schools, and so on.

The recently released book The First World War, the Universities and the Professions in Australia, 1914–1939, edited by the authors, outlines how the war sped up these developments and widened the range of workers, such as physiotherapists, who saw themselves as part of a professional group.

New knowledge created in war

During the war, surgeons and dentists developed new techniques, such as traction splints and blood transfusions. The use of saline fluid to treat shock dramatically improved the survival rate of the wounded. Advances in plastic surgery – led by New Zealand-born but London-based Harold Gillies and assisted by Australian surgeons – helped those with devastating facial injuries. Psychiatrists contended with the new condition of shell shock.




Read more:
World War I: the birth of plastic surgery and modern anaesthesia


Engineers gained experience in logistics and the management of people. John Monash received a Doctor of Engineering in 1920 for his wartime developments in the coordinated offensive.

New ideas spread rapidly. As the noted surgeon Victor Hurley observed in 1950:

… treatment of large numbers of wounded and the stimulus of war necessities presented the opportunity for close observations and investigations on a large scale, such as were not readily possible in civil life.

The “regular contacts with officers of other medical services” allowed developments to be exchanged.

Professional contributions to the war

Professionals were also important to the war effort at home. Linguists provided translating and censorship services, lawyers drafted international treaties, while scientists and engineers developed processes for the mass manufacture of munitions and tested materials for use in military equipment.

The gas mask developed at the University of Melbourne.
Australian War Memorial

Often these initiatives combined expertise from different professions. Medical, engineering and science professors at the University of Melbourne developed a gas mask, manufactured in large quantities but not deployed.

Back in Australia, the Commonwealth government established the first federally funded research body – the Advisory Council of Science and Industry (later CSIRO) in 1916. Its first task was to tackle agricultural production issues, such as the spread of prickly pear. Australia’s farm production was essential to the war effort.

University research expanded after the war, as government and industry worked with the universities.

The greatest need was for doctors and nurses. Medical students who had broken their studies to enlist were brought back from the front to complete their training before returning. University medical schools shortened courses to rush more graduate doctors to the front. Women medical graduates, such as Vera Scantlebury-Brown, also served in Europe, although they could not join the medical corps.




Read more:
The forgotten Australian women doctors of the Great War


The Great War’s broader influences

More broadly the experience of travelling to European theatres of war exposed professionals to international ideas. Architect soldiers, in particular, brought the influences of European and Middle-Eastern sites to Australian buildings.

A notable example is the Royal Australasian College of Surgeons building. This was designed in a Greek revival style by returned soldiers Leighton Irwin and Roy Kenneth Stevenson. It opened in 1935 to house the college, which accredited Australian surgeons and sought to raise the standard of surgery and hospitals, efforts also spurred by the Great War.

The Royal Australasian College of Surgeons building was designed in a Greek revival style.
from shutterstock.com

Repatriation efforts cemented the position of professionals in the public sphere. Doctors determined eligibility for invalid benefits and managed treatment.

Returned soldiers received training, both in technical skills and also professional degrees. Many took the opportunity of studying in overseas institutions, including British and European universities, schools of the Architectural Association, London, or the Royal College of Surgery.

Australia’s universities remitted tuition fees for returned soldiers. This allowed individuals such as Albert Coates to go to university and become a noted surgeon. Coates would later gain renown for his work with prisoners of war in the second world war.

How did the war change professions?

After the war, new communication technologies created careers in radio broadcasting and advertising.

In response to the cascade of new knowledge, and to keep up with professional developments, university courses became increasingly specialised, at the expense of the generalist. The gaps created by specialisation allowed new groups to seek professional status, often competing with other professionals.

For instance, the number of war wounded, combined with poliomyelitis (polio) epidemics, created unprecedented demand for masseurs. Universities had offered individual subjects in massage at the turn of the century. Now masseurs pressed for full degree status, clashing with doctors who controlled medical practice.

By the time of the second world war, masseurs had become physiotherapists, with professional status.

Nurses learnt new skills during the war, and achieved greater social recognition.
Wikimedia Commons

Nurses had learnt new skills during the first world war and achieved greater social recognition. To build on this, the Australian Nursing Federation (now known as the Australian Nursing and Midwifery Federation) – established in 1924 – lobbied for university qualifications. It sought to overcome the prevailing conception nursing was marked by “service and sacrifice”, ideals encouraged by the reliance on volunteer nurses during the war.

All Australian states had nursing registers by 1928, admitting only qualified nurses. Although nurses could attend subjects in some universities before the second world war, a full university course waited until the latter part of the 20th century.




Read more:
Friendship in war was not just confined to bonds between men


A new national sentiment, fostered by the war, was evident in all of these developments. Professionals no longer fought battles only within local and state areas. Now they argued in general terms, confident their expertise supported national priorities.

Professionals lobbied through national associations, such as the Institution of Engineers (established in 1919), the Australian Veterinary Association (established in 1926), and the Law Council of Australia (established in 1933). These groups sought to raise the standing of their members and defend their interests, on this new basis.

The histories of professional groups and higher education have often focused on the period after the second world war, and the expansion of the sector. However, this overlooks the role of the first world war in transforming Australia into a nation that valued expertise, knowledge and professional standing.The Conversation

James Waghorne, Academic Historian, Melbourne Graduate School of Education, University of Melbourne and Kate Darian-Smith, Executive Dean and Pro Vice-Chancellor, College of Arts, Law and Education, University of Tasmania

This article is republished from The Conversation under a Creative Commons license. Read the original article.


The Second Opium War



Japan and Russia Unofficially at War



WWII: Germany Invades Denmark



When science is put in the service of evil



File 20190307 82665 1o7jut6.jpg?ixlib=rb 1.1
Nazi leadership saw medical and pharmaceutical research as a front-line tool to contribute to the war effort.
Akanbatt / Pixabay

Francisco López-Muñoz, Universidad Camilo José Cela

The Holocaust is one of the worst collective crimes in the history of humanity – and medical science was complicit in the horrors.

After World War II, evidence was given at the Nuremberg Trials of reprehensible research carried out on humans. This includes subjects being frozen, infected with tuberculosis, or having limbs amputated.

There was also specific research into pharmacology that is less well-known, as can be seen from the articles we have published over the past 15 years.




Read more:
Is it ethical to use data from Nazi medical experiments?


The prestige of German medicine

German pharmacology and chemistry enjoyed great international prestige from the second half of the 19th century.

This golden age ended with the Nazi Party’s rise to power in 1933 and was replaced with institutionalised criminal behaviour in public health and human research.

At the beginning of World War II, Nazi leadership saw medical and pharmaceutical research as a front-line tool to contribute to the war effort and reduce the impact of injury, disease and epidemics on troops.

Nazi leaders believed concentration camps were a source of “inferior beings” and “degenerates” who could (and should) be used as research subjects.

German pharmacology and medicine lost all dignity. As Louis Falstein pointed out:

the Nazis prostituted law, perverted education and corrupted the civil service, but they made killers out of physicians.

The rise of eugenics in central Europe at the beginning of the 20th century paved the way for the Nazi government to implement a disastrous policy of “racial hygiene”.

The Aktion T4 Programme

The Nazi ideology promoted the persecution of those who were considered “abnormal”, as part of the Aktion T4 program.

September 1, 1939 – the date of the start of World War II – marked the beginning of the mass extermination of patients with “deficiencies” or mental conditions, who were deemed to be “empty human shells”.

Aktion 4 patients getting on a bus, 1941.
Wikimedia Commons

At first, the crimes were carried out via carbon monoxide poisoning.

In 1941, a second phase was launched: so-called “discrete euthanasia” via a lethal injection of drugs such as opiates and scopolamine (anti-neausea medication), or the use of low-doses of barbiturates to cause terminal pneumonia.

These techniques were combined with food rations and turning off the hospital heating during winter.

These euthanasia programmes led to what amounted to psychiatric genocide, with the murder of more than 250,000 patients. This is possibly the most heinous criminal act in the history of medicine.

Experimenting with healthy subjects

Medical experimentation became another tool of political power and social control, over both sick people from the T4 Program, as well as healthy people.

Those in good health were recruited in the concentration camps of ostracised ethnic or social groups such as Jews, Gypsies, Slavs and homosexuals.

A number of experiments were undertaken, including the study of:

  • the effect of sulfonamides (antibiotics) on induced gas gangrene (Ravensbrück)
  • the use of the toxic chemical formalin for female sterilisation (Auschwitz-Birkenau)
  • the use of vaccines and other drugs to prevent or treat people intentionally infected with malaria (Dachau)
  • the effects of methamphetamine in extreme exercise (Sachsenhausen)
  • the anaesthetic properties of hexobarbital (a barbiturate derivative) and chloral hydrate (a sedative) in amputations (Buchenwald)
  • the use of barbiturates and high doses of mescaline (a hallucinogenic drug) in “brainwashing” studies (Auschwitz and Dachau).
Block 10 of the Auschwitz concentration camp, where medical experiments with prisoners were carried out.
Francisco López Muñoz

Faced with all this evidence, how it is possible that up to 45% of German doctors joined the Nazi party? No other profession reached these figures of political affiliation.

What were the reasons and circumstances that led to these perverse abuses?

The banality of evil in medicine

The answer is difficult. Many doctors argued that regulations were designed for the benefit of the nation and not the patient. They invoked such misleading concepts as “force majeure” or “sacred mission”.




Read more:
Two steps forward, one step back: how World War II changed how we do human research


Some believed everything was justified by science, even the inhumane experiments carried out in the camps, while others considered themselves patriots and their actions were justified by the needs of wartime.

Some were followers of the perverse Nazi ethos and others, the more ambitious, became involved in these activities as a means of promoting their professional and academic careers.

Lastly, avoiding association with the Nazi apparatus may have been difficult in a health sector where fear had become a system of social pressure and control.

A monument by Richard Serra in Berlin honouring of the victims of the Aktion 4 programme.
Wikimedia Commons

Arturo Pérez-Reverte, in his book Purity of Blood, defines this type of motivation very well:

… although all men are capable of good and evil, the worst are always those who, when they administer evil, do so on the authority of others or on the pretext of carrying out orders.

However, as has happened in many moments of history, sometimes tragedies bring positive posthumous effects.

After the trial of the Nazi doctors, the first international code of ethics for research with human beings was enacted, the Nuremberg Code, under the Hippocratic precept “primun non nocere”. This code has had immense influence on human rights and bioethics.




Read more:
Two steps forward, one step back: how World War II changed how we do human research


The Conversation


Francisco López-Muñoz, Profesor Titular de Farmacología y Director de la Escuela Internacional de Doctorado, Universidad Camilo José Cela

This article is republished from The Conversation under a Creative Commons license. Read the original article.


China: National Protection War



When political leaders choose catastrophe – how Europe walked willingly into World War I


William Mulligan, University College Dublin

Some political catastrophes come without warning. Others are long foretold, but governments still walk open-eyed into disaster. As the possibility of a no-deal Brexit looms, most analysts agree that there will be severe economic and political consequences for the UK and the EU. And yet a no-deal Brexit still remains an option on the table.

The July crisis in 1914 that lead up to World War I, which I’ve analysed in a recent paper, provides a timely case study of how politicians chose a catastrophic path. World leaders knew that that a European war would most likely bring economic dislocation, social upheaval, and political revolution – not to mention mass death – but they went ahead anyway. Far from thinking that the war would be short – “over by Christmas” as the cliché goes – leaders across Europe shared the view expressed by the British chancellor, David Lloyd George that war would be “armageddon”.

So why did European leaders not swerve away from catastrophe in 1914? A toxic mix of wishful thinking, brinksmanship, finger-pointing, and fatalism – features currently increasingly evident in the Brexit dénouement – conspired to make the risk of catastrophic war appear a legitimate, even rational, option.

First, a small number of leaders, mainly generals, believed that war would cleanse society of its materialist and cosmopolitan values. The more terrible the consequences, the more effective the war would be in achieving national renewal. War, they argued, would bolster the values of self-sacrifice and cement social cohesion. Instead, material shortage led to military defeat and social disintegration in Russia, Austria-Hungary and Germany.

Second, some politicians believed that the prospect of catastrophe could be used to lever their opponents into concession. Kurt Riezler, adviser to the German chancellor Bethmann Hollweg, had coined the term Risikopolitik, or risk policy. He predicted that, faced with the possibility of a European war, the great powers with less at stake would back down in any given crisis. But this logic broke down if both sides considered their vital interests in danger and if both sides faced similarly catastrophic consequences from war. This led to absurdities in the July crisis, such as the comment from Germany’s Kaiser William II that: “If we should bleed to death, at least England should lose India.”

A cartoon published in the Chicago Daily News in 1914.
Luther Daniels Bradley

Shifting blame

Third, politicians framed the crisis as a choice between two catastrophes. If they backed down, they feared the permanent loss of status, allies, and, ultimately, security. For Austro-Hungarian leaders, compromise rendered them vulnerable to further Serbian provocations and the slow disintegration of the Habsburg empire. War became the lesser of two evils, a highly risky strategy that might, but probably would not, avert certain ruin. As states began to mobilise, military and political leaders feared that whichever side moved first could gain a significant military advantage. Waiting too long risked the dual catastrophe of being at war and suffering an initial defeat. This logic was particularly important in the spiral of mobilisation on the eastern front, between Austria-Hungary, Russia and Germany.

Fourth, as war became increasingly likely, leaders began to deny their own ability to resolve the conflict. Politicians began to allocate blame for the coming conflict on their opponents. Lloyd George, who went on to become prime minister in 1916, later famously claimed that Europe had “slithered” into war. The denial of agency encouraged the sense of fatalism that facilitated the outbreak of war. If leaders perceived war as inevitable, this inevitability made it psychologically easier to accept the appalling consequences.

Fifth, individual decisions, such as allies assuring their unfailing loyalty to their partners, were often intended to avoid war by forcing the other side to back down. Yet, instead of making concessions, states doubled down on their demands and stood full-square by their allies, without urging compromise. The outcome was the rapid escalation of the crisis into war.

Seasoned diplomats at the helm

Most of the key diplomats in July 1914 had recently resolved major international crises, notably during the Moroccan Crisis in 1911 and the remaking of the Balkans during regional wars in 1912 and 1913. They had the diplomatic skills to avoid disaster.

Yet, by framing the July crisis in terms of an existential test – of status, territorial integrity, and the value of alliances – leaders in all the great powers trapped themselves in a spiral of escalating tensions and decisions. This meant they began to rationalise war as a possible option from early July.

Although the consequences of a no-deal Brexit will be much less terrible, there are similarities in certain patterns of thinking and political behaviour, from the few who embrace disaster to the systemic pressures which prevent compromise. Avoiding disaster in 1914 would have required framing the stakes of the July crisis in less zero-sum ways and refusing to rationalise a general European war as an acceptable policy option. It required leaders with enough courage to compromise, even to accept defeat, and for states to offer rivals the prospect of long-term security and future gains in exchange for accepting short-term setbacks.The Conversation

William Mulligan, Professor, School of History, University College Dublin

This article is republished from The Conversation under a Creative Commons license. Read the original article.


%d bloggers like this: