Tag Archives: history

From shouting it out to staying at home: a brief history of British voting


File 20170530 30203 1mf93lw.jpg?ixlib=rb 1.1
Hogarth’s The Polling, from the Humours of an Election series.
Wikipedia

Malcolm Crook, Keele University and Tom Crook, Oxford Brookes University

Most of the voters who will be casting their ballots in the general election on Thursday June 8 will take their right to do so for granted, unaware of the contested history of this now familiar action. It’s actually less than 100 years since all adult males in the UK were awarded the franchise for parliamentary elections, in 1918, in the wake of World War I. That right wasn’t extended to all adult women for a further ten years after that.

Even today, it might be argued, the democratic principle of “one person, one vote” has not been fully implemented, since the royal family and members of the House of Lords are not allowed to vote in parliamentary elections. And even after the mass enfranchisement of the early 20th century, university graduates and owners of businesses retained a double vote, the former in their university constituencies as well as where they lived. These privileges were only abolished in 1948, in face of overwhelming Conservative opposition.

How Britain votes today is also a relatively late development in electoral history. Until 1872, parliamentary electors cast their votes orally, sometimes in front of a crowd, and these choices were then published in a poll book. Public voting was often a festive, even riotous affair. Problems of intimidation were widespread, and sanctions might be applied by landlords and employers if voters failed to follow their wishes, though this was widely accepted at the time as the “natural” state of affairs.

Open voting even had its defenders, notably the political radical John Stuart Mill, who regarded it as a manly mark of independence.

But as the franchise was partially extended in the 19th century, the campaign for secrecy grew. The method that was eventually adopted was borrowed from Australia, where the use of polling booths and uniform ballot papers marked with an “X” was pioneered in the 1850s.

More recent reforms took place in 1969, when the voting age was lowered from 21 to 18. Party emblems were also allowed on the ballot paper for the first time that year. It’s this kind of paper that will be used on June 8.

Staying at home

What no one predicted, however, when these franchise and balloting reforms were first implemented, is that voters would simply not bother to turn out and that they would abstain in such considerable numbers.

To be sure, this is a relatively recent phenomenon. In fact, turnout for much of the 20th century at general elections remained high, even by European standards. The best turnout was secured in the 1950 general election, when some 84% of those eligible to do so voted. And the figure didn’t dip below 70% until 2001, when only 59% voted. Since then things have improved slightly. In 2010, turnout was 65%. In 2015, it was 66%. But the fact remains that, today, a massive one-third of those eligible to vote fail to do so, preferring instead to stay at home (and the situation in local elections is far worse).

Turnout over the years.
Author provided

What was a regular habit for a substantial majority of the electorate has now become a more intermittent practice. Among the young and marginalised, non-voting has become widely entrenched. Greater personal mobility and the decline of social solidarity has made the decision to vote a more individual choice, which may or may not be exercised according to specific circumstances, whereas in the past it was more of a duty to be fulfilled.

Voters rarely spoil their papers in the UK, whereas in France it is a traditional form of protest that has reached epidemic proportions: some 4m ballot papers were deliberately invalidated in the second round of the recent presidential election. Like the rise in abstention in both countries, it surely reflects disenchantment with the electoral process as well as disappointment with the political elite.

In these circumstances, the idea of compulsory voting has re-emerged, though in liberal Britain the idea of forcing people to the polling station has never exerted the same attraction as on the continent. The obligation to vote is a blunt instrument for tackling a complex political and social problem. When the interest of the electorate is fully engaged, as in the recent Scottish or EU referendums, then turnout can still reach the 75% to 80% mark.

The ConversationHowever, in the forthcoming parliamentary election, following hard on the heels of its predecessor in 2015, the EU vote and elections to regional assemblies in 2016, plus the local elections in May, voter fatigue may take a toll. It’s hard to envisage more than two-thirds of those entitled to do so casting their ballot on June 8. Given the relatively small cost involved in conducting this civic act, which is the product of so much historical endeavour, such disaffection must be a cause for significant concern.

Malcolm Crook, Emeritus Professor of French History, Keele University and Tom Crook, Senior Lecturer in Modern British History, Oxford Brookes University

This article was originally published on The Conversation. Read the original article.

Advertisements

Dove, real beauty and the racist history of skin whitening



File 20171010 10908 17sb3zr.jpg?ixlib=rb 1.1
The Dove ad published on Facebook, which the company took down after many complaints of racial insensitivity.
NayTheMUA/Facebook

Liz Conor, La Trobe University

This week the marketing office of Dove, a personal care brand of Unilever, found itself in hot water over an ad that many people have taken to be racially insensitive. Social media users called for a boycott of the brand’s products.

The offending ad showed a black woman appearing to turn white after using its body lotion. This online campaign was swiftly removed but had already hurtled through social media after a US makeup artist, Naomi Blake (Naythemua), posted her dismay on Facebook, calling the ad “tone deaf”.

//platform.twitter.com/widgets.js

Dove responded initially via Twitter.

//platform.twitter.com/widgets.js

The company then followed up with a longer statement: “As a part of a campaign for Dove body wash, a three-second video clip was posted to the US Facebook page … It did not represent the diversity of real beauty which is something Dove is passionate about and is core to our beliefs, and it should not have happened.”

//platform.twitter.com/widgets.js

One has to ask, were the boys destined for Dove marketing kicking on at the pub instead of going to their History of Advertising lecture, the one with the 1884 Pears’ soap ad powerpoint? Jokes aside, Dove’s troubling ad buys into a racist history of seeing white skin as clean, and black skin as something to be cleansed.

The original Pears’ soap advert based on the fable Washing the Blackamoor white, published in the Graphic for Christmas 1884.
Author provided

Racist history

Dove has missed the mark before. In a 2011 ad, three progressively paler-skinned women stand in towels under two boards labelled “Before” and “After”, implying transitioning to lighter skin was the luminous beauty promise of Dove (Dove responded that all three women represented the “after” image).

Many of the indignant comments reference the longstanding trope of black babies and women scrubbed white. Australia has particular form on this front. Gamilaraay Yuwaalaraay historian Frances Peters–Little (filmmaker and performing artist) has demanded an apology from Dove. She posted a soap advertisement for Nulla Nulla soap from 1901 on Facebook to show the long reach of racism through entrenched tropes still at work in the Dove ads.

A soap advertisement for Nulla Nulla soap from 1901.
Author provided

Wiradjuri author Kathleen Jackson has also written about the Nulla Nulla ad and the kingplate, a badge of honour given by white settlers to Aboriginal people, labelled “DIRT”. She explains that whiteness was seen as purity, while blackness was seen as filth, something that colonialists were charged to expunge from the face of the Earth. Advertising suggested imperial soap had the power to eradicate indigeneity.

This coincided with policies that were expressly aimed at eliminating the “native”. In Australia the policy of assimilation was based on the entirely spurious scientific whimsy of “biological absorption”, that dark skin and indigenous features could be eliminated through “breeding out the colour”.

In New South Wales, “half-caste” girls were targeted for removal from their families and placed as domestic servants in white homes where it was assumed “lower-class” white men would marry them. These women were often vulnerable to sexual violence. Any resulting children, however begotten, would be fairer-skinned, due implicitly to the bleaching properties of white men’s semen.

Aboriginal mothers were vilified as unhygienic and neglectful. In fact, they battled against often impossible privation to turn their children out immaculately in the hope police would have less cause to remove them.

Real beauty?

Cleanliness and godliness, whiteness and maternal competency: these are the lacerations Dove liberally salted with its history-blind ad. It unwittingly strikes at the resistance and resilience of Aboriginal families who for generations fended off fragmentation, draconian administration and intrusive surveillance by state administrators. Its myopic implied characterisation of beauty as resulting from shedding blackness is mystifying.

In 2004, Dove kicked off a campaign for “Real Beauty”. It proclaims itself “an agent of change to educate and inspire girls on a wider definition of beauty and to make them feel more confident about themselves”. Dove’s online short films about beauty standards – including Daughters, Onslaught, Amy and Evolution – have been recognised with international advertising awards.

Yet Dove also sits in Unilever with Fair and Lovely, a skin whitening product and brand developed in India in 1975. This corporate cousin to Dove touts its bleaching agent as the No. 1 “fairness cream” and purports to work through activating “the Fair and Lovely vitamin system to give radiant even toned skin”. It is sold in over 40 countries.

Skin whitening products (there is also a Fair and Handsome for men, not associated with Unilever) are popular in Asia, where more than 60 companies compete in a market estimated at US$18 billion. They enforce social hierarchies around caste and ethnicity. Since the 1920s the racialised politics of skin lightening have spread around the globe as consumer capitalism reached into China, India and South Africa.

The ConversationDove responded to its controversial ad by saying that “the diversity of real beauty… is core to our beliefs”. But “core” here seems skin-deep when it fails to penetrate into the pores of its parent company and its subsidiaries.

Liz Conor, ARC Future Fellow, La Trobe University

This article was originally published on The Conversation. Read the original article.


Poland



10 Best History Apps for Android


The link below is to an article that looks at 10 of the best history apps for Android.

For more visit:
http://www.androidauthority.com/best-history-apps-for-android-801017/


A bloody decade of the iPhone



File 20170904 17292 1nx8xtk
Foxconn was nominated for the 2011 Public Eye Award, which produced this image as part of its campaign to end labour exploitation.
Greenpeace Switzerland/flickr, CC BY-NC-ND

Jack Linchuan Qiu, Chinese University of Hong Kong

This article is part of the Democracy Futures series, a joint global initiative with the Sydney Democracy Network. The project aims to stimulate fresh thinking about the many challenges facing democracies in the 21st century.


Ten years ago the first iPhone went on sale. The iconic product not only profoundly altered the world of gadgets, but also of consumption and tall corporate profit; this world would be impossible without the toiling of millions along the assembly line.

I look back at the first ten years of the iPhone and see a bloody decade of labour abuse, especially in Chinese factories such as those run by Foxconn, the world’s largest electronics manufacturer. At one point Foxconn had more employees in China than the US armed forces combined.

Foxconn makes most of its money from assembling iPhones, iPads, iMacs and iPods. Its notorious “military management” was blamed for causing a string of 17 worker suicides in 2010.

The company tried so hard to stop the suicides, not by digging out the roots of exploitation, but by erecting “anti-jumping nets” atop its buildings. Never before has a modern factory hidden behind such suicide-prevention netting, which last appeared on transatlantic slave ships centuries ago.

Foxconn is only one part of the Apple empire. The long and complicated supply chain has caused innumerable work injuries, occupational diseases and premature deaths over the past decade.

To date, Apple does not offer a full account for the total damage of victimised lives. The number must be many, many thousands if we include all Apple suppliers. And yet factories like Foxconn often enjoy immunity, sometimes taking no responsibility at all.

Readers unfamiliar with the dark reality behind the iPhone need only watch Complicit.

To make a living, workers must break the law

Apple continues to put out bogus claims:

Products made to have a positive impact. On the world and the people who make them.

The company claims to hold its suppliers accountable “to the highest standards”.

In reality, corporate practices in the making of the iPhone are substandard when held up against either Chinese labour regulations or ethical smartphone companies such as Fairphone. Apple’s standards for their workers are anything but “the highest”.

Wages remain low. Students and Scholars Against Corporate Misbehaviour calculate that the living wage for an iPhone worker in Shenzhen, China, should be about $650 per month. But to earn this amount today, an average worker would need to pull off 80-90 hours of overtime every month – that’s more than double the legal cap of 36 hours.

In other words, to make a living, workers have no choice but to break Chinese law.

Back in 2012, Apple vowed to work with Foxconn to bring the amount of overtime down to no more than 49 hours a week. It later broke its promise and retreated to adopt the Electronic Industry Code of Conduct (EICC), which stipulates “no more than 60 hours a week”.

The EICC standard is 25% lower than the Chinese legal threshold. So why did Apple opt for a less-than-legal code of conduct in the Chinese context over a higher standard? Tim Cook owes us an explanation.

Even with the EICC, workers refusing to do excessive overtime at the current wage level simply won’t be able to make ends meet. The only way for workers to earn a livelihood without doing an illegal amount of overtime, and without compromising their physical, mental and social health, is for Apple and their suppliers to raise basic wages.

Is there real progress behind the progress reports?

Apple also brags about its training programs. According to its 2017 Supplier Responsibility Progress Report, the company partnered with its suppliers to train more than 2.4 million workers on their rights as employees. One basic right is for workers to unionise.

However, those at Foxconn are stuck with a management-run fake union that is ineffective and fooling no one.

If Apple is serious about its words, it should let workers know about their rights to genuine union representation and use its influence to let workers exercise this right. Unfortunately, no such thing has occurred in the past ten years. Will it happen in the next ten?

Apple’s standards for their workers are anything but ‘the highest’.
Annette Bernhardt/flickr

Considering that Apple has recently backed out from the Fair Labor Association, a third-party auditor of corporate social responsibility (CSR), I’m sceptical. The FLA is not exactly “the highest standard” in labour-related auditing to begin with. But Apple no longer even bothers to ask it to assess supplier working conditions.

Despite this regressive move, Apple declared in its annual CSR report that it “continue(s) to partner with independent third-party auditors”.

The glossy report offers no information on who the auditors actually are, and how their independence is guaranteed. This is fairly inconsistent with Apple’s claim to be the most transparent of IT companies.

What then, are “the highest standards”? The least Apple can do is to let international trade union federations audit Foxconn and other suppliers to ensure their workers are not mistreated. If Apple and Foxconn are so proud of what they have done for workers, why would they be afraid?

Apple should also stop pretending it doesn’t know about Fairphone, the Lovie Award-winning Dutch smartphone firm that was Europe’s “fastest-growing tech startup” in 2015.

Fairphone, with its modular design, information transparency and worker welfare fund, has brought revolutionary change to the ethical design, manufacture and recycling of smartphones, setting a truly new standard for the likes of Apple.

Last August, I visited Hi-P, a factory in Suzhou, eastern China, that assembles Fairphones. Hi-P also happens to be a supplier for Apple. According to a worker I spoke to, she and her colleagues preferred to make Fairphones because the job was less demanding and more generously remunerated.

“It’s much harder working for Apple. They are so stingy,” the assembly-line worker in her late 30s told me. “Our managers asked them [Apple] to give us similar bonuses [as we received from Fairphone]. They tried again and again, but ended up getting nothing even close.”

If an ordinary worker can plainly demonstrate that Apple does not, in fact, have the “highest standards”, surely it’s time the company stopped pleading ignorance or innocence of its labour abuse.

There’s no excuse for Apple’s first bloody decade of the iPhone. And even less so for its next ten years.


The ConversationJack Linchuan Qiu’s book, Goodbye iSlave: A Manifesto for Digital Abolition, is available from The University of Illinois Press.

Jack Linchuan Qiu, Professor, School of Journalism and Communication, Chinese University of Hong Kong

This article was originally published on The Conversation. Read the original article.


A home for everyone? Property ownership has been about status and wealth since our convict days



File 20170908 9573 1gprv4p
A house and land on the River Derwent, Tasmania, 1822.
National Library of Australia

Imogen Wegman, University of Tasmania

While Australia has an egalitarian mythology, where everyone has a chance, the roots of problems with access to housing lie in our history. The first land grants were given to former convicts as a way to control an unfenced prison colony. As free settlers arrived in Australia, priorities changed, land ownership gained prestige, and smaller landholders were pushed out of the market.

When Governor Phillip stepped onto Australian soil for the first time, in 1788, he carried with him a set of instructions to guide him through the early days of the newest British colony. Included was some authority to grant land, and the number of acres each male convict could receive at the end of his sentence. Eighteen months later, the colony received further instructions from Home Secretary William Grenville, permitting soldiers and free settlers to receive parcels of land if they chose to stay in the colony.

Grants given to former convicts at Norfolk Plains, northern Tasmania, 1814.
G.W. Evans, held by Tasmanian Archives and Heritage Office, AF 396/1/1325

Grenville’s instructions also set out the pattern of land granting that would dominate the colony for the next two decades. Groups of grants were to be placed at the edge of a waterway, with each individual property stretching back into the land rather than along the bank. These rules had a long history; the American colony of Georgia received almost identical phrasing in 1754, but other versions had been in place since the early 18th century.

The rules had two specific purposes in Australia: to foster productivity; and to maintain surveillance over the landholding population, which consisted largely of former convicts.

Initially, all land grants were required to conform to these instructions, and status was shown by the amount of land received. Former convicts started at 30 acres, while free settlers got at least 100 acres.

Under this scheme everyone would receive a mixture of good and bad soils, access to a navigable river and the safety of a surrounding community – important in an unfamiliar land. These grants would reduce the colony’s reliance on imported provisions. Instead, it could feed excess produce into the ports that restocked passing ships.

Colonial exploration and expansion could then continue to stretch to the furthest parts of the globe. But the rules also kept the grantees contained and within a dayʼs travel of a centre of governance (Hobart or Launceston, for example).

Free settlers’ arrival changed the rules

In 1817, the Colonial Office began to encourage voluntary emigration to the Australian colonies, and ambitious free settlers arrived. People complained about the failings of the former convicts, as they practised a rough agriculture that did not fit British ideals.

At the same time the management of convicts in Van Diemen’s Land (Tasmania) moved towards the harsh penitentiary system today associated with convicts. Using land grants to pin the former convict population to specific locations, while permitting them the freedom to live their lives, conflicted with free settlersʼ aspirations for the colony.

It is no accident that Bothwell, in Tasmania’s Derwent Valley, was not directly connected to Hobart by river and was dominated by free settlers. The spread of Europeans across the land resulted from the mix of an expanding overland road network and the reduced need to keep these higher-status settlers within armʼs reach.

Grants at Bothwell were given primarily to free settlers.
Surveyor and date unknown, Tasmanian Archives and Heritage Office, AF 396/1/338

Land granting policies that excluded poorer settlers (most of whom were former convicts or the children of convicts) were introduced. Only those people with £500 capital and assets (roughly A$80,000) would be eligible. The minimum grant would be 320 acres.

One writer, the colonial surveyor G.W. Evans, asked at the time whether this was intended to drive those without means to the United States of America instead. Even if they scraped together the money, the sheer quantity of land would be beyond their ability to cultivate.

Average grant sizes, taken from specific representative regions to eliminate duplicates in the records.
Author, 2017

Locating former convicts on the rivers ensured productivity and the reliable transportation of goods, but these grants also kept them under close observation. As the penal system became more punitive convicts lost the hope of gaining a small piece of land after their sentence.

The ConversationBut before this, far from being intended as any kind of reward or enticement, the first land grants given in Australia represented ongoing control over the lowest class of settlers – those who had been “transported beyond the seas”. Since the beginning of our colonial history, land ownership in Australia has been intricately connected with role and status.

Imogen Wegman, PhD candidate, History and Classics, University of Tasmania

This article was originally published on The Conversation. Read the original article.


A short history of the office



File 20170811 1202 9p7a0j
In the seventeenth century lawyers, civil servants and other new professionals began to work from offices in Amsterdam, London and Paris.
British Museum/Flickr

Agustin Chevez, Swinburne University of Technology and DJ Huppatz, Swinburne University of Technology

For centuries people have been getting up, joining a daily commute or retreating to a room, to work. The office has become inseparable from work.

Its history illustrates not only how our work has changed but also how work’s physical spaces respond to cultural, technological and social forces.

The origins of the modern office lie with large-scale organisations such as governments, trading companies and religious orders that required written records or documentation. Medieval monks, for example, worked in quiet spaces designed specifically for sedentary activities such as copying and studying manuscripts. As depicted in Botticelli’s St Augustine in His Cell, these early “workstations” comprised a desk, chair and storage shelves.

Sandro Botticelli St Augustin dans son cabinet de travail or St Augustine at Work.
Wikipedia Commons

Another of Botticelli’s paintings of St Augustine at work is now in Florence’s Uffizi Gallery. This building was originally constructed as the central administrative building of the Medici mercantile empire in 1560.

It was an early version of the modern corporate office. It was both a workplace and a visible statement of prestige and power.

But such spaces were rare in medieval times, as most people worked from home. In Home: The Short History of an Idea, Witold Rybczynski argues that the seventeenth century represented a turning point.

Lawyers, civil servants and other new professionals began to work from offices in Amsterdam, London and Paris. This led to a cultural distinction between the office, associated with work, and the home, associated with comfort, privacy and intimacy.

Despite these early offices, working from home continued. In the nineteenth century, banking dynasties such as the Rothschilds and Barings operated from luxurious homes so as to make clients feel at ease. And, even after the office was well established in the 1960s, Hugh Hefner famously ran his Playboy empire from a giant circular bed in a bedroom of his Chicago apartment.

A police station office in the 1970s.
Dave Conner/Flickr, CC BY

But these were exceptions to the general rule. Over the course of the nineteenth and twentieth centuries, increasingly specialised office designs – from the office towers of Chicago and New York to the post-war suburban corporate campuses – reinforced a distinction between work and home.

Managing the office

Various management theories also had a profound impact on the office. As Gideon Haigh put it in The Office: A Hardworking History, the office was “an activity long before it was a place”.

Work was shaped by social and cultural expectations even before the modern office existed. Monasteries, for example, introduced timekeeping that imposed strict discipline on monks’ daily routines.

Later, modern theorists understood the office as a factory-like environment. Inspired by Frank Gilbreth’s time-motion studies of bricklayers and Fredrick Taylor’s Principles of Scientific Management, William Henry Leffingwell’s 1917 book, Scientific Office Management, depicted work as a series of tasks that could be rationalised, standardised and scientifically calculated into an efficient production regime. Even his concessions to the office environment, such as flowers, were intended to increase productivity.

Technology in the office

Changes in technology also influenced the office. In the nineteenth and early twentieth century, Morse’s telegraph, Bell’s telephone and Edison’s dictating machine, revolutionised both concepts of work and office design. Telecommunications meant offices could be separate from factories and warehouses, separating white and blue collar workers. Ironically, while these new technologies suggested the possibility of a distributed workforce, in practice, American offices in particular became more centralised.

IBM Selectric typewriter marked a change in office technology.
Christine Mahler/Flickr, CC BY-NC-SA

In 1964, when IBM introduced a magnetic-card recording device into a Selectric typewriter, the future of the office, and our expectations of it, changed forever. This early word processor could store information, it was the start of computer-based work and early fears of a jobless society due to automation.

Now digital maturity seems to be signalling the end of the office. With online connectivity, more people could potentially work from home.

But some of the same organisations that promoted and enabled the idea of work “anywhere, anytime” – Yahoo and IBM, for example – have cancelled work from home policies to bring employees back to bricks and mortar offices.

Why return to the office?

Anthropological research on how we interact with each other and how physical proximity increases interactions highlights the importance of being together in a physical space. The office is an important factor in communicating the necessary cues of leadership, not to mention enabling collaboration and communication.

Although employers might be calling their employees back to the physical space of the office again, its boundaries are changing. For example, recent “chip parties”, celebrate employees getting a radio-frequency identification implant that enables employers to monitor their employees. In the future, the office may be embedded under our skin.

The ConversationWhile this might seem strange to us , it’s probably just as strange as the idea of making multiple people sit in cubicles to work would have seemed to a fifteenth-century craftsman. The office of the future may be as familiar as home, or even our neighbour’s kitchen table, but only time will tell.

Agustin Chevez, Adjunct Research Fellow, Centre For Design Innovation, Swinburne University of Technology and DJ Huppatz, Senior Lecturer, Swinburne University of Technology

This article was originally published on The Conversation. Read the original article.


Powerful and ignored: the history of the electric drill in Australia


Tom Lee, University of Technology Sydney and Berto Pandolfo, University of Technology Sydney

Portable electric drills didn’t always look like oversized handguns.

Before Alonzo G. Decker and Samuel D. Black intervened in the 1910s, the machines typically required the use of both hands. The two men, founders of the eponymous American company Black & Decker, developed a portable electric drill that incorporated a pistol grip and trigger switch, apparently inspired by Samuel Colt’s pistol.

We are documenting a collection of more than 50 portable electric drills made roughly between 1930 and 1980.

Seen as part of a history of technology, they have a lot to teach us about function and form, masculine values and the history of Australian craft.


Read more: Reengineering elevators could transform 21st-century cities


The collection also represents an important chapter in Australian manufacturing, and includes drills produced by local companies such as Sher, KBC and Lightburn that have since disappeared. It also features models made by Black & Decker, which once had manufacturing operations in Australia.

The CP2 manufactured by Black & Decker in Croydon, Victoria. There is evidence of this model being on the market from 1963 to 1966, although we suspect it was available earlier and for much longer.
Berto Pandolfo, Author provided

Design historians and collectors have paid little attention to the electric drill. It’s seen as an object of work, unlike domestic items such as the tea kettle, which can be statements of taste and luxury.

But the device deserves our attention. It’s considered the first portable electric power tool, and arguably helped to democratise the industry, putting construction in the hands of everyone from labourers to hobbyists.

The electric drill in Australia

Australia once played a significant role in producing the portable electric drill.

Ken Bowes & Co. Ltd, known as KBC, was a South Australian manufacturing company founded in 1936. Although it produced domestic appliances such as the bean slicer, die casting of military components such as ammunition parts (shell and bomb noses) and tank attack guns kept the company busy during World War II.

It appears that KBC entered the hardware market in 1948 with its first portable electric drill, designed for the cabinet maker and general handyman. The body of the drill was made from die-cast zinc alloy and it had a unique removable front plate on the handle to allow the user easy access to the connection terminals.

KBC drill and label (note the lack of integration between handle and body), circa 1950s.
Berto Pandolfo, Author provided

In 1956, Black & Decker established an Australian manufacturing plant in Croydon, Victoria, where drills such as the CP2 were manufactured.

Between 1960 and 1982, many power tool brands had a media presence. KBC sponsored a radio program called, appropriately enough, That’s The Drill. Wolf power tools were awarded as prizes on the television program Pick-A-Box.

Black & Decker ran advertisements that appeared during popular television programs and used endorsements by sporting celebrities such as cricketer Dennis Lillee.

While the popularity of portable power drills has endured, the manufacture of these objects in Australia more or less vanished by the end of the 20th century.

Why we value some objects and not others

The portable electric drill has been poorly documented by designers, historians and museums.

Obvious repositories for their collection, such as museums of technology or innovation, are increasingly challenged by space and funding pressures. Apart from a few token examples, many everyday objects have not managed to establish a museum presence.

The Museum of Applied Arts and Sciences in Sydney holds at least two vintage portable electric drills: one is a Desouthers, made in England, and another drill of unknown origin. Museums Victoria has one example of a Black & Decker electric drill from the 1960s in its digital archive.

The crude utility of the portable drill is part of the reason why it has escaped much academic scrutiny.

The Black and Decker U-500 drill. The first drill to be completely manufactured in Australia at the Crodyon factory in Victoria.
Berto Pandolfo, Author provided

Design studies and collections tend to focus on luxury objects such as Ferrari sports cars and Rolex wristwatches. Even kitchen and home appliances get more attention, especially those designs associated with high-end companies such as Alessi and designers such as Dieter Rams and Jasper Morrison.

By contrast, the electric drill remains a B-grade object. It is a stock weapon in horror films, although even there it lacks the status reserved for the more sublimely threatening implements of violence such as swords, spears and guns.

The case for the drill

Hard yakka and aesthetics have not typically been happy bedfellows. However, labour and its associated objects can provide a compelling look at contemporary life.

Like the laptop computer, the shape of which is tied to the “macho mystique” of the briefcase, the pistol form of the portable drill seems to be significantly influenced by ideas of power and masculinity.

The symbolic association with the pistol is also practical, and would have no doubt eased the burden for those early users struggling with the device’s weight.


Read More: Apple’s goodbye to the MP3 player reminds us why the iPod became an instant classic


A recent turn towards the everyday as a site for design anthropology will hopefully shift focus towards inconspicuous yet important technologies like portable electric drills.

These objects are part of a rich history that will be forgotten if institutions focus exclusively on luxury items, big name designers and cultures of display and ornament.

The ConversationEven our most anonymous objects are sources of cultural expression, and they should not be overlooked.

Tom Lee, Lecturer, Faculty of Design and Architecture Building, University of Technology Sydney and Berto Pandolfo, Director Industrial Design, University of Technology Sydney

This article was originally published on The Conversation. Read the original article.


Living blanket, water diviner, wild pet: a cultural history of the dingo



File 20170719 13561 7m29x5
A watercolour of a dingo, pre-1793, from John Hunter’s drawing books.
By permission of The Hunterian Museum at the Royal College of Surgeons, London.

Justine M. Philip, University of New England

In traditional Aboriginal society, women travelled with canine companions draped around their waists like garments of clothing. Dingoes played an important role in the protection and mobility of the women and children, and are believed to have greatly extended women’s contribution to the traditional economy and food supply.

Wongapitcha women carrying dogs which they hold across their backs to enjoy the warmth of the animals’ bodies,
Photo and caption: Herbert Basedow, 1924. Glass plate negative, by permission of the NMG Macintosh collection, J. L. Shellshear Museum, University of Sydney.

Dingo pups were taken from the wild when very young. The pups were a highly valued ritual food source, while others were adopted into human society. They grew up in the company of women and children, providing an effective hunting aid, a living blanket and guarding against intruders.

Nursing young dingo pups was also deeply embedded in traditional customs. Interspecies breastfeeding of mammalian young was common in most human societies pre-industrialisation, historically providing the only safe way to ensure the survival of motherless mammalian young. Technological advances in milk pasteurisation made artificial feeding a viable alternative by the late 1800s.

Cohabitation with human society represented a transient phase of the dingo’s lifecycle: the pups generally returned to the wild once mature (at one or two years of age) to breed. As such, dingoes maintained the dual roles of human companion and top-order predator – retaining their independent and essentially wild nature over thousands of years.


Further reading: Dingoes do bark: why most dingo facts you think you know are wrong


Post-colonisation, it became too dangerous to keep the semi-wild canines in the Aboriginal camps. Dingoes were targeted for eradication as livestock holdings spread across the country. Their removal would have had a profound impact on the women, resulting in a great loss of traditional knowledge and status.

DNA studies estimate that the dingo arrived on the Australian continent between 4,700 and 18,000 years ago, representing perhaps the earliest example of human-assisted oceanic migration. They were adopted into Aboriginal society, maintaining a symbiotic partnership that lasted thousands of years, and for this reason have been celebrated as a cultural keystone species.

The dingo’s ability to locate water above and below ground was perhaps its most indispensable skill. Written records, artworks and photographs in museum archives reveal dingo water knowledge as recorded by European explorers. Records reveal a number of accounts of wild/semi-wild dingoes leading Europeans to lifesaving water springs.

In Australian cartography, a “Dingo Soak” refers to a waterhole dug by a mythical or live canine. There are other freshwater landmarks across the continent – “Dingo Springs”, “Dingo Rock”, “Dingo Gap”.

In Aboriginal mythology, the travels of ancestral dingoes map out songlines, graphemic maps tracing pathways across the continent from one water source to the next. Their stories tell of the formation of mountains, waterholes and star constellations. In some accounts, dingoes emerged from the ground as rainbows; in others they dug the waterholes and made waterfalls as they travelled through the landscape.

Ethnographic evidence

Human-dingo heritage is preserved in ethnographic collections in Paris, London and Washington DC. Artefacts include talismans and ornaments made of canine teeth, bones and fur; rain incarnations and love charms. Funerary containers were decorated with dingo teeth, providing protection for the spirit in the afterlife.

In one portrait from the Smithsonian collection, an Aboriginal woman wears a dingo tail headdress – a talisman believed to hold great power and worn by warriors going to battle.

Woman with headdress looking to the side, 1870-1873 by John William Lindt, albumen print 20 x 15cm.
Smithsonian National Anthropological Archives USNM INV 04926900

The Smithsonian Institutional Archives also reveal a wealth of information about the post-colonial social history of the dingo. Many semi-wild dingoes were kept in early Euro-Australian settlements, then transported live to England, France and later to America as diplomatic gifts or exotic animal displays.

It was noted that the dingoes remained essentially wild despite numerous attempts to domesticate them – they failed to respond to any amount of discipline, kindness, bribery or coercion. Despite 230 years of surviving on the fringes of human settlements, travelling in menageries and circus troupes, living in zoos and semi-domestic arrangements, this remains true today. The dingo has retained its independent character and irreversible prey drive.

However, they did breed quite well in captivity and zoos often had excess pups to trade. Occasionally these pups went to private homes. Affectionate and tractable when young, eventually their carnivorous nature would get the better of them. The majority soon ended up in difficult circumstances and back in the hands of officials.

Dingo pup, 1930. Popular official guide to the New York zoological park.
Zoo Ephemera collection, Smithsonian National Museum of Natural History Library.

A Roosevelt connection

The Smithsonian holds records of the first live dingoes to arrive in Washington DC in 1901, a gift from the US consul to New South Wales. A number of pups were later born at the National Zoo and documented in the daybooks and records of births, deaths, sales and exchange.

One file contains a curious letter, offering a home to one of the dingo pups on display, dated May 14 1908. The request for the pup was signed by Theodore Roosevelt junior. At the time, Theodore’s father was in office as the 26th president of the United States, and the Roosevelt family were in residence at the White House.


Further reading: A wolf in dog’s clothing? Why dingoes may not be Australian wildlife’s saviours


The Roosevelt children were well known for their eclectic pet collection. Many arrived as diplomatic gifts and ended up at the National Zoo, or were traded through the local Schmid’s Bird and Pet Animal Emporium at 712 12th Street NW. The list included snakes (which ended up, uninvited, in the Oval Office) and a pig, smuggled into the presidential residence under the care of a young Quentin Roosevelt.

The pig, once discovered, ended up in Schmid’s Emporium for sale under a sign stating: “this pig slept last night in the White House”. No records have surfaced about the Roosevelts’ dingo. However, five months later – around the time that the pup would have been challenging all boundaries of domestication – a sale notice appeared in The Washington Post, dated October 16 1908, under: DOGS, PETS, ETC.
“JUST RECEIVED, Dingo, Australian wild dog … SCHMID’S BIRD STORE 712 12TH.”

Baudin’s dingoes

The first live exhibit of dingoes in an international display appeared in the Ménagerie du Jardin des Plantes in Paris in 1803. The pair, a male and female, had been collected by Captain Nicholas Baudin in Port Jackson, and transported to France on Le Naturalist in the care of François Péron (zoologist) and Charles-Alexandre Lesueur (artist).

Frederick Cuvier, naturalist and zookeeper, was assigned to care for the dingoes in Paris. In 1924, he wrote:

This dog, who was female, was about eighteen months when she arrived in Europe. She lived in freedom in the vessel where she was embarked, and despite the corrections inflicted on her, as well as a young male that died as a result of a punishment too harsh, she continued to evade punishment and consume all that suited her appetite.

The female lived for seven years in the gardens, the male survived just two months after arrival.

Eventually the bodies of both were transferred to the stores of the Muséum national d’Histoire naturelle and preserved for perpetuity. Notes in the museum state that early on in the voyage of Le Naturalist, before departing King Island for France, the male dingo had “been too brutally castrated because of his independent character” and these injuries eventually killed him.

The taxidermy specimens remain in the vaults of the museum today.

NASA’s dingo encounter

In 2006, a team of NASA scientists from the Smithsonian’s Centre for Earth and Planetary Studies were in Australia’s Simpson Desert studying the formation of parallel desert dunes similar to those found on Mars.

Senior scientist Ted Maxwell took a photo of a wild dingo casually observing the scientists while they were staking out a dune on the Colson Track. Maxwell recorded the co-ordinates of their location, noting that it was a 60-kilometre journey across the desert to Dalhousie Springs, the nearest known permanent water source. The dingo appears completely at ease.

A dingo inspects scientists’ work during their expedition in the Simpson Desert, Australia. May, 2006.
Ted Maxwell, Smithsonian National Air & Space Museum.

Another reference to a dingo appears in the Smithsonian records in 2014, this time as the NASA Curiosity Rover crossed the Martian landscape. The vehicle passed an ancient dried freshwater lake on Mars, before travelling up through “Dingo Gap” in the sand dunes.

The NASA scientists had mapped the surface of Mars into quadrangles, and named the locations after sites on Earth with a similar ancient geology or rock formations. Dingo Gap is named after a location near the remote Kimberly quadrangle in Western Australia.

The ConversationSo, in contemporary celestial narrative, a valley on the Martian landscape is named after the Australian wild dog, the dingo, that thrived for thousands of years in one of the most extreme environments on Earth.

Justine M. Philip, Doctor of Philosophy, Ecosystem Management, University of New England

This article was originally published on The Conversation. Read the original article.


Animated History of Poland



%d bloggers like this: