Category Archives: USA

Burial Sites of the US Presidents



The Spanish American War



The USA & UK as Allies



History of Hawaii Before the USA



Japanese-American Internment Camps



The American Founders made sure the president could never suspend Congress



The signing of the U.S. Constitution.
Architect of the Capitol

Eliga Gould, University of New Hampshire

The British monarch has the right to determine when Parliament is in session – or, more to the point, when it is not.

Breaking with longstanding tradition, and possibly with the United Kingdom’s unwritten constitution, new Prime Minister Boris Johnson asked Queen Elizabeth II to suspend, or “prorogue,” the national legislature for five weeks starting on Sept. 9, or shortly after. She agreed.

Freed from having to take pesky questions in the House of Commons, Johnson claims he will be able to concentrate on getting a better deal for Britain as it prepares to leave the European Union on Oct. 31. Many British lawmakers, including some in Johnson’s own party, are furious and fighting back. But if the ploy succeeds, it will be one of the longest parliamentary suspensions since the British last cut off their monarch’s head.

Given the similarities between the U.S. and U.K. political systems and the personal parallels – and affection – between Johnson and U.S. President Donald Trump, Americans might wonder whether the president has a similar power to suspend Congress.

The answer is a very clear no – thanks to the forethought, and strong historical knowledge, of the country’s Founders.

Johnson and Trump have similarities but differences too.
Erin Schaff, The New York Times, Pool

Breaking up, but still learning by example

On July 4, 1776, Congress severed all ties to Britain. The Declaration of Independence included a repudiation of George III, though Americans had initially admired him when he assumed the throne in 1760. They also rejected the monarchical form of government that King George embodied.

Initially admired: George III.
Allan Ramsay/Wikimedia Commons

Compared to other kingdoms in Europe, which were ruled by overbearing monarchs and aristocrats, the British monarchy was not that bad. In fact, the institution contained a number of features that Americans quite liked. One was the system of representative government. King George and his ministers could only enact laws, including laws that taxed the British people, with the consent of Parliament. The House of Commons, the legislature’s lower chamber, was an elective body, chosen in the 18th century by property-owning men – and occasionally property-owning women – in England, Scotland and Wales. Although Britain wasn’t a democracy, it wasn’t an absolute monarchy, and definitely not a dictatorship.

From the earliest days of English settlement, Americans held the legislative part of the British monarchy in high regard. They modeled their own colonial assemblies as far as possible on Parliament, especially the House of Commons. Each colony had a governor and a council, but the most important branch was the representative assembly. Only colonial assemblies could levy taxes, and all other laws required their approval as well.

After independence, the colonies became states. Americans, wrote David Ramsay of South Carolina in 1789, were now a “free people who collectively” had the right to rule themselves. If they were to have government based on “the consent of the governed,” as the Declaration proclaimed, they still needed legislatures, which needed to be as strong as possible. Parliament remained an example worth following.

Rejecting royalty

What Americans did not want was another king. The Founders admitted that even though the British monarchy had failed the colonists, it worked pretty well for the British, with the king’s ministers consulting Parliament on most matters of importance. But they knew that the “constitution” that required them to do so was an unwritten one based primarily in tradition, not legal statutes and documents.

A detail of a portrait of King Charles I, while his head was still attached.
Sir Anthony Van Dyck/Wikimedia Commons

They also knew that just over a century before, a different king, Charles I, had not been so accommodating. In 1629, when Parliament refused his request for taxes, Charles dissolved the legislature and governed as a personal monarch – not for five weeks, but for 11 years.

That didn’t go well for Parliament, the British people or the king. The civil war that ensued ended with Charles’ execution in 1649 on a balcony overlooking what is today Trafalgar Square. The crowd’s gasp as the axe severed his neck was a sound no one ever forgot. The kings and queens who followed him were mindful of it too. When Charles’s son, James II, suspended Parliament again, the British sent him packing, and gave the crown to William and Mary.

The lesson, however, was largely a matter of custom. During the 18th century, the king’s ministers knew how to get along with Parliament, but the law did not require them to. British monarchs still had enormous powers, and Parliament usually did what they wanted. Although it was Parliament, not George III, that sparked the American Revolution by taxing the colonists without their consent, Americans placed most of the blame on the king’s ministers, and on the king himself.

Protecting the legislature

When Americans started debating what sort of government they wanted for the United States, they knew they needed an executive with some of the vigor that they associated with a monarchy. What they had in mind, however, was different from the British crown. The monarch, as Alexander Hamilton wrote in the “Federalist” essays, was a “perpetual magistrate,” who had powers that were limited only by whatever rules he or she chose to observe.

The newly created role of U.S. president, by contrast, had clearly defined powers under the Constitution, as did Congress. Crucially, the power to summon or dismiss Congress belonged to the House of Representatives and the Senate, which together decided when to convene and when to adjourn. The position of president, in other words, was intentionally designed without the authority to reproduce the 11-year tyranny of King Charles – or the five-week suspension of Queen Elizabeth II and her current prime minister.The Conversation

Eliga Gould, Professor of History, University of New Hampshire

This article is republished from The Conversation under a Creative Commons license. Read the original article.


Slavery Begins in America



Who won the war? We did, says everyone



Winston Churchill, Joseph Stalin and Franklin D Roosevelt at the Yalta Conference, 1945.
Wikipedia

Nick Chater, Warwick Business School, University of Warwick

Ask any of the few remaining World War II veterans what they did during the war and you’re likely to get a humble answer. But ask the person on the street how important their country’s contribution to the war effort was and you’ll probably hear something far less modest. A new study suggests people from Germany, Russia, the UK and the US on average all think their own country shouldered more than half the burden of fighting World War II.

Our national collective memories seem to be deceiving us, and this is part of a far more general pattern. Aside from those veterans who have no desire to revel in the horrors of war, we may have a general psychological tendency to believe our contributions are more significant than they really are.

You can see this in even the most mundane of tasks. Unloading the dishwasher can be a perennial source of family irritation. I suspect that I’m doing more than my fair share. The trouble is that so does everybody else. Each of us can think: “The sheer injustice! I’m overworked and under-appreciated.”

But we can’t all be right. This strange magnification of our own efforts seems to be ubiquitous. In business, sport or entertainment, it’s all too easy for each participant to think that their own special stardust is the real reason their company, team or show was a hit.

It works for nations, too. A study last year, led by US memory researcher Henry Roediger III, asked people from 35 countries for the percentage contribution their own nation has made to world history. A dispassionate judge would, of course, assign percentages that add up to no more than 100% (and, indeed, considerably less, given the 160 or so countries left out). In fact, the self-rating percentages add up to over 1,000%, with citizens of India, Russia and the UK each suspecting on average that their own nations had more than half the responsibility for world progress.

A sceptic might note that “contributing to world history” is a rather nebulous idea, which each nation can interpret to its advantage. (The Italians, at 40%, might focus on the Romans and the Renaissance, for example.) But what about our responsibility for specific world events? The latest study from Roediger’s lab addresses the question of national contributions to World War II.

The researchers surveyed people from eight former Allied countries (Australia, Canada, China, France, New Zealand, Russia/USSR, the UK and the US) and three former Axis powers (Germany, Italy and Japan). As might be expected, people from the winning Allied side ranked their own countries highly, and the average percentage responses added up to 309%. Citizens of the UK, US and Russia all believed their countries had contributed more than 50% of the war effort and were more than 50% responsible for victory.

World War II deaths by country. How would you work out which country contributed the most?
Dna-Dennis/Wikimedia Commons

You might suspect that the losing Axis powers, whose historical record is inextricably tied to the immeasurable human suffering of the war, might not be so proud. As former US president John F Kennedy said (echoing the Roman historian Tacitus): “Victory has a hundred fathers and defeat is an orphan.” Perhaps the results for the Allied countries just reflect a general human tendency to claim credit for positive achievements. Yet citizens of the three Axis powers also over-claim shares of the war effort (totalling 140%). Rather than minimising their own contribution, even defeated nations seem to overstate their role.

Why? The simplest explanation is that we piece together answers to questions, of whatever kind, by weaving together whatever relevant snippets of information we can bring to mind. And the snippets of information that come to mind will depend on the information we’ve been exposed to through our education and cultural environment. Citizens of each nation learn a lot more about their country’s own war effort than those of other countries. These “home nation” memories spring to mind, and a biased evaluation is the inevitable result.

So there may not be inherent “psychological nationalism” in play here. And nothing special about collective, rather than individual, memory either. We simply improvise answers, perhaps as honestly as possible, based on what our memory provides – and our memory, inevitably, magnifies our own (or our nation’s) efforts.

How do you calculate real responsibility?

A note of caution is in order. Assigning responsibilities for past events baffles not just everyday citizens, but academic philosophers. Imagine a whodunit in which two hopeful murderers put lethal doses of cyanide into Lady Fotherington’s coffee. Each might say: “It’s not my fault – she would have died anyway.” Is each only “half” to blame, and hence due a reduced sentence? Or are they both 100% culpable? This poisoning is a simple matter compared with the tangled causes of military victory and defeat. So it is not entirely clear what even counts as over- or under-estimating our responsibilities because responsibilities are so difficult to assess.

Still, the tendency to overplay our own and our nation’s role in just about anything seems all too plausible. We see history through a magnifying glass that is pointing directly at ourselves. We learn the most about the story of our own nation. So our home nation’s efforts and contributions inevitably spring readily to mind (military and civilian deaths, key battles, advances in technology and so on). The efforts and contributions of other nations are sensed more dimly, and often not at all.

And the magnifying glass over our efforts is pervasive in daily life. I can find myself thinking irritably, as I unload the dishwasher, “Well, I don’t even remember the last time you did this!” But of course not. Not because you didn’t do it, but because I wasn’t there.The Conversation

Nick Chater, Professor of Behavioural Science, Warwick Business School, University of Warwick

This article is republished from The Conversation under a Creative Commons license. Read the original article.


How the 1869 Cincinnati Red Stockings turned baseball into a national sensation



A drawing from Harper’s Weekly depicts a game between the Red Stockings and the Brooklyn Atlantics.
New York Public Library

Robert Wyss, University of Connecticut

This Major League Baseball season, fans may notice a patch on the players’ uniforms that reads “MLB 150.”

The logo commemorates the Cincinnati Red Stockings, who, in 1869, became the first professional baseball team – and went on to win an unprecedented 81 straight games.

As the league’s first openly salaried club, the Red Stockings made professionalism – which had been previously frowned upon – acceptable to the American public.

But the winning streak was just as pivotal.

“This did not just make the city famous,” John Thorn, Major League Baseball’s official historian, said in an interview for this article. “It made baseball famous.”

Pay to play?

In the years after the Civil War, baseball’s popularity exploded, and thousands of American communities fielded teams. Initially most players were gentry – lawyers, bankers and merchants whose wealth allowed them to train and play as a hobby. The National Association of Base Ball Players banned the practice of paying players.

At the time, the concept of amateurism was especially popular among fans. Inspired by classical ideas of sportsmanship, its proponents argued that playing sport for a reason other than for the love of the game was immoral, even corrupt.

Nonetheless, some of the major clubs in the East and Midwest began disregarding the rule prohibiting professionalism and secretly hired talented young working-class players to get an edge.

After the 1868 season, the national association reversed its position and sanctified the practice of paying players. The move recognized the reality that some players were already getting paid, and that was unlikely to change because professionals clearly helped teams win.

Yet the taint of professionalism restrained virtually every club from paying an entire roster of players.

The Cincinnati Red Stockings, however, became the exception.

The Cincinnati experiment

In the years after the Civil War, Cincinnati was a young, growing, grimy city.

The city had experienced an influx of German and Irish immigrants who toiled in the multiplying slaughterhouses. The stench of hog flesh wafted through the streets, while the black fumes of steamboats, locomotives and factories lingered over the skyline.

Nonetheless, money was pouring into the coffers of the city’s gentry. And with prosperity, the city sought respectability; it wanted to be as significant as the big cities that ran along the Atlantic seaboard – New York, Philadelphia and Baltimore.

Men slaughter hogs on an assembly line in a Cincinnati meat packing plant.
Library of Congress

Cincinnati’s main club, the Red Stockings, was run by an ambitious young lawyer named Aaron Champion. Prior to the 1869 season, he budgeted US$10,000 for his payroll and hired Harry Wright to captain and manage the squad. Wright was lauded later in his career as a “baseball Edison” for his ability to find talent. But the best player on the team was his 22-year-old brother, George, who played shortstop. George Wright would end up finishing the 1869 season with a .633 batting average and 49 home runs.

Only one player hailed from Cincinnati; the rest had been recruited from other teams around the nation. Wright had hoped to attract the top player in the country for each position. He didn’t quite get the best of the best, but the team was loaded with stars.

As the season began, the Red Stockings and their new salaries attracted little press attention.

“The benefits of professionalism were not immediately recognized,” Greg Rhodes, a co-author of “Baseball Revolutionaries: How the 1869 Red Stockings Rocked the Country and Made Baseball Famous,” told me. “So the Cincinnati experiment wasn’t seen as all that radical.”

The Red Stockings opened the season by winning 45 to 9. They kept winning and winning and winning – huge blowouts.

At first only the Cincinnati sports writers had caught on that something special was going on. Then, in June, the team took its first road trip east. Playing in hostile territory against what were considered the best teams in baseball, they were also performing before the most influential sports writers.

The pivotal victory was a tight 4-to-2 win against what had been considered by many the best team in baseball, the powerful New York Mutuals, in a game played with Tammany Hall “boss” William Tweed watching from the stands.

Now the national press was paying attention. The Red Stockings continued to win, and, by the conclusion of the road trip in Washington, they were puffing stogies at the White House with their host, President Ulysses Grant.

The players chugged home in a boozy, satisfied revel and were met by 4,000 joyous fans at Cincinnati’s Union Station.

American idols

The Red Stockings had become a sensation. They were profiled in magazines and serenaded in sheet music. Ticket prices doubled to 50 cents. They drew such huge crowds that during a game played outside of Chicago, an overloaded bleacher collapsed.

Aaron Chapman’s squad averaged 42 runs a game in the 1869 season.
From the collection of Greg Rhodes, Author provided

Most scores were ridiculously lopsided; during the 1869 season the team averaged 42 runs a game. Once they even scored 103. The most controversial contest was in August against the Haymakers of Troy, New York. The game was rife with rumors of $17,000 bets, and bookmakers bribing umpires and players. The game ended suspiciously at 17 to 17, when the Haymakers left the field in the sixth inning, incensed by an umpire’s call. The Red Stockings were declared the winners.

The season climaxed with a road trip west on the new transcontinental railroad, which had just opened in May. The players, armed with rifles, shot out windows at bison, antelope and even prairie dogs and slept in wooden Coleman cars lighted with whale oil. More than 2,000 excited baseball fans greeted the team in San Francisco, where admission to games was one dollar in gold.

Cincinnati ended its season with an undefeated record: 57 wins, 0 losses. The nation’s most prominent sports writer of the day, Henry Chadwick, declared them “champion club of the United States.”

Despite fears that others clubs would outbid Cincinnati for their players, every Red Stockings player demonstrated his loyalty by signing contracts to return for the 1870 season.

The demise begins

The winning streak continued into the next season – up until a June 14, 1870, game against the Brooklyn Atlantics.

An error by second baseman Charles Sweasy ended the Red Sockings’ historic streak.
From the collection of John Thorn, Author provided

After nine innings, the teams were tied at 5. Under the era’s rules, the game could have been declared a draw, leaving the streak intact. Instead Harry Wright opted to continue, and the Red Stockings ended up losing in extra innings after an error by the second baseman, Charlie Sweasy.

The 81-game win streak had ended.

The Red Stockings did not return in 1871. Ticket sales had fallen after their first loss, and other teams began to outbid the Red Stockings for their star players. Ultimately the cost of retaining all of its players was more than the Cincinnati club could afford.

Yet the team had made its mark.

“It made baseball from something of a provincial fare to a national game,” Thorn explained.

A few years later, in 1876, the National League was founded and still exists today. The Cincinnati Reds were a charter member. And not surprisingly, some of the biggest 150-year celebrations of the first professional baseball team are occurring in the town they once called Porkopolis.The Conversation

Robert Wyss, Professor of Journalism, University of Connecticut

This article is republished from The Conversation under a Creative Commons license. Read the original article.


A brief history of presidential lethargy



A television set turned on in the West Wing of the White House.
AP Photo/Susan Walsh

Stacy A. Cordery, Iowa State University

No one doubts the job of president of the United States is stressful and demanding. The chief executive deserves downtime.

But how much is enough, and when is it too much?

These questions came into focus after Axios’ release of President Donald Trump’s schedule. The hours blocked off for nebulous “executive time” seem, to many critics, disproportionate to the number of scheduled working hours.

While Trump’s workdays may ultimately prove to be shorter than those of past presidents, he’s not the first to face criticism. For every president praised for his work ethic, there’s one disparaged for sleeping on the job.

Teddy Roosevelt, locomotive president

Before Theodore Roosevelt ascended to the presidency in 1901, the question of how hard a president toiled was of little concern to Americans.

Except in times of national crisis, his predecessors neither labored under the same expectations, nor faced the same level of popular scrutiny. Since the country’s founding, Congress had been the main engine for identifying national problems and outlining legislative solutions. Congressmen were generally more accessible to journalists than the president was.

Teddy Roosevelt’s activist approach to governing shifted the public’s expectations for the president.
Library of Congress

But when Roosevelt shifted the balance of power from Congress to the White House, he created the expectation that an activist president, consumed by affairs of state, would work endlessly in the best interests of the people.

Roosevelt, whom Sen. Joseph Foraker called a “steam engine in trousers,” personified the hard-working chief executive. He filled his days with official functions and unofficial gatherings. He asserted his personality on policy and stamped the presidency firmly on the nation’s consciousness.

Taft had a tough act to follow

His successor, William Howard Taft, suffered by comparison. While it’s fair to observe that nearly anyone would have looked like a slacker compared with Roosevelt, it didn’t help that Taft weighed 300 pounds, which his contemporaries equated with laziness.

Taft’s girth only added to the perception that he lacked Roosevelt’s vigor.
Library of Congress

Taft helped neither his cause nor his image when he snored through meetings, at evening entertainments and, as author Jeffrey Rosen noted, “even while standing at public events.” Watching Taft’s eyelids close, Sen. James Watson said to him, “Mr. President, you are the largest audience I ever put entirely to sleep.”

An early biographer called Taft “slow-moving, easy-going if not lazy” with “a placid nature.” Others have suggested that Taft’s obesity caused sleep apnea and daytime drowsiness, a finding not inconsistent with historian Lewis L. Gould’s conclusion that Taft was capable of work “at an intense pace” and “a high rate of efficiency.”

It seems that Taft could work quickly, but in short bursts.

Coolidge the snoozer

Other presidents were more intentional about their daytime sleeping. Calvin Coolidge’s penchant for hourlong naps after lunch earned him amused scorn from contemporaries. But when he missed his nap, he fell asleep at afternoon meetings. He even napped on vacation. Tourists stared in amazement as the president, blissfully unaware, swayed in a hammock on his front porch in Vermont.

This, for many Republicans, wasn’t a problem: The Republican Party of the 1920s was averse to an activist federal government, so the fact that Coolidge wasn’t seen as a hard-charging, incessantly busy president was fine.

Biographer Amity Shlaes wrote that “Coolidge made a virtue of inaction” while simultaneously exhibiting “a ferocious discipline in work.” Political scientist Robert Gilbert argued that after Coolidge’s son died during his first year as president, Coolidge’s “affinity for sleep became more extreme.” Grief, according to Gilbert, explained his growing penchant for slumbering, which expanded into a pre-lunch nap, a two- to four-hour post-lunch snooze and 11 hours of shut-eye nightly.

For Reagan, the jury’s out

Ronald Reagan may have had a tendency to nod off.

“I have left orders to be awakened at any time in case of a national emergency – even if I’m in a cabinet meeting,” he joked. Word got out that he napped daily, and historian Michael Schaller wrote in 1994 that Reagan’s staff “released a false daily schedule that showed him working long hours,” labeling his afternoon nap “personal staff time.” But some family members denied that he napped in the White House.

Journalists were divided. Some found him “lazy, passive, stupid or even senile” and “intellectually lazy … without a constant curiosity,” while others claimed he was “a hard worker,” who put in long days and worked over lunch. Perhaps age played a role in Reagan’s naps – if they happened at all.

Clinton crams in the hours

One president not prone to napping was Bill Clinton. Frustrated that he could not find time to think, Clinton ordered a formal study of how he spent his days. His ideal was four hours in the afternoon “to talk to people, to read, to do whatever.” Sometimes he got half that much.

Two years later, a second study found that, during Clinton’s 50-hour workweek, “regularly scheduled meetings” took up 29 percent of his time, “public events, etc.” made up 36 percent of his workday, while “thinking time – phone & office work” constituted 35 percent of his day. Unlike presidents whose somnolence drew sneers, Clinton was disparaged for working too much and driving his staff to exhaustion with all-nighters.

Partisanship at the heart of criticism?

The work of being president of the United States never ends. There is always more to be done. Personal time may be a myth, as whatever the president reads, watches or does can almost certainly be applied to some aspect of the job.

Trump’s “executive time” could be a rational response to the demands of the job or life circumstances. Trump, for example, only seems to get four or five hours of sleep a night, which seems to suggest that he has more time to tackle his daily duties than the rest of us.

But, like his predecessors, the appearance of taking time away from running the country will garner criticism. Though they can sometimes catch 40 winks, presidents can seldom catch a break.The Conversation

Stacy A. Cordery, Professor of History, Iowa State University

This article is republished from The Conversation under a Creative Commons license. Read the original article.


%d bloggers like this: