History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Tue, 12 Nov 2019 23:26:09 +0000 Tue, 12 Nov 2019 23:26:09 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://hnn.org/site/feed Why Televised Hearings Mattered During Watergate But May Not Today

John Dean

 

I started a continuing legal education program with John Dean in 2011. We have done over one-hundred-and-fifty programs across the nation since then. 

 

Our first program was about obstruction of justice and how Dean, as Nixon’s White House Counsel, navigated the stormy waters when he turned on the president and became history’s most important whistleblower. Unlike the current whistleblower, Dean had been involved in the cover-up, but ultimately decided he had to end the criminal activity in the White House, with no assurance of anonymity and with the almost certain expectation that he was blowing himself up in the process.

 

Dean was placed in the witness protection program but became one of the most recognized figures of his time. When he testified before the Senate Select Committee for the entire third week of June 1973, all three networks carried his testimony, gavel-to-gavel. John Lennon and Yoko Ono showed up to watch Dean. Almost all of America tuned-in and some reports  estimated there were 80 million viewers. In our history we have had only a handful of such mega-TV events: the Kennedy assassination weekend, the Apollo landing on the moon, and the attacks on 9-11 come to mind.

 

Yet is wasn’t until a year after Dean testified that Nixon resigned. The process of Nixon’s take-down was slow but steady, a phenomenon that many historians refer to as a “drip, drip, drip.” Nixon’s credibility kept absorbing one body shock after another: first the Senate hearings, then the discovery of the taping system, the firing of the Special Prosecutor Archibald Cox, the discovery of a gap in the tapes, the indictment of top Administration officials, and finally, the Supreme Court ruling in late July 1974 that tapes had to be turned over.

 

One tape dealt the knockout blow. It became known as the “smoking gun tape,” because it showed the president a week after the break-in, on June 23, 1972, ordering his chief of staff to call in the CIA and instruct them to tell the FBI to end its investigation into Watergate, as it might uncover CIA operations. This both killed Nixon’s lie that he knew nothing of the cover-up and displayed the kind of abuse of presidential power that the founders worried about when they agreed to insert an impeachment outlet in Article II of the Constitution. The president was seeking to use his office for his own personal political protection.

 

In the current situation, the “smoking gun” has already been produced in the form of the partial transcript of President Trump’s July conversation with President Zelensky. Despite what Trump says, it shows him bargaining for political dirt on his adversary through the misuse of his presidential powers.

 

The whistleblower today has been backed up by others who had direct knowledge, making his or her account now superfluous. John Dean had no such back-up from others; he had to wait a year for his testimony to be fully corroborated by the tapes themselves.

 

Because the nation also had to wait almost a year to determine who to believe, there was time for Dean’s testimony in June 1973 to take hold and sink in. Nixon had won reelection by a landslide in November 1972 and his approval rating was nearing 70% in January 1973, when he kept his promise to end America’s involvement in the Vietnam War. But with the burglars’ trial before Judge John Sirica in January 1973 and the unanimous Senate vote to investigate campaign activities from the 1972 election on February 7, 1973, things began to spiral.

 

In April 1973 when Nixon fired his top advisors and attorney general, Dean was also let go. Nixon’s approval rating fell to 48%. Then John Dean testified in June and Nixon’s popularity fell as low as 31% by August. Nonetheless, despite the widespread opinion that Nixon was somehow culpable in the break-in, only 26% thought he should be impeached and forced to resign; 61% did not. The calls for impeachment and removal didn’t reach a clear majority until the “smoking gun tape” was produced in late July 1974. Then the number rose to 57%. By that time Nixon’s approval rating had sunk to 24%.

 

What this tells us is that the televised hearings started the path downward in June 1973, though it took a year of continuing scandal to wipe out Nixon’s support. Importantly, Dean’s testimony that he thought he might have been taped in one instance led Senate investigators to Alexander Butterfield and the revelation that the taping system existed. It was the fight for the tapes that confounded Nixon and his defenders and ultimately led to his resignation.

 

The current impeachment inquiry will be Watergate in reverse. The whistleblower has already surfaced the “smoking gun” transcript. Others have already verified the whistleblower’s allegations. The televised hearings, while they will add public understanding of what happened, are coming after years of scandal and revelations about Mr. Trump. Between the expected impeachment by the House at year-end and the trial in the Senate early next winter, there will be little time to let the pot boil as it did in Watergate. And it is doubtful, given the depth of the partisan divide and media bias and agendas, that the polls on impeachment and removal will shift a great deal with the televised hearings.

 

The public knows what Trump did. They generally know it was wrong. Trump supporters simply don’t care. Even with the attempt to have foreign interference in our elections all but confirmed, the President’s followers are unmoved.

 

Perhaps I will be proved wrong, but I think the televised hearings will not significantly move the needle of public opinion as happened during Watergate. The House will impeach. The Senate will not convict. And those Republicans who stand behind the President will have to answer to voters when their time comes. So will President Trump.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173588 https://historynewsnetwork.org/article/173588 0
Muslims Should Reject Indian Supreme Court's Land Offer India has just set an example of how a nation can retreat into darkness, with its Supreme Court ruling on Saturday that a Hindu temple be built to honor fictitious God Ram on the disputed site of a historic 400-year-old Muslim mosque, named after the first Mughal emperor.

 

The top court agreed with Hindus that a structure existed under the mosque, i.e. the Babri Masjib was not built on vacant land. It decided to give Muslims land elsewhere to construct a mosque, a consolation prize that seems akin to giving the United States back to the Indians and resettling the Americans in Hawaii. 

 

Indian Muslims should reject the land offer and not build any mosque at all in protest. To defy Hindu extremists, they should start holding weekly Friday congregations under open skies whenever possible.

 

They also should forcefully assert their equal rights under India's constitution, but must shun violence. Mahatma Gandhi's non-violence creed is the most powerful weapon civilians ever invented. 

 

The verdict capped years of legal wrangling, which reached India's highest court in 2011 when Hindus appealed a lower court order that the site be shared by the two religions. In 2018, a three-judge panel declined to send the case to a constitution bench, prompting the chief justice to form a body to decide who owns the property.

 

The five-member bench, headed by Chief Justice Ranjan Gogoi, ordered the government on Saturday to set up in three months a trust to manage the site on behalf of Ram Lalla Virajman, the child deity, worshiped by Hindus at the site in Ayodha, Uttar Pradesh 

 

Now the Hindu ultra-nationalist government will form a board to erect the Ram temple where once stood the Babri Masjid. This action ends India's secular tradition by patronizing one religion against another. India, a nation of 1.3 billion people with 200 million Muslims, is to treat all religions equally under the constitution.

 

The verdict is faulty because it ordered the government to construct a temple, even though the court admitted there is no evidence that one ever existed at the site. The court overstepped its jurisdiction, too; its task is to decide the ownership. What will be done with the property that is no business of the court — that will be decided by the owner.

 

Hindus, including Prime Minister Narendra Modi's supporters, claim that Emperor Babur, whose dynasty ruled India from 1526 to 1857, built the mosque on top of a temple at the birthplace of Ram, a character in the Ramayana, an epic written by sage Valmiki at 200 BCE. An archaeological survey found no evidence of this, only that an unspecified structure existed before the mosque. 

 

Court Errs on Evidence

In deciding the case in favor of Hindus, the justices accepted infant Ram as a perpetual minor under law, an ill-conceived legal doctrine without precedence outside India. It is beyond a modern man's imagination how learned jurists of an ancient civilization can accord legal status to a fictitious deity. Only the dark mind of India seems to be at work.

 

This judgment has irreparably damaged the Supreme Court and undermined minority confidence in the judiciary. The court made a mistake by deciding the case based on whether Hindus worshiped at the site, which has been used as a mosque since 1528. It ignored the main instrument of ownership — the title to the land. 

 

The court said “there is clear evidence to indicate that the worship by the Hindus in the outer courtyard continued unimpeded in spite of the setting up of a grill-brick wall in 1857,” which means the Hindus were in possession of a part of the land and hence have a valid ownership claim. 

 

By this logic, Muslims in the United States can rightfully claim ownership of many churches, where they held weekly Friday prayers. No one in the United States can claim ownership of a house just by virtue of using it for some time.

 

The court contradicted itself in saying that titles can’t be decided on faith when it actually ruled in favor of Hindus because of the evidence that Hindus continuously worshiped at the site for a long period. 

 

The pendulum of justice cannot swing according to political wisdom or pragmatism. Justice upholds legal principles. The court should have given the land to its valid title holder. Even better would have been for the apex court to stay above the fray and send the case back to the lower court to settle the ownership dispute.

 

Like the U.S. Supreme Court, the Supreme Court of India should be minimally interventionist, hearing only the cases that involve serious constitutional issues. Indian justices decide as many as 700 cases a year, against 70 by their U.S. peers.

 

The unanimous landmark verdict is pure politics, and it has voided respect for history, which India's founding leaders treasured as mythical Cobras protect their pots of gold. This ruling relegates India's Muslims into perpetual discrimination based on religion, an idea Hindu ultra-nationalist Hindu guru V. D. Savarkar espoused a century ago.

 

Secularism Fails in India

Muslims prayed at the mosque for generations until 1949, when Hindu activists placed idols of Ram inside the complex. The mosque was demolished in 1992 by Hindu mobs triggering nationwide religious violence that killed about 2,000 people, most of them Muslims.

 

This destruction of the mosque highlighted the failure of secularism in India, and divided the country along religious lines, giving politicians an opportunity to appeal to base instincts of Hindu masses to win elections.

 

Modi's political organization, the Bharatiya Janata Party, which wishes to create a Muslim-free India, vowed decades ago to build the temple at Ayodhya. Behind this movement has been the Vishwa Hindu Parishad, a militant umbrella group with a tax-exempt affiliate in the United States, which works to reconvert Muslims and Christians into Hindu.

 

The verdict gave a major victory to the 69-year-old Modi, who was blamed for the Muslim massacre in 2002 and has been under fire since August for scrapping special autonomy for Muslim-majority Kashmir, a picturesque Himalayan region.

 

Since Modi came to power in 2014, India has passed laws against Muslims, and several states are planning to deny government jobs to people with more than two children. Muslim birth rate exceeds Hindu, and India has a phobia that Muslims will outnumber Hindus and reestablish the Muslim empire. Muslim history has been removed from school textbooks. Modi is considering a law to grant refugee status to everybody but Muslims

 

With the verdict, India is now a Hindu nation, which means that Muslims are free to leave if they so choose, or stay as subservient to Hindus. This is how Hindus want to avenge their humiliation under Muslim rule for a thousand years.

 

The court decision will create chilling effects among Muslims and intensify simmering Hindu-Muslim tensions inside India and out. Moderate Muslims in Bangladesh and Pakistan, for example, will stumble to cite India as a model to fight extremism. Muslims rather will point to extremism of Buddhas and Hindus as well as Christians as an existentialist threat to their religion and identify.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173587 https://historynewsnetwork.org/article/173587 0
The Whistleblower Should Remain Anonymous

Frank Willis

 

On the night of June 17, 1972 security guard Frank Wills noticed a piece of duct tape on a door in the Watergate complex in Washington. D.C. The tape was probably placed there by tenants moving in or out earlier in the day. Wills removed the tape and resumed his rounds. A half an hour later he returned and saw a new piece of tape holding open the latch. Wills walked to a telephone in the lobby and called the police. He accompanied the police as they searched room by room for the intruders until they found five burglars in the offices of the Democratic National Committee. 

 

Without Wills the Watergate break-in and the scandal brought down President Richard Nixon two years later might never have happened. His simple act of noticing the tape changed the course of American history. The 24-year-old native of Savannah, Georgia became a celebrity and to many a hero almost overnight. Wills played himself in the film version of “All the President’s Men.”  He was interviewed on talk shows and the national news and featured in newspapers and magazines. 

 

For a time Frank Wills was a household name yet the spotlight eventually went out, leaving Wills to wrestle with the after effects of what for him was a toxic celebrity. When the cameras went away Wills drifted from job to job, moved often, and failed at selling Dick Gregory’s fad diet products. He finally settled in South Carolina to take care of his mother after she suffered a stroke. The former security guard was twice arrested for shoplifting small items. In his occasional reappearance in the media,  he expressed some bitterness that he did not receive more credit for uncovering Watergate. Frank Wills died of a brain tumor in 2000 at age 52. 

 

47 years after Willis noticed the tape, a whistleblower has exposed a far bigger scandal. He or she reported that White House officials attempted to extort Ukraine by withholding congressionally approved military aid in exchange for a public investigation into the imaginary corruption of a political rival. The whistleblower reported this to the proper authorities under the law created to encourage and protect such reporting. Lawmakers enacted these protections to ensure that employees do not have to sacrifice their careers to do the right thing. 

  

Frank Wills and the whistleblower exposed the highest level of crimes committed against the American people in very different ways, Wills inadvertently and the whistleblower deliberately. Without them neither scandal would probably ever come to light. Wills’ involvement ended with the phone call to the police and the whistleblower’s involvement ended with the report to his or her superiors. There was never a need to call Wills as a witness in the Watergate or impeachment proceedings. There is now no need to call the whistleblower to Congress after the allegations of the report have been repeatedly verified by witnesses and the White House’s own documents and public statements from the president and the chief of staff. 

 

The whistleblower can and should take a quiet, private pride for their actions and continue in their career of public service unmolested. But, of course, it is never that simple under the current administration. Rather than defend itself against the obvious crimes it committed it has attacked the whistleblower as a liar, a political opponent, a traitor and whatever else pops into their head or tweets. The administration is trying to expose the identity of the whistleblower, an act as unnecessary as it is illegal. It would destroy the whistleblower’s life as intentionally as Wills’ life was destroyed carelessly. What happened to Frank Wills could happen to the whistleblower. It is imperative that his or her identity be forever protected.  We do not need to destroy his or her life with the double-edged sword of celebrity.

 

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173589 https://historynewsnetwork.org/article/173589 0
HNN Hot Topics: Veterans Day Click HERE for our most recent articles on Veterans Day.Click HERE for our articles on World War 1.

Veterans Day, Ninety-Five Years On by Adam Hochschild and Joe Sacco

The enduring folly of the Battle of the Somme.

NOVEMBER 11, 2013

Veterans Day in Ireland by Jason R. Myers

For one thing, it's not Veterans Day, it's Remembrance Day. For another, it's not an official holiday, even though some 200,000 Irishman fought in World War I.

NOVEMBER 11, 2013

Prepare to Welcome Our Troops Home from Afghanistan by Vaughn Davis Bornet

America's longest war will soon be over.

NOVEMBER 11, 2013

This Veterans Day, Beware the Dangers of Robot War by William Astore

This Veterans Day, we need to turn away from the false promise of robot weaponry

NOVEMBER 12, 2012

Veterans Day is a Time for Love for One's Country by Vaughn Davis Bornet

What can be said on Veterans Day 2011 that has not been said repeatedly over our years of remembering war and that final peace?

NOVEMBER 11, 2011

This Veterans Day, Let's Reflect on the D.C. War Memorial by Jeffrey S. Reznick

We should celebrate the newly-restored District of Columbia War Memorial.

NOVEMBER 7, 2011

Remembering Generosity and Commitment this Veterans Day by William Astore

Let's remember that America's veterans have often exhibited remarkable generosity of spirit and awe-inspiring levels of commitment.

NOVEMBER 11, 2010

Honoring Indian Veterans This Veterans Day by Ed Hooper

More than 44,000 Indians served in World War II.

NOVEMBER 7, 2010

Keeping Veterans Day Alive by Ed Hooper

Veterans Day celebrations are in retrenchment all over the country.

NOVEMBER 1, 2009

This Veterans Day Let's Hear from the Troops Themselves by Robert E. Bonner

This years Veterans Day comes in the wake of fierce political campaigning over which policies best serve the interest of U.S. soldiers.

NOVEMBER 10, 2006

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/56852 https://historynewsnetwork.org/article/56852 0
The History Behind a Recently Defaced French Holocaust Memorial

 

On 21 October a Holocaust memorial in the French city of Lyon was vandalized. The memorial plaque at 12 rue Sainte-Catherine in the central part of town contained the names of the 86 Jews arrested at that address on 9 February 1943. It was the largest single roundup of Jews in the city. Most of those arrested were subsequently murdered in Auschwitz and Sobibor. On the plaque, black paint was used to cross out their names. 

 

Sadly this is only the latest such act of defacement in Europe and the US this year. Still, there is something especially ironic about this instance. The victims of this roundup had already been, effaced in a sense--forgotten after the war, then manipulated during the 1987 trial of Klaus Barbie, the Gestapo chief whose signature was on the Nazi records concerning the roundup, and finally ignored by the passage of time as it changed the once picturesque alleyway in central Lyon.

 

The Germans occupied Lyon with the rest of southern France in November 1942, but as of February 1943, the Gestapo in Lyon had arrested very few Jews. The German police were understaffed, underinformed, and they had to contend with French resistance networks in Lyon, which formed a critical hub of underground activity. Jews, meanwhile, were hard to find. Most, (even those who had arrived from Eastern Europe) spoke French and the trade in false identification papers became more vigorous with time. As one Gestapo officer put it, France was “an accursed country” because “you cannot tell a Jew from a non-Jew.” 

 

12 rue Sainte-Catherine housed the Lyon headquarters of the Union générale des israélites de France (UGIF), the umbrella Jewish organization created at the behest of the German and French authorities. It continued the social welfare activities of prewar Jewish charitable organizations. 9 February 1943 was a distribution day. Jews arrived seeking everything from ration cards to medicines to advice on reaching and crossing the Swiss border. They were easy targets. Gestapo agents with military escort entered the UGIF offices and simply waited for Jews to show up. Eighty-six Jews were arrested. Others were warned away by the few who escaped the trap. Klaus Barbie reported to his superiors that he had bagged a critical resistance network. It was an early version of Barbie’s later tall tales of counter-intelligence expertise. But though the Gestapo fortuitously arrested a handful of young men and women involved in the Zionist underground, Barbie knew nothing of them or their network. None were even interrogated before their deportation.

 

The roundup was largely forgotten afterwards. Most of the arrestees were gassed, the few who escaped the roundup hid successfully, and Gestapo crimes in Lyon and its environs over the next year and a half became bloodier, ranging from torture to summary shootings to mass reprisal killings. With the liberation of France in August 1944, it was these most recent atrocities, locally carried out and largely against resistors, that were adjudicated. The French military trial of Barbie in 1952, in which Barbie was tried in absentia, centered on the bloody April 1944 German counter-insurgency campaign in the Jura region. A subsequent trial of Gestapo officials in 1954, in which Barbie was tried again in absentia, covered a range of crimes including the mass shooting of resistors in Saint-Genis-Laval and the mass killings at Bron airfield just before the German retreat. There was no mention of the rue Sainte-Catherine roundup. 

 

Barbie, meanwhile, forged a new life for himself and his family in Bolivia, where the US Army Counter-Intelligence Corps had sent him in 1951 following four years of surely sub-standard work for the Americans. He was not discovered until 1972, not deported to France until 1983, and not tried in Lyon until 1987. “New” crimes now had to be found. Barbie was to be tried not for ordinary war crimes, but for crimes against humanity, which the French government made imprescriptible in 1964. Moreover, crimes already adjudicated in the 1950s could not be retried. Barbie had indeed committed several crimes that fit the legalities of 1987 trial, the details having been researched by Nazi hunters Serge and Beate Klarsfeld in the Gestapo records collected and held by the Centre de documentation juive contemporaine (CDJC) in Paris. The roundup at rue Sainte-Catherine was one of these crimes. 

 

The matter was open and shut. Relevant Gestapo records with Barbie’s signature were submitted into evidence, and early in the trial, escapees and the lone living survivor from the roundup testified as to how the Gestapo arrested them or their loved ones. But a cruel wrinkle emerged by trial’s end. Barbie’s defense attorney was Jacques Vergès, a controversial and outspoken celebrity attorney known for anti-colonial activism. Hisclients in the 1950s included members of Algeria’s National Liberation Front standing before French military courts. In the 1960s, Vergès’s anti-colonialism developed into a more pronounced anti-Zionism and anti-Semitism. He spun conspiracy theories concerning the Rothschilds’ supposed behind-the-scenes machinations before the Six-Day War, and he wrote polemics defending Palestinian terrorists who, he said, were stigmatized by imperialists backed by Jewish money. Now Vergès argued that the Barbie trial was a political setup by Zionists who aimed to obscure what he called similar Israeli crimes in the Middle East. The Zionists, Vergès insisted to one French magazine, “are always holding this [type of] trial. They rehash them so as always to appear as victims.” The Algerian press concurred that the Holocaust was “the Jewish Olympic flame which maintains global financial power imposed by the media.” 

 

In his three-day summation at the end of Barbie’s trial, Vergès wove an elaborate theory. Why, he asked, was the rue Sainte-Catherine roundup only raised “out of the mud” in the 1980s? The answer, he said, was so that the Jews could manipulate its history. Barbie had reported in February 1943 that he had arrested resistors, and indeed, Vergès falsely argued, East European Jews at the rue Sainte-Catherine on 9 February had links to the Allies through Switzerland. Barbie had thus engaged in legitimate military counter-espionage. But the roundup, Vergès professed to reveal, was actually occasioned by the UGIF itself. The Union was “a collaborationist body,” of bourgeois French Jews who, like Vichy itself, viewed the deportation of poor working-class Jews as politically desirable. “It was the directors of the UGIF themselves,” Vergès said, “who lured the families to the headquarters … under the pretext of providing aid.” The court exhibits, Vergès continued, distorted the true picture. Records submitted into evidence were handpicked at the CDJC by his nemesis Serge Klarsfeld, while the UGIF records that revealed the “truth” lay hidden in same Jewish archive, which, he said, secretly maintained “a quasi-monopoly of information.” “Only the UGIF records, in possession of the CDJC, would have been able to get at the bottom [of] this roundup…. The archives at least partially place the responsibility on Jewish notables in the deportation of their brothers. Must we cover this responsibility or transfer it to Lieutenant Barbie? The question … looms in the conscience.”

 

In fact, the UGIF records at the CDJC had been open to researchers for many years. Vergès plagiarized much of his summation from journalist Maurice Rajfus’s deeply flawed book on the UGIF, published nearly a decade earlier, based on some of the long-available UGIF files. The French dailies did not doubt Barbie’s fundamental guilt. Still, having turned out in force to hear Vergès’s fiery summation after having skipped most of the trial including the testimony from the roundup’s survivors, some reporters were almost willfully bamboozled. No journalist followed up with the CDJC itself to see if records were really hidden. Instead Lyon Figaro assessed Vergès’s summation on rue Sainte-Catherine as “sometimes confused” but also “sometimes brilliant.” The Paris dailies were fascinated by the theme of “Jewish collaboration” and they liberally quoted Vergès’s one-liner that “the Jewish community … had its traitors.” 

 

Though the court found Barbie guilty on all counts, the charges of Jewish coverups hung in the air. On the anticolonial and anti-Semitic left, Tunisian-born writer and filmmaker Said Ould Khelifa claimed that the Barbie verdict perpetuated the “western tendency toward the hierarchization of inhumanity.” Focusing on the rue Sainte-Catherine roundup, Khelifa wrote that Vergès had delivered to the Jews a “terrible revelation.” The court, Khelifa insisted, had simply not seen all of the evidence. In the years ahead, French Holocaust deniers routinely referred to UGIF collaboration with Vichy as a disclaimer of Jewish victimhood, and in the 1997-98 trial of former Vichy official Maurice Papon, the defense falsely charged that the UGIF had drawn up Papon’s deportation lists.

 

The plaque at 12 rue Sainte-Catherine accurately listing each of the roundup’s victims under the symbol of a Star of David was installed in 2011 by the Association of the Sons and Daughters of the Jewish Deportees of France. But it is a Holocaust memorial that, even before its defacement in October 2019, had been increasingly, and gloomily, out of place. The upper floors where terrified Jews were arrested in February 1943 have housed a barely reputable late night establishment named the Sauna Club des Terreaux since before Barbie’s trialin 1987. Gritty bars and take-out stands flank the building on either side. Few if any passers-by notice the plaque, the names on which were forgotten after the war, manipulated by anti-Semites thereafter, and now, have finally been, quite literally, crossed out.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173555 https://historynewsnetwork.org/article/173555 0
The Boer War: The Opening Act in a Violent Century

 

"When is a war not a war? When it is carried on by methods of barbarism in South Africa." 

-Henry Campbell-Bannerman, Liberal MP (later British PM), 1901

 

When gold was discovered in South Africa in 1884, many were ecstatic. Paul Kruger, President of the Boer republic of the Transvaal did not share the enthusiasm. “This gold will cause our country to be soaked in blood.” Indeed, the old Afrikaner would be proved right. Thousands of fortune-seekers from across Europe descended on his humble nation, turning a rough mining encampment into the city of Johannesburg almost overnight. The Boers looked upon influx of foreign miners and businessmen, “uitlanders” in Africaans, with fear and disgust.

 

The Republic of the Transvaal and its sister, the Orange Free State, had been set up by the descendants of Dutch settlers who had trekked north in the early 19th century to escape British rule. Called Boers, from the Dutch word for farmer, this community had developed a unique culture during the 200 years since first arriving in South Africa. They were deeply insular, religiously conservative, and fiercely independent. In the 1870s, the grasping hands of the British Empire reached and annexed the Transvaal. When conflict broke out in 1881, the Boers fought fiercely and reclaimed their independence. 

 

The peace after the First Boer War was always shaky. Britain had certainly not relinquished its designs on South Africa’s natural resources. The growing uitlander population was also a source of rising tension. These foreigners, many of them British, were becoming wealthy and increasingly demanding political power in the Transvaal. The uitlanders received encouragement from British arch-imperialist Cecil Rhodes, founder of De Beers, as well as the British Colonial Secretary Joseph Chamberlain. Both men believed that incorporating the Boer Republics into the British Empire was inevitable. In 1895, Rhodes funded the Jameson Raid, an ill-fated mission to seize the Transvaal. While the British government officially disavowed the raid, many in London had tacitly supported it. Anglo-Boer relations reached a new low and war appeared inevitable. In 1899, the British government forced the matter by issuing an ultimatum demanding full rights for the uitlanders. Knowing full-well that the Boers would refuse, Britain had sent troops to South Africa.

 

Britain was the wealthiest nation on earth and possessed an empire upon which the sun never set. Since the defeat of Napoleon, the 19thcentury had been a nearly unbroken procession of British progress and expansion. At the outbreak of the Second Boer War (called the Boer War hereafter), London was awash in excitement. It would hardly be a war at all. The chief worry of the British soldiers was that the fighting would be over before they arrived. The determined Boers would see to it that the British had all the fighting they could handle and then some.

 

Rather than the expected easy British victory, the war began with disastrous Britain defeatson all fronts. In three battles, the British suffered nearly 3,000 casualties. The London press dubbed it “Black Week,” and the Empire was sent into an uproar. The Boers also besieged several important British settlements. In the field, Boer leaders repeatedly surprised the British forces with their superior mobility and better knowledge of the local terrain. Rather than facing the British directly, the Boers used hit-and-run tactics to disrupt British supply lines.

 

The aging Queen Victoria spoke for her empire after Black Week when she defiantly announced: “we are not interested in the possibilities of defeat; they do not exist.” Britain re-doubled its efforts, ultimately sending nearly half a million troops from across the Empire to overwhelm the total force of 50,000 Boer commandos. In early 1900, this overwhelming influx of men and materiel decisively turned the tide. The cities of Kimberley, Mafeking, and Ladysmith, which had been besieged by the Boers, were soon liberated. The British offensive then advanced on Pretoria and Bloemfontein, capitals of the Boer Republics.

 

After the capitals fell and the main Boer forces were defeated, many, including the British commanders, believed the war was over. The British even announced the re-annexation of the Transvaal. However, the Boers refused to surrender. Their governments continued to operate on the run, and bands of Boer commandos embarked on a guerrilla campaign. 

 

Britain’s response to the Boer insurgency was swift and brutal. British military leaders ordered the destruction of Boer farms and homesteads and the internment of Boer civilians. The roundup soon encompassed over 100,000 Boers, mostly women and children, in a series of concentration camps across South Africa. As the British focused on pacifying the country, they paid scant attention to their captives, who began to die of starvation and disease at horrifying rates. By the time the British forced the Boers to surrender in May 1902, over 20,000 women and children had perished.

 

Outside South Africa, the Boer War has been largely forgotten amidst the sea of 20thcentury horrors. However, the Boer War provided an uncanny preview of 20thcentury warfare. The killing power of modern weaponry was on full display, upending centuries of military theory. The stubborn Boer insurgency provided a guide for later asymmetric conflicts. The British responded to resistance by extending the boundaries of the war to the entire Boer population. The doctrine of total war rationalized the wanton destruction of civilian property. The awful suffering imposed sparked global outrage and inspired a powerful antiwar movement in Britain itself. 

 

The Boer War also shaped the careers of several towering figures. War correspondent Winston Churchill’s daring escape from Boer captivity made him a household name. Attempting to demonstrate India’s vital role in the Empire, Mahatma Gandhi organized a volunteer ambulance corps. Future South African Prime Minister Jan Smuts led a series of audacious assaults on the British Cape Colony. Reporter Sol Plaatje, who later founded the African National Congress, witnessed the racism of both the British and the Boers. Their voices provide eloquent accounts of the 20th century’s first conflict.

 

Infernal Machines

The 19th century witnessed tremendous advances in military science that fundamentally changed the nature of warfare. Explosives developed by Alfred Nobel and others made the cannonball of Napoleon’s day seem almost quaint. Hiram Maxim’s machine gun, a water-cooled weapon, could fire a remarkable 600 rounds per minute. Until the Boer War, European colonial powers were content to use these devastating new weapons primarily against poorly armed local populations. Many European leaders believed these weapons would not be used in “civilized” warfare. Instead, they stubbornly relied on outdated military doctrines such as the gallant frontal charge.

 

For the British high command, the Boer War was a rude awakening. Their Boer foes had the most recent quick-firing rifles, machine guns, and artillery to boot. At the war’s outset, British troops marched in close formation and aggressively charged into battle. Invariably, they were slaughtered by the Boers. Sol Plaatje reported with amazement, “they [the British] stroll about in a heavy volley far more recklessly than we walk through a shower of rain.” The combination of outdated tactics and general arrogance led to the disasters of Black Week and cost the British commander his job.

 

By setting two well-armed foes against each other, the Boer War provided a first glimpse into the changing role of man in war. Previously, individual virtues such as valor and determination could change the outcome of a battle. Now these human attributes were increasingly subordinated to the awesome killing power of modern machinery. The valiant frontal assault would become a suicide charge against machine guns. Courage would count little against the Lyddite shell, which was said to kill nearly everything within a 50-yard radius. War began to lose its luster when it became less about individual bravery and more about the impersonal killing power of machines. All the signs of this terrible evolution of war were present on the battlefields of South Africa. However, some in Europe clung to their old romantic notions. Had they learned from the Boer War, perhaps some of the outright butchery of WWI would have been avoided. 

 

No Safe Place

By September 1900, the British had captured over 15,000 Boer commandos. They controlled all the major cities and had put the Boer governments to flight. Hundreds of thousands of British troops were stationed across South Africa. With their main armies defeated, the Boers organized a well-coordinated guerilla campaign. 

 

The Boer insurgency provided a new template for effective asymmetric warfare. Their commandos infiltrated their home areas, where they relied on local knowledge and partisan support. The commando units were remarkably non-hierarchical, giving each great autonomy in identifying British weaknesses. Commandos were typically expert marksmen and were motivated by the fervor that comes from defending one’s homeland. An impressed Churchill described them as: “thousands of independent riflemen, thinking for themselves, possessed of beautiful weapons, led with skill… moving like the wind, and supported by iron constitutions.”

 

The British soon realized that their control in the Boer territories extended only as far as the sights of their rifles. During 1901, the British repeatedly offered peace, but the Boer leadership’s hard core of bitter-enders” refused. Boer commanders Christiaan de Wet, Louis Botha, and Koos de la Rey continued to effectively harass British settlements, infrastructure, and businesses. Smuts led an extended raid into Cape Colony, sparking panic among the British subjects. These attacks made it impossible for the British to restore economic productivity and social order in South Africa. For all its military might, Britain found that defeating an insurgency was far more difficult than winning on the battlefield. America would learn a similar lesson in the jungles of Vietnam and the deserts of Iraq. 

 

Total War

Throughout history, civilians had often suffered the direct and indirect effects of war including violence, looting, displacement, and famine. What was unique in the Boer War was that a modern Western nation targeted an entire civilian population. Using their superior industrial power, the British vigorously pursued a doctrine of total war and turned the entire country into a warzone. Under this doctrine, anything that could aid the Boer guerillas must be destroyed. 

 

The consequences were devastating. As historian Martin Bossenbroek explains, orders were given to burn the farms of Boer commandos. These farm burnings “often…were not reprisals for sabotage but random acts of destruction,” wrecking economic havoc on the civilian population. This indiscriminate campaign surely violated the 1899 Hague Convention forbidding “collective punishment.” 

 

The civilian situation deteriorated further when Lord Kitchener took command of the British forces. Determined to strangle the insurgency by any means necessary, Kitchener constructed what Bossenbroek describes as an “immense metal web” throughout South Africa. Kitchner’s web included hundreds of military blockhouses and dozens of civilian internment camps. 

 

While earlier conflicts had used internment or concentration camps, the scale employed in South Africa was unprecedented. The network of camps soon swelled to contain nearly 100,000 Boer civilians, mostly women and children. Africans caught up in the conflict were also interned in significant numbers. The British military authorities responsible for the camps had put little thought into the welfare of the internees. As a result, conditions in the camps were appalling. Deaths from starvation and disease spread with terrifying speed. By October 1901, some camps experienced death rates exceeding 30% per month

 

Many Boers bitterly questioned whether British policies sought the annihilation of the Afrikaner people. Historian and Member of Parliament Thomas Pakenham argues that Kitchener did not desire the deaths of women and children in the camps, rather “he was simply not interested” in their fates. In Kitchener’s single-minded quest for victory he had “uproot[ed] a whole nation.” 

 

Ultimately, total war brought victory. The Boers were worn down and demoralized by the suffering of their people. As Deneys Reitz, a young Boer commando recalled, his troop was reduced to “starving, ragged men, clad in skins or sacking, their bodies covered with sores.” Not only did independence now seem impossible, but continuing the war now threatened the very existence of the Boers. Kitchener’s triumph showed the brutal effectiveness of making the civilian population a target of military operations. In WWII, the German Blitz and the Allied firebombings similarly attempted to break an opposing nation’s will to resist. 

 

Antiwar Activism

British policies in South Africa did not escape the world’s notice. From the beginning, many saw Britain as the grasping, bullying aggressor. When the Boer delegation arrived in Europe for the 1900 World’s Fair, they received a riotous ovation. In America, Teddy Roosevelt expressed deep sympathy for the Boers. However, as Bossenbroek notes, such feelings did not translate into material support. Nations recognized Britain’s naval dominance and did not wish to antagonize the Empire by supporting the Boers’ hopeless cause.

 

Within Britain, the Boer War helped create the first modern anti-war movement. The conflict cost over 2.5 million pounds per month, (nearly 400 million dollars per month in 2019). The main beneficiaries seemed to be arms dealers and the wealthy mining houses. For many reformers, a seemingly interminable faraway war was an outrageous expense while Britons at home lacked adequate nutrition, healthcare, and education.

 

While economic considerations surely influenced some anti-war voices, the humanitarian issue truly captured British hearts. One remarkable woman, Emily Hobhouse, is responsible for alerting the British people to the horrors in South Africa. She spent months investigating the camp conditions, and what she found utterly shocked her. Not only did the camps lack sufficient food, clean water, and medicine, but internees whose male relatives remained in commandos were punished with starvation rations. Hobhouse declared: “I call this camp system a wholesale cruelty…to keep these Camps going is murder to the children.”

 

Despite pressure from British authorities, Hobhouse shared a detailed report of her findings. The public outcry was swift. Henry Campbell-Bannerman, who led the Liberal opposition deplored the “methods of barbarism in South Africa.” A young David Lloyd George went even further, calling British actions “a policy of extermination.” His fervent opposition to the war burnished his growing political reputation. Under increasing criticism, the Conservative government agreed to send a commission to South Africa. Led by the suffragette Millicent Fawcett, the commission confirmed Hobhouse’s assertions and demanded immediate policy changes. The military relinquished control of the concentration camps to British colonial administrators, and the death rates began to plummet. The episode demonstrated that democratic politicians now needed to consider the humanitarian consequences of their actions. Unfortunately, the masses retained significant moral blind spots and governments simply worked harder to cover up human rights abuses. Nonetheless, the popular campaign against the outrages in South Africa marked a watershed in anti-war activism.

 

An Enduring Legacy

 

The Boer War reverberated throughout the British Empire. Global sympathy for the Boers showed London how resented the Empire was. Other nations appeared all too eager to take advantage of any further signs of British weakness. Although Britain remained the dominant world power, its days of “splendid isolation” were numbered. In 1902, Britain concluded a treaty with Japan to secure their Pacific holdings against European rivals. In 1904, the Entente Cordiale ended centuries of animosity between Britain and France. By signing an agreement with France’s ally Russia in 1907, Britain protected its claims in Afghanistan, Iran, and its crown jewel, India. With this final deal, the Triple Entente was born. 

 

The Boer War also revealed the grime of poverty below the veneer of Victorian splendor. Embarrassingly, many potential British recruits were rejected because they were too poorly nourished. The richest nation in the world could not even feed its people. Such revelations motivated Liberal efforts to create the basic forms of social welfare. 

 

In South Africa, the war sowed the seeds of apartheid. The peace concluded at Vereeniging offered exceptionally lenient terms to the Boers and pledged millions of pounds to rebuild the nation. This arrangement left the Boers with political control across much of South Africa. Considering white Boer dominance to be preferable to African sovereignty, the British soon reconciled with their bitter foes. In 1906, the Boers were granted significant legal autonomy, and in 1910, the colonies joined to become the Union of South Africa, a self-governing dominion. 

 

Many of the “bitter-enders” were still unhappy with any degree of British authority. Winston Churchill believed this opposition was based on “the abiding fear and hatred of the movement that seeks to place the native on a level with the white man.” Indeed, to the Boers’ racialized worldview, even Britain’s tepid endorsement of African legal rights was anathema. Just before WWI, former Boer commander Barry Hertzog founded the National Party, which fiercely defended Afrikaner culture and white supremacy. Although an opportunistic 1914 Afrikaner uprising was suppressed, the Afrikaner nationalists never stopped trying to slip the British yoke. During the next few decades, the ruling Afrikaner minority systematically stripped black Africans of their rights and pushed for greater separation from Britain. Leaders like Jan Smuts attempted to maintain unity, but in the chaos after WWII, the right-wing nationalists won out. The National Party’s victory in 1948 enabled the final construction of the apartheid state. 

 

The British accommodation with the Boers betrayed Britain’s non-white allies. In exchange for supporting the Empire, Indians and Africans had been promised legal and political equality. Before the war, Gandhi had believed “if I demanded rights as a British citizen, it was also my duty, as such, to participate in the defense of the British Empire.” After the war, he expressed the disappointment of many, “learn your lessons, if you wish to, from the Boer War. Those who have been enemies of that [British] empire a few years ago, have now become friends.”

 

Africans felt similarly betrayed. Sol Plaatje described the racist ways the British had mistreated their African allies. During the siege at Mafeking, Africans were given the lowest rations and ultimately were forced from the city to reduce the number of mouths to feed. A British administrator described the widespread African discontent well: “they received a rude awakening. They found the country was not theirs; that we had not fought to give it to them, and most of all that the owners went back and still owned the farms.” For Gandhi, Plaatje, and others, British duplicity forced them to acknowledge that true equality could never be obtained within the Empire. The struggle for equality would become a struggle for independence. 

 

There is something darkly poetic in the timing of the Boer War. It offered a grim preview of warfare and the social conditions that would shake the world during the 20th century. The devastating power of modern weaponry and the challenges of defeating an insurgency would force a fundamental reevaluation of military strategy. Lines between civilians and combatants would be increasingly trampled. As a result, the suffering of innocents would reach an unprecedented scale. The Boer War was the first spring of these deadly flowers of modern war.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173552 https://historynewsnetwork.org/article/173552 0
Veterans Day Reflections on Pseudo Patriotism and Half-Baked History

Patriotism has been defined many ways, but I prefer Martha Nussbaum’s in her Political Emotions: Why Love Matters for Justice: “A strong emotion taking the nation as its object . . . . It is a form of love.” It “can play . . . an essential role in creating a decent society, in which, indeed, liberty and justice are available to all.” Her examples of great patriotic leaders include not only George Washington and Abraham Lincoln, but also Martin Luther King Jr., an adherent of non-violence and breaker of unjust laws. Such patriotism does not demean other nations, but wishes them well. It is not self-satisfied, but values critical dissent and realizes much work is still necessary if one’s country is to live up to its highest ideals.

Valuing critical dissent and realizing more work needs doing are essential to true patriots. Pseudo patriots, however, wish only to emphasize a country’s heroes and heroics, and minimize or ignore its villains and villainy. In response to critics, they reply “love it or leave it,” believing love does not criticize. They think that historians should help inculcate patriotism, but they want only half-baked history. 

They share the viewpoint of right-wing columnist Jarrett Stepman. In his The War on History: The Conspiracy to Rewrite America's Past (2019), he begins by stating, “An informed patriotism is what we want. . . . Is the essence of our civilization—our culture, our mores, our history—fundamentally good and worth preserving, or is it rotten at its root? 

His “informed patriotism” suggests a false dichotomy—”fundamentally good and worth preserving” or “rotten at its root.” But history is not so simple. Historians’ main allegiance should be to truth-telling in all its fully-baked complexity. Granted, we all have our biases and see history through the lens of our own interests. But our main job is not to glorify a country—or an ethnic group, or a gender, or a particular person—but to tell the truth, warts and all. Like all countries, the past of the USA has its glorious, sublime moments, but also its nasty, disgraceful ones. Neither can be ignored. 

These reflections are especially appropriate in light of Donald Trump’s continuing insistence that he is attempting to “Make America Great Again.” For the slogan appeals to Americans who wish to ignore past U. S. sins, especially the racism that killed and brutalized so many Native Americans and blacks, including slaves. 

 

In a 2017 article entitled “Who Are We?”, conservative columnist Ross Douthat suggested that many Trump supporters preferred “the older narrative” of U.S. history, the one that glorified Columbus, the Pilgrims, the Founding Fathers, Lewis and Clark, and Davy Crockett, the one that emphasized the melting pot (not multiculturalism), and the U.S. Christian tradition (not separation of church and state, and certainly not any secularist thinking). 

In response to Affirmative Action and the “Black-Lives-Matter” movement, many Trump supporters, especially older white men, see themselves as today’s true victims. As one Trump supporter claimed, “White lives matter, too, you know.” Such individuals believe Trump can help them “get their country back,” a country that would once again be dominated by white, male Christians, a country where history is taught so as to glorify American exceptionalism and not “tear it down” by harping on any perceived past flaws.

 

As columnist Roger Cohen has pointed out, "I want my country back” is not just a U.S. sentiment. “It is the universal cry of the global wave of rightist reaction.” It helps fuel anti-immigration, anti-Islamic, and anti-semitic feelings in various countries.

 

In the late 1980s I witnessed a similar reaction in the Soviet Union. These were the days of Gorbachev’s glasnost (openness). It was wonderful to see Soviet citizens’ hunger to learn their true history after they had been denied it for so many decades. Many of the crimes of Lenin and Stalin were exposed, but defenders of the two men fought back, and the battle over their history and reputations continues today in Russia, where Lenin and Stalin remain more popular than other Soviet leaders including Gorbachev. Rather than face up to the truths of the crimes of Lenin and Stalin, President Vladimir Putin attempts to manipulate Russian history for his own benefit. Like many of Trump supporters unwilling to acknowledge the USA’s past injustices, many of Putin’s followers refuse to acknowledge all the sins of Lenin and Stalin.

 

In 2015, Poland elected a right-wing government that “promised to make the country’s past great again” and combat a “ pedagogy of shame.” In 2018 it made it a crime to blame Poles for any Nazi atrocities. The government hoped to stifle any tales of anti-SemiticPoles collaborating with Hitler’s forces. This year it has interfered with a museum devoted to the history of Polish Jews. The museum’s offense? Exhibiting evidence of Polish anti-Semitism.

 

Mention of Nazi atrocities recalls, of course, the Holocaust and all the other evils perpetrated by Hitler and the Germans who followed him. Earlier this year, Susan Neiman’s Learning from the Germans: Race and the Memory of Evil appeared. One of its chapters deals with “The Sins of the Fathers,” i.e., the German fathers of the Nazi era, but more than one hundred pages deal with what Neiman labels “Southern Discomfort.” She is referring to the “discomfort” of U. S. slavery and racism. Having grown up in the U. S. south, she believes that the slavery and racism that existed there has not been adequately recognized by the U.S. public. The aim of her book is to indicate what we might learn about dealing with those sins from the way Germans have acknowledged and dealt with all the evils that emanated from the Nazi era.

 

On the occasion of 2019’s Veterans Day, the controversy over The New York Times (NYT) 1619 Project illustrates well the contested relationship of history, past sins, and patriotism. The project’s aim is“to reframe the country’s history, understanding 1619 [when slaveswere brought to Virginia] as our true founding, and placing the consequences of slavery and the contributions of black Americans at the very center of the story we tell ourselves about who we are.”

 

But many pseudo patriots objected. As Adam Bruno wrote on this site a few months ago, such objections typify those who wish to weave “unadulterated patriotism into the center of the country’s historical quilt—a move that conceals the broader ups and downs of the past.” 

 

He contrasts the criticism of the project by conservative media and individuals such as National Review, Larry Schweikart’s A Patriot's History of the United States, and former Speaker of the House Representatives Newt Gingrich with individuals like historian Andrew Hartman, whose A War for the Soul of America declared that “those aspects of American history that shined an unfavorable light on the nation, such as slavery, were ignored or explained away [by conservative media] as aberrations.”

 

Gingrich, who received a Ph.D. in history from Tulane University, typifies the half-baked historical approach. He tweeted that “the NY Times 1619 Project should make its slogan ‘All the Propaganda we want to brainwash you with.’” In her justly praised These Truths: A History of the United States, Jill Lepore refers to Gingrich’s  1996 book To Renew America as “a fantasy, useful to his politics, but useless as history—heedless of difference and violence and the struggle for justice. It also undermined and belittled the American experiment, making it less bold, less daring, less interesting, less violent, a daffy, reassuring bedtime story instead of a stirring, terrifying, inspiring, troubling, earth-shaking epic.” Yet, she acknowledged “that fairy tale spoke to the earnest yearnings and political despair of Americans who joined the Tea Party, and who rallied behind Donald Trump’s promise to ‘make American great again.’” 

 

Another half-baked historian is former Fox commentator Bill O’Reilly. In a 2017 article Andrew Bacevich, a retired military officer and reputable historian, bemoaned that O’Reilly was “America’s Best-Selling Historian” because he was “no more able . . . to write history” than was Trump “to run a government.”  

 

The problem with the type of history written by Gingrich and O’Reilly is not that they interpret it differently from historians like Lepore, who happens to be more progressive in her politics. In his second Annual Message to Congress, Abraham Lincoln stated “We must disenthrall ourselves, and then we shall save our country.” Lepore begins her U.S. history by quoting these words. But the history of Gingrich, O’Reilly, and other half-baked historians fails to disenthrall itself from a pseudo patriotism that belittles historians who see both sides of U.S. history, the positive and the negative. 

 

A favorite metaphor of U. S. politicians is to refer to their country as a “city upon a hill,” meaning a shinning beacon for the world. John Kennedy, Ronald Reagan, and Barack Obama all spoke of it, with Reagan most suggesting that ideal had been achieved. In 2016, Republican Mitt Romney, his party’s 2012 presidential candidate,warned that Donald Trump had “neither the temperament nor the judgment to be president, and his personal qualities would mean that America would cease to be a shining city on a hill.”

 

Many pseudo patriots and half-baked historians believe that by trampling out all the nay-sayers and “fake news” and by restoring white Christian supremacy, Trump can restore the “shining city on a hill,” can “Make America Great Again.” True patriots and true historians know, however, as Martin Luther King, Jr. did, that the city on the hill is still a work in progress, a work that is in peril under a president many historians consider our worst ever. 

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173554 https://historynewsnetwork.org/article/173554 0
The Overlooked Aftermath of the Chernobyl Nuclear Disaster

 

April 26, 1986. Ukrainian Republic of the Soviet Union. The Number 4 reactor at the Chernobyl nuclear power plant exploded. The blast propelled a massive amount of radioactive material into the atmosphere. This fallout covered a wide area of what is now Ukraine and Belarus, and western Russia. Soviet officials put the death toll at no more than 54 people.

 

Within weeks, the Soviet government declared that the radioactive fallout posed no danger to human health, and it offered reassurance to affected citizens as it distributed numerous manuals with recommendations on continuing to live in the contaminated regions. 

 

Eventually international agencies such as the United Nations also minimized the human health and environmental aftereffects of the Chernobyl explosion. Those who complained of problems from nuclear contamination were labeled “radiophobic.”

 

Acclaimed historian Professor Kate Brown embarked on an unrivaled journey of scholarly investigation to learn more about the aftermath of this devastating nuclear disaster. In her impassioned and lively new book, Manual for Survival: A Chernobyl Guide to the Future (WW Norton), she recounts her findings based on extensive archival research, travels in the “Zone of Alienation” and beyond, and numerous interviews of scientists, officials, factory workers, farmers, health care professionals, radiation monitors, and others. 

 

In her exploration, Professor Brown found evidence of extensive medical and environmental damage from radioactivity in Ukraine, Belarus, and beyond. She also unraveled an international effort to minimize public awareness about the dangers posed by nuclear power, nuclear testing, and nuclear weapons research. She traces similar efforts to downplay damage from radioactive contamination since the 1945 atomic bombings of Hiroshima and Nagasaki, and warns of the dangers that the nuclear radiation presents after almost eight decades of nuclear weapons and energy.

 

Based on her investigation, Professor Brown learned of dramatic increases in cancer, birth defects and other medical problems linked to Chernobyl. As documented in archives and as reported to her by scientists and other professionals, she found that tens of thousands of people—not a few dozen—died as the result of radiation from the massive nuclear explosion. She also describes ongoing medical and environmental problems that persist in the aftermath of the disaster. And, as clean energy initiatives often prescribe nuclear energy as an alternative to carbon-based fuels, Professor Brown calls for careful consideration of what happens when technology fails and we are left with in the wake of nuclear disasters. Her book raises profound environmental concerns based on careful investigation in the vein of Rachel Carson’s iconic volume Silent Spring.

 

Kate Brown is currently a professor in the Science, Technology and Society Program at the Massachusetts Institute of Technology. She is renowned for research that illuminates the convergence of history, science, technology, and bio-politics. She has written three other award-winning books, including A Biography of No Place: From Ethnic Borderland to Soviet Heartland; Dispatches from Dystopia: Histories of Places Not Yet Forgotten; and Plutopia: Nuclear Families in Atomic Cities and the Great Soviet and American Plutonium Disasters. Plutopia earned many awards including the American Historical Association’s Albert J. Beveridge and John H. Dunning Prizes for the best book in American history: the George Perkins Marsh Prize from the American Society for Environmental History, the Ellis W. Hawley Prize from the Organization of American Historians (OAH), the Wayne S. Vucinich Book Prize of the Association for Slavic Studies, East European, and Eurasian Studies, and the Robert G. Athearn Prize from the Western History Association. 

 

Professor Brown’s teaching and research are also widely recognized. For example, she has received numerous fellowships and, in 2017, she was awarded the Berlin Prize by the American Academy in Berlin. Her current research focuses  the history of “plant people:” indigenes, peasants, and scientists who understood long before others that plants communicate, have sensory capacities, and possess the capacity for memory and intelligence

 

Professor Brown generously responded by telephone to questions about her work as a historian and her new groundbreaking new book, Manual for Survival.

 

Robin Lindley: Congratulations on your groundbreaking book on the Chernobyl disaster, Professor Brown. You are a recognized expert in Soviet and Russian history. What sparked your interest in this field? Was there something in your family background or in your childhood that drew you to this history?

 

Professor Kate Brown. I don't have any Slavic heritage or anything that I know of. But I remember I went to a movie called Red Dawn about the Soviets attacking Colorado. And fortunately, the Coloradans had guns and they could defend themselves. It’s a stupid movie and I recognized it to be a cult movie or propaganda, but I was upset that the kids in the movie theater were cheering every time a Communist was killed. And I went home and I was complaining to my parents about it. My mom was there smoking a cigarette and she says, “Well, do something about it. Study Russian and change the world.” And I decided, well dammit, I'll just do that.

 

The very next week I signed up for classes in everything Russian. Russian history, Russian grammar, and Russian literature mostly in translation. My aim was to go to Russia and see if it really was an Evil Empire. I guess I’ve always liked to know things for myself rather than relying on someone else's knowledge.

 

Robin Lindley: And then you traveled to Russia.

 

Professor Kate Brown: Yes. By 1987, I had enough Russian grammar and language to study in Leningrad as an exchange student. Just then all kinds of interesting things were starting to happen between Gorbachev and the United States. After that, I just kept going. I was part of a crowd of Westerners who worked in the USSR at the end of that polity. Gorbachev liberalized visas and politics, which made it easier to spend time in Soviet Union and carry out joint programs. 

 

Robin Lindley: Did you also work as a journalist?

 

Professor Kate Brown: Yes, I did. When I was in Seattle in a graduate history program at the University of Washington, I initially didn't have funding for my studies, so I worked through a work study program. I worked for KCTS, the public television station, on their weekly news magazine. And then I worked at KUOW, a National Public Radio station where I was a beat reporter. The job wasn’t complicated. I’d get to work at eight in the morning. They'd say, Boeing's on strike or there is a problem over water rights at the Snoqualmie Falls. And I'd go off and I'd get some interviews and I'd have my story on the radio by 4:00 PM, whatever the story. That was a real crash course both in figuring out how to get a lot of information really fast and how to organize material into a news story. I learned to stick a microphone in people's faces. And I learned to tell a story with a narrative arc. And then I met some people at the TV station who went on to make documentary films. They hired me as a researcher and scriptwriter for their documentaries. 

 

And in 1992 I ended up in Munich working for Radio Free Europe. And then I went to Moscow and I did some stories for Radio Free Europe from there in the fall of 1992.

 

I have that kind of experience, and I enjoyed it. But then I thought I wanted to do longer form journalism and I wanted to write my own books. I didn't want to just write a story that's on in the course of a day. And I wanted more control over the stories I told and how I told them, as opposed to the rigid format, whether it was TV or in short form journalism. So, I chose a career in academia, which meant I would have a smaller audience but I could have more autonomy in what I wrote.

 

Robin Lindley: And you pursued grad school and earned a doctorate in history, but you wanted to write for scholars and the general public.

 

Professor Kate Brown: Academia is full of all kinds of wonderful ideas and fantastic research-driven, creative work. But sometimes academic writing turns off popular readers. And so that was one other missing part for me. Was it possible to do nuanced, creative research and then tell about it in a way that's compelling and can reach any kind of high-school level reader? That's always been my mission. So I wrote my dissertation as a first-person travelogue. I got some trouble for it because dissertations are usually more formally written, but I was stubborn and finally my advisors just said do whatever you want.

 

Robin Lindley: Did that desire to put your brand on a scholarship bring you to history graduate school?

 

Professor Kate Brown: I didn't think so much of a brand, but as a lease to liberate myself from the constraints that I saw imposed on grad students and academics, and we often put these constraints on ourselves. 

 

Robin Lindley: Your journalism background served you well. Your writing is very engaging and accessible. I believe you have described yourself as a partisan historian. In Plutopia, your book on the plutonium cities of Hanford, Washington, and Ozersk, USSR, you included information from non-expert people you interviewed who actually lived in the areas you wrote about in addition to your archival research. You also broke from most scholarly writing with first-person reporting on your research.

 

Professor Kate Brown: When you work in the archives, you get kind of a sketch or an outline of what real life is like in whatever period you're studying. And, when you go to a place, especially if it's going to a place for recent history, you can see what it looks like. You can see how the physical world is also an actor in your stories: the way the rivers flow, how the soils soak up water. 

 

You can read the archival record, but it really helps to get the fine grain detail by going to a place and then talking to people. People can really clue you into their local knowledge that is so important. And they know things that experts don't know. They know things that you only get a glimmer of working in the archives. 

 

I don't just take people's word for it. After talking to people, I can go back to the archives and try to cross-check what they tell me. Often, I have a whole new understanding of an event after hearing firsthand accounts. 

 

Robin Lindley: And you also did extensive interviews for your new book on Chernobyl.

 

Professor Kate Brown: With Chernobyl, I did a good number of oral histories. What I found in the archives is that the officials were having arguments among themselves. Some doctors and scientists who studied the accident were reporting major health problems. But experts in nuclear medicine who were Moscow, Vienna, Paris, or New York were saying that, with the kinds of emissions and the kinds of estimated doses that they calculated people received, they didn’t expect major health problems. They would explain the rise in the frequency of disease by saying that these people were anxious, or they drank too much, or they had a poor diet and a poor economic situation. They basically devised an alternative narrative to attribute to those health problems, though I didn’t see hard evidence of drinking or a rise in anxiety. So, what helped, I think, is that I would just go talk to people and get their stories, and confirm one version or another of the oral histories with the material in the archives. But then I still wasn't sure. 

 

So, in this project, I took yet another step and I enrolled myself as a participant observer with two biologists who were the only two scientists I could find who regularly worked in the Chernobyl Zone twice a year since 2000. They are like clockwork arriving in the Chernobyl Zone in June and in September. They use the contaminated Chernobyl Zone as a natural experiment, a massive field lab.  I started going along with them, and I learned a lot. I learned forensic methods to detect radioactivity in the natural environment as I traveled with them. Later, outside the Zone, when I went to Chernobyl-contaminated areas where people continued to live, I could see evidence of damage in the environment using techniques that I had learned from the biologists.

 

And that was a third way to cross check the story, which I knew was going to be controversial. I was really intent on verifying the stories I was getting. And, as I was talking to people, I figured I could use science also. People lie and archives lie, but maybe trees don't. 

 

Robin Lindley: Thanks for describing your approach to research. Did your research for Plutopia on those plutonium cities spark your book on Chernobyl?

 

Professor Kate Brown: For sure it did. I felt like this was a bit of a sequel for Plutopia. I started out with a very different set of questions for Plutopia, but I kept running into these farmers, whether they were in Siberia or in Eastern Washington, who had very similar stories to tell me about their health problems. And I knew that they weren't talking to each other and they didn't share a common language. 

 

I tried to do as much research as I could in archives, but those cities were both military sites. The American government wasn't terribly curious about what happened off site of the nuclear reservation. And the Russians kept some studies of people living down river and downwind of the plant who were exposed, but those studies were off limits to me as a researcher. So I figured Chernobyl might be a good place to look and try to get more about that health story because it was a civilian site and it happened later. 

 

I walked into the archives in Kyiv (Kiev) one day. I asked what they had from the Ministry of Health on the medical consequences of the Chernobyl disaster. They said that was a censored topic during the Soviet period and I would not find anything. I asked to look anyway.” Sure enough, it took three seconds to find a whole document collection entitled “The Medical Consequences of the Chernobyl Catastrophe.” Big bound collections. I started reading them and I realized that the archivists didn't know about these files. They discouraged me not because they were trying to deceive me, but because nobody else had ever pulled them before.

 

Robin Lindley: It surprised me those files had been untouched until you came in.

 

Professor Kate Brown: Yes. And over and over again, I had that experience in Minsk and Zhytomyr, Gomel and Mogilev. To be the first to check out the files. With two research assistants, we found files down to the county level. In sum we found thousands of records that described, in one way or another, environmental exposures and health problems. 

 

I also was convinced that I came across again untapped archives in the Belorussian Academy of Science. The Belorussian government was doing a great job of ignoring the contamination story and pretending Chernobyl didn’t exist, but scientists at the Academy had privately gotten very worried and they started their own case control studies on several topics, but mostly related to children's health and the health of pregnant women. And those studies are really convincing. They had all the relevant data such as dose estimates. I guess that's when I determined I believed what’s called the alarmist stories.

 

Robin Lindley: What were some of your major findings on the medical and environmental consequences of the Chernobyl catastrophe? The official death toll was about 50 people but you learned that radiation illness probably contributed to tens of thousands of deaths.     

 

Professor Kate Brown: Yes. The official death toll most often cited in the big publications is that 33 to 54 people died from the Chernobyl explosion, but that’s just from the acute effects of radioactivity. Those were fireman and plant operators who were exposed massively right during and right after the accident, and most of them died within the next couple of months in one hospital in Moscow.       

 

But I found the death toll was much higher. I found that not 300, the official count, were hospitalized after the accident for Chernobyl exposures, but 40,000 were hospitalized, 11,000 of them were kids were hospitalized for exposures in the summer after the accident.

 

The Ukrainian government gave compensation to 35,000 women whose husbands died from causes related to radiation. Now that number is limited. It included just men who had documented exposures. It doesn't include children or women or babies. And that's the number just for Ukraine, which got the least amount of any radioactive fallout, while Belarus received far more. 

 

We tried really hard to get some kind of count for Belarus and Russia about fatalities from Chernobyl, but there simply is no kind of official count. So, 35,000 is the lowest possible number. On the thirtieth anniversary at the Chernobyl visitor center, the official tour guide said that the death toll was at least 150,000 in Ukraine alone.

 

Robin Lindley: How again did you come up with the figure for men who died?

 

Professor Kate Brown: About 35,000 wives received compensation because their husbands died from a Chernobyl-related radiation illness. That means that these men did some kind of work in which they were monitored so they had a film badge or some other dose estimate. Their doses were recorded or reconstructed, and then their illnesses were on a list of illnesses that were attributed to Chernobyl contamination. They died leaving their widows some income as compensation. So that's how that number was created. 

 

Robin Lindley: What evidence did you find on birth defects?

 

Professor Kate Brown: There's all kinds of evidence in the book. The evidence I had was a study here and a study there and observations here and there. But the one study that's been done that fits standardized Western protocols was by Wladimir Wertelecki who teaches at the University of South Alabama. He carried out a study in the northern province of Ukraine. He found that there was a six times higher rate of neural tube birth defects (a category that includes spina bifida and anencephaly) in people who live in those northern regions. He also found elevated rates of cesium in the bodies of people in that northern Rivni Region.

 

Anencephaly and spina bifida are also big problems in Eastern Washington [the site of Hanford]. In 2010 the State of Washington became alarmed because there was a 10 times higher than expected number of babies with anencephaly in Eastern Washington, especially in the three counties around Hanford. They did a study and wrote a report that you can get it online. To the best as I know, this little epidemic is not over and the numbers continue to be high. The Washington State epidemiologist wrote in this report that they don't know what's causing these defects. He said they looked into all kinds of things such as nitrates and pesticides and genetic factors and radiation from Hanford. 

 

They reported that they were told by the Department of Energy that radioactivity does not leave the Hanford site. If you know anything about Hanford, you know that’s a statement that only a very gullible person would believe. So that's largely a silent, unexplored topic, but one we see in areas where people have been exposed to radioactivity. 

 

Robin Lindley: What a tragedy for those families. Another issue that's related to the physical health consequences of the Chernobyl disaster of course is the mental health of citizens after the catastrophe itself. I think you mentioned cases of posttraumatic stress disorder and just the general stress of living in that situation. 

 

Professor Kate Brown: The United Nations’ bodies first said that the health problems were from the fear of radiation. But some researchers and scientists find that real neurological damage caused by exposure to radioactivity can cause emotional disorders. There are also people who work in microbiology who have found that when you have a disorderly microbiome in your gut that is damaged from some toxin, whether it's a chemical toxin or a radioactive contaminant, that that can trigger emotional problems as the  gut serves as a sort of a second emotional brain. A lot of how we feel every day has to do with our microbiome and our gut. 

 

These cases suggest a lot of unanswered questions. A purpose of the book is to urge citizens to ask their leaders and public health officials to get more curious about the long-term effects of chronic low doses of radioactivity. We know a lot about high doses of radioactivity and human health, but researchers repeat that they know next to nothing about low doses. We know about high doses from the Hiroshima studies. We don't know about these low-dose effects because we have never really studied people who live in those conditions. 

 

Robin Lindley: Your book serves as a call for further research. What were some of the environmental consequences of the disaster that struck you? You mention harm to animals and plants and even decreased pollination.

 

Professor Kate Brown: What was really striking was when I went from the Ministry of Health records to the State Committee for Industrial Agriculture records, I saw that the people in the Soviet Union did their best to monitor food supplies and levels of radioactivity in the in soils, water and air. And when they found high levels of radioactivity, they went in with bulldozers and they scraped away the topsoil and dumped it far away from the villages. And they scrubbed down surfaces and asphalt and buildings with chemical solvents to try remove any radioactivity. 

 

They could get these villages to a level of making them livable, but then they would come back two weeks later, and the radiation levels would be nearly as high. And they realized that radioactive isotopes could mimic minerals that plants and animals need to survive, that go from soils, air and water into the plants, then into the animals. And then, because humans sustain themselves on plants and animals, they take in these materials and bring them to their villages. And as they go into the villages with their shoes or their tractor wheels, they bring in dust and dirt from the forest and the field. And that all those contaminants gather in human population points. 

 

So the exposures for humans were consistently from ingesting contaminants. Once you ingest radioactive isotopes, the natural biological barrier of your skin and your body no longer helps, and beta and alpha particles penetrate your skin. Once they're inside your body, they can do a lot of damage to lungs, hearts, various organs, and inside the joints. They wreak havoc on bone marrow. 

 

But before these acute problems, people have subacute problems. And interestingly enough, we don't much care about those.  Few journalists have asked me how many people had digestive tract disorders or respiratory problems. Mostly, they want to know more about cancers and deaths and birth defects—the acute problems. But subacute problems mount in a body. A person may have that one chronic disease, but maybe two or three subacute problems. A family would have several people with chronic health problems. They're still alive and not in the death toll, but their lives are shorter and far more painful. They are not able to be productive as members of the community in terms of work and a creative life. 

 

None of us would wish this kind of medical history on our own families and communities. And that's I think something that we don't statistically track because we have failed to ask this question.

 

Robin Lindley: The damage to the immune system must be serious.

 

Professor Kate Brown: Yes.

 

Robin Lindley: One of the themes of your book is the international effort to minimize evidence about the medical and environmental consequences of the Chernobyl disaster. You also note that there's a long history, even in United States, of covering up problems with radiation illness. That goes back to the 1945 bombing of Hiroshima and Nagasaki when General Groves refused to disclose the effects of radiation. You recount a history of similar cover up efforts since then.

 

Professor Kate Brown: Unfortunately, we have a real track record in the United States of minimizing the record of radiation exposure and illness. We have a long-term life span study of the survivors of the Japan bombings, but that study doesn't take into account radioactive fallout. It estimates that the doses survivors received was that of one big x-ray that lasted a second. But the other exposures, the Chernobyl-like exposures, with people living in these environments who take in radioactive contaminants by ingesting them in the air in their lungs or through the food cycle, was never considered as part of that study. There were also exposures of Marshall Islanders and people from the Nevada test sites. We really don’t know what happened in those cases for a lack of curiosity. 

 

Finding out about radiation is a real threat [to nuclear power advocates]. In 1987, a group of health physicists, specialists in nuclear medicine, had a convention in Columbia, Maryland. A lawyer from the Department of Energy addressed them.  He said that after Chernobyl, the biggest threat to the nuclear industry was lawsuits. He announced that they were going to break out into small groups with lawyers from the Department of Justice to train them on how to become expert witnesses in defending the US government against lawsuits. These scientists then served as “objective” witnesses in lawsuits where Americans took corporations to court for their exposures in the production and testing of nuclear weapons. It comes as no surprise that few won those lawsuits.

 

Other nuclear powers including Great Britain, France, and Russia, were facing similar lawsuits. If industry scientists could say that Chernobyl was the world's worst nuclear accident and only 33 or 54 people died, then those lawsuits could and indeed did go away. 

 

Robin Lindley: When you were traveling through the Zone of Alienation and when you were finding information that a lot of people probably didn't want you to have, were you threatened? Did you feel that your safety was endangered at all?

 

Professor Kate Brown: No, I didn't. Since I published the book there have been a couple of people who are industry scientists and a guy who runs two pro-nuclear NGOs, and they make a living by promoting the nuclear industry. They’ve been attacking me and my book but they're not disinterested parties. Other than that, I haven't really endured any hardships. 

 

Robin Lindley: I'm glad. With the KGB involved and a series of cover ups, your book reads like a thriller. It’s a compelling scholarly expose’ with popular appeal on what happened after Chernobyl. 

 

Many witnesses you spoke with were women who were close to the ground level in areas of contamination--the sort of people you wanted to hear from who'd lived through this experience. They included doctors, teachers, and women who work in the wool and leather factories, among others. Their contribution to your book was impressive. 

 

Professor Kate Brown: Yes. Well, women are the ones who take care of the kinship networks. They're the ones who normally, especially in Soviet society, take care of family members when someone is sick. And women are also the ones who staffed hospitals. Being a doctor in the Soviet Union was a low paying job usually left to women. Men were researchers who worked in institutes and universities. So, it was women who noticed these trends in poor health and they're the ones that are there in the book. The women were the ones sitting around in the waiting rooms and they exchange information there day after day for hours, and they start to see trends.

 

Robin Lindley: Thanks for that personal insight from your travels. What was the political fallout of the Chernobyl disaster in terms of the future of Mikhail Gorbachev and the fall of the Soviet Union? 

 

Professor Kate Brown: Gorbachev said at some point after the fall that the main cause of the fall of the Soviet Union was Chernobyl, but I'm not sure Gorbachev is the most reliable person to ask on this point because everybody else in the former Soviet Union blames him for the collapse. So it makes sense that he was looking for outside factors to deflect attention away from himself. But, as I worked through archives, I took note of the incredible resources that the Soviet government spent to try to deal with this disaster from cleaning up this huge territory and then putting a sarcophagus over the reactor itself. In Ukraine alone, they sent out 9,000 medical staff to look at everybody they could find who might have been exposed to contaminants. They dealt with the medical fallout, and set up studies of the ecology and the human health problems. And on and on.

 

Chernobyl was a huge drain at a time when the Soviet Union was experiencing a collapse in oil prices and oil exports, the main source of hard currency revenue. And so Chernobyl was certainly a confounding factor. 

 

And then they kept this all under wraps and they weren’t honest with people. When, in 1989, they finally published the first maps of radioactivity showing the high levels of radioactivity in places where people were living for three and a half years, residents were furious. People poured out to the streets in June 1989. There were marches and strikes and pilgrimages and protests, and new people started to run for office.

 

Robin Lindley: You recount some of the history of previous nuclear accidents in the Soviet Union. Wasn’t the Soviet military using nuclear weapons to stop forest fires, or is story apocryphal?

 

Professor Kate Brown: The story I report my book that we have from archival sources and one eyewitness was that there was a gas fire in a well when someone digging tapped into underground flows of gas and that caused a fire in the ground. They couldn’t extinguish it. They tried for a year to put out the fire this way and that, and finally a team came from a closed military establishment in Russia and they dug down 200 meters, right next to the gas fire, and they dropped a nuclear bomb in there and blew it up, expecting to spill this big mound of dirt on top of the gas fire and just snuff it out. But what happened instead is somehow the bomb went sideways horizontally, and not on the gas fire. And then the plume from the nuclear bomb went up through the gas well and just made this huge column a mile in the sky from the explosion. And then fallout rained down. They had to evacuate Russian soldiers and villagers nearby. It was not far from Kharkiv.

 

That was in the 1970s. And that was in the same year that a nuclear explosion for civilian purposes became an experiment that went disastrously wrong. And there are lots of incidents like that. 

 

They had 104 accidents at the Chernobyl plant in the five years before the big accident in 1986. It was a tottering enterprise to run. Lots can go wrong and lots apparently did. 

 

Robin Lindley: What have you learned about the nuclear accident in Northern Russia this past August where here was an explosion perhaps involving a missile experiment?

 

Professor Kate Brown: I only know what we all read in the papers. I've been reading a little bit in the Russian papers and they don’t have much more than what's in the English papers but this seems like a case of press the replay button from Chernobyl with denials that it happened. And then, seven Russian scientists died. That's significant. There’s a lot of secrecy about the situation. It doesn't appear to have created anywhere near the levels of radioactivity and fallout as Chernobyl. They were trying to develop some kind of weapon, but we don't know exactly what. There's some speculation about a weapon for a nuclear submarine or some kind of missile. So it's unclear. 

 

Robin Lindley: Congratulations on your new role at MIT as a professor in an interdisciplinary program.

 

Professor Kate Brown: I’m teaching in a program in science, technology and society. At MIT we train future scientists and future engineers so it’s a wonderful place to think about science and to talk with students about not just creating beautiful machines, but also thinking about how they will be used in the worst and the best of all possible situations. It’s an exceptional opportunity. 

 

Robin Lindley: Are you continuing your research on Russia and on nuclear issues?

 

Professor Kate Brown: No. I thought I'd move on from that. I feel like I’d just start repeating myself. Now I'm interested in what I call “plant people.” Now that Western scientists have validated the notion that plants have distributed intelligence and communicate with one another and across species. I thought to myself, peasants have known that for hundreds of years. So I am going back in time and looking at people who had these insights. I want to know what else have they have known that we have missed. 

 

Robin Lindley: I’d like to conclude by asking you how you decided on the title of your book about the aftermath of Chernobyl: Manual for Survival: A Chernobyl Guide to the Future? 

 

Professor Kate Brown: I found in the archives all kinds of [post Chernobyl disaster] manuals for how to live on a radioactive landscape. There was a manual for doctors who treated exposed patients, another manual for the meat packing industry, and one on how to deal with high and low levels of radioactive farm crops, and manuals for the dairy industry, for the leather industry, and for the wool industry, and manuals for farmers who were going to live here. 

 

The manuals were to reassure citizens, and said we've checked the radiation in your population point and everything's fine. No need to worry. There are just a few things you need to keep in mind. Take all your topsoil and remove it and bury it somewhere far from your village. Don't eat any berries or mushrooms. In fact, it’s better not to enter the forest at all. They go on and on like that. Clearly everything was not fine. 

 

That's where I got the idea of the manual. I decided to call it Manual for Survival because I considered the people who lived there to be survival experts. This place had suffered through the revolution, the Russian civil war, the First World War, the Second World War, and famine, and purges. They'd seen it all. And then, they tried to make it better by building a nuclear power plant to bring cheap energy to the villages. And then it blew up. 

 

So they'd seen all the calamities the twentieth century had to offer. And I thought, as we talk about coming threats because of the ecological crisis, that it might be good to know something about how to survive a severe ecological crisis. And so that's what I was looking for: the everyday heroes.

 

I did find lots and lots of people who resisted the bosses who told them to fudge the numbers or to overlook troubling facts or not report radioactivity in the water or the land. And these people stood up to power and said, No, I'm not going to do that. I don't care what you do to me, but I'm going to do what's right. And I found that extremely inspiring. 

 

Nobody got shot and they weren't throwing people in jail for resistance. Some people got docked in pay and other people faced more demands on the job or were demoted. But they continued. So it was possible to be courageous and they actually did a great deal of good. I was purposely looking for that story. 

 

Robin Lindley: Thank you for your thoughtful responses Professor Brown and congratulations on your new book and your new position at MIT. I wish you the best.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173550 https://historynewsnetwork.org/article/173550 0
It’s Been 32 Years since the Conclusion of the INF Treaty Yet Arms Control Is Still Vital

Gorbachev and Reagan sign the INF Treaty

 

 

In August, the United States withdrew from the landmark INF Treaty of 1987 due to the Russian Federation’s continuing violation of the treaty and Vladimir Putin’s reckless deployment of the Russian 9M729 cruise missile. Another crucial arms control treaty, the New START agreement, is set to expire in early 2021. Recently, George Shultz and Mikhail Gorbachev called American and Russian decision makers to preserve the INF Treaty. (1)

 

More than thirty years ago, Shultz and Gorbachev stepped forward with President Reagan to change history’s direction. Reagan and Gorbachev signed the INF Treaty on the occasion of their historic Washington Summit on December 8, 1987. The unprecedented agreement eliminated all US and Russian missiles between the ranges of 500 to 5500 kilometers. The two countries destroyed a total of 2,692 ballistic and cruise missiles by the treaty’s deadline of June 1, 1991, with verification measures that were previously unimaginable. 

 

The INF Treaty had enormous impact: It lowered the threat of nuclear war in Europe substantially and paved the way for negotiations on tactical nuclear and chemical weapons, as well as negotiations on conventional forces in Europe. The INF Treaty facilitated the peaceful end of the Cold War – it was the cornerstone of Euro-Atlantic security in a time awash of bold changes. Never before during the Cold War had Europeans been able to experience life largely free of the fear of nuclear war. Today, the looming expiration of the INF Treaty removes a pillar of global security.

 

How did Reagan and Gorbachev manage to develop mutual trust? Both reconceptualised the notion of “security:” Reagan believed that nuclear weapons, not Soviet communism, was the main enemy. The United States and the Soviet Union needed to work together to rid the world of these arsenals. Reagan and Gorbachev sought to move from mutually assured destruction to mutually assured survival and the leaders were united in their desire to reduce and ultimately to eliminate nuclear weapons. 

 

They created an upward spiral of trust by creating positive experiences with each other. Reagan’s and Gorbachev’s key to success was mutual engagement in as many ways as possible and to move forward in steps. NATO’s policy was another key to success: NATO’s combination of deterrence and arms control made the conclusion of the INF Treaty possible. Strength and diplomacy went together. George Shultz highlighted the relevance of NATO’s dual-track strategy: “If you go to a negotiation and you do not have any strength, you are going to get your head handed to you. On the other hand, the willingness to negotiate builds strength because you are using it for a constructive purpose. If it is strength with no objective to be gained, it loses its meaning. […] These are not alternative ways of going about things." (2) 

 

The INF Treaty could not have been achieved without the support of US allies. NATO’s 1979 dual-track decision reflected the parallelism between strength and diplomacy. NATO offered the Warsaw Pact a mutual limitation of medium-range ballistic missiles and intermediate-range ballistic missiles combined with the threat that in the event of disagreement NATO would deploy additional weapons in Western Europe. Deployment and negotiations were intertwined. NATO’s aim had always been to abolish Intermediate Nuclear Forces entirely. The Alliance championed the so-called zero option as the ideal outcome of the US-Soviet INF negotiations because it would remove all the INF missiles rather than simply controlling their growth in balanced ways. NATO’s solidary gave the United States negotiating leverage. In the end, Reagan’s and Gorbachev’s political leadership was decisive. Both negotiated in good faith and were instrumental in forging agreement. Both had the political will to hammer out solutions to outstanding issues. Particularly Mikhail Gorbachev had to overcome resistance in the Politburo and the Soviet military.

 

This kind of political willingness is absent today. Are we in for another round of INF missile deployments in Europe? It is imperative to avoid a situation where we might have no arms control and no mutual verification at all. Against the backdrop of the Trump Administration’s loathing of arms control, it will be up to the European NATO allies to conceptualize a new arms-control framework for the post-INF world. Recently, at the 2019 Munich Security Conference, German Chancellor Angela Merkel raised the idea of incorporating China into a global INF arrangement. Russia’s apparent interest in the deployment of INF weapons in Asia might give Washington a new chance to engage with the Kremlin: Both Russia and the United States have a mutual interest with regards to China’s inclusion in a new INF negotiating framework.

 

(1) See George P. Shultz “We Must Preserve This Nuclear Treaty”, in: The New York Times, 25 October 2018. See Mikhail Gorbachev “A New Nuclear Arms Race Has Begun”, in: The New York Times, 25 October 2018.

(2) nterview between James E. Goodby and George P. Shultz, The Foreign Service Journal, December 2016, see http://www.afsa.org/groundbreaking-diplomacy-interview-george-shultz, accessed 22 December 2017.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173549 https://historynewsnetwork.org/article/173549 0
Echoes of the 1969 Vietnam War Protests 50 Years Later

Moratorium Protests in D.C., 1969

 

“Old men forget; yet all shall be forgot, But he'll remember, with advantages, What feats he did that day,” predicted Shakespeare’s Henry V.  

 

Well, I’m not so sure that mine was much of a feat, that pleasant evening of October 15, 1969.  But just being in Manhattan, marching toward St. Patrick’s Cathedral, was in itself risky business…that is, if you were a member of the American military.  And so we were, my friend Johnny and I.  

 

Of course, we wore our civvies.  All the same, we stood out.  Fresh from the Coast Guard’s boot camp in Cape May, New Jersey, just a month earlier, we still had nothing but stubble on our scalps… this when “Hair” was the hottest ticket on Broadway. Neither of us owned a civilian cap, so, yes, indeed, we stood out.  

 

What would have happened to us, had we been spotted by a Coastie officer or senior petty officer stationed with us on Governors Island in New York harbor is hard to say.  At a minimum, I suppose, I could have kissed goodbye my berth in the February 1970 Officer Candidate School class.  Worst-case scenario: our commanding officer, who was in profile the spitting image of the American eagle, was fond of warning us that a cutter left from the island for Vietnam almost weekly.  He could make sure the next to sail found room for any one of us who didn’t properly appreciate our cushy place in the personnel department. 

 

That would have been very bad luck, indeed, especially considering I had joined the Coast Guard to avoid being drafted into the Army or Marine Corps. So perhaps being in Manhattan, candle in hand, with the tens of thousands who turned out that night, was something of a feat.

 

The Big Apple gathering was only one of many occurring across the country that night.  Boston’s was the largest, as Senator George McGovern was the big draw.  A Rhodes Scholar named William Jefferson Clinton organized one in England, a fact that would cause him a bit of trouble in 1992.  But he assured voters he hadn’t even inhaled and skated past accusations about his Oxford antics to slide into the White House.

 

Feat or no, as active-duty military, Johnny and I added to the noteworthy diversity of the Moratorium.  The organizers consciously chose to take to the streets, rather than focusing on college campuses, so that John and Jane Q. Public might participate. And participate they did.  To my knowledge, October of ’69 marked a turning point in opposition to the war.

 

The turn had been a long time --- and 45,000 young American lives --- in coming. In 1964, when I was still in high school, journalists first began speculating that U.S. “advisors” in South Vietnam were actually engaging in combat.  August 2, 1964, a month before I began my senior year, was the day the USS Maddox claimed to have been attacked by North Vietnamese torpedo boats. Though the exchange of fire left no American casualties, it got LBJ a Congressional blank check (the Tonkin Gulf Resolution).

 

During the next four years, while I enjoyed my 2-S (college student) deferment and majored in fraternity at Franklin & Marshall College, the twin themes in Vietnam were escalation of American troops on the ground and American bombs from the air. Frustrated and hated by many, Lyndon Baines Johnson pulled himself from the ’68 race. More Hamlet than Henry V, Vice President Humphrey dithered his way to defeat, handing Richard M. Nixon the most remarkable come-back story in American political history.

 

Nixon claimed during the campaign to have a secret plan to extricate us from the war.  A college senior by then, I dearly hoped his plan would kick in before I lost my deferment. But by the time I received my reclassification to 1-A (prime meat, if you will) in June of ‘69, thousands more American boys had died during Tricky Dick’s first half-year in office.  I called my local draft board and asked if I could appeal my reclassification on the basis of a graduate fellowship to Penn State.  

 

“Sure, you can appeal, son,” came the elderly reply down the telephone line.

 

“What will that do for me?” I followed up.

 

“You’ll be inducted in August instead of July,” she responded.

 

Thus, my four-year enlistment into the ranks of Charlie Gulf.

 

And, thus, following nine weeks of basic training, a week of liberty, and hardly a month in my Governors Island duty station, I was walking down Fifth Avenue to the Cathedral, where the speeches took place.  And I was in the company of men and women from all walks of life and economic circumstances.

 

The October events were dwarfed a month later by the Moratorium-on-steroids in Washington.  Outdoing its October predecessors, the November 15th event drew a half million souls, including many bright lights from the entertainment industry.  Two days prior to the Saturday extravaganza, at least 40,000 made a Thursday evening “march against death” that dragged into Friday, making the Moratorium something of a three-day phenomenon.

 

The Moratoria (for the benefit of any Latin scholars reading this) were inspiring, hopeful, even exhilarating… and utterly useless, so far as I could tell then or can remember now.

 

Nixon, reportedly infuriated but presenting a façade of indifference, allowed the war to drag on.  I was stationed in Cleveland on Lake Erie, when 25 miles away in May of 1970 National Guardsmen shot students at Kent State University.  Campuses across the country erupted in turmoil and rage.  Two years later, Nixon won reelection by a landslide against the same McGovern who had captivated the crowd in Boston three years earlier.  I guess Nixon was right about a silent majority.

 

Old men forget?  No…as I watch events unfolding in Washington today, I recall the river of candles flowing up Fifth Avenue fifty years ago. And I recall the words of another writer, a Frenchman, I think: “The more things change, the more they remain the same.”  (Don’t ask for that in French. Remember, I majored in fraternity and got D’s in French.)

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173551 https://historynewsnetwork.org/article/173551 0
Americans Were Too Stodgy for Art Nouveau

 

Art nouveau was the modern art movement that America almost entirely missed.  Not only is it absent from our streets, it is largely missing from our museums, a void that a traveling exhibit, Designing the New: Charles Rennie Mackintosh and the Glasgow Style, now at the Walters Museum in Baltimore, is seeking to repair.  Nevertheless, a traveling exhibit and the fact that we had Louis Comfort Tiffany is small consolation when set beside the gloriously curved and tiled art nouveau buildings that sprouted like exotic flowers in the streets of Vienna, Brussels, Glasgow, and Darmstadt, cultivated by the European architects of the Gay Nineties. Americans who wanted something other than classical styles for their late nineteenth-century buildings chose Romanesque Revival for public buildings, and Queen Anne, or Arts and Crafts Movement-influenced Craftsman, Shingle, or Stick styles for homes.  As I said, we were a stodgy people.

 

Art nouveau did not flourish everywhere; it bloomed in Buenos Aires but not in London, Chicago, San Francisco or Montreal.  It blossomed in wealthy, industrial Glasgow, but not in similarly wealthy, industrial Manchester for reasons that seem rooted in the serendipity of individual genius: the genius of Charles Rennie Mackintosh, Margaret Macdonald Mackintosh (who married Charles), Margaret’s sister Francis Macdonald, Francis’s husband, Herbert MacNair, and the dozen or so others who were part of the Glasgow School, which flourished until art nouveau went out of fashion as the First World War began. Americans can now have a glimpse of the style without crossing the Atlantic in the traveling exhibit that opened at the Walters in October and will go to Nashville, Chicago and the new Museum of the American Arts and Crafts Movement scheduled to open this winter in St. Petersburg, Florida.

 

The Arts and Crafts movement flourished in the same decades as art nouveau, but it sold better in America.  Arts and Crafts was deeply reactionary, a reaction against industrialization and mass production that sought to recover the beauty and authenticity of the past with hand-crafted objects inspired by Europe’s medieval and folk art that tended toward clunky.  The contrast with the Glasgow School and other art nouveau movements could not have been greater.  The nouveau artists also revisited the motifs and forms of folk, Celtic, Gothic, and Japanese art, but used them to create a modernist movement, a new art for the cutting-edge patrons of the fin de siecle.  The sinuously botanical nouveau designs and buildings in which the curves of women and flowers intertwine was shockingly, radically modern.  

 

Americans and Brits weren’t up for art that was nouveau; they were enraptured by Arts and Crafts, building mansions and neighborhoods of charmingly retro houses and filling them with hand-crafted evocations of the medieval and peasant world.  Today you can buy an Arts and Crafts home in almost any of the older American cities or Victorian resort towns, but we have nothing to compare with the art nouveau homes and apartments buildings that enthrall visitors to Paris and Prague. No American house can be put in a category with Victor Horta’s Tassel House in Brussels, the interior of Munich’s Villa Stuck, the Vienna Secession building, or the entrances that Hector Guimard created for the Paris Metro (although France kindly condescended to give a Guimard Metro entrance to Montreal and another to the National Gallery in Washington).  We did have Laurelton Hall, the home Louis Comfort Tiffany built for himself on Long Island.  It was torn down after a fire struck the abandoned mansion, but bits and pieces of it can be seen at the Morse Museum in Winter Park, Florida and the Metropolitan Museum in New York.  Tiffany notwithstanding, we are left with the fact that our forebears were too stodgy for the first great modern architectural movement.

 

The Walters is offering Americans a chance to redeem our reputation for rejecting the new (or at least the nouveau) by giving us the opportunity to see the art movement our great-great-grandparents shunned.  It is only a glimpse; no whole Rennie Mackintosh room has been sent across the Atlantic.  To see one you need to fly to Scotland, to Glasgow where there are several, or to Dundee, where the Oak Room from Glasgow’s Ingram Street Tearoom has been installed at the Victoria & Albert’s dazzling new northernmost outpost.  What the traveling exhibition now at the Walters offers is more of a tasting menu: Margaret Macdonald Mackintosh’s show-stopping May Queen mixed-media mural, splendidly styled bookbindings, and images of Renne Mackintosh buildings.

 

Mackintosh buildings are a marvel.  A contemporary of Frank Lloyd Wright, with whom he is often compared, Mackintosh, like Wright, started on the bottom rung of an architecture firm, cautiously introducing innovative design elements to commissions on which he worked.  His career began to take off in the 1890s when he met Kate Cranston: female entrepreneur. Miss Cranston, as she chose to be known, made a fortune running tearooms in Glasgow, dining establishments where respectable Victorian ladies might meet friends and enjoy a meal in a public space without male escort and without marking themselves as the sort of loose women who dined in public establishments. The freedom to drink a cup of tea with a female friend in a public space was a tiny step forward in women’s liberation.  Miss Cranston continued to expand her tearoom empire after she married in 1892, but the partnership for which she is remembered is the one between Miss Cranston’s Tearooms and Charles Rennie and Margaret Macdonald Mackintosh.  It began when Rennie Mackintosh designed the wall murals for her Buchanan Street Tearooms in 1896.  Charles and Margaret were soon designing whole rooms, and then filling entire buildings with elegantly designed public dining and entertainment rooms for Miss Cranston’s hospitality empire.  They also designed the interior of Cranston’s Glasgow home, “Hous’hill.” 

 

The exhibition devotes a good deal of space to videos of Rennie Mackintosh buildings shot from drones.  While these enable visitors to see architectural details ordinarily visible only to pigeons, they fail to offer museum-goers a sense of why these buildings excite people, something that good architectural photography can do.  Mackintosh, like Wright, is known for creating entire buildings, down to the furniture and decoration of the interior.  But, for better and worse, Mackintosh lacked Wright’s boldness in non-artistic matters.  Mackintosh was not notorious for lying, cheating on employers, exploiting employees, or abandoning wives and children. He was not a serial creator of scandals.  Mackintosh’s designs were as boldly innovative as Wright’s, but he did not have what it would have taken to push his career beyond the small number of patrons willing to commission him in conservative Glasgow.  But, oh, the beauty of those Rennie Mackintosh interiors. 

 

The opening gallery of the exhibit at the Walters is a model of exhibition design. You are immediately faced with an elegant Rennie Mackintosh chair, taller than I am and delicately slender, set on a pedestal in a small space with text and images outlining the careers of the Glasgow School.  Past the chair, at the far end of a long gallery, you see a wall of Glasgow School posters, elegantly elongated female figures that embody the art nouveau moment.  Between the chair and the posters, visitors are given the context within which to understand the Glasgow School, influenced by the art of China, Japan, Persia and the Ottoman lands, but a fundamentally European movement that represented a radical break with the conventions of Victorian design.  Unfortunately, the rooms that follow fail to convey a sense of why people were—and are—excited by Rennie Mackintosh.  And here the curators have my sympathy.

 

Rennie Mackintosh did not design glorious stand-alone objects like a Tiffany lamp or Lalique dragonfly brooch—he designed spaces.  A Mackintosh chair needs its room around it, because it is the totality of the rooms, the impact of the whole that makes you fall in love. 

 

I am an American, and for that and other reasons I have seen many, many Frank Lloyd Wright buildings.  And my thoughts on exploring one always run along the same track.  Initially delighted at the forms, the materials, the patterns, light, and perfection, sooner or later I begin to think, “How very beautiful. How awkward. I can’t imagine living here.”  The Guggenheim, a gallery built for the display of art, is an exceedingly difficult space in which to display art.  It has, for this reason, not been widely replicated. Fallingwater, Wright’s masterpiece in the Allegheny foothills of Pennsylvania, is beautiful, and damp, and inconveniently laid out, and chilly. The perfect dining room in Chicago’s perfect Robie House seats six people, exactly six people.  It would be impossible for the owners to invite three or eight people to dinner, inconceivable that the owners should give birth to a fifth child.  

 

A number of Frank Lloyd Wright houses now hire out as short-stay rentals, about the amount of time that someone might enjoy living in one. But when I stood in the light-filled living room of Rennie Mackintosh’s Hill House in a Glasgow suburb on a grey morning in a light drizzle, I thought: Imagine how wonderful it would be to be able to sit here, in this luminous and comfortable and exquisitely beautiful room, every day for a lifetime.

 

Designing the New: Charles Rennie Mackintosh and the Glasgow Style runs at the Walters through January 5, 2020. It will travel to the Frist Art Museum, Nashville, June 26 to September 27, 2020; the Museum of the American Arts and Crafts Movement, St. Petersburg, Florida, October 29, 2020 to January 24, 2021; and the Richard H. Driehaus Museum, Chicago, February 27 to May 23, 2021.

 

Designing the New: Charles Rennie Mackintosh and the Glasgow Style is a touring exhibition co-organized by Glasgow Museums and the American Federation of Arts. The exhibition comprises works from the collections of Glasgow City Council (Museums and Collections), with loans from Scottish collections and private lenders. Support for the US national tour is provided by the Dr. Lee MacCormick Edwards Charitable Foundation.

 

 

 

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173556 https://historynewsnetwork.org/article/173556 0
What Should We Do When a Ruler is Mentally Unstable? David P. Barash is an evolutionary biologist and professor of psychology emeritus at the University of Washington; his most recent book is Through a Glass Brightly: Using science to see our species as we really are (Oxford University Press), 2018.

F. Scott Fitzgerald is said to have commented to Ernest Hemingway that “The rich are different from you and me,” to which Hemingway replied “Yes, they have more money.” Many people similarly assume that national leaders are different from you and me not just because they generally have (a lot) more money, but also because their mental health is supposed to be more stable. 

 

And yet, history is full of political leaders whose mental stability was questionable at best, and others who were undoubtedly nut-cases. Roman emperor Caligula is infamous for sexual excess, for having people killed for his personal amusement, and other pathologies. Charles VI of medieval France became convinced he was made of glass and was terrified that at any moment he might break. Mad King Ludwig II of Bavaria suffered from what now appears to be Pick’s disease (related to early Alzheimer’s), along with frontotemporal dementia and schizotypal personality disorder. King George III of England evidently suffered from logorrhea, an incontrollable need to speak and write to a degree that often became incomprehensible, as well as apparent bipolar disease. And that is only a very limited sample. We can safely conclude that occupying a position of great political responsibility is no guarantee against mental illness.

 

“Only part of us is sane,” wrote Rebecca West. “Only part of us loves pleasure and the longer day of happiness, wants to live to our nineties and die in peace ...” It requires no arcane wisdom to realize that chronic mental illness is not the only source of “crazy” behavior: people often act out of misperceptions, anger, despair, insanity, stubbornness, revenge, pride, and/or dogmatic conviction – particularly when under threat. Moreover, in certain situations—as when either side is convinced that war is inevitable or under pressure to avoid losing face — an irrational act, including a lethal one, may appear appropriate, even unavoidable. When he ordered the attack on Pearl Harbor, the Japanese Defense Minister observed that “Sometimes it is necessary to close one’s eyes and jump off the Kiyomizu Temple” (a renowned suicide spot). During World War I, Kaiser Wilhelm wrote in the margin of a government document that “Even if we are destroyed, England at least will lose India.” While in his bunker, during the final days of World War II, Adolf Hitler ordered what he hoped would be the total destruction of Germany, because he felt that its people had “failed” him.  

Both Woodrow Wilson and Dwight Eisenhower suffered serious strokes while president of the United States. Boris Yeltsin, president of the Russian Federation from 1991 to 1999, was a known alcoholic who became incoherent and disoriented when on a binge. Nothing is known about what contingency plans, if any, were established within the Kremlin to handle potential war crises had they arisen during the Yeltsin period. Richard Nixon also drank heavily, especially during his stressful stint as US president during the Watergate crisis, which ultimately led to his resignation. During that time, Defense Secretary James Schlesinger took the extraordinary step of insisting that he be notified of any orders from the president that concerned nuclear weapons before they were passed down the command chain. 

 

Presumably, Schlesinger – and by some accounts, Henry Kissinger, who was then serving as National Security Adviser – would have inserted themselves, unconstitutionally, to prevent war, especially nuclear war, if Nixon had ordered it. As civilians, neither Yeltsin nor Nixon, when incapacitated by alcohol, would have been permitted to drive a car; yet they had full governmental authority to start a nuclear war by themselves. 

 

During his time in office, Donald Trump has been the first US president considered, simply by virtue of his own personal traits, to be a national security threat. Thus, he is renowned for repetitively lying, possibly to a pathological extent, and concerns have constantly been raised about his mental stability, impulsiveness, narcissism, sociopathy, and many other ailments that in the opinion of many mental health professionals would clearly disqualify him for numerous military and government posts … but not the presidency. The fact that the current impeachment inquiry has driven him to acts and utterances that are increasingly bizarre and unhinged, and that by law he has sole sufficient authority to order the use of nuclear weapons should be of the greatest concern to everyone, regardless of your politics.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/blog/154276 https://historynewsnetwork.org/blog/154276 0
The Electoral College: How the Founders Cheated You of Your Vote

 

The U.S. Supreme Court will soon rule on a “college” cheating case, whose outcome will shape the future of America’s presidency and republic. Failure to rule decisively will leave the Electoral College and presidential selection vulnerable to Russian, Chinese, and other foreign government hackers and garden-variety corruption.

 

The Court will not judge an ordinary college, of course. Its decision will lay down rules for America’s Electoral College, whose votes in December 2020 and every four years thereafter will determine the next President of the United States and shape America’s democracy. Weeks of debate by America’s Founders failed to set any rules at the 1787 Constitutional Convention in Philadelphia—a failure that led to “cheating” at the College ever since.

 

The question before the Court is whether Electoral College electors must vote according to the preferences of voters who elected them or whether they may ignore voter preferences and cast votes as they see fit. Twenty-one states allow electors to ignore voter preferences, opening those electors to corruption. 

 

Indeed, “faithless electors” have thwarted voter will in five presidential elections, dating back to 1824. Ten who did so in 2016 were not enough to reverse Clinton-Trump election results, but would have reversed the 2000 presidential election, in which George W. Bush defeated Al Gore by only five electoral votes. A Supreme Court decision to allow electors to ignore voter choices would open the way for a coup d’etat by a despot with control of the nation’s military. 

 

The question of how to choose a president plunged the Constitutional Convention into near-chaos for weeks in the summer of 1787. Almost every delegate offered some plan whose flaws always offset its advantages. Indeed, the original proposal for an Electoral College provoked Virginia planter George Mason to rage that foreign interests would benefit from “the corruptibility of the men chosen.”

 

Chaired by George Washington, the Convention convened in mid-May 1787, with Virginia Governor Edmund Randolph proposing that “a National Executive be…chosen by the National Legislature [i.e., Congress]” Contrary arguments immediately plunged the Convention into bitter debate. Making matters worse, delegates had sealed Pennsylvania State House (now Independence Hall) windows to keep proceedings secret, and Philadelphia’s blistering summer heat that sent tempers rising with the temperature. 

 

To Randolph’s motion for Congress to appoint the National Executive, Philadelphia’s Gouverneur Morris snapped that the Executive would become “a mere creature of that body…. Usurpation and tyranny on the part of the legislature will be the consequence.” Morris demanded that “citizens of the U.S.” elect the President.

If the people should elect, they will… prefer some man of distinguished character…of continental reputation. If the Legislature elect, it will be the work of intrigue, of cabal, and of faction. It will be like the election of the pope by a conclave of cardinals; real merit will rarely be the title to the appointment.”  

The aging Benjamin Franklin agreed that failure to permit the people to choose the chief magistrate was “contrary to republican principles. In free governments, the rulers are the servants, and the people their superiors and sovereigns.”

 

Made up largely of America’s wealthiest men, the Convention rejected Franklin’s suggestion. Connecticut’s Roger Sherman claimed the country was too large for a popular vote: “The people will never be sufficiently informed.” South Carolina planter Charles Pinckney agreed, warning that the three most populous states would combine to elect the president and thwart the will of all other states. Nine states agreed and voted down popular elections. 

 

Chaos ensued, with delegates trying to out-shout each other: One called for the House of Representatives to select the President; another for the Senate to do so. “State governors…” cried another; “state legislatures…” “A lottery….”

 

Washington stood and demanded order!

 

Maryland lawyer Luther Martin then proposed that electors be appointed by legislatures of each state to select the president. North Carolina’s Hugh Williamson sprang to his feet and objected. Electors “would certainly not be men of the first nor even of the second grade in the states,” because senators, representatives and other federal office holders could not serve as electors. Such undistinguished men, he insisted, would be easy prey for domestic and foreign corruptors. The Convention agreed and rejected Martin’s proposal.

 

Williamson then proposed giving executive power to three men, each “taken from three districts into which the states should be divided.” A North-South civil war over slavery would be inevitable if a northerner won executive power over the South and vice versa. The Convention believed his proposal thwarted union and made civil war inevitable. 

 

Virginia Governor Randolph charged that all proposals under consideration posed “the danger of monarchy” and “civil war.”  The executive “will be in possession of the sword: Make him too weak, the legislature will usurp his powers. Make him too strong, he will usurp on the legislature…..a civil war will ensue, and the commander of the victorious army will be the despot of America.” 

 

As arguing intensified, Washington again barked for order and warned delegates to settle their differences: “There are seeds of discontent in every part of the union ready to produce disorders, if…the present Convention should not be able to devise…a more vigorous and energetic government.” 

 

But Elbridge Gerry of Massachusetts gave up: “We seem to be entirely at a loss on this head.”

 

On August 31, most other delegates also admitted defeat and turned the problem over to a committee of eleven—one member from each participating state--to produce a binding resolution. With six slave-state delegates in the majority, the committee gave each state government power to appoint, directly or by popular vote, “a number of electors equal to the whole number of senators and representatives to which the state may be entitled in Congress.” Those electors would choose the President and Vice President. 

 

Delegates from the North were furious. Three-fifths of the non-voting slaves would inflate the population that determined the number of each southern state’s representatives in Congress, allowing a relatively small number of southern freeholders—the wealthy white plantation and property owners who owned most of the land and slaves in the South—to elect a disproportionately large number of electors to the Electoral College. 

 

The northern delegates, however, recognized that rejection of southern demands would end chances for a constitution and union. Virginia alone was America’s largest, most populous, wealthiest state—the essential core of any union. 

 

On September 12, 1787, the Electoral College was born. In the sixty years that followed George Washington’s election as America’s first president, the South would elect nine of the first twelve presidents, who filled the office for more than forty-eight of those years, until the eve of the Civil War.

 

The Constitutional Convention left each state free to decide how it would select its electors and whether or not electors would have to cast ballots according to voter preferences. To this day, no federal law prevents electors from disregarding preferences of those who appointed or elected them. In the 230 years since creation of the Electoral College, 214 “faithless electors” have disregarded voter preferences in 19 of 58 presidential elections. The College elected five presidents who failed to win a majority of popular votes. No “faithless electors” have ever been prosecuted. 

 

Although Benjamin Franklin signed the Constitution, he was not enthusiastic: “I confess that I do not entirely approve this Constitution at present, but I am not sure I shall ever approve it.” Washington agreed, conceding imperfections, but citing Article V permitting future amendments to remedy defects. 

 

Washington, however, lived in a nation with fewer than 50,000 eligible voters in a nation of 4 million people and 13 states. He did not--and could not--envision his nation exploding into an empire stretching from the Atlantic Ocean across one-fourth of the planet’s circumference, midway into the Pacific. He did not--and could not--envision a nation of more than 325 million people in 50 states, each with conflicting and often irreconcilable interests. 

 

Despite perennial demands to replace the Electoral College with popular elections, every Congress has refused. Although twenty-nine states and the District of Columbia imposed laws against faithless electors, sanctions range from only token fines to mere nullification of their ballots. Sometime in the coming months, however, the U.S. Supreme Court will rule on whether electors in the Electoral College must obey voters. If it fails to rule or if it rules to free electors of obligations to voters, it will open the Electoral College to foreign government hackers and both foreign and domestic corruption that may end democracy in America as we know it.

 

[Quotations from the Constitutional Convention in Philadelphia in this article may be found in The Records of the Federal Convention of 1787, Edited by Max Farrand (New Haven, CT: Yale University Press, 4 vols., 1966), II:63-70 (Friday, July 20, 1787) and Notes of Debates in the Federal Convention of 1787 Reported by James Madison (New York, NY: W. W. Norton & Co., 1987), 331-336.] 

 

For more by the author read: 

 

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173546 https://historynewsnetwork.org/article/173546 0
Who Will Be America's Brutus?

The Death of Julius Caesar by Vincenzo Camuccini

 

Brutus, hero or traitor? George Washington or Judas Iscariot?

 

The inventors of the American political system knew stories of the Roman Republic well. They were aware that ambition and the hunger for power and glory led to its demise.  Over time, Republican Rome was replaced by an empire headed by autocrats and dictators. As the founders modeled their government, in part, on the Roman Republic, the fear that the new nation might follow the same path motivated the framers to build into the system a variety of checks and balances designed to make power counteract power and ambition counteract ambition. Their great fear, “tyrannaophobia,” led them to limit the powers of the presidency and bound the office to follow the rule of law.

 

Today, that system is under threat by a president who is doing violence to the rule of law and democratic norms. What can we learn from Roman history that might help us confront a runaway presidency in our age?

 

Fear of Caesar’s growing power led many Romans to rise up in opposition to the budding emperor.  While Rome maintained the paraphernalia of republican structures, real power was held by one man, Caesar. Brutus, one of the most respected men in Rome and a close friend to Caesar,  and Cato came up against the unrelenting ambition of Julius Caesar. Casesar’s distain for the rule of law, coupled with his highly skilled manipulation of public opinion and the military, eventually led to him being declared “dictator” for life. The Republic began to dissolve, only to be replaced by empire.

 

A number of dissatisfied Senators persuaded Brutus, that the rise of Caesar’s power meant the death of the republic. Torn within, Brutus eventually decided that the future of the republic was indeed in jeopardy, and that he would have to betray his friend. A band of Senate dissidents assassinated Caesar, and during the deed, Brutus was alleged to have said sic semper tyrannis (thus ever to tyrants). 

 

The Framers lauded the efforts of Brutus, Cato, and other defenders of the republic who opposed the rise of imperial power in the form of Caesar and those emperors who followed him. Brutus and Cato became American symbols of liberty, much celebrated for their defense of republican government.  In 1778, General George Washington even had Joseph Addison’s play CATO, A TRAGEDY IN 5 ACTS performed for the troops at Valley Forge. In the early American republic, the names Brutus and Cato were often used as pseudonyms when writing about public affairs to gain gravitas with the reader. 

 

Brutus was close to Caesar, thus his conundrum. Side with a friend, or with the republic? Brutus agonized over the decision. Was murder of a friend noble or a betrayal? Selfless or selfish? At the conclusion of Shakespeare Julius Caesar, the Bard has Antony, Brutus’ rival say,

                  This was the noblest Roman of them all.

                  All the conspirators save only he

                  Did that they did in envy of great Caesar.

                  He only in a general honest thought

                  And common good to all made one of them.  

                  His life was gentle, and the elements

                  So mixed in him that nature might stand up

                  And say to all the world “This was a man.”

                  5-5-67-74

 

Of course, the result of the assassination of Caesar was a bloody civil war, and in the end Octavius (Augustus),  taking over as dictator as the republic all but disappeared. In this, we are left to ask, as Marjorie Garber does in SHAKESPEARE AFTER ALL:

…as Cassius and Brutus are soon to learn, they have killed the wrong Caesar. They have killed the private man, the one of flesh and blood. But the public man, the myth, lives on, after his death, and after theirs, and long after Shakespeare’s. “Julius Caesar, thou art mighty yet.” …The plebeians are a swayable chorus, a malleable, responsive audience to be played upon by the clever actor…” (p.419)

 

Caesar was dead, but the idea of a Caesar was very real and powerful, and lived on.  The people bowed to Caesar, honored and worshiped him, celebrated his accomplishments. They did not rise to defend the republic, they rose to cheer for their savior.  And Cassius, a leader of the rebellion against Caesar noted of the people:

                  And why should Caesar be a tyrant then?

                  Poor man! I know he would not be a wolf

       But that he sees the Romans are but sheep. 

       He were no lion were not Romans hinds.

                  1-3-104-107

 

Is a republic worth saving if the people are sheep?  Was Rome merely biding time, waiting for its Caesar?

 

Today, we ask, who will be our Brutus? Who will save us from the threat to the republic posed by Donald J. Trump? In Trump we have a president who daily violates the norms of democratic behavior, the rule of law, the system of checks and balances. Have we a Caesar in our midst, imperial and imperious? Who will figuratively kill the beast that is doing such violence to the American system?  And is Trump the real problem or is Trumpism? After all, a willing segment of the public willinglyfollows behind the Pied Piper.Are we sheep who do not deserve a republic? Truly, the fault (dear Brutus) may not be in the stars but in ourselves, that we hunger to be led, to be dominated, to follow the strong leader.

 

At the end of the Constitutional Convention in Philadelphia, a woman asked Ben Franklin, “What is it Dr Franklin, a republic or a monarchy?” to which Franklin famously replied “A republic, if you can keep it.” Keep it indeed. It is not up to the president, it is up to us. Do we deserve a republic, and can we do the hard work to keep it?

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173548 https://historynewsnetwork.org/article/173548 0
The World War I Battle That Didn't End with Armistice Day: Hunger

 

Even after the Armistice of November 11, 1918 ending World War One, American soldiers were carrying out heroic missions. Lieutenant Orville C. Bell and officers in the American Relief Administration (ARA) saved civilians in Montenegro from starvation.  The Balkan country was occupied by Austria during the war. The fighting destroyed food production and the occupying forces seized all the supplies.  That left villages in Montenegro with little or no food. The Armistice could not stop the enemy of hunger and a ferocious winter was setting in.  Lt. Bell and the ARA faced a daunting task in Montenegro, a mountainous country with no railway operating after the war. How would they get food to the starving population? Would more innocent people become the war's casualties?  While trying to reach one district in March, 1919 Lt. Bell wrote "It is completely isolated because of the impassable mountains whose trail has been blocked by snow for some months and because of the broken bridge on its one good road....” Many of us take the vast networks of transportation to move food from farms and factories and into our communities for granted. But in a war zone such systems break down. In Montenegro, the breakdown and its effects were extreme. Bell wrote that “many of the people were eating grass. Four the last four months the death rate has been enormous.” The New York Times reported that people in Montenegro had given up hope. But Lt. Bell and ARA had different ideas.  Bell and his team carried out an extraordinary mission and transported the food through mountains and over broken bridges. They used cable ways to cross where bridges were downed, and animals to move the food. It worked and the soldiers successfully distributed throughout Montenegro.  In Bell’s report, from the Hoover Library at Stanford, he writes “To get into the country from the top of the mountains wagons, burros and pack horses are used. They are carrying their loads of flour, lard, pork and milk to all parts of the little country in spite of snow storms that have closed many of the passes….” Bell and his team did the impossible and brought food over the harsh terrain and to the hungry. While everyone likes deliveries,  just imagine the overwhelming sense of relief and joy the people in these isolated communities must have felt to see food delivered after months of almost none.  The Times quoted Lt. Bell: “They have stopped digging graves and are, instead, planting their crops for this year’s harvest.” The American Relief Administration performed these heroic deeds throughout Europe after World War I, saving millions from hungerand starvation. 

 

One of the forgotten consequences of war is food shortages and hunger. It can take years to recover agriculture from war damage. Even when a conflict ends, severe hunger continues to be a threat. The worst is the malnutrition which can stunt children's growth and development or claim their lives. 

 

The wars today in Syria, Yemen, South Sudan, and Central African Republic also are major hunger emergencies. There are heroic missions ongoing in these countries by humanitarian organizations.  But yet hunger relief missions to these nations often lack funding. The UN World Food Program, Save the Children, Catholic Relief Services, Action against Hunger, UNICEF need more resources to save lives. This can be achieved if we set our mind to it. 

 

Like the heroic officers of the ARA, we should do everything we can to feed the hungry in war-torn and impoverished nations. No obstacle was going to stand in the way of their mission to bring food to the poor and vulnerable. The ARA officers give us an example of a world we want to live in, where no one suffers hunger and despair. We will respond to the cries for help and feed the hungry. 

 

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173553 https://historynewsnetwork.org/article/173553 0
What If Donald Trump Resigned?

 

Editor's note: This article was originally publsihed in May 2019. Given the recent impeachment inquiry and the article's increased relevancy, it is being republished. 

 

Two thirds of the American public (give or take a little) now believe that it is time for our President to stop being President. Trump should no longer have the power to take us to war on a whim or to ruin the careers of our leaders. 

 

There has been endless, somewhat idle  discussion of “Impeachment”  in Congress. It hasn’t proven, so far at least, to be the answer to our dilemma.  There has developed considerable agreement that a case for an Exodus needs to be made—and soon.  While that case can be made (by lawyers, by partisans, by the impatient, and by those who take our foreign affairs exceptionally seriously), there is a plain truth: we’re getting nowhere. 

 

Tempers have risen as the convoluted months have passed. Countless speeches have been made urging change—and not just in favor of immediate action.  There are among us political party members who pause, consider, maybe show some sadness, and dwell a bit drearily on the theme:  “Yes, I know he really has to go.  But we’re getting nowhere.”

 

I have slowly arrived at a point of view.  Oh, I’ve done what I can:  I’ve written three substantial articles that unreservedly  attack President Donald J. Trump’s performance in office.  It was a pleasure to write, then read, them—if frustrating.  To the extent there has been a reaction, it has been favorable enough, but mostly ineffective.  “Yes,” vast numbers say, “he does have to go.” 

 

If we agree pretty much on the need for Trump’s departure, the time is very much at hand to ask, essentially, What does he think about it?  What does he want, mid-term in the White House? Does he think there has  been enough roughhousing, yelling, defiance, repudiation of  important leaders at times and for reasons that are bound to be embarrassing?  Persecution, really rudeness, to the Press? Could it be that our peerless leader is agreeable to returning himself to a variety of estates and golf courses?

 

Thinking about his “situation” and the unpleasant circumstances that are slowly developing for us and for him, it does seem to this observer that a moment of crisis is approaching.  What, then, has become the Path I see to some kind of solution?

 

Since writing the initial draft of this article our good Nation has sent an aircraft carrier squadron to the Persian Gulf as an all too obvious threat to the Iranian government.  This aggressive action has been taken entirely on the initiative of the one who has other choices!  Military engagement is not the option that will bring him a true and lasting  sense of well being.  He need not suffer legal confrontations, speech and rebuttal, partisan challenges, and never ending indignities to family members (deserved or not).  As the days drag on it is so very apparent there is a tenable solution:

 

The Honorable leader of the executive branch of the United States should RESIGN at a very early opportunity. The President should not drag his feet until the Situation gets too hot to handle.

 

Yes, the owner of “the Trump estate,” that husband of a lovely lady, parent of stalwart children, and regular commuter to Mar-a-Lago and traveler to random places worldwide in government airplanes, should once and for all  take the terrible pressure off his mind and his health by JUST DEPARTING.

 

When President Richard Nixon finally decided the time had come, he wrote a one line notification of what he was doing.  It sufficed then.  But noticeably more than that is needed now. The President will want to offer his point of view to Posterity!  Believe it or not, we the Public will be receptive to thinking and weighing his final point of view.

 

 I have thought about it.  Here is a tentative draft resignation that I think might serve presidential needs and history as well:

 

“I am today resigning the position of President of the United States, effective at the time of transmitting this letter to the Congress.  The never ending turmoil surrounding daily and weekly events is beginning to be a considerable strain on my  well-being.  I fear that it will affect my physical condition before too long. 

 

“The position I have been occupying is one of never ending, constant responsibility. It has had its rewards, for me and members of my family. I feel I have served my Country well.

 

 “I could continue—waging the never ending political battles that so entrance those for whom such political activity is a lifetime activity.  But I am increasingly aware that Life has other rewards in store for me—provided I treat it with careful regard. 

 

“As I say goodbye, I trust that observers will weigh with proper regard the several aspects of my presidency—partisan or not—and arrive at a balanced verdict on my shortened career as President.

 

“I wish my successors well.  Overall,  I am quite certain that my impact on the Presidency of the United States has been positive.”

 

DONALD J. TRUMP                                   

 

The letter above, drafted clear across the Nation cautiously and respectfully (yet still a Draft),  is the best I can offer for consideration at this point in time. It should not bear my name.   “Draft Letter for consideration” is intended as a title and should suffice.

 

I am suggesting this avenue as a possible way—sometime in the near future--to bring an end to the several  crises into which  our beloved Country has gradually worked itself,  and to avoid any and all wars which may ominously be waiting out there!   Our Leader will write his own letter, of course—and by no means do I expect it will be more than a tiny bit  influenced by my ordinary citizen’s prose—if indeed that. (I have no illusions that my prose will be the words finally chosen!)

 

Do be of good faith, fellow citizens of whatever persuasion.  We must avoid additional unpleasantness—and far worse!  Keep calm on the domestic front, and by all means be patient.  Rise above partisanship.  Let’s meet our Leader halfway on the course I suggest which, if taken, may  just be the direction to improving the future of all Americans.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/171992 https://historynewsnetwork.org/article/171992 0
The Ghosts of Founders Past

 

Halloween held a ghoulish surprise for American President Donald Trump and British Prime Minister Boris Johnson. As a result of their hubris, they were both forced to watch their longstanding dreams of unbridled executive authority die in the ditch of rule of law.

 

As a very scary October for global democracy came to an end, actions by the British House of Commons kept the UK in the European Union and largely prevented the possibility of an economically catastrophic No Deal Brexit, while the American House of Representatives passed the most fair-minded, non-partisan, and forward-looking rules for an Impeachment inquiry in American history. Despite these legislative triumphs, it is far too early to declare victory over the coordinated, Russian-backed, neo-populist authoritarian coup attempts, which have over the last few years sought to capture the epicenters of global power. Nonetheless, it is also quite necessary for historically minded citizens to look back with immense pride at what our elected representatives have achieved over the past two months.

 

We also need to celebrate the robustness of our institutions and the extreme prescience of our founding founders. They constructed the checks and balances of our Anglo-Saxon institutions for exactly the kind of circumstances we are now living in. When these institutions were created, they were unique in world history. They bucked the global trend throughout the Early Modern and Enlightenment periods towards greater and greater centralization of power.

 

Two-hundred and thirty years ago, in 1789, the United States’ House of Representatives and the Senate held their first sessions in Manhattan, weeks before George Washington was inaugurated as America's first President. This symbolism was important. Our founders had articulated in the newly ratified Constitution and Bill of Rights, that individuals were to be spared from arbitrary authority and that Congress was to have sovereignty over the domains of legislation, taxation, Declarations of War, and assuring that the President of the United States did not conspire with foreign powers against the interests of the United States or seek to use his office for personal gain.

 

Fascinatingly, the United States Congress first sat exactly one hundred years after the constitutional upheaval of the British Glorious Revolution (1688) and the ensuing English Bill of Rights (1689). These monumental developments established Parliamentary Sovereignty in Westminster and imported a new Protestant Royal House from The Netherlands -- a monarchy which had expressly consented to having their powers constrained even before receiving the throne. The American founding fathers were extremely aware of British precedent. They sought to adopt what they thought best about British constitutional and parliamentary practice into the new Republic.  

 

Dispersion of powers and the supremacy of the elected legislative branch over the executive branch was the key Anglo-Saxon innovation. The founders on both sides of the Atlantic knew that circumstances would change, requiring flexibility in how the Congress and Parliament deployed their sovereign prerogatives. As such they only sketched a broad framework relating to issues like impeachment or the precise relationships between Parliament and Government, allowing future legislators and jurists to fill in the details as circumstance would require.

 

Today we see that these precedents, conventions, and institutions are being upheld.  In the UK, the Benn Act, the Letwin Amendment , and the rejection of the Government’s Brexit Bill timetable -- and in the U.S., the Whistleblower Complaint, the House Intelligence committee subpoenas, and the Oct 31 ground rules  for the Impeachment inquiry -- are exactly the kinds of legislation, amendments, and legal frameworks that the founders wanted our sovereign parliaments to be able to devise to fit evolving circumstances, when they dreamed up our forms of governance over two centuries ago. 

 

The sovereignty of our parliaments is a mirror image of the freedoms we have as individuals that are encapsulated in the Anglo-American Bills of Rights.

 

We were truly on the crux of losing our personal freedoms. Had we been living in the younger and more centralized democratic systems of continental Europe or the even newer democracies in the former French and Spanish colonial spheres, the existing structures and precedents would have made it much harder to check an Executive keen on overreach. Those systems are what Samuel Huntington referred to in his Political Order in Changing Societies  as ‘modern’ (i.e. more centralized) with the executive being the most important branch of government, where as he referred to Anglo-Saxon political systems as 'Tudor'  (i.e. having power more dispersed and with the legislative branch as primus inter pares). Huntingdon correctly diagnosed that each new technological advance in communication and human organization had made an ever greater and more arbitrary centralization of authority possible. Huntington also grasped that what technology had made possible, new political forces and ideologies would arise to try to implement.

 

Miraculously, our founders anticipated developments like this, grasping the fundamental truth articulated by Huntington that advancements in communications technology makes centralization of authority increasingly likely. Our founders sought to devise institutions, which would buttress what they saw as our uniquely Anglo-Saxon cultural legacy of individual freedom by creating a sovereign legislature with robust oversight powers to check aspiring tyrants, long before they could usurp unfettered power.

 

On All Hallows’ Eve 2019, the Ghosts of Founders Past were reveling in their uncanny prescience by exercising their all too righteous revenge.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173501 https://historynewsnetwork.org/article/173501 0
The Internet at 50: The future and “dissolving containers”

 

This is the fourth article in a series reflecting on the Internet at 50. For the first article on the four developments that created the world wide web, click here. For the second on the dot com bubble burst, click here. For the third on the night the Internt was born, click here

 

In 50 years, the internet has grown from a basic experiment that connected two computers in 1969 into the pervasive communications tool we use today.  And as described by Harlan Lebo, author of 100 Days: How Four Events in 1969 Shaped America (Amazon,Barnes & Noble), the only thing that will remain constant about the internet is its unceasing state of change.

 

 

____________________

 

There is a story about the internet – actually a true incident – that vividly illustrates how swiftly events can evolve in the online world.

 

The story goes like this: in 2003, two historians who studied the dot-com industry in the wake of the bubble burst decided to write a book about the future of the internet. They talked to a publisher, who told them, “if you can make a case for transformation to come in the internet, I’ll buy your idea.”

 

But the historians could see nothing on the near horizon that seemed like major transformation to come. So they gave up their idea for a book, thinking that the internet, as far as they could see, would thrive – but without major variation – for the foreseeable future. 

 

Instead, during the next five years came Facebook (2004), YouTube (2005), Twitter (2006), the iPhone (2007), and the Android operating system (2008) – five developments among many that would, yet again, profoundly reshape the internet in the American experience.

 

* * * * * * * *

 

Facebook, YouTube, and Twitter and other social networking applications created unprecedented opportunities for online interaction – how users communicate and present themselves to the world.  They have opened new gateways to communication and expression for billions of users worldwide. Yet at the same time, the three services have become lightning rods for controversy about privacy, personal intrusion, and political confrontation that says much about the shifting and conflicting nature of the internet and its role in the American experience.

 

On their own, Facebook, YouTube, and Twitter had potential as broad services for communication and social contact.  But their value skyrocketed with the arrival of the principal breakthrough digital tool of the 2000s: the smartphone.

 

One can imagine, 50 years ago, just how magical the prediction of a smartphone would have seemed: a replacement for a conventional telephone, computer, flashlight, wallet photo insert, credit cards, calendar, books, audio and video players, and magazines – all in a device that also offers internet access and fits in a pocket.  A single smartphone today has far more processing power than all of the computers at the Johnson Space Flight Center 50 years ago that managed the Apollo missions to the moon.

 

In 2019, more than 2.7 billion smartphones – either Apple’s iPhones or an even larger number of devices employing Google’s Android system – are used worldwide, now as a seemingly-permanent physical fixture in the internet landscape. The smartphone has become the current manifestation of the power and influence of the internet in the lives of its users.

But the smartphone is already beginning to advance beyond its current stature, as it represents only one example of how the internet and the tools that use its capabilities will continue to transform. 

 

* * * * * * * *

 

Fifty years after the internet was born, where is the technology going in its middle age? First comes recognizing that such questions no longer refer to the internet as a single entity. The expansion of the internet now encompasses the broad range of hardware and software defined as “digital technology,” as the online realm grows, changes, and then changes again.

 

Advances to come in digital technology have been foreshadowed by shifts that are already occurring.  Among the developments with the greatest impact on users are “dissolving containers” – as described by digital strategist Brad Berens to mean the demise of a physical item for using content (such as a DVD) that has given way to intangible files, or to new devices that are more sophisticated, specialized, or smaller.

 

“The vinyl record album became the CD, which became the mp3 file,” said Berens, principal at Big Digital Idea Consulting. “Photographs that were previously shot with film and printed on paper are now jpgs that are captured on a memory card and then stored in the cloud.  “Similarly, the desktop computer shrank into the laptop, which became a tablet, and now many people increasingly use their smartphone as an routine alternative to a larger computer.” 

 

And as technology dissolves into other digital devices, trends suggest that the object we currently call a “smartphone” represents only the current capabilities – and not the future – for how we create digital connections.

 

“Most of us consider the smartphone to be a primary communication tool in our lives,” said Berens.  “But the smartphone as we now know it may soon dissolve into a different kind of device that is even more comprehensive in function that our current phones. 

 

“For example, we’ve already seen the smartwatch replace some functions of the smartphone,” Berens said. “The smartphone may dissolve until it is merely a tiny processor in your pocket that controls several sensors for communication that we carry or wear.

 

“By then,” said Berens, “the physical form of the ‘phone’ will no longer be relevant.”

 

As the smartphone changes, that evolution will be typical of the broader progress to come in how online communication and information gathering will expand aroundindividuals – by some predictions filling a person’s physical environment.

 

* * * * * * * *

 

“Why do we need to have a physical device we hold in our hand to communicate just because they aren’t physically near us?” said Marcus Weldon, president of Bell Labs and chief technology officer of Nokia. “It won’t be long before we will not just connect people more intuitively, but we will actually connect everything– your environment, buildings, cities– so that we can actually optimize you in your world.

 

“The future won’t be about Fitbits or smartwatches,” said Weldon, “but about adding, embedding, or even ingesting sensors on – or in – everything.”

 

As a result, said Weldon, “We will be able to eliminate mundane tasks, live more complete lives, be more productive, and do more creative things. I think that's a very interesting new reality to hope for – human beings perfectly assisted and augmented by machines.”  

That type of monitoring would also include voice recognition that would identify an individual at any location. 

 

“Computing and digital technology should be useful to me wherever I go,” said Leonard Kleinrock, the UCLA computer scientist whose lab was the home for the first connection between computers 50 years ago that is recognized as the birth of the internet.  “I should be able to reach out to any computer without a privacy breakdown, so it recognizes me and allows me to use it.”

 

But if continuous monitoring seems like a troubling specter of “Big Brother,” designers  emphasize that the control of information is a key.

 

“It won’t be a creepy, ‘Big Brother’ way of gathering information,” said Weldon, “but rather done in a way that users are in control of their own information – a friendly ‘Little Brother’ or ‘Sister’ – a supportive personal assistant. That’s the world we’re moving towards.”

 

If the idea of continuous monitoring seems far in the future, note that the technology in some forms has already arrived: for instance, auto insurers such as Allstate, Progressive, State Farm, and others have created software and monitors that are embedded in a car or activated with a smartphone app that transfers data to the company to judge driving performance. 

 

The reward for using the software – and driving safely – is lower insurance rates. The opposite, of course, also applies: rates go up when a pre-determined number of rules about speeding or other infractions are broken. 

 

“It’s easy to see the extremes of continuous monitoring,” said Berens. “As a positive, for example, a sensor that monitors diabetes that sends information to your doctor and activates the release of insulin in your body is part of a benevolent network of support that keeps track of you, takes care of you, and gives you objective choices about your life. 

 

“However, the dark perspective is that continuous monitoring seems like a constant observation of your behavior and violating your privacy at every moment, as larger forces beyond your knowledge are monitoring everything you do.”

 

But monitoring with digital technology is already viewed by some as a positive. An early example occurred in July 2017: employees at Three Square Market, a technology company in Wisconsin, were given the opportunity to have a tiny chip the size of a grain of rice implanted between their thumb and index finger.  The benefit of the chip was that users no longer needed a staff identification at work, and they could buy lunch from the company cafeteria without cash or credit cards. Fifty of the company’s 80 employees agreed to the implant.

 

However, not everyone is so enthusiastic: the Center for the Digital Future at USC Annenberg followed up Three Square Market’s project by looking at the issue in a survey for a national audience, asking, “if a digital chip could be put into your finger that is painless, invisible, and removable, and that allows you to eliminate all keys, IDs, boarding passes, credit cards, passports, and all possibilities of fraud, would you consider it?” More than half said they would probably or definitely not. However, 19 percent said they would probably or definitely consider an implant. 

 

* * * * * * * *

 

Exploring the future of the internet – or rather digital technology – also requires new definitions for long-standing ideas. For instance, in 1970, when a parent asked a 13-year-old in the next room what he was doing and the reply was, “I’m watching television,” that response meant he was likely watching a weekly program on one of the three television networks, or perhaps one of the few local channels. Ask the same question in 1990, and “I’m watching television” meant not only the television networks, but dozens of cable channels accessed through a set top box. 

But by 2010, the definition of that reply had altered dramatically: “I’m watching television” could describe television networks, or cable, or subscription services such as Netflix, or millions of programs on YouTube; today, add a rapidly-expanding number of streaming services to those options.

 

The evolving definitions in the American experience of the digital world takes on even greater importance when exploring how they affect individuals and the relationships in their lives. 

 

“The internet is changing the fabric of our social relationships,” said anthropologist Genevieve Bell. “We connect with each other differently because of social media, but we also define those connections differently.”

 

For instance, how do we define a friend – and how is that definition shifting because of the internet and digital technology? 

 

“Social media has dramatically expanded our connections to other people, but they have also redefined how we perceive friendship,” said Berens. “In terms of how we develop our relationships, our definition of ‘friend’ is vitally important.” 

 

Views about friendships – and the perceived roles of those friends – can have profound emotional effects when the internet is part of the mix. As just one example – several studies have shown that merely viewing social media such as Facebook can lead to depression, because when users see the momentary emotional peak experiences posted by others – parties, trips, family events – those highlights lead viewers to amplify the less-exciting routines in their lives. 

 

“On Facebook,” said Jeffrey Cole, “everybody appears to be having a better life than we are.” 

 

* * * * * * * *

 

Such issues about the role of the internet in the American experience will continue to emerge, as will the wide pendulum swings between the unlimited possibilities and the dark corners in the digital realm. After all, the same online tools that educate millions across the globe or let friends swap chocolate cake recipes also serves as a conduit for plans to produce 3D-printable guns or to deliver messages of hate. 

 

“We need to be mindful,” said futurist Rishad Tobaccowala, “that the technology that was a key part of Barack Obama’s election strategy also helped the Russians influence the next campaign that elected Trump.”

 

The internet has long been on a path of constant reinvention, with flux being the sole constant. The biggest question of all is: where will digital technology go next? 

 

“Nobody saw the internet coming as we know it today,” Kleinrock said almost a half-century after the events in his lab occurred that sparked online technology. “Fifty years ago, no one considered the idea of search engines, or websites. And when they came, they were surprises, and explosive. We created this tool called the internet that is constantly shocking us with surprises. It will continue to surprise us.”

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173505 https://historynewsnetwork.org/article/173505 0
Roundup Top 10!  

The Taproot Remains: On the Life and Legacy of Ernest J. Gaines

by Matthew Teutsch

"Gaines...wrote about the people he knew. The land he knew. Their struggles. Their joys. Their lives."

 

Presidential Candidates Crave the Spotlight. 200 Years Ago That Was Taboo.

by David Botti

For a century, presidential candidates were discouraged from openly campaigning - lest they appear power hungry like the British king America revolted against. Here's why that all changed.

 

 

Why Popeyes markets its chicken sandwich to African Americans

by Marcia Chatelain

Popeyes has long cultivated a black customer base — which has positive and negative ramifications.

 

 

The History Behind the Guy Fawkes Masks and Protest

by Sara Barrett

All around the world, protesters wear Guy Fawkes masks to conceal their identity in service of a cause.

 

 

How Richard Nixon captured white rage — and laid the groundwork for Donald Trump

by Scott Laderman

Fifty years ago, Nixon gave us the “silent majority.” Today, Trump proudly declares himself its standard-bearer.

 

 

The Problem With How We Teach History

by Rachel Burstein

Students are still often building up to what they have been told is true, rather than finding truth on their own.

 

 

Why a 1972 Northern Ireland murder matters so much to historians

by Donald M. Beaudette and Laura Weinstein

A recent trial is an example of when historical truth and legal accountability diverge.

 

 

One Big Thing the Dems Get Wrong About Warren

by John F. Harris

The political establishment loves the center. But it’s the radicals who end up writing history.

 

 

Remembering The Ad Hoc Committee for Handicapped Access (AHCHA): Against Erasure of Disability History At the University Of Chicago

by Steph Ban

"The irony of placing a reminder of disability history in a stairwell does not escape me nor does it surprise me."

 

 

 

How Einstein Became the First Science Superstar

by Ron Cowen

A century ago, astronomers proved the general theory of relativity — and made him a global household name.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173544 https://historynewsnetwork.org/article/173544 0
The History Briefing on Presidential Booing: How Historians Contextualized the News When President Trump and Melania sat behind home plate for Game 5 of the World Series at Nationals Park, they were met with vehement booing from the crowd. Some in the crowd chanted “Lock him up!”, which is both a play on “Lock her up!”, a common chant heard at Trump rallies aimed at Hillary Clinton, and a reference to the ongoing impeachment proceedings (the rules of which the House voted on last week). The following day, political pundits were eager to debate if the crowd’s behavior marked the end of an era of political civility. Joe Scarborough and Mika Brzezinski, hosts of the MSNBC show Morning Joe, condemned the crowd’s actions by calling them “un-American” and insinuating that they were stooping to Trump’s level. Several historians and journalists wrote about the history of booing presidents to contextualize the Nats fans actions and the ensuing backlash.

 

Lawrence Glickman, a historian from Cornell University, tweeted Monday morning, “Well, what do you know. There's actually an American tradition of booing presidents at baseball games.” Glickman included a link to an article from The Kane Republican reporting that President Harry Truman was booed at a Yankees-Senators game in April 1951. One spectator at that game went so far as to shout “Where’s MacArthur?”. By citing this example, Glickman suggested that, contrary to Joe Scarborough's sentiments, booing presidents is an American tradition. It is important for people to understand that this event is not an anomaly, nor is it a Trump-specific phenomenon. Glickman’s tweet shows Americans that this action was not as unique as they, along with many news outlets, might have originally thought. 

 

 

Well, what do you know. There's actually an American tradition of booing presidents at baseball games.https://t.co/jU63skkz3L — Lawrence Glickman (@LarryGlickman) October 28, 2019

Journalist Matt Bonesteel traced this behavior back even further in his article for The Washington Post. Bonesteel pointed out that President Herbert Hoover was also booed at the World Series in 1931. In the middle of the Great Depression and Prohibition, the crowd reportedly yelled “We want beer!” at President Hoover as he left the stadium. Bonesteel additionally mentioned that George H.W. Bush was booed at the 1992 All-Star Game and George W. Bush was booed at a Nationals game in 2008.

 

Historian Kevin Kruse also tweeted about the Bush incidents, and added that President Obama was booed at an All-Star Game in 2008.

And other presidents have been booed at baseball games, in case everyone suddenly forgot. If I'm remembering correctly, Obama was booed at the All-Star Game in 2009, GWBush was booed at a Nationals game in 2008, and GHWBush was famously booed at the 1992 All-Star Game: pic.twitter.com/spZjEj1vqh — Kevin M. Kruse (@KevinMKruse) October 28, 2019

MSNBC correspondent Steve Kornacki tweeted a video of NASCAR fans booing Bill Clinton at a race in 1992 and protestors flew a banner that read “No Draft Dodger for President”. 

Labor Day weekend 1992: With a "No Draft Dodger For President" banner flying overhead, Democratic presidential nominee Bill Clinton is loudly booed by tens of thousands of NASCAR fans pic.twitter.com/wgJAWOfSqL

— Steve Kornacki (@SteveKornacki) October 28, 2019

In all of these examples booing presidents is "an American tradition" or in the words of Heather Cox Richardson, an American history professor at Boston College, “Presidents get heckled; it goes with the turf”.

In an opinion piece for NBC News, political strategist Richard Galen argued booing Trump was admirable. According to Galen, the inaction of Whigs allowed antebellum southern Democrats to expand slavery. Such atrocities are “what happens when good men and women do nothing”. To Galen history shows Americans that they have a moral imperative to express their disapproval of Trump however they can, even if that means booing him at a baseball game.

 

History has shown that the crowd from last Sunday night’s game was no less civil than sports fans of the last 90 years. Presidents on both sides of the aisle have been, and will most likely continue to be subject to boos from a rambunctious crowd. The story may have made for great T.V., but with history as our lens, we can see that this might not have been the watershed moment the media portrayed it as.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173459 https://historynewsnetwork.org/article/173459 0
The History Briefing on the Assassination of ISIS Founder Abu Bakr al-Baghdadi: How Historians Have Discussed Recent News Last Sunday, October 27th, President Trump announced in a televised news conference that ISIS founder and leader Abu Bakr al-Baghdadi was killed in a US special operations mission. The news came amidst turmoil in the Middle East after Trump pulled US troops out of Northern Syria and Turkey quickly invaded Syria last month. To understand this complex situation, historians contextualized Baghdadi’s death,  the history of ISIS, and the history of militant and terrorist leadership more broadly. By looking to the past, we can gain a better comprehension of the effect Baghdadi’s death will likely have. 

 

Max Abrahms, a professor of political science at Northeastern University, authored an op-ed for Newsweek titled “Baghdadi’s Death Does Not Matter.” Abrahms uses the "rules" for militant leaders developed in his book Rules for Rebels: The Science of Victory in Military History to argue that Baghdadi was an ineffective leader. Abrahms studied hundreds of militant groups throughout world history and created three rules that he says smart militant leaders follow: understand that not all forms of violence are equal for furthering political goals, prevent rank-and-file members from harming civilians, and avoid blame for terrorist attacks by low-level members. In his opinion piece, Abrahms reviews Baghdadi’s actions as the leader of ISIS and concludes that Baghdadi failed to follow these rules and was thus an inept leader. Particularly, he finds that Baghdadi failed to realize the value of limiting his followers’ violence against civilians—something that most skilled militant leaders understand. Breaking these rules, Abrahms argues, was a detriment to ISIS, driving away both other militant groups and the local population. Abrahms points out that more fighters have been leaving ISIS than joining it. Further, Baghdadi’s excessively violent approach has motivated the largest anti-terrorism coalition in history. Baghdadi’s approach has made ISIS a highly prioritized target, all while driving away potential allies and recruits. This is why Abrahms predicts Baghdadi's death will not be a great loss to ISIS, as Abrahms believes Baghdadi’s approach did more harm than good to the organization. Abrahms’ particular focus on the strategies of militant leaders makes his input on Baghdadi’s leadership, and the vacuum the terrorist leader will leave, especially valuable. 

 

Greg Barton, a professor of Global Islamic Politics at the Alfred Deakin Institute, makes a more optimistic but still measured prediction of the impact of Baghdadi’s death in an article for the Conversation. Barton examines the history of ISIS, noting that from its beginning it has been a hybrid movement of religious fundamentalists and experienced Baathist military figures. Baghdadi, Barton notes, was a strong leader because of his role as a religious figure and his background as an Islamic scholar earned him credibility as the leader of a new caliphate. So Baghdadi may be hard to replace, says Barton, but there are other powerful influences at work within ISIS. Barton connects the rise of ISIS with the de-Baathification project that occurred after the US invasion of Iraq in 2003. Many Sunni military leaders were ousted from the Iraqi government in a short time, which proved to be a great opportunity for ISIS recruitment. Many key figures in the organization are ex-officers in the Iraqi military and intelligence agencies. Though the loss of Baghdadi is a significant blow to ISIS, the hidden core of ISIS leadership still remains intact. Barton predicts that Baghdadi’s death has no chance of being the end of ISIS, but it could provide an opportunity to slow its resurgence. Professor Barton believes the key to optimizing the damage to ISIS is cutting down emerging leaders as they rise to prominence. And this, he believes, is largely contingent on whether President Trump sticks to his decision to pull US troops out of Syria. Professor Barton’s contribution is particularly important as he looks to the genesis of ISIS as an organization to determine the impact the death of its founder may have going forward.

 

Max Boot, a military historian and best-selling author, explores the limits of ‘decapitation’ strategies—the killing of a terrorist movement’s leader—in an opinion piece for the Washington Post. Boot explains that the death of a group’s leader is most detrimental when the group has weak organization and depends largely on a cult of personality. Otherwise, terrorist groups have shown to be entirely capable of bouncing back from the death of a leader. Boot gives several examples from the recent history of failed decapitation strategies. For example, Israeli Defense Forces killed Hezbollah’s general secretary Abbas al-Musawi in 1992, but Hezbollah is stronger now than ever before under his successor. Boot fears that, with ISIS already bolstering itself for a comeback in Iraq and Syria, Trump’s recent decision to pull American troops out of Syria will give a perfect opportunity for a resurgence of ISIS. To Boot, the removal of US troops, and the instability this could bring to the region is a far more important factor in the fight against ISIS than the death of al-Baghdadi.

 

Rebecca Frankel, who authored War Dogs: Tales of Canine History, Heroism, and Love, examined the headline from a very different angle in a Retropolis article written by Washington Post reporter Alex Horton. A Belgian Malinois dog named Conan helped special forces operatives in their pursuit of Baghdadi. The president hailed the pup as a hero and announced Conan will visit the White House. Frankel details the long history of war dogs, present in some capacity since at least the Civil War. The use of dogs in war has become more extensive since then, now a vital resource in locating bombs, and in aiding special operations, like Conan. In fact, a dog named Cairo helped Navy Seals to take down Osama bin Laden. The subject of war dogs is not entirely cheerful, however, as Frankel notes there are issues with maltreatment of dogs in service, as well as issues for retired dogs such as PTSD and difficulty in finding good homes for military dogs. 

 

In the fight against ISIS, the military and political situation is complex and ever-changing. Historians disagree about how much damage Baghdadi's death will do to ISIS, but it is clear that the fight against ISIS is on-going and crucial to restoring stability to the region. 

 

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173461 https://historynewsnetwork.org/article/173461 0
The History Briefing on Syria and the Kurds: How Historians Have Covered this Top Story Last month, the White House announced a full withdrawal of American troops from Northern Syria, blindsiding the public and U.S. officials alike. As journalists report on the resulting chaos—Turkey invaded Syria and Kurdish forces have allied with Russia—historians helped the public understand why this all unfolded and what it means for the future of the region and its people.

 

President Trump justified his removal of American forces from Syria by claiming that the Kurds were historically bad allies. On an October 9 press conference, Trump stated the Kurds “didn’t help us in the Second World War. They didn’t help us with Normandy.” In a separate press conference, Trump claimed “One historian said [the Kurds and Turkey have] been fighting for hundreds of years.” Trump believes these conflicts are “not the kind of things that you settle the way that we would like to see it settled.”

 

In a Washington Post op ed, Sarah Wagner, an associate professor of anthropology at George Washington University, criticized Trump’s statement that the Kurds did not aid American troops in World War II. For Wagner, US Presidents often reference military casualties to instill the idea that the United States is “a nation worth dying for” and to bolster nationalism. Referencing military sacrifice can recast history, legitimize a policy decision, or chart a future course for the country. Wagner analyzed President Ronald Reagan’s “Boys of Pointe du Hoc speech” at the 40th anniversary of D-Day. Delivered a week after the burial of the Vietnam War unknown soldiers in Arlington National Cemetery, Reagan called for the restoration of “moral order to the world” and connected the military sacrifice of Vietnam and Normandy. The public and press hailed Reagan’s speech for helping to heal the cultural wounds of the Vietnam War. Wagner argues Trump is abusing the emotional power of Normandy to deny the Kurdish people needed aid. Overall, a President can use history to intentionally distract the public from current issues in order to reinforce future policy for the U.S.

 

Mustafa Akyol, a senior fellow at the Cato Institute, dissected the President's claim that Turkey and Syria have fought for hundreds of years in an op ed for the New York Times. In the early 16th century, the Kurds were caught in the middle of a war between the Ottoman Empire and the Shiite Safavid Empire of Persia. The Kurds eventually chose to side with the Ottomans and for the next four centuries, the Kurds lived amongst Turks, Arabs, Bosnians, Armenians, Greeks and Jews. After the Ottoman Empire was dissolved after World War I, Turkish leader Mustafa Kemal Ataturk solidified the nation-state of the Turkish republic in 1923. While the Ottoman empire was a multiethnic empire, Turkey promoted a narrow nationalism that excluded groups such as the Kurds. The Kurds’ protests of such discrimination were violently suppressed by Turkey. Based on this history, Akyol believes a solution to the continued conflict can only be reached if Turkey adopts respect for the Kurds and all citizens.

 

Other historians contextualized the history of the United States abandoning the Kurds. Derek Chollet, the executive vice president of the German Marshall Fund, along with Itai Barsade, a program assistant at the German Marshall Fund, argue in a recent Washington Post op ed that the U.S. has previously faced international criticism for worsening a Kurdish humanitarian crisis. In April 1991, Saddam Hussein and the Iraqi military encroached into a Kurdish region. The U.S. had just abandoned the region after successfully ousting Hussein from Kuwait through an air-bombing raid. German leader Helmut Kohl called upon U.S. President George H.W. Bush to come to the assistance of the Kurds. After international pressure, Bush issued supply drops from the Air Force to the Kurds and condemned Iraq in the United Nations. Chollet and Barsade argue that the U.S. must now recognize its mistake and protect its security interests and alleviate Kurdish suffering.

 

Finally, Charles Thépaut, a visiting fellow at the Washington Institute for Near East Policy, discussed the recent history of U.S. involvement in Syria in  The Hill. Last spring, The U.S. committed less than 2,000 troops to Syria in 2015 tasked specifically for counterterrorism. As part of a global coalition, counterterrorism efforts successfully took back territories previously seized by ISIS. American support boosted security in the region. Now, Thépaut worries Europe cannot bear the burden of stabilizing the region alone. Counterterrorism will be increasingly difficult without the U.S. present. U.S. troops in northern Syria would be a “a limited and extremely effective investment.”

 

 

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173340 https://historynewsnetwork.org/article/173340 0
The Latest Polls Are Worrying Me Steve Hochstadt is a professor of history emeritus at Illinois College, who blogs for HNN and LAProgressive, and writes about Jewish refugees in Shanghai.

 

 

A NYT article on Monday by Nate Silver, the guru of understanding polls, frightened me: he said that Trump could win in 2020. The main evidence is a poll showing Trump running neck-and-neck with the leading Democrats in certain arbitrarily selected “key” states, every race very close. I have been relying for my sanity on Trump’s approval rating in many polls, which is down near 40%. How could he beat anybody? Maybe I need to revise my understanding of approval. Perhaps enough people disapprove of Trump, but are willing to vote for him.

 

I don’t think that polls a year ahead of time mean much about what will happen, especially if they are close. There are other polls, though, which cause me more anxiety about the state of our nation. Whether Trump wins next year or not, the views of the minority of Americans who populate his “base” are troubling.

 

In the most recent head-to-head polls, white working-class respondents preferred Trump by 25%, just the same percentage as they preferred him in 2016 against Clinton. College-educated whites gave the nod to Democrats, but only by 6 - 10%. Those Americans, who are as a whole less interested in real political information, still like Trump, after all he has done. A large minority of those who have been trained to understand that information still pick Trump.

 

Some aspects of the general approval polls offer more and maybe more unpleasant information. The tracking of Trump’s approval ratings by the Washington Post and ABC News during his whole presidency shows both remarkable stability and, maybe, the beginning of a serious downturn in the wake of impeachment, as his approval among Republicans dropped to 74%. There will be lots more points on this graph, so little weight should be put on this point. More notable is the lack of change among his most fervent supporters, those who “strongly approve of him”, which has stayed between 60% and 70%.

 

Those squishy approval polls are now being directed at the state level to help predict the 2020 decision in the Electoral College. It’s possible to be heartened by the latest poll, just before impeachment began. More approve of him than disapprove in only 17 states, mostly states in the South or next to it. More disapprove than approve in all the states that the NYT article isolated as “key”. One year ago, his approval ratings were positive in 24 states.

 

Fox News reported that their two polls this October showed that more people want Trump to be impeached and removed than oppose it, 50% to 41%. But here’s the scary part. Among those who oppose impeachment, 57% say new evidence cannot change their minds. That adds up to about one quarter of those surveyed. They probably all belong in the category of those who think the whole impeachment inquiry is “bogus”, 39%.

 

Recent polls show that Republicans who are regular viewers of FOX News are much more likely to be hard-core Trump supporters. Over half of Republicans who support the president and watch Fox News say there is “virtually nothing he could do to make them stop supporting him.” A different poll two years ago shows the same thing. In August 2017, 61% who approved of Trump said they couldn’t think of anything he could do that would make them disapprove of his job as President.

 

For me, the scariest segment of the American electorate has been identified by some political scientists as “chaos-seekers”. They are so disaffected from our political system, that they want to undermine it, even destroy it. They use social media to share outlandish stories, such as the “Pizzagate” rumor, the false story about Obama’s birth, and Alex Jones’ lies about the Sandy Hook Elementary School shooting being faked. They don’t necessarily believe them to be true. “For the core group, hostile political rumors are simply a tool to create havoc.” The political scientists conducted thousands of interviews, and found a significant minority of people who agreed with statements like the following. “I fantasize about a natural disaster wiping out most of humanity such that a small group of people can start all over.” “I think society should be burned to the ground.” Sometimes I just feel like destroying beautiful things.” These people tended to be Trump supporters.

 

A study from 2018 found that Trump supporters tended to “take a belligerent, combative approach toward people they find threatening.” The kind of authoritarianism that Trump’s most fervent supporters embody is “the wish to support a strong and determined authority who will 'crush evil and take us back to our true path.’” The Guardian recently listed 52 Trump supporters who carried out or threatened acts of violence since his campaign began. Since Congressman Adam Schiff became the leader of the House impeachment inquiry, he has been subject to violent threats on social media, often approvingly quoting Trump’s comments about him.

 

A video portraying Trump shooting his critics inside a church was played at a conference for his supporters at Trump’s National Doral Miami resort in early October. The pastor who gave the benediction at the 2016 Republican National Convention told the crowd there, “We’ve come to declare war!” Conference-goers roared back: “War! War!”

 

Trump himself uses careful language to encourage violent supporters and threaten wider violence against his opponents. In an interview in March, he said, “I can tell you I have the support of the police, the support of the military, the support of the Bikers for Trump—I have the tough people, but they don’t play it tough—until they go to a certain point, and then it would be very bad, very bad.”

 

Are we near that point now? What if Trump loses the election?

 

Today, the only important polls will be taken, the elections in Virginia, Kentucky, and Mississippi. They will provide hints about 2020. Then we have only 12 months left to worry.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/blog/154275 https://historynewsnetwork.org/blog/154275 0
The First World War through a camera lens: from the soldier’s snapshot to the memorial photograph

A photo of World War I sodliers taken with a "Vest Pocket" Kodak

 

The Great War was the first conflict that would be comprehensively documented by amateur photographers. While professional lenses had captured scenes from the Crimean and the American Civil War, this was the first time that large numbers of serving men took cameras into the frontlines and made a visual record of their own experiences. 

 

Camera technology had advanced quickly in the years prior to the First World War and photography had shifted from a specialised interest to a mass-market craze. Kodak had launched its small, light, easy-to-use ‘Vest Pocket’ camera in 1912 and, two years on, this model sold in Britain for thirty shillings. While that was more than a week’s wage for many manual workers, it was within the budget of most men who would serve as officers in the First World War. As soon as war broke out, camera manufactures targeted their advertising squarely at that market. 

 

The Vest Pocket Kodak was now sold as the ‘Soldier’s Kodak’ in Britain, and advertising urged, ‘Treat yourself to a Soldier’s Kodak to-day, and start to use it before you go to the front.’ Promotion pushed the idea that photography was ‘in’, and wouldn’t every man heading off to war want to take the latest model with him?  ‘Hundreds of them are in use at the front to-day,’ adverts claimed. ‘So get one now and make sure of bringing back a priceless picture record of your share in the Great War.’ The sales pitch ramped up as Christmas 1914 approached and families of soldiers were told what a thoughtful and practical gift a camera would make (‘Must be light– not to add weight to kit – small– to slip into a tunic pocket – strong– to withstand rough usage – simple– to require no skill.’) While around 5,500 Vest Pocket Kodaks had sold in 1914, in 1915 the number soared to 28,000. It’s estimated that one in five British officers had a camera in his kit.

 

The press would encourage this trend too. A new type of popular daily newspaper had emerged in Britain in the years before the First World War, which made extensive use of photography. These papers now wanted to publish images of the war, and with press photographers denied access to the conflict zone, they appealed directly to their readers in the forces.

 

Soldier photographers would capture the day-to-day routine of military life (their pictures were often comradely and comic), but also significant moments of history. In January 1915 several newspapers published photographs of British and German troops ‘fraternising’ on Christmas Day 1914. The military authorities had tried to play down these incidents, but with the emergence of photographic evidence, anecdotes were corroborated and public interest swelled. A letter submitted to the press along with the photographs described how ‘a crowd of some 100 Tommies of each nationality held a regular mothers’ meeting between the trenches.’ It was hardly the sort of story that the authorities wanted the public to be reading, but it heightened the newspapers’ appetite.

 

In January 1915 the Daily Mirror announced that it had put aside a fund of £5,000 to pay for ‘Pictures of the War’. It was now offering a top reward of £1,000 for the ‘most interesting snapshot’ – an enormous sum of money, equivalent to around US $130,000 today. But within days, the Daily Sketchand Daily Mail had followed suit. Thousands of submissions duly poured in, and through the spring of 1915 the three papers regularly published ‘true’ pictures direct from the front. In reality, soldier photographers rarely actually made much money from having their pictures published in the press, but there was pride to be had here too, and the urge to share images and earn acknowledgement for them long predates camera phones and Instagram.

 

By March 1915 the military authorities had had enough. There was a fear that intelligence could be picked up by the enemy, but also concern about how these uncensored images might influence the public mood. A War Office Instruction stated: ‘The taking of photographs is not permitted and the sending of films through the post is prohibited.’ Servicemen were told to send their cameras home and henceforth anyone found taking photographs could be arrested. Many men reluctantly complied – but by no means all.Tolerance seems to have varied according to the attitude of the commanding officer, and with many of the keenest photographers actually being officers, they found ways to continue.

 

Just as amateur photography was being proscribed, so cameras were finding a new purpose in the war. As well as friendships, photography would now capture loss. With casualty lists lengthening, and repatriation of bodies ruled out, some system was needed to record and maintain cemeteries. The Graves Registration Commission was established in March 1915 to perform this role and, from May of that year, it also offered a photographic service. In response to requests from bereaved families, the Commission would provide a photograph of a grave. 

 

Letters from the public to the Commission indicate the importance of this service. ‘Sir, I received the photos, of the grave of Corporal ---. I thank you very kindly for sending them, and I think they are very nice; it gives us an idea of what the grave is like – that is all I shall ever see of it. Thanking you again. From his loving mother.’ For some families this photograph would be all they ever had because, as fronts shifted later in the war, cemeteries were destroyed and remains lost. Some 170,000 photographs of graves would ultimately be taken and dispatched. 

 

But by 1920 funding for this service was running out, and it was announced that no further requests for photographs could be accepted. Demand wasn’t diminishing, though - in 1922 the Commission was still receiving an average of 200 letters per day from bereaved families - and so charities, associations and commercial photographers stepped in to meet this need. From the early 1920s a number of companies, most of them established and staffed by ex-servicemen, were photographing war graves. 

 

Many veterans returned to France and Flanders through the 1920s and 1930s. They retraced their steps, sought out familiar landmarks, and once again shared their photographs with the newspapers. These images seem rather distant from the Tommy snapshots of the first year of the war; the men in these pictures point at weed-covered ruins, and graves, and their expressions are solemn – but, as in 1915, there’s still a sense of them wanting to capture a picture record of their share in the war. 

 

Photograph albums would be consigned to attics in the years that followed, hidden away along with the memories, but many have emerged in recent times, and their images have been published in the press and online. In 1915 one British newspaper observed: ‘We read in pictures now-a-days, see in pictures, learn in pictures, remember in pictures.’ It’s even truer one hundred years on, and photographs from the First World War have never been more widely shared.

 

Read more from this author: 

 

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173470 https://historynewsnetwork.org/article/173470 0
This Election Day, Remember That Local Politics Matter

In 1831, following decades of tumultuous French politics, Alexis de Toqueville toured the United States to understand why democracy had flourished here, while it had failed in France. Ultimately, Toqueville concluded that the strength of American democracy rested not in separation of powers or in our constitution, but in local politics. Local government, he wrote, was “the life and mainspring of American liberty.” 

This week, Americans will have the opportunity to vote for the municipal officials that run cities and towns. Unlike national politics, where most people rely on Democratic and Republican leaders to define the “right” answer, local issues often cut across contemporary party lines. Following local politics offers the chance to shed the partisan blinders we all wear and to think through issues from the ground up. This fact makes local politics an incredible resource for helping children understand what politics is and why it matters.

 

Engaging with local politics is so easy, even a four-year old can do it. A few weeks ago, my four-year-old noticed the proliferation of lawn signs in the run-up to our town’s municipal election on November 5th (which happens to be her birthday). She asked me, “What does that sign say?” I told her, “Bryan Barash for City Council.” We walked a couple more blocks and she asked me, “What does that sign say?” “Emily Norton for City Council.,” I told her. She started counting who had more signs.

 

After enthusiastically keeping score for a while, she asked the next obvious question: “Why are they putting up signs?”

 

“We’re going to choose who’s in charge of our city,” I told her. “We’re going to vote for one of them. They tell us what they want Newton to be like, and we pick whoever we think has the best ideas.”

 

“Who do you think has the best ideas?” she asked.

 

“One of the main things they disagree about,” I told her, “is how many new houses we should build in Newton. What do you think? Should we build more houses?”

 

She was quiet for a moment. “There was just a big storm in Florida, so maybe if we build new houses, people could come live here.”

 

“But If we build more houses,” I pushed back, “there might be more traffic and Newton would get more crowded. You might have to wait longer for a turn on the swing at the playground.” She was quiet.

 

A few minutes passed and I started to wonder if I had gone to far.  But then she spoke up again. “I don’t care if there’s more traffic,” she told me. “More houses means more new friends, so I think we should build more houses.”

With just a little support, she grasped the issue, weighed the costs and benefits, and took a stand about the kind of place she wanted to live. At a time where it feels like there’s nothing good about politics, and young people question whether democracy is worth it at all, this conversation with my daughter reminded me of its promise.

 

Almost two hundred years ago, Toqueville articulated the promise of an American democracy grounded in local, communal life. In the 1830s, of course, the reality of American democracy remained far from Toqueville’s utopian vision.  Laws deprived huge numbers of Americans of their right to vote and failed to protect their liberty. In many cases, local politicians wielded their power to oppress minorities. Even today, the promise of a country whose destiny is shaped by all the people who inhabit it remains to be achieved.

 

But despite these challenges, the dream is worth taking seriously. To him, it didn’t matter whether people were motivated by a sense of public service or by the self-interest of not wanting to have a public road cut cross their backyard. The strength of American democracy was that citizens have the power to advance their vision of American life in the public square, regardless of why they choose to exercise that power. 

 

Next week, Americans will go to the polls. In most local elections, turnout is low. With attention grabbing headlines from Washington, it’s easy to forget that we make many of the most important decisions that shape everyday life--how should children be educated?  What kinds of housing can be built in our cities and towns?--at the local level.  These elections represent an untapped opportunity to talk to young everywhere about the issues that shape their lives locally.  

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173462 https://historynewsnetwork.org/article/173462 0
Harriet Appropriately and Aptly Honors Harriet Tubman

 

Harriet Tubman was an iconic figure on the fabled Underground Railroad, a long line of safe houses, or railroad stops, on which tens of thousands of runaway slaves from the southern states reached freedom in the North or in Canada, with a lot of help from black and white friends. She is a lower-case legend, a candidate for immortality on our ten-dollar bill and a treasure of history prior to the Civil War.

 

Her story is being told in a new film, Harriet, that opened Friday, that traces her life and her escape from slavery in Maryland, an escape on foot which covered 100 miles. It is an emotional and gripping story that not only sheds new light on Harriet, but the Underground Railroad and women’s history.

 

It is the kind of story that could have been overdone or undercooked, but in the able hands of director Kasi Lemmons, it turned out just fine. It is not only an enjoyable and inspirational film, but one that covers a lot of pre and post-Civil War history as well as some about the war itself. You learn a lot of history watching it and earn a new appreciation for the brave people, north and south, black and white, on land an on the sea, who, together, helped nearly 100,000 slaves (best guess) escape slavery and finished living their lives in freedom.

 

The pace of the story is slow at first, abominably slow but once Harriet starts to work with the Underground Railroad, with headquarters in Philadelphia, under the direction of William Still, who wrote a about the railroad, the tale picks up both speed and drama.

 

How accurate is the history of the movie?

 

Well…

 

Harriet Tubman, who lived to be 91, was a hardworking, productive, bold “conductor” on the railroad, but she was probably not the gun-toting combination of Joan of Arc and Annie Oakley, as the film suggests. She might have been hated by the son of her owner, but he certainly did not loathe her the way the character in the movie did, and certainly did not pursue her halfway across the country with a rifle.

 

In the movie, there are numerous dramatic moments that enrich the legend, but may not be wholly accurate. Tubman did carry a pistol for protection, but she really shoot some people? Did she actually run 100 miles, non-stop, to Philadelphia, as the film suggests? The scene of her wading across a river, in darkness, symphonic music in the background, is preposterous, but sure made for a great moment.

 

I thought the most interesting part of the movie was Harriet’s growth from a simple runaway slave to a hard-ass leader of the railroad, very confident and a born leader who rescued over 70 slaves and brought them north to freedom, risking her life on every single trip. She was also a woman who never stopped her anti-slavery work, even during the Civil War. Harriet, who said she heard voices from heaven, was the energizer bunny of her era. 

 

Director Lemmons does good work in not just telling Harriet’s individual story of courage. He covers the entire Underground Railroad story, taking pains to point out, thank you, that thousands of runaways escaped on ships bound from Southern ports to New York and Boston (where dockside police in both ports conveniently turned their bacs to runaways going down the gangplank so they could tell their bosses that they did not “see” any runaways disembarking the ship). He covers the start of the Civil War and the mission of the Union Army on which Harriet served as a spy and armed scout.

 

Harriet enjoyed a good life after the war living on the grounds of Secretary of State William Seward’s home in Auburn, New York, and became a mini-legend by the time she died.

 

Cynthia Erivo, a Broadway veteran, is nothing short of sensational as Harriet. She endures enormous physical and emotional trips during her years, faces down slave catchers and overcomes adversity. She is dramatic, she is emotional. There is a scene early in the film when her husband tells her, after she as been gone for a year up North, that he has taken up with another woman. Harriet cries and then bends over in agony. Erivo is forceful and admirable. It is a star turn, to be sure. 

 

Other good performances in the film are by Leslie Odom Jr. as William Still, Clarke Peters as Harriet’s father, Zachary Momoh as her husband, Janelle Monae as friend Marie Buchanan and   Joe Alwyn (very impressive) as her monstrous master Gideon, 

 

People will compare this to stories such as Beloved and Twelve Years a Slave. They should not. Harriet stands on its own as the tale of a brave slave who ran off to freedom and turned around to help others do the same.

 

This is not an African American’s story or a woman’s story. It is an American story. 

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173502 https://historynewsnetwork.org/article/173502 0
A History of Why Trump Abandoned the Kurds

An-Nasir Salah ad-Din Yusuf ibn Ayyub/ Saladin

 

Perdition’s antechamber is the circle known as Limbo, and according to poet Dante Alighieri in the fourteenth-century, this bucolic place was for the repose of those righteous pagans who lived before the incarnation of Christ. While the Jewish prophets and patriarchs had been liberated by Christ after his crucifixion, the pre-Christian pagans who lived righteously were forever to dwell in this not-quite-heaven. Dante makes a temporal exception however, allowing a few Muslims who were born after Christ’s life. Perhaps the most surprising of these inclusions is a general who defeated the Christian monarch Richard I during the Third Crusade of the twelfth-century.To honor this Islamic military genius was as close to ecumenicism as was possible for Dante. Because of his qualities of virtue, charity, chivalry, and equanimity, the general appeared in Canto IV with just a single line: “And sole apart retired, the Soldan fierce.” 

 

An-Nasir Salah ad-Din Yusuf ibn Ayyub has long been known to Europeans as Saladin. Religious studies scholar Karen Armstrong wrote in Holy War: The Crusades and Their Impact in Today’s World that “he was revered by both East and West and was the only Muslim hero to be given a Western version of his name by his admirers in Europe.” In victory, Saladin was not cruel but compassionate, not gloating but generous, not intolerant but humane. For centuries, these perceptions of Saladin’s character have also been imparted onto the people he was born amongst, a branch of the Indo-European linguistic tree speaking a language closely related to Persian. They were scattered across the Near East with significant populations in the countries that would become Iraq, Syria, and Turkey. The name of Saladin’s people are the Kurds.

 

With perhaps 45 million Kurds spread in a diaspora cross those nations, with small communities in Europe and North America, the Kurds are arguably alongside the Romani as one of the biggest groups of technically stateless people in the world. Largely secular Muslims, the Kurds are diverse in their faiths, including Sunni, Shia, and Sufi Muslims, Zoroastrians, Christians of a variety of denominations, and the indigenous religion of Yazidism that has connections to the Gnosticism of late antiquity. Frequently the targets of oppression in their host countries, the Kurds have been periodically marked for official persecution as nations have banned their language and folkways, and committed ethnic cleansing and genocide. 

 

Like many indigenous peoples, the Kurds were the orphans of colonialism. By the provisions of the Treaty of Sèvres signed by the Allies at the conclusion of World War I, the Kurds were promised a homeland carved from the former confines of the Ottoman Empire. The provision was part of the same trend towards national self-determination embodied by the Balfour Declaration that ultimately lead to the establishment of Israel. Three years later, that promise was broken by the Treaty of Lausanne. As in Africa and Asia, Western colonial powers imposed arbitrary borders, dividing ethnic and linguistic groups from each other, with ramifications that reverberate today. The result was that the Kurd’s traditional homeland was distributed through several other countries, effectively nullifying (for a time) aspirations of Kurdish sovereignty. Historian Michael Eppel explains in A People Without a State: The Kurds from the Rise of Islam to the Dawn of Nationalism that the “populations of Kurdistan – the Kurds and Armenians – are the descendants of ancient residents of the area who mingled with the waves of conquerors and immigrants who settled there and became part of the population.” Both Armenians and Kurds, not coincidentally, suffered from Turkish aggression and Western indifference. 

 

For the past century the Kurds have valiantly fought for the establishment of a state, free from Syrian, Iraqi, Iranian, or Turkish persecution. In those attempts they have often allied with Western powers, particularly the United States, but repeatedly they’ve been betrayed, as Europeans and Americans are content to let the Kurds wait in Limbo. With Donald Trump’s promise to Turkish President Recep Tayyip Erdoğan to withdraw American troops from northern Syria, tacitly condoning the subsequent Turkish invasion of what had been a Kurdish autonomous zone, the betrayal of our allies has reached a new nadir. Unlike past abandonments, this decision bears no strategic benefit to Washington (even as Vice President Mike Pence has ham-handedly tried to manage what seemed to be Trump’s impetuous writing of foreign policy on-the-fly). That’s without even mentioning the truly bizarre, infantile, and embarrassing letter Trump wrote to Erdoğan released by the White House. 

 

Much has been written about the Kurds in the last two weeks following the United States’ abandonment of our of most loyal allies in the Middle East, but as Dante’s example proffers, the West has long been content to let the Kurds dwell in Limbo rather than Paradise; pleased to take their support when it’s useful and to turn their back when it becomes strategic to do so. This is why it’s so important to enumerate precisely the way in which the United States has frequently turned its back on the Kurds despite their loyalty. We must also consider the implications of Trump’s abandonment of them now – an action that as thoughtless as it might seem actually betrays more nefarious intentions than simply bolstering the president’s Istanbul real-estate portfolio. 

 

In The Great Betrayal: How America Abandoned the Kurds and Lost the Middle East, international studies scholar David Phillips details the United States’ long disreputable history of abandoning our ally. While Trump’s recent perfidy is perhaps the most galling of disloyalties, Philips said that Washington has “betrayed the Kurds in Iraq by failing to support their goal of independence,” as various U.S. administrations saw more utility in placating Istanbul, Damascus, and Baghdad than honoring commitments to our ally. In 1975 the United States cut military funding to the Kurds, allowing Iraqi dictator Saddam Hussein to attack; by the time of the Reagan administration, the U.S. looked the other way as Hussein, who was also embroiled in war against our adversary of Iran, launched chemical gas attacks against Kurdish civilians. By the first Gulf War, President George H.W. Bush implored the Kurds to rise up against Hussein, which they bravely did, only to have the United States abandon commitments and support which led to more slaughter.  

 

More recently, the Kurds have been able to carve out a semi-autonomous Kurdistan in northern Syria, joining a previous Iraqi Kurdistan that was made possible by an Allied No-Fly Zone in the previous decades. Born out of the chaos of the Syrian Civil War, the Kurdish regions of that nation have been instrumental in the fight against, and the victory over, the theocratic and brutal Islamic State. Since ISIS has spread like a noxious gas across the Levant and the Near East, it has been Kurdish fighters, Kurdish blood, and Kurdish lives that reclaimed land inch by inch, mile by mile. This was a brutal war against a self-declared “Caliphate,” an existential struggle against a fundamentally fascistic form of fundamentalism. 

 

Even in this we had a responsibility, as journalist James Verini describes in They Will Have to Die Now: Mosul and the Fall of the Caliphate, as Iraqis, Syrians, and Kurds were “living and dying in a new and blacker war, a war with a foe at whose core was a death cult… a war that nevertheless would not be happening, at least not in this way, if not for the American war that preceded it.” In fighting against ISIS, Kurdish paramilitaries (particularly their fearsome divisions of women fighters) have been the decisive factor in victory. Now that Trump has greenlit Erdoğan’s invasion, Turkey has already committed war-crimes against the Kurds, the region is destabilized, other reactionary forces such as Russian president Vladimir Putin have been empowered, and ISIS is resurgent. 

 

There have been denunciations across a shockingly wide swath of the political spectrum. Among the Republicans, who despite Trump’s continued assaults on the Constitution remain steadfastly loyal to the aspiring authoritarian, there was a brief respite of sanity as figures like Senate Majority Leader Mitch McConnell and South Carolina Senator Lindsey Graham criticized the move as fundamentally jeopardizing the national security of the United States. Writing in the Washington Post, McConnell called Trump’s decision a “strategic nightmare,” while evangelical power-broker Pat Robertson, recalling how the Kurds were instrumental in defending Christians from ISIS, said that the president’s actions had left him “absolutely appalled.” While it could be said that Republicans have never found a war that they didn’t like, the disgust at Trump’s betrayal has similarly angered virtually all major Democratic officials, with former Vice President and current presidential candidate Joe Biden’s contention that “Trump sold [the Kurds] out” is a representative example of sentiment. 

 

Lest this be interpreted as the foreign policy wings of both parties simply doing what the foreign policy wings of both parties do, it’s worth considering that some of the most vociferous condemnations of our Syrian pullout actually come from the far-left. Writing for Jacobin, Djene Bajalan and Michael Brooks argue that an American promise to leave northern Syria “would not be a blow to U.S. imperialism,” but would rather provide opportunity for Turkey to “destroy [Kurdish] radical democratic dreams.” Few issues would seem to unite Pat Robertson and Jacobin, but it’s actually Bajalan and Brooks who explain the why of Trump’s betrayal. The Kurds’ were not simply allying in our fight against ISIS, for that was a pragmatic relationship born from military necessity, but they are also among the exponents and experimenters in one of the most radical and egalitarian political arrangements currently being enacted in the world, during an era when democratic values are seemingly on retreat. 

 

In the Syrian region of Rojava, the Kurds haven’t been able to just win victory against ISIS, but they’ve also been able to construct a nascent state whose values are almost diametrically opposed to the Caliphate in every conceivable way. Based around the radical political thought of American philosopher Murray Bookchin and the theories of Kurdish Worker’s Party founder Abdullah Öcalan, Rojava has become a bulwark of progressive organization. Marcel Cartier writes in Serkeftin: An Account of the Rojava Revolution (the title means “Victory” in Kurdish) that in northern Syria there is “a feeling, a spirit, the life and soul of a revolution.” Among the Kurds there has been the establishment of a radical state committed to complete gender equality, to ecological stewardship, to multiethnic democracy, all organized around “Democratic Confederalism.” With authoritarianism ascendant from Moscow to Washington, Beijing to Brasilia, the example of the Kurds has become a beacon for those who fear the eclipse of democracy. 

 

And that’s why Trump has abandoned the Kurds, not because of a hotel in Turkey, but because Rojava’s example has to be sacrificed in the modern day “Great Game” of alternating cooperation and competition between neo-colonial leaders in Turkey, Syria, Russia, and the United States. Trump’s disdain for the Kurds isn’t in spite of their establishment of direct democracy in Rojava, it’s because of it. The Kurds have become a cause among the international left, as the Spanish Civil War was during the 1930’s. Writing again in Jacobin, Rosa Burç and Fouad Oveisy argue that “Rojava, the site of a remarkable peoples’ revolution, is on the brink of colonization and extermination. The international left must stand against it.” That the Kurds are associated with socialism, anti-fascism, and radical democracy isn’t incidental to Trump’s abandonment of them – it’s the reason why he has. In that larger sense, such an abomination isn’t merely disloyalty to an individual group of people, it’s disloyalty to the very idea and promise of democracy. It’s worth remembering that when Dante designed his inferno, he placed traitors in the ninth and last circle, as far from noble Saladin as they could be. 

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173468 https://historynewsnetwork.org/article/173468 0
A Baseball Musical that Needs Some Relief Pitching

 

Last weekend, the Washington Nationals won their first World Series since 1924 in Texas and in New Jersey Last Days of Summer, a baseball musical set in the early 1940s, opened. Hooray for the Nationals and their sparkling pitching and prodigious hitting. And hooray for Last Days of Summer, even though the new musical has a lot of errors and needs some good relief pitching.

 

Last Days of Summer, written by Steve Kluger and based on his 1998 novel, is the story about players from the New York Giants 1940 team, that toiled in New York along with the Yankees and Dodgers, and a little kid who loved the game. The play opened last weekend at the George Street Playhouse, in New Brunswick, N.J.

 

The kid, Joey Margolis, meets Charlie Banks, a star on the Giants team, at the time that the kid’s dad has left his family and raced off with another woman. Joey, desperate for parental love, makes Charlie his new “dad” and starts to spend a lot of time with him.

 

It is suddenly time for Joey’s Bar Mitzvah and he gets Banks to help him prepare for the event. They need time to do that, time that ballplaying Charlie does not have. So, he takes the kid with him on the team’s road trip so they can study.

 

That’s basically the plot of Last Days of Summer. The first act is as wobbly as a knuckleball. In the second act, in which there is far less baseball than in the first, the story is much better developed. All is thrown into chaos when World War II begins and Charlie and other players, such as his friend Stuke, join the military. The story and plot both improve as the fantasy of baseball drifts away and the soldier boys get into trouble as do the folks back home.

 

Last Days of Summer ably directed by Jeff Calhoun, is a nice war story about life in New York in the 1940s, a dreamy world of baseball and the beach, that is disrupted by the war. It is a warm, cozy baseball star and little kid tale (oh, there have been so many of them). It works as a story about history. It has fine acting by a gifted ensemble, handsome sets and sharp choreography. 

 

The play needs help, though, a bit of relief pitching in the ninth inning to hold it together.

 

First, the music, written by playwright Kluger and Jason Howland, all sounds the same. There are no songs that you hum going out of the theater and, while the tunes help tell the story, they leave you unimpressed. Not a day at the old ballpark at all.

 

The story itself is a bit forced at first. Name the number of ballplayers who helped a kid prepare for a Bar Mitzvah. Even though it gets much better in the second act, it still needs a sharpness and better character development. It is also very predictable. As a scene begins, you know how it is going to end. The end is very schmaltzy. The presence of a 30 something year old Joey in the play to narrate the story about himself as a ten-year-old kid is very awkward because you keep getting him mixed up with the ten-year-old kid.

 

If this play is about the Giants baseball team in 1940, there should be more baseball history. While the kid loved his Giants, they were a terrible team that year. The Giants were 72-80 and finished a lowly sixth place in the National League. They did have some bright spots, such as pitcher Carl Hubbell and slugger Mel Ott. None of this is mentioned in the play and it should have been. What is the history of New York without the history of baseball?

 

The play is staged at the brand new, sparkling New Brunswick Performing Arts Center, the new home to the theater group and other artistic ensembles. The Johnson Theater, where Last Days…is staged, is gorgeous. The complex has a second, smaller stage, the Laurents theater, for other plays.

 

Director Calhoun gets fine work from Julian Emile Lerner as the energetic and adorable Joey (kid), Danny Binstock as Joey (father), Christine Pedi as the aunt, Parker Weathersbee as Craig Nakamura, Don Stephenson as the Rabbi, Will Burton as ballplayer and buddy Stuke and Bobby Conte Thornton as Charlie Banks.

 

Let’s hope the play improves over time. Hey, the Washington Nationals did.

 

PRODUCTION: The play is produced by the George Street Playhouse. Sets: Beowulf Boriff, Costumes: Loren Shaw, Lighting: Ken Billington, Sound:  Brian Ronan, Choreography: Paul McGill. The play is directed by Jeff Calhoun. It runs through Nvember10.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173504 https://historynewsnetwork.org/article/173504 0
China and the NBA: How to Lose Friends and Alienate People

 

The Chinese folk saying “lift a rock only to drop it on one’s own feet”, or its English equivalent, “to shoot oneself in the foot”, perfectly describes the self-defeating inclination of dictatorship. And nothing shows such inclinations as much as China’s effort to intimidate America’s National Basketball Association.

 

The dispute began when the Houston Rockets general manager Daryl Morey tweeted (and quickly deleted) support for the pro-democracy protesters in Hong Kong” ‘Fight for Freedom, Stand with Hong Kong.’ The response was quick. China’s government blacklisted the Rockets, ordered the state-run television networks to cancel broadcasts of two NBA preseason matches, and instructed Chinese companies to suspend their sponsorship and licensing agreements with the NBA.

 

As the NBA’s largest international market, China expected the league to fall back into line, apologize for offending the Chinese Communist Party (CCP), and pledge never to repeat the mistake. And, initially, the NBA did just that. ‘We feel greatly disappointed at [Morey’s] inappropriate speech, which is regrettable’, the league said in a statement. ‘We take respecting Chinese history and culture as a serious matter.’

 

But the attempt to kowtow to China sparked outrage among US lawmakers, who accused the NBA of choosing money over human rights. “No one should implement a gag rule on Americans speaking out for freedom”, Senate Minority Leader Charles Schumer tweeted. The NBA threw Morey “under the bus” to protect its market access, Senator Marco Rubio added.

 

Under pressure, NBA commissioner Adam Silver then seemed to change the league’s position. In an interview with a Japanese news outlet, he said “Morey is supported in terms of his ability to exercise his freedom of expression.”

 

In the end, it was China that had to back down. The authorities allowed a previously scheduled NBA game to be played in Shanghai—to the cheers of thousands of Chinese fans—and ordered state media to play down the controversy. The lesson should be clear: bullying is a guaranteed way to lose friends and alienate people in the West.

 

China may be a lucrative market for the NBA, which has reaped millions of dollars in revenue through broadcasting and merchandizing-licensing deals in the country. But the NBA is also a very valuable friend to China. Its relationship with the league is one of the great successes in its cultural and commercial relations with the United States, and a powerful example of Sino-American ‘sports diplomacy’.

 

Such diplomacy has a legendary history in Sino-American relations. During the 1971 World Table Tennis Championships in Japan, the US player Glenn Coward boarded a shuttle bus with the Chinese national team. Rather than avoid him, as the Chinese team had been instructed to do, its top player, Zhuang Zedong, started a conversation with the American (through an interpreter). The two players even exchanged gifts—and act of good will that garnered significant positive media attention.

 

Recognizing the diplomatic opportunity, Chairman Mao Tse Tung invited the US team for an all-expense-paid visit to China. The heavily documented trip—which included tours of important sites, exhibition ping pong matches, and even an audience with Chinese Premier Zhou Enlai—opened the way for the two governments to begin back-channel communications, and eventually to normalize bilateral relations.

 

Mao and U.S. President Richard Nixon did not squander the opportunity that sports diplomacy presented. But, by picking a fight with the NBA, Chinese President Xi Jinping’s government could well have. At a time when Sino-American relations are deteriorating, that’s the last thing China needs.

 

To some extent, China’s miscalculated response probably stemmed from hubris. The government had effortlessly bullied some of the world’s largest and best-known corporations into submission after they offended its delicate political sensitivities. Apple and Marriott International listed Chinese territories, such as Hong Kong and Taiwan, as separate countries. Cathay Pacific Airways, Hong Kong’s flagship airline, didn’t prohibit its employees from participating in pro-democracy protests. But the employees of Cathay Pacific Airlines who did join the protesters were, probably under pressure from Beijing, fired.

 

China has used similar tactics to pressure Western governments into complying with its will. For example, it terminated high-level exchanges and ended business dealings with France, Germany, and England when they hosted the Dalai Lama.

 

Similarly, after the Nobel Peace Prize awarded to the Chinese dissident, Liu Xiaobo in 2010, China ended salmon imports from Norway (though the Norwegian government has no influence over the Nobel committee’s decision). China ended up getting its way in most of these showdowns, with Western actors showing remorse and trying to regain China’s favor.

 

But hubris is only part of the story. Chinese officials have strong incentives to demonstrate their loyalty to the regime, even at the cost of strategic objectives. The resulting modus operandi—called “ning zuo wu you”, which loosely translates to ‘rather left than right’—influences most official calculations. The decision to intimidate the NBA was most likely made by a party apparatchik eager to win favor with CCP superiors.

 

With intimidation hard wired into the Chinese system, such self-defeating actions are likely to continue—and cost the CCP dearly. The more friends China turns to enemies, whether out of hubris or ingrained instinct—the easier it will be for the United States to muster a broad coalition to contain its power and ambitions. At that point, the Chinese bully’s favorite strategy for defending its interests will even be less effective.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173466 https://historynewsnetwork.org/article/173466 0
Refugees in an Age of Immigration Restrictionism

Sisters Olga (left) and Lilly (right) Cushneroff

 

Each new stage in the Trump administration’s handling of refugees and immigrants, whether family separation, a Muslim ban, indiscriminate deportations, a requirement to first seek asylum elsewhere, or other “tough” policies enacted over the past three years, invites comparisons to past policies. Usually that means talking about the Obama years, or maybe the 1986 immigration reforms.  But it’s worth looking back further, to the restrictionist era of the 1920s and 30s.  

 

In 1931, infant Lilly Cushneroff, her toddler sister Olga, their parents, and several others illegally crossed the Mexico-U.S. border near San Ysidro. She and her family then lived as undocumented immigrants in southern California, along with thousands of others who had followed similar journeys across the globe from the defunct Russian empire to the United States. Immigrants usually resided unbothered in one or more other nations for months or years and then came to the U.S. illegally. Most were underskilled workers and many benefitted from unemployment programs.  Yet, at the height of American immigration restrictionism, they were embraced and protected by the U.S. government.  

 

The era of restrictionism (so-called because it restricted the flow of immigration) has long been treated as an anomaly in U.S. history: a relatively short period of intense hostility toward immigrants. While that description is inadequate, it’s notable that restrictionists of the 1920s and 30s also made exceptions for certain groups of immigrants that were recognized as refugees. The quota system established by the 1924 National Origins Act preferred immigrants from Northern Europe (see Trump’s request for more people from Norway), and banned Asian immigration. But beyond that, Congress sought to protect undocumented migrants who sought asylum.  

 

The Act of June 8, 1934 created, for the first time, a distinction between refugees and other immigrants to the U.S. The quotas had not stopped immigration, but it made immigration from much of the world illegal. It also criminalized international migration as the world faced an unprecedented refugee crisis in the wake of World War I and related upheavals.

 

The impact of restricitionist policies was tempered by the fact that migrants continued to enter the U.S. anyway, crossing porous northern and southern borders or landing in a port and simply staying without permission. Even in rare instances when the State Department sought deportation, political turmoil abroad usually prevented it. That changed in 1933, when FDR announced he would recognize the Soviet government. Diplomatic relations meant that Soviet citizens with no legal status in the U.S. could possibly be deported. Congress quickly moved to protect them.  

 

The Act of June 8 was supported across the spectrum. Conservatives liked the idea of protecting anti-Communist “White Russians,” and liberals seized the opportunity to extend protections to both religious and political refugees. The law’s purpose was narrow, but written broadly enough to allow some other undocumented immigrants the chance to apply for legal status as refugees.  

 

Thousands of Jews also had fled the USSR or neighboring states formerly within the Russian empire. Openly anti-Semitic members of Congress prevented most Jewish asylum seekers from taking advantage, but those who had entered the US before June 1933 usually benefitted from the new law.  Armenians fleeing Turkey comprised the other major group of unintended beneficiaries.  Many immigration officials objected to their inclusion, often on clearly racist grounds, but ultimately they were accepted under the terms of the law.

 

Almost all of those protected by the 1934 act had broken U.S. laws.  Crucially, immigration officials in Washington determined that entering the U.S. illegally, falsifying official documents, or lying to immigration officers was not only inconsequential, it was to be expected of refugees fleeing for their lives. Passage through Mexico, Canada, Cuba, China, or any other nation was treated similarly – the refugees sought asylum in the U.S.A., and rightly so, and the U.S. government would not force them to seek asylum elsewhere. Though some families, including Lilly and Olga’s, were initially told by agents in the field that they would be separated because of differing individual legal situations (Lilly was born in Mexico, and therefore not a Russian refugee), the INS determined they would be dealt with as a unit, and offered asylum together.   

 

When we compare the current wave of hostility toward immigrants to the restrictionist era, we also must note the efforts then made to make the system as humane as possible. The nuances of that earlier period remind us that we don’t have to choose so starkly between either no immigration or no borders, and even if support for refugees currently cannot transcend the political divide, there still may be ways to reach agreement when it comes to helping asylum seekers.   

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173472 https://historynewsnetwork.org/article/173472 0
Whistleblower or Spy? What the History of Cold War Espionage Can Teach US

 

President Donald Trump has called the source for the author of the whistleblower complaint “close to a spy,” and alluded to the harsh treatment supposedly meted out to spies in an earlier era. Many observers were appropriately wary of the president’s definition, as whistleblowers have a unique history in our system – and legal protections – that go back to the nation’s founding.   

 

In fact, Trump is not totally wrong in embracing a somewhat more expansive definition of espionage than our common media stereotypes might suggest.   The modern history of espionage, to the extent that it can be known, provides something of a guide in assessing Trump’s accusation, as well as his proposed punishment.

 

Since the dawn of the Cold War and the rapid expansion of the CIA’s covert operations program (alongside the work of other agencies charged with surveillance and espionage work, including especially the National Security Agency) spies have taken on a vast range of assignments and assumed many different roles.  Codebreakers, high-altitude pilots, tunnel-diggers, sound engineers, counter-surveillance and counterespionage specialists – all of these have played crucial roles in America’s modern spy wars.   

 

Of course, covert operatives, of the sort we typically associate with espionage, have been the unsung heroes of America’s efforts to glean vital information about its real or potential adversaries. These women and men have little in common with James Bond or Jason Bourne; instead, they are masters of the psychology of persuasion, of assumed identities, and above all of transferring information from source to headquarters.  In their essential book Spycraft, Robert Wallace and H. Keith Melton write that “for a spy the greatest danger usually is not stealing a secret, but passing it to his handler … Without the means to transfer information securely between agent and handler, espionage could not exist.”

 

Indeed, when we think about the role of the spy, the transit of information – from an often unstable, even dangerous source (whether a U-2 spy plane, the Berlin Tunnel, or a mole operating deep inside the Kremlin) to government officials who can interpret and deploy this information for strategic purposes – seems to be the defining feature.

 

Trump’s claim that the whistleblower committed espionage puts us in uncharted territory.  The interesting twist here is that it’s not the purported spy who is serving as a conduit of information in conjunction with a foreign regime – that conduit, of course, is the president himself. Instead, the whistleblower/spy conducted her or his operation entirely within the scope of the executive branch of the U.S. government and followed established protocols, which in this instance involved sending a letter to the Inspector General of the Intelligence Community (ICIG).   Whatever we might think of the whistleblower’s motives, he or she has sought identification as a whistleblower– precisely the kind of exposure a spy would never seek.

 

Trump’s suggested punishment for the whistleblower/”spy”—execution—raised justifiable concern from many political commentators. In the contemporary era, the U.S. has executed only two spies, Ethel and Julius Rosenberg, who were American citizens accused of transferring atomic secrets to the Soviet Union.  That sentence outraged millions of Americans and continues to stir controversy.  The Soviets, in contrast, executed many of their own citizens accused of collaboration with foreign espionage services, typically deep inside Lefortovo Prison, the KGB dungeon in Moscow.   Lest we consider that as a model, we should recall that the Soviet Union’s treatment of prisoners and dissidents received so much international condemnation, it led to the creation of Helsinki Watch in 1978, a key moment in the modern human rights movement.

 

As the Cold War progressed, both sides adopted a very different practice for dealing with captured spies on enemy soil.  An unmasked covert operative was termed persona non grata and promptly ejected (“PNG’d”) from the host country – generally ruining a promising undercover career and depriving a spy service of precious human talent.   Many spies on both sides of the Iron Curtain met this fate, but were fortunately able to return home to productive senior roles inside their respective spy services.  Given this history, therefore, is it possible imagine that our whistleblower/spy might be sent home to his or her original agency, and in fact get a promotion?  Clearly, the spying analogy has led us to something of a dead end.  

 

The world of spies is one of deceptions and defections; the whistleblower, in contrast, stepped into a sharply-defined role in sending the complaint letter to the Inspector General.  It’s not difficult to understand why the president and his defenders wish to paint the whistleblowerin an unflattering light, but they will find no support for their cause in the history of espionage.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173467 https://historynewsnetwork.org/article/173467 0
Churchill and Stalin: Comrades-in-Arms during World War Two

 

The alliance of Britain, the United States and the Soviet Unionduring World War II is often presented as a fragile necessity. The alliance was forced into existence by Hitler and fell apart as soon as Nazi Germany was defeated. But neither the formation of what Churchill later called the Grand Alliance nor its collapse was inevitable. The Grand Alliance was willed into existence by its leaders and then sustained through four years of total war. It was one the most successful alliances in history. 

 

When the Grand Alliance emerged following the US and Soviet entry into the war in 1941, it was not clear that such an unlikely coalition could survive the vicissitudes of war. The three countries had very different socio-political systems and there was a long history of ideological conflict between Soviet communism and western liberal democracy. Within western states there were anti-communists hostile to alliance with the old ideological enemy, while on the Soviet side there were deep suspicions of western capitalist leaders. The Grand Alliance also had to deal with Hitler’s efforts to sow seeds of doubt by spreading rumours that each of the allies was negotiating a separate peace with the Germans.

 

There were significant internal tensions during the coalition’s early years, mainly because most of the fighting was being done by the Red Army, while the British and Americans fought on the margins of the conflict. But increasing amounts of western material aid did reach the USSR beginning in 1943, and in June 1944 the western allies invaded northern France – an operation Moscow had been demanding since July 1941. 

 

The Grand Alliance overcame these difficulties because of the leadership of the so-called Big Three: Winston Churchill, Franklin Delano Roosevelt and Joseph Stalin – leaders prepared to put aside ideological differences in the interests of a greater cause.  The social background, personalities, politics, leadership styles and working methods of the Big Three were diverse. But they had one important trait in common: they were men of long political experience who placed a high premium on personal relations with each other. 

 

Personal contact between the three leaders – at meetings, through correspondence and via intermediaries – convinced them they could work together and trust each other. At times that trust and friendship was strained but difficulties were overcome and differences resolved through compromises that respected honour and protected vital interests. The Grand Alliance as it developed during the war is unimaginable without this personal bond between Churchill, Roosevelt and Stalin.

 

Power in the Grand Alliance lay with Roosevelt and Stalin. As Churchill famously said, it was the Red Army that tore the guts out of Hitler’s war machine, while it was American industrial might and manpower that tipped the balance of forces decisively in the allies’ favour. But the beating heart of the Grand Alliance was Churchill’s relationship with Stalin. As US ambassador Averell Harriman recalled, while Stalin admired and respected Roosevelt and praised him as a “great man for war and a great man for peace”, Churchill he toasted as “my comrade-in-arms.”

 

Stalin’s relations with Churchill were fragile but intimate and intense. Churchill was a mercurial personality and his relations with Stalin were volatile. He had a history of militant anti-Bolshevism and was unapologetic about it. Yet Stalin, a dedicated communist, wanted Churchill to win the 1945 British General Election and was shocked when he lost and did not return to the Potsdam summit after going home for the counting of the votes.  

 

During the war the two men conducted a 500-message correspondence (two-thirds of the messages were Churchill’s) and Churchill travelled twice to Moscow – in August 1942 and October 1944 - for crucial bilateral meetings with Stalin. Famously, during their October bilateral the two men divided central and southern Europe into percentage-based British and Soviet spheres of influence. 

 

At Yalta in 1945 the Big Three proclaimed their commitment to a peacetime Grand Alliance that would prevent war and provide peace, security and prosperity for all states – a goal reaffirmed by the Potsdam summit and at the founding conference of the United Nations in San Francisco. After the war the collaboration continued. Major Nazi war criminals were tried at Nuremburg and convicted of crimes against humanity and of conspiracy to wage aggressive war. A peace conference was convened in Paris in summer 1946, and in 1947 peace treaties were signed with the Nazis’ wartime allies – Bulgaria, Finland, Hungary Italy and Romania. 

 

Even when Stalin clashed publicly with Churchill following the now ex-Premier’s “Iron Curtain” speech in Fulton, Missouri in March 1946, but the two men never lost their affection for each other. When Field Marshal Montgomery visited Moscow in 1947 Stalin took the opportunity to give Monty a message for Churchill saying that he had the happiest memories of working with Britain’s great war leader. Churchill responded: “I always look back on our comradeship together, when so much was at stake, and you can always count on me where the safety of Russia and the fame of its armies are concerned . . . Your life is not only precious to your country, which you saved, but to the friendship between Soviet Russia and the English-speaking world.”

 

It is commonly assumed the cold war was inevitable, that once Hitler was defeated the conflicting interests and ideologies of the Soviet Union and the western powers inexorably drove the two sides apart. Despite his reputation as an early cold warrior, that was not Churchill’s view:  the main message of his iron curtain speech was the need for a good understanding with Russia. When he returned to power in Britain in 1951 it was as a peacemaker and an advocate of détente with the USSR. Jaw-jaw is always better then war-war, he said.

 

Nor was the cold war Stalin’s choice. Throughout the war the Soviet dictator stressed the long-term common interests – economic, political and military - of the partners in the Grand Alliance. An avid reader of historical works, Stalin told Churchill and Roosevelt at Yalta that “in the history of diplomacy I know of no such close alliance of three Great Powers as this.”

 

In spring 1947 the inter-allied Council of Foreign Ministers met in Moscow to negotiate Germany’s future. By the end of the year, however, negotiations about a German peace treaty had collapsed and the Grand Alliance was in the later stages of its disintegration. The failure of the Grand Alliance led to the cold war and to decades of division, conflict, and rivalry between the Soviet Union and its erstwhile western allies.

 

In the end the story of the Grand Alliance and its denouement in the cold war is quite simple. During the war its leaders choose to ally against a common enemy and then to carry the coalition forward into peacetime political collaboration. After the war different choices were made – including some by Stalin and Churchill – to pursue the separate as opposed to the common interests of the Grand Alliance. The result was the cold war. 

 

The first set of choices saved the world from Hitler and the Nazis. The second set of choices plunged the world into decades of a potentially catastrophic conflict, whose vast nuclear arsenals continue to pose an existential threat to humanity.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173471 https://historynewsnetwork.org/article/173471 0
Lebanon’s Chance for Change

Protests in Lebanon, October 2019

 

In a major city in Lebanon, thousands of people take to the streets to protest the regime, driven by a mix of economic grievances, concern about corrupt sectarian elites, and opposition to international intervention. The security forces are out on the street, ostensibly neutral, supporting neither the people nor the government. Violence breaks out, straining the long-standing confessional balance that holds the country together. Ultimately, the army steps in to restore order, with a greater or lesser degree of success. 

 

Such events are frequent in Lebanese history, particularly prior to internal crises. In 1958, protesters and armed groups took to the streets to oppose Lebanese President Camille Chamoun’s efforts to force Lebanon into the Western camp in the Cold War, as well as to engineer a second presidential term in violation of the country’s constitution. When a coup in Iraq and a simultaneous crisis in Jordan made the Eisenhower administration nervous, it sent 14,000 Marines, which Chamoun had desperately wanted. The conflict was only resolved when a respected army commander, Fouad Chehab, took over as president, offering a way out of the confrontation

 

In the early 1970s, there were frequent protests against the government due to complaints about corruption and uneven development. Many demonstrators also advocated freedom of action for Palestinian militant groups based in the country, who aimed to wage war on Israel. Palestinian strikes into Israel and terror attacks around the world prompted Israeli bombings and raids on Lebanon, producing a cycle of violence that alienated much of the country’s Christian population, which along with other groups began military training and acquiring arms. The army’s inability to keep the peace between these groups led to the outbreak of a 15-year civil war, from which Lebanon is still recovering.

 

It is tempting to see the same potential for instability in the latest rounds of protests in Lebanon, which began last month. But this time, there may be something truly different. Two weeks ago, prompted by the proposals fornew taxes on tobacco and VoiP services like Whatsapp, tens of thousands of Lebanese of all sects took to the streets in virtually every major city. While the protesters of the 1950s and 1970s were unable to separate themselves from the sectarian interests and regional politics, the majority of demonstrators today are set on rejecting the country’s sectarian leadership as a whole. 

 

Lebanon faces a huge variety of challenges, from unbalanced budgets, widespread unemployment, a lack of a foreign exchange reserves, and insufficient fuel resources, to burning wildfires in parts of the country’s wooded areas. Beyond this, there are deeper problems of atrocious transportation infrastructure (crossing Beirut at rush hour can take hours), insufficient electricity production (rolling blackouts are the norm), and inadequate garbage disposal that has led to literal mountains of trash piling up throughout the country. At the heart of all these issues lies endemic corruption. (And that’s not even mentioning the presence of Hezbollah, an Iranian-backed proxy with a massive military force). There is no single solution, or even a set of solutions, that could address the country’s challenges quickly. 

 

Just as these issues reflect long neglect and misgovernance, the protests are the culmination of many years of activism. Lebanon’s liberal laws on media and freedom of expression make it an ideal place for grass-roots organizations to develop. But while civil society organizations have proliferated since the end of the civil war, they often have difficulty bringing about meaningful change. In 2015-16, in response to an escalating garbage crisis, a series of protests under the banner “You Stink” (Tala3 Ri7tak) took place, but failed to appeal to a popular base or unite under a clear series of demands. The current protests have avoided the first pitfall, but there is a risk of the second. 

 

The single overwhelming call that unites protesters is the slogan of the so-called Arab Spring of 2011: The people want the downfall of the regime. But what this slogan means varies. Most demonstrators would like to see the usual politicians, most of whom are militia leaders and their family members or those who benefited from foreign patronage, gone from the scene or in prison. Beyond this, there is little agreement. A committee of organizations called for the current government to step down, and for a new government of “persons not derived from the leading class” to take power. Protesters have also demanded legal and policy measures, from the prosecution of known corrupt politicians and officials to the lifting of bank secrecy laws and the allowing of civil marriage (religious communal laws prevent the unions of many star-crossed Lebanese lovers of different sectarian backgrounds).

 

The protesters won their first concrete victory on Tuesday, when Prime Minister Saad Hariri resigned. Lebanese President Michel Aoun has asked Hariri to stay on as the head of a temporary government until a new one can be formed. Whether there is a caretaker government or Hariri continues on, at least some reforms are likely to be put on the table to try to get the protesters off the streets. This can’t come too soon for Lebanon’s economy. The country’s central bank (whose leader Riad Salame is one of the protester’s targets), appears to be under pressure to devalue its currency, a move that could be disastrous for the country’s finances.

 

Still, it’s not clear that a government of new faces with a few changed policies would get the protesters off the streets. Lebanon’s semi-democratic political system is based on sectarian rather than popular representation, with the parliamentary seats split on an even basis between Christian and Muslim Deputies, and a long tradition in which the top government positions are distributed on the basis of sect. Moreover, the process of selecting representatives discourages competition. One study suggested that up to 70% of parliamentary seats are essentially predetermined. Unless these rules are replaced, the old faces are unlikely to go away. But there are many proposals on how to achieve this, and it will take time to select one. 

 

Even after Hariri’s resignation, resistance to change is firm. Soon after protests broke out, Hezbollah Secretary General Hassan Nasrallah announced his party’s opposition to any change in government, which includes representatives from his party and takes a permissive attitude towards the party’s militia wing. A few days later, he issued a clear warning (and veiled threat) that the protests risk igniting a civil war. Supporters of Hezbollah and its partner Amal, whose leader Nabih Berri has been particularly criticized for corruption, have even attacked protesters in isolated incidents. There have also been clashes in deeply divided Christian areas between supporters of Lebanese President Michel Aoun, allied with Hezbollah, and the Lebanese Forces party leader Samir Geagea. Ironically, the country’s oft-feuding leaders may still end up findingcommon cause in dividing the protesters before they achieve real reform.

 

Despite these obstacles, the protesters remain largely unified, and as they enter their second week, remain strong. As now former Prime Minister Hariri admitted during a press conference,the protests have “broken all barriers,” including that of “blind sectarian loyalty.” If he is right, then there is a true chance that Lebanon can break out of the sectarian trap it finds itself in. But fear and violence still have the potential to knock the protests off course. Neither the protesters nor the protested know where this revolution is headed, but the situation is unlikely to return to where it was before. 

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173463 https://historynewsnetwork.org/article/173463 0
Citibank: Exploiting the Past, Condemning the Future

 

In 2011, Citigroup published a 300-page 200th anniversary commemoration Celebrating the Past, Defining the Future. I recently saw a copy on the customer desk at my local branch in Brooklyn and could not resist reading and reviewing the book. I passed up purchasing a copy on Amazon for $849, but my university library was able to secure a copy through an inter-library loan. 

 

I am interested in Citigroup and its history because of my own research on New York City’s relationship with slavery, the slave trade, and the sale of slave produced commodities (New York And Slavery: Time To Teach The Truth, SUNY Press 2008 and New York’s Grand Emancipation Jubilee, SUNY Press 2018) and because of the role international banking played in the 2008 financial crisis.The federal government rescued Citigroup from potential collapse by covering over $300 billion in risky assets and transferring $20 billion to the bank.

 

According to CEO Vikram Pandit in the book’s foreword, “In the summer of 1812, a dozen New York merchants came together to form the bank that would be known as Citi. They pooled their capital, shared ideas, and financed new ventures. Their principles guided the bank’s principles; their success fueled the bank’s success. From the beginning, they backed bold projects that improved and connected the world” (9). The book’s introduction elaborates on this theme claiming that throughout its history, Citibank was motivated by concern for innovation, people, values, and clients (13). Pandit was definitely half right. The principles of the founders definitely directed the bank’s actions, but despite the books heft, glossy photographs, and celebratory tone, Citibank and its forbearers definitely did not strive to make the world a better place.

 

Chapter 1 describes “a local bank with national ambitions” between 1812 and 1890. The City Bank of New York was chartered by the State Assembly in 1812 and was made a federal depository in 1814. The chapter explains that in its early years the bank “mainly served the commercial interests of the merchants who owned it.” In 1837 it nearly collapsed along with the national economic system in a financial panic precipitated after President Jackson and the Democratic Party refused to recharter the national Bank of the United States. The City Bank of New York survived the panic because John Jacob Astor, the wealthiest man in the country at the time, deposited substantial funds in the bank and installed Moses Taylor, “an importer of Cuban sugar,” on the bank’s Board of Directors. 

 

Extended sections describe Taylor’s role in transforming the bank into a national financial institutions with international connections. Taylor is lauded as a “commodity specialist” who focused on Latin America and was able to “broaden City Bank’s client base” (36-47). The “commodity” that Taylor specialized in was sugar produced in Cuba by enslaved Africans smuggled into the Spanish colony in violation of international bans on the trans-Atlantic slave trade. Not only did Taylor broker Cuba’s sugar exports but he and the bank served as an “investment advisor for Cubans and sent samples of bylaws, reports, rules, and regulations to help them set up companies or commercial associations” (37). Taylor and City Bank became the “middleman” for plantation owners arranging for them to purchase equipment for “railroads, ferries, lighthouses, steam engines and machinery for sugar mills” (37).  As full service banker, Taylor arranged for the education of the children of the Cuban planters and slaveholders when they were in the United States and shopping trips for their wives. In response and gratitude, the Cuban planters laundered profits from slave produced commodities by investing approximately $3 million in the United States, worth almost $300 million today, employing Taylor as an agent.

 

Taylor later served as the bank’s President from 1855 until his death in 1882. During this period Taylor used sugar profits to help the bank survive a series of economic downturns and to purchase mines, railroads, and real estate. All of the companies that Taylor invested in were then required to use City Bank for financial transactions. 

 

In the crucial months leading up to the American Civil War, Taylor promoted a compromise between the North and South that would leave slavery in place. During the war, Taylor was able to negotiate a federal charter for the bank, which became National City Bank of New York, in exchange for loans to finance the war effort. After the war, Taylor became a member of Tammany Hall, served on a commission that whitewashed the corrupt Tweed Ring that controlled New York City politics, and invested with Tweed in a series of business ventures. When Moses Taylor died in 1882, he was heavily invested in the company that would become Con Edison and was one of the wealthiest men of the 19th century. His estate was reportedly worth $70 million, or about $1.6 billion in today’s dollars.

 

Other Citibank “heroes” had similarly tainted careers. The commemoration celebrates Charles and James Stillman, father and son cotton-brokers. Stillman senior is identified as a “long-time National City Bank client.” He made his initial fortune profiteering during the Mexican-American War and expanded it by smuggling contraband cotton out of Brownsville, Texas and across the Rio Grande River during the Civil War. His business partner and later his son James were Citibank directors, and James eventually became bank president. James started out as a cotton broker after the Civil War, shifted into railroad investment, and then partnered with Rockefeller and Standard Oil as a Robber Baron promoting monopoly power in the United States. Maintaining his father’s interests in Mexico, during the 1870s James brokered a civil war that overthrew the Mexican government. Stillman also recruited former U.S. Treasury Department officials to work for Citibank, ensuring the banks increased wealth and political influence. Eventually Citibank dominated U.S. trade with Latin America and banking systems there, including Haiti where Citibank lobbied for a 1915 invasion by U.S. marines that led to a twenty-year occupation (see Peter Hudson’s Bankers and Empire).

 

Next to step up in leadership was Frank Vanderslip, who helped finance the U.S. war with Spain and the annexation of the Philippines, and then pushed the bank to invest in the archipelago. Charles Mitchell, Citibank chairman at the start of the Great Depression, was interrogated by Congress for suspicious banking practices, forced to resign, and was then indicted for income tax evasion, although the bank itself was not charged.

 

Over the years, the First National City Bank changed its name a number of times, but not its seedy business practices. In the 1970s it helped force austerity on a financial troubled New York City after the bank precipitated the crisis by divesting from city securities and refusing to back new financial instruments. In response, municipal unions withdrew millions of dollars in deposits from the bank. Citibank was also forced to pay fines levied by the state for charging consumers usurious interest rates. 

 

Citibank was a vocal opponent of corporate boycotts of apartheid South Africa and one of the largest lenders to both the South African government and private corporations there. In the 1990s, Citigroup was implicated in a money laundering schemes with Mexican drug cartels and was investigated by Congress. In 2002, Citigroup was involved in one of the biggest corporate scandals in United States history when it was accused of helping Enron disguise debt and agreed to pay $101 million to settle charges relating to the Enron fraud case. During the 2008 financial crisis, a New York Times investigation uncovered evidence that Citigroup recklessly purchased toxic subprime mortgages. In 2012 Citigroup was one of five mortgage companies that agreed to pay a combined $25 billion to resolve allegations of loan and foreclosure abuses.

 

More recently, Citibank’s Mexican affiliate, Banamex USA, was discovered laundering drug money and illegally shipping it back to Mexico. As part of a legal settlement, Banamex USA “admitted to criminal violations by willfully failing to maintain an effective anti-money-laundering” compliance program and Citigroup agreed to pay $97.4 million in fines. In 2015, Citigroup was forced to pay another multi-million dollar fine because of other “oversight” lapses at Banamex USA.

 

In May 2016, government prosecutors in New York’s Eastern District announced that Citigroup was being investigated over ties to alleged bribery and corruption at FIFA, the international soccer federation. Traffic Group, a conspirator in the bribery charges, made wire payments totaling $11 million from a Citibank account in Miami. In February 2017, Citigroup paid an “administrative penalty” of 69.5 million Rand ($13 million) in South Africa for colluding to fix prices and manipulate financial markets. One almost good thing was that after years of financing coal production, in 2015 Citigroup claimed it would shift its focus to environmental sustainability mode. Of course in 2017 it was still the number one banker for coal power in the United States.

 

Moses Taylor and the banks other pioneers clearly were role models for current Citigroup executives, but not quite in the way Celebrating the Past Defining the Future portrayed them.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173474 https://historynewsnetwork.org/article/173474 0
As The Trump Administration Demonstrates, We Have Undervalued Virtues And Values

 

In a 2018 New York Times (NYT) article, an anonymous Trump administration official wrote, “The root of the problem is the president’s amorality. Anyone who works with him knows he is not moored to any discernible first principles that guide his decision making.” (A few months earlier, I made a similar point in an HNN essay entitled “Unlike Obama, Trump Has No Moral Compass.”)

 

In a more recent NYT piece, “Our Republic Is under Attack From the President,” a retired U. S. admiral wrote that President Trump shows no interest in the virtues and values that “have sustained this nation for the past 243 years. . . . And if this president doesn’t understand their importance, if this president doesn’t demonstrate the leadership that America needs, both domestically and abroad, then it is time for a new person in the Oval Office—Republican, Democrat or independent—the sooner, the better.”

 

The problem, however, is not just Trump. It is also ignoring, unlike our Founding Fathers and some later thinkers and leaders, the need for proper virtues and values (see below) to sustain a healthy society. 

 

In her The De-Moralization of Society (1994), conservative historian Gertrude Himmelfarb made a major distinction between the two “v” words. She argued that “it was not until the present century that morality became so thoroughly relativized and subjectified that virtues ceased to be ‘virtues’ and became ‘values.’” But the difference  need not delay us here. Both “v” words deal with ethical first principles. 

 

Typical of historians’ analysis of  early American concern with virtue are the views of Clinton Rossiter and James Kloppenberg. The former wrote that “on no point in the whole range of political theory were Americans more thoroughly in accord” than that free government necessitated “a virtuous people.” He quotes various Founding Fathers like Samuel Adams, who stated,We may look up to Armies for our Defence, but Virtue is our best Security. It is not possible that any State should long remain free, where Virtue is not supremely honored.” Kloppenberg also emphasized the prominence “virtue” played in the minds of Founding Fathers. In a 1987 essay, “The Virtues of Liberalism,” he quotes such “fathers” as James Madison, who in a 1788 speech stated: “I go on this great republican principle, that the people will have virtue and intelligence to select men of virtue and wisdom. Is there no virtue among us? If there be not, we are in a wretched situation.” 

 

In the early nineteenth century, under Andrew Jackson’s presidency (1829-1837) and some other presidents, “the meaning of virtue lost its earlier religious, civic, and ethical significance and became a label for bourgeois propriety or feminine purity.” At least according to Kloppenberg. In his Uncertain Victory: Social Democracy and Progressivism in European and American Thought, 1870-1920 (1988), however, he argues that the progressives of the Progressive Era reemphasized the importance of virtue: “Without the virtue that democratic citizenship requires, these progressives argued, democratic government becomes simply an institutionalized scramble for private advantage.”

 

Although concern with civic virtues and values was not evidenced as much in the period between that of the Founding Fathers and the Progressive Era as it was in those two time spans, it was not an ignored subject. Ralph Waldo Emerson (1803-1882), whom the famous literary critic Harold Bloom believed in 2004 “remains the central figure in American culture,” often wrote of virtue. According to one analysis, in 1200-odd pages of his most famous essays, “Emerson uses the word virtue (or virtuous) 146 times.”

 

The most outstanding president of Emerson’s time, Abraham Lincoln (b. 1809), shared his concern for moral behavior, as did many abolitionists. Doris Kearns Goodwin’s Leadership: In Turbulent Times focuses on Lincoln, Theodore Roosevelt (a president of the Progressive Era), and two later progressives, Franklin Roosevelt (FDR) and Lyndon Johnson (LBJ). She writes that the four presidents were “at their formidable best, when guided by a sense of moral purpose, they were able to channel their ambitions and summon their talents to enlarge the opportunities and lives of others.” 

 

But following LBJ’s actions in behalf of civil rights and the impoverished—and the death of Martin Luther King, Jr. in 1968—we have lived in a half century in which virtues and values have become an increasingly contentious issue.

 

Liberals or progressives began to cringe when they heard the words values or virtues because conservatives often spouted them. Like with Ronald Reagan when he pledged to appoint only “family values” judges. Or Supreme Court Justice Antonin Scalia when he wrote that “the purpose of constitutional guarantees . . . is precisely to prevent the law from reflecting certain changes in original values.” Or George W. Bush who at the 2000 Republican convention spoke about “conservative values and conservative ideas.” (All three examples cited in Lepore’s These Truths.)   

 

In her memoirs, British Conservative Party prime minister of the 1980s, Margaret Thatcher, praised conservatives who wrote of virtues such as the American Michael Novak. She was impressed by his emphasis on democratic capitalism as a system that “encouraged a range of virtues.” Former Secretary of Education under President Reagan, William Bennet, edited The Book of Virtues: A Treasury of Great Moral Stories (1993) illustrating such virtues as self-discipline, compassion, responsibility, courage, perseverance, honesty, loyalty, and faith. 

 

One leftist, Todd Gitlin, former president of the radical Students for Democratic Society (SDS) in the 1960s, bemoaned the Left’s decreasing concern with values. In 1995 he wrote “the Left . . .  once stood for universal values,” but had retreated from such a stance in favor of an increasing emphasis on identity politics.

 

Historian Jill Lepore has quoted Gitlin and (like him) has written “the center would not hold.” And she added that “the heated rhetoric of the gun rights and antiabortion movements fanned rage among extremists. And a new kind of [identity] politics came to characterize those on both the Left and the Right. . . . By the 1980s, influenced by the psychology and popular culture of trauma, the Left had abandoned solidarity across difference in favor of the meditation on and expression of suffering, a politics of feeling and resentment, of self and sensitivity. She also believes that in recent decades the Internet and social media have worsened any commitment to common values, exacerbating “the political isolation of ordinary Americans while strengthening polarization on both the left and the right. . . . The ties to timeless truths that held the nation together, faded to ethereal invisibility.”

 

But one major Democratic politician has emphasized values—Barack Obama. Before being elected president, he devoted a 27-page chapter to the topic in his The Audacity of Hope. In it he wrote, “I think that Democrats are wrong to run away from a debate about values,” and that the question of values should be at “the heart of our politics,  the cornerstone of any meaningful debate about budgets and projects, regulations and policies.” Some of the values he advocated were empathy, honesty, fairness, self-reliance, humility, kindness, courtesy, and compassion,” as well as wisdom, which implies the ability to prioritize such values in order to best work for the common good. 

 

In their defense of reason, science, open-mindedness, and tolerance, however, leftists have been suspicious of religious dogmatism, and with it of self-identified proponents of “Christian values.” But Obama is right. Democrats have gone too far. Values should be “the heart” of their politics. What is so bad about Trump is that he has no values other than inflating and aggrandizing his own ego. As NYT columnist Nicholas Kristof recently wrote: “It would be difficult to imagine a president more at odds with Jesus’ message than Trump, a serial philanderer and liar who has persecuted refugees, divided families, exploited the poor and allegedly committed sexual assaults.” 

 

At the heart of opposition to Trumpism should be a reassertion of proper values. The type of values Obama has mentioned in various writings and speeches including empathy, compassion, rationality, humility, tolerance, and open-mindedness. The type of values Pope Francis has called for in his writings on the flaws of capitalism and environmental responsibility, and in remarks condemning Trump’s “cruel” immigration policies  and indifference to the poor.  And the type of values so powerfully preached by the Reverend Martin Luther King, Jr., who thought that Gandhi was “probably the first person in history to lift the love ethic of Jesus above mere interaction between individuals to a powerful effective social force on a large scale.”

 

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173465 https://historynewsnetwork.org/article/173465 0
The Age of the Commando: How a Group of WWII Heroes Rewrote the Rules of War

 

An undeclared war is being fought by American Special Operations Forces (SOF) across the globe, at a cost to US taxpayers of tens of billions of dollars a year. While some commentators condemn the deployments as illegal and counter-productive, others argue that the SOF help to keep America safe and is comparatively cheap. After all, the Iraq War alone cost at least $1 trillion (some estimates are double that) and claimed the lives of more than 4,400 US servicemen.

 

The rapid increase in the use of SOF began with the so-called “War on Terror,” in the wake of 9/11. At first, SOF were deployed alongside friendly indigenous forces  and US conventional ground forces in Iraq  and Afghanistan. But since the withdrawal of the bulk of US ground forces from Iraq in 2010, SOF and drone strikes have been US military’s weapons of choice. In May 2011, for example, President Barack Obama authorized Operation Neptune Spear, a US Navy’s SEAL Team 6 mission that located and killed al-Qaeda leader Osama bin Laden. By 2016, America’s most elite Special Forces – including Navy SEALS, Delta Force and Army Green Berets – were deployed in 138 countries, or 70 percent of the world’s total.

 

Retired Lieutenant General Charles Cleveland, the commander of the US Army’s Special Operations Command (SOCOM) from 2012 to 2015, summed up the Special Forces role as conducting kill or capture missions and training local allies. “SOF is at its best,” he insisted, “when its indigenous and direct-action capabilities work in support of each other. Beyond Afghanistan and Iraq and ongoing CT [counterterrorism] efforts elsewhere, SOF continues to work with partner nations in counterinsurgency and counterdrug efforts in Asia, Latin America, and Africa.”

 

Manpower has not been a problem. Since 9/11, the number of SOF operatives has more than doubled from 33,000 to 70,000, while SOCOM’s annual budget of $12.6 billion is four times what it was in 2001. Even before 9/11, most military powers were using special operations units as a force multiplier in conventional conflicts, as well as for more specific tasks such as anti-terrorism, hostage rescue, deep reconnaissance, sabotage, and kill or capture missions. But the sudden expansion of SOCOM is the result of two factors: the rise of terrorism in the 1990s, culminating in the al-Qaeda attacks on the World Trade Towers in 2001, and the takeover of large chunks of Syria and Iraq by ISIS more recently; and, more importantly, the fact that the political, human and financial costs of the ill-fated war in Iraq have persuaded successive US political leaders – Obama chief among them – that surgical strikes by SOF and drones are a far more cost effective way of dealing with America’s foes than the bludgeon of conventional forces.

 

A similar calculation underpinned the creation of North America’s first special operations unit – known as the First Special Service Force (“the Force”) – in World War 2. The Force was the brainchild of eccentric British inventor Geoffrey Pyke who, in March 1942 (and with the Allies’ fortunes at their lowest ebb), persuaded British Prime Minister Winston Churchill to raise a small, elite para-ski unit for a top-secret mission behind enemy lines. If Pyke’s scheme came off, said Churchill, “never in the history of human conflict would so many have been immobilized by so few.”

 

But lacking the necessary resources, Churchill offered the scheme to US Army Chief of Staff George C. Marshall, who was visiting the UK to coordinate Allied strategy. Marshall accepted the offer and, with President Franklin D. Roosevelt’s backing, appointed a young Operations officer to form the new unit with tough young recruits – lumberjacks, forest rangers, hunters, northwoodsmen, game wardens, and explorers – from both sides of the US–Canadian border. After completing one of the toughest military training regimes ever devised – including skiing, rock climbing and parachuting – the recruits were sent to southern Italy to capture a crucial Nazi mountain stronghold that stood as a bulwark against the advance of the US Fifth Army, a seemingly impregnable fort that had defeated all previous Allied attempts to take it.

 

By scaling 200 feet high cliffs in rain and darkness, the Forcemen were able to outflank the German defenders and capture the position in under two hours, a feat that theater commander Dwight D. Eisenhower struggled to comprehend, writing, “I have never understood how, encumbered by their equipment, they were able to do it.”

 

It was the first of the Force’s many stunning victories. From December 1943 to the same month a year later, the men fought in 20 battles and always achieved their objective. For every Forcemen who fell in battle or was taken prisoner, his comrades killed at least 25 of the enemy and captured 235. They were awarded more than 250 gallantry medals and 1,214 Purple Hearts. “I can testify,” wrote one veteran war correspondent, “to their spectacular power and efficiency, their marvelous morale and their never-failing spirit of attack.”

 

They were the original force multipliers who, in the words of Senate Majority Leader Mitch McConnell, “helped save a continent in chaos,” and their modern descendants in SOF still honor their achievements. Is today’s SOF in the same league? And is there a danger that SOF’s preeminence will undermine the readiness and capability of America’s conventional forces, as some commentators believe? It’s hard to say. But what is not in doubt is that the modern SOF are just as bold and professional as their World War Two forebears, while their relatively low casualties – 10 of 13 US soldiers killed in Afghanstan in 2018 were SOF – are a welcome change from the thousands of bodybags sent back from Iraq. 

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173475 https://historynewsnetwork.org/article/173475 0
A Holocaust Survivor Tells His Remarkable True Story of Courage and Survival at Auschwitz

 

I was awoken suddenly from my short sleep to loud banging and shouts of “Raus schnell!” We were on the top bunk. Father jumped down first, then me, and then finally my uncle. The Kapos ordered us out of the barracks. It was a beautiful, sunny morning and I found myself looking at hundreds of barracks, thousands of emaciated people behind barbed-wire fences, and dozens of guard towers where SS soldiers manned machine guns and searchlights. Nearby, there were four huge chimneys belching angry red flames and smoke. The smell of burning flesh that had first overwhelmed me when I exited the cattle car still permeated the air. I could not fathom the immense size of this place, and I thought that we must be in a large industrial area. My father told me to move fast if I heard the Kapo’s orders, because otherwise they would beat me.

 

Tables were set up in front of our barracks, and two men sat at each one. They ordered us to come to the table nearest us in single file. Again, my father went first, I was next, and my uncle was last. The first man asked my name, my place and date of birth, what languages I spoke, my height and weight, and the color of my hair. The next man tattooed a number on my left arm: A-9892. My father’s number was A-9891 and my uncle’s was A-9893. Wherever we went, I was always between them; they were my guardian angels.

 

Nearby, there were piles of striped pants, jackets, and caps. I was handed one of each and put them on, but they didn’t fit well. We had no socks or underwear, no belt or suspenders to hold up our pants. From a pile of unattended dirty rags, my father managed to find a pair of trousers, and with his teeth and fingers he ripped off strips of material that he twisted into belts for me, my uncle, and himself. We stripped more pieces of cloth and wrapped them around our feet in place of socks. We also kept a small piece to use as a wipe in lieu of toilet paper. My father and uncle were inventive, and they taught me how to survive under these horrific conditions.

 

Once I put on these striped prisoner’s clothes, I felt like I was no longer a human being, only a number. On two strips of white material, prisoner workers stamped a Star of David with my number. As we proceeded down the line, they used needle and thread to stitch one strip on the front left side on the jacket and the other on the back. Different groups had different triangles: political prisoners got a red triangle (with a Pfor Polish or F for French, etc.), Roma people had a brown triangle, Jehovah’s Witnesses a violet triangle, homosexuals a pink triangle, habitual criminals a green triangle, and so-called asocials a black triangle. Out of all these groups, we Jews were on the lowest rung of the ladder in the camp hierarchy.

Soon, two prisoners arrived carrying a large canister of hot tea, my first food or drink in days. They gave us metal dishes, lined us up, and portioned out the tea. It tasted quite different from what I was used to at home. My father asked these men if we would see our families that day. They laughed at him, pointed to one of the chimneys spewing flames, and asked, “Where did you come from?”

 

My father replied, “We arrived from Hungary in the middle of the night.”

 

The prisoner said, “It’s 1944 and you don’t know what this place is all about? Your families have gone up through the chimney.” This was camp vernacular to describe being gassed and cremated.

 

At that moment, I’m sure Father realized that my mother and the rest of our family had been murdered soon after our arrival, but it took me a few days to understand the processes of this killing machine. Until I learned more about the existence of the gas chambers, I assumed that they had been burned alive. I was devastated, but I was under such threat at every moment that I could not dwell on the loss of my family during the day. I could think only of work, food, and physical survival. My father and uncle never spoke of the deaths, so when I thought of my family while I lay in my bunk at night, I was alone with my grief. In truth, it was easier to exist in a state of denial than to face this horrible reality.

 

After the prison workers tattooed our numbers on our left arms and inscribed them on our clothing, they lined us up once more. An officer yelled out, “Doctors and lawyers, raise your hands!” Those who did were ordered to step out of the formation and were taken away. Next, he asked for farmers. Many of us raised our hands. My father knew from his time in the labor battalions that working on a farm would give us access to potatoes, turnips, or beets. The guards selected a hundred men, including the three of us.

 

I was hungry, thirsty, and completely shocked by how my life was changing minute by minute and hour by hour. Everything about this place was threatening and filled me with fear, and now we were told that we were going to a different camp. I wondered if the new camp would be similar to Birkenau.

 

The guards marched us several kilometers down the road to Auschwitz I. En route, we passed a group of women with shaved heads and striped dresses; they were harnessed to a huge cement roller that they pulled to grade the road. Some had dilapidated shoes, some wooden clogs or sandals, and some were barefoot. The SS women guards were whipping them and yelling, “Schnell! Faster, you damn Jews!” The soles of the feet of the women without shoes had been ripped to shreds, and the rocks they walked upon were covered with their blood. The SS women were large and bursting out of their uniforms, and the contrast between them and their skeletal prisoners was striking. I couldn’t help wondering if we would be treated the same way.

 

Copyright Harper Collins Inc. Republished from By Chance Alone: A Remarkable True Story of Courage and Survival at Auschwitz with the permission of the publisher. 

 

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173473 https://historynewsnetwork.org/article/173473 0
Congressional Courage, the D.C. Slave Trade, and Moral Politics in Washington

 

Along a bustling stretch of Independence Avenue Southwest in Washington, D.C., two modest informational panels adjacent to the sidewalk have marked, since January 2017, the site of the Yellow House, a former slave jail in the heart of the nation’s capital. There, enslaved people and abducted free blacks such as Solomon Northup were held as prisoners as they waited to be sold to the Deep South. Abolitionists were outraged that the slave trade was conducted mere blocks from the halls of Congress and turned to their elected representatives for redress. Political cynics may argue that moral bankruptcy is a long-time fixture in Washington, but at a crucial moment of national division, congressional leadership confronted the ethical embarrassment of the D.C. slave trade and eradicated it.

 

By the 1830s, Washington City had emerged as a leading depot for the domestic slave trade that sent enslaved men, women, and children from the Chesapeake to locations farther south and west. Several D.C.-based slave traders incarcerated bondpeople from Maryland and Virginia in private jails in the nation’s capital prior to shipping them, by land or by sea, to New Orleans and other points in the Deep South, where they brought higher prices.

 

Starting in 1836, the most famous slave jail in Washington City belonged to trader William H. Williams. Dubbed the Yellow House, Williams’ compound consumed most if not all of the rectangular block between Seventh and Eighth streets and between B Street and Maryland Avenue, immediately south of the National Mall. It once stood on the block now occupied by the Federal Aviation Administration’s Orville Wright Federal Building, across the street from the Smithsonian’s Hirshhorn Museum.

 

The structures that today dominate the National Mall—the Smithsonian castle and museums; monuments to presidents and war memorials—did not exist when Williams commenced operations at the Yellow House. Thus D.C. travelers oriented themselves geographically around the towering, three-story, plastered brick Yellow House. Surrounded by ten- or twelve-foot walls, partly shrouded by trees, Williams’ slave pen stood as a prominent landmark readily visible from the U.S. Capitol.

 

The arresting sight of enslaved people, chained two by two in long lines known as coffles and forcibly marched from the Yellow House and Washington’s other slave jails, down Maryland Avenue, and across the Long Bridge over the Potomac, elicited frequent complaint. By the 1840s, the practice was so controversial that the transport of enslaved captives increasingly took place under cover of darkness or via horse-drawn coach to avoid offending public sensibilities.

 

Many were not only opposed to the visibility of the slave trade: they wanted to abolish the slave trade altogether in Washington, D.C. In the mid-1830s, abolitionists ramped up efforts to extinguish the traffic. To advance their agenda, tens of thousands of American citizens seeking to terminate the D.C. slave trade petitioned Congress. Perhaps, the abolitionists mused, they could count on congressional leaders to display the requisite strength, courage, and moral fortitude to stand on principle against the evils of human bondage.

 

From 1836 to 1844, Congress toiled under the self-imposed “gag rule” that precluded discussion of petitions dealing with the politically divisive matter of slavery. But by 1850, a proposal to end the D.C. slave trade was included in measures that together composed the Compromise of 1850. If enacted, the proposed abolition of the D.C. slave trade would have a uniquely devastating impact on William H. Williams because, after the retrocession of Alexandria back to Virginia in 1846, his Yellow House remained the busiest slave jail in the District by far.

 

Members of Congress divided sharply over the proposed bill to end the trade. Those representing slaveholding interests in the Deep South opposed any such measure. Senator Jefferson Davis criticized the bill, noting that Williams’ slave jail “is a comfortable looking house.” Davis thought it “Rather a boarding-house in its aspect than a prison.” With a “spacious yard” and a “growth of poplar trees” surrounding it, the Mississippi senator concluded cheerily that Williams’ slave pen “look[ed] as little like a jail as any residence in the city of Washington.”

In reality, Solomon Northup and other inmates of the Yellow House described it as a site of physical and emotional terror.         

 

Whereas Davis refused to acknowledge basic truths about the Yellow House, other congressmen understood the unspeakable horrors that the enslaved suffered behind its heavy iron gates. Representative Joshua R. Giddings of Ohio, an avowed abolitionist, personally visited Williams’ slave jail in an attempt to retrieve one of the many free black individuals kidnapped and imprisoned there prior to their transport south for illegal sale. As Giddings recollected, the formidable structure’s “gloomy walls . . . retained all the horrid barbarity of the darker ages.”

 

Between proslavery and abolitionist members of Congress were those such as Kentucky senator Henry Clay, an elder statesman with more than forty years of experience in Washington. “I have never visited one of these depots,” Clay confessed of the District’s slave jails. He nevertheless opposed them because, despite owning enslaved people and representing a slave state, he recognized the Yellow House and other similar establishments as “nothing more nor less than private jails, subject to no inspection of public authority, under the exclusive control of those who erect them.” Inside, “the owner of the jail” was law, imposing “police regulations” over the enslaved prisoners as he saw fit, absent any external oversight. No proslavery radical, Clay found the traffic in bondpeople in the nation’s capital objectionable and concurred with the notion that it brought “some degree of odium on the District.”

 

It fell to Clay to shepherd the bill abolishing the D.C. slave trade through the U.S. Senate. It was no easy task. Rumblings out of South Carolina in 1850 raised the prospect of secession and civil war. Already, sectional tensions over slavery threatened to rend the country in two. As the architect of the Compromise of 1850, Clay saw the abolition of the slave trade in Washington as the northern counterpoint to the proposed fugitive slave bill, which was intended for the exclusive benefit of southern enslavers. He also understood that the compromise offered different measures so offensive to each portion of the Union that it could never pass if all of its components were bundled together as an omnibus bill. Clay therefore disaggregated the compromise into its constituent parts and held separate votes on its individual provisions. Through this process, the act abolishing the slave trade in Washington, D.C. cleared the Senate by a 33-19 margin. Predictably, all opposition came from slave-state senators. Only Clay, fellow Kentuckian Joseph R. Underwood, Thomas Hart Benton of Missouri, and Sam Houston of Texas voted in favor despite representing a slave state. So passed the last piece of the Compromise of 1850.

 

The law that eradicated the slave trade in Washington, D.C., dealt William H. Williams’ operations a fatal blow. By the time it went into effect on January 1, 1851, some Washington slave dealers had relocated across the Potomac to Alexandria to continue operations lawfully there. No evidence survives to indicate that Williams did the same; rather, the change in the law prompted his retirement from slave trading after some twenty years in the business.

 

In 1854, Henry Wilson, who would soon represent Massachusetts in the U.S. Senate, visited the site where the Yellow House had long stood. Within four years of Congress outlawing the D.C. slave trade, the Washington cityscape was cleansed of Williams’ slave pen. An enterprising businessman was utilizing the former site of the slave jail to engage in an immeasurably more pleasant trade. As Wilson recorded, “flowers were blooming where the slave once sighed,” and a sign advertised “Flowers for sale and bo[u]quets made ‘to order.’”

 

Congressional action had inspired positive change. The ugliest humanity had to offer yielded to beauty, an auspicious sign for Henry Wilson and his fellow abolitionists. “I hope,” Wilson prayed, “but a few years more shall pass until every spot wherever the groans of human bondage are heard shall be a garden in which the blossoms of freedom shall make glad the eye, and the accents of hope delight the ear.” Ultimately, however, that grand vision would not be effected by members of Congress pursuing the morally correct course but by a civil war that cost three-quarters of a million lives.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173469 https://historynewsnetwork.org/article/173469 0
Elbridge Gerry’s Monster Salamander that Swallows Votes As Americans prepare to vote in local and state elections on Election Day, tens of thousands--even millions--will find their votes chewed, swallowed, and discarded by a monstrous “salamander”—the two-hundred-year-old creation of Founding Father Elbridge Gerry of Massachusetts.

 

Gerry created the metaphorical salamander to reshape voting districts and ensure his own election and re-election and that of loyal political office holders. The son of a wealthy merchant in Marblehead, Mass., Gerry used his salamander in the 1812 national election, when his friend James Madison announced for the presidency and asked Gerry to run for vice president.  

 

To ensure his victory, Gerry slick-talked a majority of his state’s legislators into redrawing the state’s voting-district boundaries. By extending borders of one district to incorporate larger numbers of opposition voters from neighboring districts, redistricting left selected districts with voter majorities that favored Gerry and ensured his election as America’s fifth vice president 

 

A Boston Centinal cartoonist drew a caricature of what he called the state’s “gerrymandered” districts, with the overpopulated district that opposed Gerry depicted as a monstrous salamander. 

 

A Boston Centinal cartoon in March 1812 shows then-Governor Elbridge Gerry’s creation

of a new salamander-shaped political district that he had “gerrymandered” to favor his political party.

 

Gerry’s “salamander’ undermined what most Americans believed had been one goal of the American Revolution—elimination of England’s pocket boroughs and rotten boroughs that gave a handful of English noblemen out-sized voting control of Britain’s Parliament. Gerry, however, had not served in the military during the Revolutionary War. Like many signers of the Declaration of Independence, Gerry had supported independence to protect his family’s wealth against British taxation—not to give Americans universal voting privileges. 

 

When the war erupted in Boston, Gerry smuggled food supplies into the city to offset British Army efforts to starve Bostonians for opposing British rule, but he earned handsome profits from the sale of those good—as did other American merchants such as John Hancock and Robert Morris. Few, if any, thought ill of profiting from the war. When polemicist Thomas Paine objected in Congress, Philadelphia merchant Robert Morris retorted, “By becoming a delegate [in the Continental Congress]…I did not relinquish my right of forming mercantile connections.” 

 

With American independence, Gerry refused to sign the Constitution, arguing that executive power over the army gave the president the potential to become a despot. Married and father of thirteen children by then, he told Congress a President with control of a standing army was like “a standing penis: An excellent assurance of domestic tranquility, but a dangerous temptation to foreign adventure.”

 

With ratification of the Constitution, Gerry ran for and won election to America’s First Congress, where he joined Virginia’s James Madison in winning passage of a Bill of Rights that limited federal government powers to curtail certain individual rights, such as free speech, free press, and the right to assembly. 

 

After two terms in Congress, Gerry served as a diplomat in Paris before becoming governor of his state and then vice president. He died in November 1814, leaving as his principle legacy the powerful and dangerous political weapon of  “gerrymandering.” 

 

For more than a century since the Civil War, each political party in almost every state has used and continues to use gerrymandering to strip millions of their voting powers. Leaders in every southern state—Democrats and Republicans alike--gerrymandered for the sole purpose of depriving African-Americans of political influence in local, state, and federal elections. Apart from the impact of racial exclusion in local and state elections, gerrymandering has sent two candidates with fewer votes than their opponents to the White House. Indeed, Donald J. Trump lost the popular vote by an astounding total of more than three million votes, but gerrymandering ensured him 304 of the 538 Electoral College Votes and the presidency of the United States.

 

In state after state across America, each political party with a legislative majority is now trying to gerrymander to perpetuate its political power. Only a court decision blocked the most recent effort by North Carolina Republicans, and it was not too long ago that voters across the South needed more than five years of massive popular uprisings, rioting, and the lives of men, women, and children to lop off the head of Gerry’s monstrous salamander. 

 

Some gerrymandering of sorts often occurs naturally, without the machinations of scheming politicians. The depopulation of farm areas in many states has given remaining land owners of such districts far more voting clout than voters in heavily populated cities. 

 

The beast will continue to regenerate, therefore, until states or the federal government kill it with legislation that imposes lower limits on district population as a percentage of state population to qualify as an election district.

 

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173464 https://historynewsnetwork.org/article/173464 0
What If Mike Pence is the 2020 Republican Presidential Nominee? Ronald L. Feinman is the author of “Assassinations, Threats, and the American Presidency: From Andrew Jackson to Barack Obama” (Rowman Littlefield Publishers, 2015).  A paperback edition is now available.

 

Could the House vote to impeach Donald Trump by the end of the year? The tumult over the Ukraine telephone conversation between Trump and Ukrainian President Volodymyr Zelensky led Speaker of the House Nancy Pelosi to launch a formal impeachment inquiry. The impending impeachment trial will take place after Thanksgiving, if not later. While it seems unlikely at this point, if Trump was removed from office or if he resigned, Vice President Mike Pence would become president with less than a year remaining in the present Presidential term. 

 

The latest in any Presidential term that a President has left office was in 1963. After John F. Kennedy was assassinated on November 22, Lyndon B. Johnson became President with slightly less than a year until Election Day 1964 and approximately one year and two months left in JFK’s term. 

 

As the 1964 presidential election approached, LBJ’s only challenger for the Democratic Party nomination was Alabama Governor George Wallace. Wallace was a nationally known, controversial figure, after he opposed the admission of two African American students to the University of Alabama in June 1963.  Wallace was unable to put a dent into Johnson’s primary campaign, however.

 

The only other potential obstacle to LBJ’s presidential campaign was Attorney General Robert F. Kennedy who was still in the cabinet until the summer of 1964. RFK wished to be Johnson’s Vice Presidential running mate, but Johnson had “bad blood” with RFK from the beginning of the JFK Presidency. LBJ did not want RFK to have any influence in his full term bid, and so he chose Minnesota Senator Hubert Humphrey as his running mate instead.  In the election, Johnson received an all-time high of 61.1 percent of the vote and 486 electoral votes. He defeated Senator Barry Goldwater from Arizona by winning 44 of 50 states.

 

After Warren G. Harding died on August 2, 1923, his successor became president with the second least amount of time left in a presidential term. Calvin Coolidge became president with about nineteen months left until the next inauguration, and about fifteen months to Election Day 1924. 

 

Coolidge faced the opposition of progressive California Senator Hiram Johnson, who competed in a number of primaries, but only won in South Dakota. Progressive Wisconsin Senator Robert La Follette, Sr. ran a vigorous third party campaign as the revived Progressive Party nominee, winning his home state, and 16.6 percent of the total national vote. Ultimately, Coolidge easily defeated his two opponents, La Follette, and Democratic Presidential nominee John W. Davis by winning 54 percent of the vote.

 

If Trump is removed from office, Mike Pence would likely become president with the least amount of time left in the previous president’s term. To understand Pence’s potential chances in 2020, President Gerald Ford’s experience succeeding Richard Nixon after he resigned in August 1974 might be more relevant. Nixon resigned after the Supreme Court ordered him to hand over the Watergate tapes in the case of US v Richard Nixon.  While Ford became president with nearly two and half years left in Nixon’s term, a full year more than Calvin Coolidge had after Warren G. Harding’s death and 15 and a  half months more than Lyndon B. Johnson had after John F. Kennedy’s death, the effect on the Republican Party and Gerald Ford was extremely detrimental due to the Watergate Scandal and Ford’s decision to pardon Richard Nixon a month into his Presidency and two months before the midterm election of 1974.  

 

This contributed to the Democratic Party gaining 49 seats in the House of Representatives securing a two-thirds majority in the 94th Congress. The Democrats also gained four members in the US Senate, to a total of 60 seats, making the political situation for Gerald Ford very tough for the remaining two years of the term. The Nixon pardon and the bad economy undermined Ford, and led to his defeat for a full term in the Oval Office in 1976.

 

It is seemingly a long shot that Trump will be removed from office, as only Senator Mitt Romney has hinted he would support such an action, and 20 or more Republicans would need to vote for removal in the US Senate. But there clearly are others who might vote to convict, making for a majority of the Senate advocating Trump’s removal, and as more evidence comes out, and discontent grows with Trump’s Syrian policy and his insults and character assassination of everyone imaginable, it is not beyond the realm of possibility to create an untenable situation that could make conservatives in the Republican Party prefer a person closer to their hearts and views, Vice President Mike Pence.

 

Therefore, it’s worth considering what might happen if Pence became president and tried to run for a full term as President while defending his connections to an ousted Trump. 

 

Would anyone in the Republican Party challenge President Pence in primaries or caucuses, if few were arranged already, or deadlines had passed for registration to participate in such primaries or caucuses?  Would a John Kasich, Jon Huntsman, Mitt Romney or others who formerly contended for the Presidency enter the race?

 

Would anyone attempt to make the nomination a convention struggle in August 2020 at the Republican National Convention in Charlotte, North Carolina, something that has not occurred in decades?

 

And how would this affect the Democratic Presidential nomination battle which would be in full throttle, especially in February and March 2020 when a majority of the scheduled primaries and caucuses will take place?

 

Would this scenario favor an establishment candidate, such as Joe Biden; or a more leftist candidate, such as Bernie Sanders or Elizabeth Warren; or a fresh face from the moderate wing, such as Pete Buttigieg, Kamala Harris, Amy Klobuchar, or Cory Booker? Or would it lead to others to announce their candidacy, such as Hillary Clinton or Michael Bloomberg?

 

Could a third party or independent candidate further complicate the political field, such as Independent Justin Amash running as a Libertarian? 

 

This is all uncharted territory, and creates the possibility of total chaos in an election year, potentially greater than in 1968.

 

So we could be on the way to an election year like no other since the Civil War, the Great Depression, and the tumult around the Vietnam War, and no one can possibly predict who will be inaugurated President on January 20, 2021.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/blog/154272 https://historynewsnetwork.org/blog/154272 0
The Total Eclipse That Helped Prove Einstein's Theory of Relativity  

When Albert Einstein published the first draft of his general relativity theory in 1911, it predicted that light would bend when passing the gravitational pull of a large object. To verify his calculations, he needed two things: astronomy and a total eclipse of the sun.  If astronomers journeyed to the narrow eclipse track—it’s called the path of totality and is less than a hundred miles wide—where the sun’s light is completely blocked by the moon, they might photograph starlight as it passed our closest star. That is, if all aspects of the expedition went as planned.  Over the course of a decade seven daring astronomers would attempt to test Einstein’s theory.  They were faced with tremendous obstacles even when things went well. Planning an expedition was not an easy task.

 

The century’s first total eclipse, on May 28, 1900, was a perfect example of the work that goes into accommodating a visiting expedition. This eclipse shadow crossed the face of the earth like a current of electricity, connecting a string of human beings who waited along a path more than 10,000 miles long. Its track of totality was narrow, only 50 miles wide. But its length was expansive in that it reached many places that would provide good viewing around the globe. As the moon’s shadow, or umbra, moved eastward at approximately 2,200 miles per hour, it was observed by scientists in Mexico, the United States, Portugal, Spain, Algeria, Tunisia, Libya, and Egypt before the shadow ended in the Red Sea. In the United States, its path swept across several southern states, stretching 925 miles from New Orleans to the coast of Virginia. There would not be another total eclipse viewable in the country for eighteen years, and the press jumped on it.

 

Two of the towns in the southern states that lay in the eclipse path were Washington, Georgia, and Wadesboro, North Carolina. Their locations, far from city lights, would provide good sites for expeditions to set up. It was a major event in these sleepy towns when astronomers descended like locusts in 1900. They may not have been the big theatrical stars of the day, but as distinguished scientists, they were close to it. Even before the teams appeared in person, the excitement had begun. Local buildings were given fresh coats of paint. Lawns were manicured. Cafés and shops stocked up on supplies, anticipating a windfall in sales. Town committees and social groups planned afternoon teas and evening lectures for the esteemed guests. There would also be farewell picnics and barbecues when it was all over. This meant gallons of ice tea made, dozens

of peach pies baked ahead, and enough skillet cornbread and barbecued pork to feed a small army.

 

Hotel rooms were reserved, and private residences made available to rent. Journalists and photographers arrived and found lodging. Often, the mountains of astronomical equipment would be shipped ahead, accompanied by a team member to oversee its safety. Local boys and men were then hired to unload the many railroad cars. Crates and boxes would be packed into wagons pulled by horses or mules and transported to the campsites. The folks living in the path of totality prepared as well as they could. Then they waited for the “clippers,” as they called them, to appear in person.

 

A dozen of the most prestigious American observatories sent teams to these southern locations. The cast of characters would read like a who’s who of famed pioneers in the field of astronomy, both the old guard and the new. When the astronomers arrived, the serious work began. Masons would begin erecting the brick piers, and carpenters would build the wooden platforms to hold the telescopes. The canvas huts and awnings that would protect such expensive equipment from rain and sun would need erecting. Volunteers were chosen to act as security guards, standing watch at the huts during the night. Helpers who were needed on eclipse day to assist in scientific tasks such as handling the stopwatch and calling the time might be found among local merchants, blacksmiths, and farmers. They would be rehearsed in their tasks for two or three days before the eclipse. Sketch artists who could capture the image of the corona either came with the expeditions or were found locally. By eclipse day, telescopes forty feet high would be pointing at the sky, like carnival rides at a state fair.

 

Nature was kind on May 28, 1900. On the day of the eclipse, there would be clear blue skies along most of the path from New Orleans to the Atlantic Ocean. The streets of Washington, Georgia, were packed tight with people, many reminding each other, “Don’t look with your bare eyes!” On out-of-the-way hilltops and in open fields, the astronomers kept close watch on the crescent’s length. Their numerous telescopes and cameras

were ready. Business owners locked up shop and went with their employees into the streets. The many sketch artists, including “five young ladies from town,” stepped up to their boards, excited. Once totality began, some artists would sketch just what they saw with their naked eyes. Others would peer into the eyepieces of telescopes as they drew.

 

At 8:10 a.m., as the spectators waited, breathless, the moon began its ascent on the sun. As viewed in Washington, the full eclipse lasted for one minute and twenty-five seconds. That was long enough to disturb the natural world. Flocks of purple martins and swallows flew in circles overhead as cicadas and insects rattled in confusion. The Macon Telegraph commented on this disruption as the umbra fell over the countryside: “The cows that were being driven in small groups to the pastures near the city stopped in the streets and tried to turn back. Chickens and fowls cackled and cawed, denoting their alarm. The ignorant and superstitious dropped on their knees and prayed to be forgiven.” In the town of Washington, the crowd cheered wildly as the darkness receded and the sun began its reappearance.

 

Traveling at such a phenomenal speed, the eclipse shadow passed over South Carolina and raced on to North Carolina. The streets in Wadesboro were also flooded with spectators, an excursion train from Charlotte having arrived earlier with hundreds of people. Most of the scientists were dispersed, set up and waiting in individual encampments. Despite the early hour, the temperature was almost seventy degrees. Although the “Yanks” had been warned of the southern heat even in late May, several would collapse and need treatment by local doctors. As the full eclipse began, people watched from sidewalks or climbed to the rooftops of buildings, holding in their hands pieces of smoked glass or darkened binoculars. A smattering of small and inexpensive telescopes owned by schoolchildren were pointed at the sky. The total eclipse was over in less than ninety seconds when a sliver of yellow sun began to emerge. The shadow had already sped on to Norfolk, Virginia. There, it would dash across the Atlantic to reach the people eagerly waiting in Europe and Africa.

 

Scattered along the path of totality to view the eclipse of 1900 were five astronomers whose careers would later be linked to a man named Einstein, who was yet to graduate from the Swiss Polytechnic Institute in Zurich. In Thomaston, another tiny town in Georgia, from the Lick Observatory in California, were William Campbell and Charles Perrine. In Portugal, from the Royal Observatory at Greenwich, were Frank Dyson and Charles Davidson. And standing on the rooftop of the Hôtel de la Régence in Algiers was Andrew Crommelin, also from Greenwich. The last three men who would have an important role to play were Erwin Freundlich, a fifteenyear-old German schoolboy who was dreaming of becoming a shipbuilder; Arthur Eddington, an English college student who had just that year discovered a love for physics; and Edwin Cottingham, who was busy in his clock shop in the tiny village of Thrapston, England. 

 

A dozen years down the road from the 1900 eclipse, Freundlich would send Perrine a letter in which he would refer to a Professor Einstein who had just conceived of a new theory that needed verification. This letter would set the ball rolling, and that theory would change all their lives. It would also mean forsaking the classical physics they had learned as students. But this chain of events was still more than a decade away. It was now a brand-new world that had just entered into a brand-new century. From that day in 1900 to the eclipse in 1919 that would make Einstein world-famous, there would be twelve more total solar eclipses. Their paths of totality would touch on all seven continents and cross all oceans. These marvels of nature still had much to teach about the secrets of the universe. The astronomers were learning. It would be a matter of the right eclipse, the right place, and the right time.

 

This article has been adapted from Proving Einstein Right: The Daring Expeditions that Changed How We Look at the Universe  by Sylvester James “Jim” Gates, Jr and Cathie Pelletier. Copyright © 2019. Available from PublicAffairs, an imprint of Perseus Books, LLC, a subsidiary of Hachette Book Group, Inc.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173408 https://historynewsnetwork.org/article/173408 0
William Loren Katz: Teacher, Author, Editor and Activist (1927-2019)

 

William Loren Katz, historian who championed the marginalized, died on October 25, 2019. He was 92 years old. Bill, who is survived by Dr. Laurie Lehman, his wife and partner of 36 years, was the father of Naomi and Michael, proud grandfather of Maya, and lifelong friend of Dr. Virginia Shipley. He is also survived by a legacy that comprises the forty American history books that he authored including Black Indians, The Black West [a revised edition was published this year], and Breaking the Chains: African-American Slave Resistance. As the series general editor, Bill oversaw the publication of over 200 edited volumes from The American Negro: History and Literature, and The Anti-Slavery Crusade in America series published by the New York Times and Arno Press.

 

I “met” Bill three times and it wasn’t until the third time that I realized who he was. As a graduate student in the 1970s, I was aware that William L. Katz was the general editor of The American Negro: History and Literature series. As a high school United States history teacher in the 1980s, an important source was an edited collection of primary source documents, Eyewitness, The Negro in American History by the same William L. Katz. Later, my wife worked with Bill’s wife. After a visit to their Greenwich Village apartment and seeing Bill’s bookshelves, I realized my friend was The William L. Katz. Bill was always generous with his time and helped me with research for the New York and Slavery Complicity and Resistance curriculum and two books.

 

Bill was born in New York City and entered the United States Navy in 1944 at the age of 17, right after graduating from high school, so he could join the fight against Fascism. After the war he used his GI Bill benefits to study history at Syracuse University (BA History, 1950) and education at New York University (MA Secondary Education, 1952). He then taught in New York City and State public schools for fourteen years. As a historian and educator, Bill was a consultant on numerous projects and worked with the U.S. Senate, the British House of Commons, the Smithsonian Institution; and a number of school districts.

 

Bill never separated his work as a historian from teaching and activism. He was especially proud of his anti-Apartheid activism in the 1980s, his work with WBAI-FM radio, and his support for Black Lives Matter. Bill co-authored a picture history about the Abraham Lincoln Brigade for young adult readers and was a strong supporter of their veteran organizations. He wrote books about Black Cowboys and Black Indians and advocated for the rights of indigenous people, which led to him receiving the White Dove Imani Peace Award from the White Dove-Imani-Rainbow Lodge of Ohio. Bill received a lifetime achievement award from the Institute of African American Affairs of New York University and in 2012, a National Underground Railroad to Freedom Award from the National Park Service.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173436 https://historynewsnetwork.org/article/173436 0
The Internet at 50: The Night the Internet Was Born

Above: The laboratory’s logbook from the night the Internet was born

 

This is the third article in a series reflecting on the Internet at 50. For the first article on the four developments that created the world wide web, click here. For the second on the dot com bubble burst, click here

 

On October 29 1969, computers at UCLA and the Stanford Research Institute were connected in the first tentative experiment that would later be recognized as the birth of the internet.  But as Harlan Lebo, author of 100 Days: How Four Events in 1969 Shaped America (Amazon,Barnes & Noble), points out, 50 years ago there was no expectation of where the achievement would lead, and the real impact would not begin to be understood for almost 20 years. 

 

____________________

 

“Here is a question: how many revolutions do you 

know that you can tell the exact minute when the 

revolution began?”

                                                – Leonard Kleinrock, UCLA

 

 

UCLA 

October 29, 1969

9:30 p.m. 

 

Boelter Hall is a nondescript but pleasant enough brick building on the UCLA campus, framed by California olive trees and bordered by the grass-lined walkway known as the Court of Sciences in the south section of the university. 

 

During the day, Boelter Hall teems with engineering students. But in the evening, with most student housing far across the campus, the building descends into quiet solitude – an ideal setting for work that requires time and focus.

 

October 29 was a perfect night to change the world.

 

Charley Kline, a graduate student in the engineering school’s computer science department, viewed the late hours as a time to work in Boelter free of distractions.

 

“I was a tech guy who liked to program at all hours,” Kline said, “and it was much easier for me to stay focused in the middle of the night.”

 

But “to program” in 1969 was vastly different from today’s UCLA engineering student, pecking away on a two-pound laptop while sitting in a coffee shop in Westwood a few blocks from campus.  For Kline, and a generation of students studying in the young field of computer science, a PC of any size or price was almost a decade away; “to program” meant working in an on-campus laboratory, “computers” were room-filling systems, “keyboards” usually meant cumbersome stand-alone terminals shrouded in sheet steel.

 

That night, Kline’s project would be relatively simple, and involved testing a new system that had been installed at UCLA for almost two months: an experimental project funded by the federal government to create links between computers in locations across the country.

 

Simple – if it worked.

 

For Kline, such assignments were departures from the traditional education that most computer science students pursued in the 1960s. Although some opportunities in computing involved government or academic systems that were used for scientific research and calculation, in that era, jobs in computer science generally meant support for large systems that served as giant calculators and billing machines for banking and other industries. 

 

But Kline sought different types of opportunities.

 

“I was interested in exploring the problems that were emerging in a world where computers worked independently with some success, but had a great deal of trouble communicating with each other,” Kline recalled. “Our goal was to determine how to make them talk.” 

 

Kline found an opportunity for that mission in the laboratory of Leonard Kleinrock, who at 35 was already recognized for developing a mathematical theory of the methods to create communication pathways between computers at different locations.

 

In 1969, Kleinrock’s principal project to validate his theoretical discoveries was to participate in building an experimental network, a system that would, according to the July 3 press release, “for the first time, link together computers of different makes and using different machine languages.”

 

“As of now, computer networks are still in their infancy,” Kleinrock explained in the release, “but as they grow up and become more sophisticated, we will probably see the spread of ‘computer utilities,’ which like present electric and telephone utilities, will service individual homes and offices across the country.”

 

Creation of the network, reported the release, “represents a major forward step in computer technology and may serve as the forerunner of large computer networks of the future.”

Almost 50 years later, Kleinrock recalled, “In simplest terms, we were trying to shift the thinking from everyone using a large stand-alone computer, to a linked network that could exchange information.”

 

But for the moment, functional networks were still to come; first came learning how to create practical connections between computers, with a goal of linking systems at universities, government agencies, and scientific institutions so they could communicate and exchange information. To start, four computers would serve as the foundation of the system: machines at UCLA, the Stanford Research Institute, UC Santa Barbara, and the University of Utah. 

 

* * * * * * * *

 

At UCLA, just receiving the delivery of the computer in 1967 became a logistical headache; the computer – a Sigma 7 built by Scientific Data Systems of Santa Monica – was eight feet wide and almost six feet tall, so cumbersome that no elevator in Boelter Hall could accommodate it. To move the Sigma 7 into the building required a forklift on the loading dock at the back side of the building to raise the plastic-wrapped computer to the third floor, where a section of railing had been ripped out to allow the equipment to slide through.

 

In the lab, the Sigma 7 was connected to a Honeywell DDP-516, a “mini-computer” (merely the size of a refrigerator). The Honeywell was chosen not only for its price and performance, but also for its rugged structure built to military specifications; to demonstrate to visitors the computer’s physical strength, Kleinrock would pound on the cabinet with his fist.

 

The Honeywell was equipped with additional technology created by the consulting firm of Bolt, Beranek, and Newman, a Cambridge, Massachusetts company that added the parts to transform the computer into an “Interface Message Processor,” better known as an IMP (this was the first device now called a “router”). 

 

When attached to the Sigma 7, the IMP would – everyone hoped – serve as an all-purpose gateway that would link computers in many locations, built by separate manufacturers, created for a range of purposes, and all using different types of programming languages. It would be an ambitious project.

 

The first IMP – today still identified with the tag that marked it as node #1 in the national network to come – had been delivered to UCLA on August 30; three days, later, Kleinrock’s team successfully linked the IMP to the Sigma 7. With each IMP requiring a month to construct, the second was ready late in September.  On October 1, it was delivered to the Stanford Research Institute in Menlo Park, 350 miles north of UCLA.  Later, the third and fourth IMPs would be sent to UC Santa Barbara and the University of Utah, completing the equipment for the quartet of computers that would be the start of the new network.

 

The next step was to encourage the machines to talk, listen, and respond.

 

Around 9 pm, Kline walked across the Court of Sciences to the entrance of Boelter, and then downstairs to room 3420, the home to the UCLA Network Measurement Laboratory, where the new computers had been shoehorned through the door.

 

Kline sat down at the industrial-metal desk next to his terminal, picked up the phone, and dialed a number in Menlo Park.

 

* * * * * * * *

 

At the Stanford Research Institute, Bill Duvall was waiting for Kline’s call. Duvall, at 25, was working full-time at the institute.  A nonprofit research organization in Menlo Park that was spun off from Stanford University in 1946, the Institute (now known as SRI), had been established to serve as a “center of innovation” – an organization-for-hire that conducted research, designed products, and developed plans for civic agencies and private industry.

 

For Kline and Duvall, the task that night was clear.

 

“Our goal was to test the capability of the UCLA machine to log in to the computer at SRI,” said Kline. 

 

It would have seemed a simple experiment, but in practice, the process was much more complicated.  That night they would try it.

 

* * * * * * * *

 

At 9:30 p.m., Kline and Duvall, each on a telephone headset, powered up their equipment and activated their experimental operating systems that would allow Kline to connect.

Just after 9:30, Kline typed a letter.

 

"The first letter I typed was an L," Kline said. On Duvall’s terminal, the “L” appeared. 

 

Kline tried again; he typed the “O.”

 

“I got the ‘O,’” Duvall reported.

 

But that was all; the computer at SRI overloaded and the connection crashed. Two letters was as far as they got.

 

But two letters were enough. For at least a moment, the connection had worked. The first communication between the computers was "LO” – an inadvertent, almost-biblical declaration of the beginning of a new age.

 

“We couldn’t have planned a more powerful, more succinct, more prophetic message,” Kleinrock remembered.

 

Duvall was able to quickly fix the problem, and an hour later the two machines were again connected, with Kline successfully logging in. At 10:30 p.m., Kline duly recorded the moment by writing the result in the laboratory’s logbook (see image at top of article), and then went home to bed.

 

Duvall did not see the need for festivities either. He stopped by a local hangout for a burger and a beer.

 

“It was no celebration,” Duvall said. “I was hungry.”

 

* * * * * * * *

 

Kline and Duvall did not mark the moment because they did not realize they had a reason to celebrate. The pair viewed their work that night as simply another step in what they knew would become a long and complex series of technological events. 

 

But leading to what?  The link between computers at UCLA and SRI was never intended to become the indispensable technology used daily by billions. Kline, Duvall, Kleinrock, and hundreds of other computer scientists developing 1960s technology had hopes of building a system that would, perhaps at most, connect computers so they could easily exchange information and allow their users to communicate with each other.  

 

At a time when the first personal computers would not appear until the late 1970s, and public access to the internet would not be available for almost 15 years after that, a future filled with billions of websites, online shopping, social media, and instant global access to information was not even the remotest practical consideration.  Such miracles were being pondered only as fanciful theory – and on a much more limited scale – by a handful of visionaries.

 

The birth of the internet – if one technological link in a long chain can be described as a “birth” – had no emotion associated with it; there were no ticker-tape parades, no drama of Thomas Edison watching the first light bulb burn while he contemplated the enormity of what he had done. But the connection achieved on October 29, 1969 was, if nothing else, the starting point of a journey leading to technology that not only succeeded in its original objective, but would evolve into a phenomenon for communication beyond anyone’s most outrageous expectations.

Like all great technological achievements, the internet as we know it exists thanks to a serendipitous intersection of events, people, and inspiration. Over the decades since, some of those combinations would thrive and change the most fundamental activities of work, play, and human interaction; others would fail spectacularly

 

Perhaps the most astonishing issue yet to come about the internet would be the extraordinary story of its emergence, as it progressed from a fragile connection between two computers in 1969 with a modest intended function into the most pervasive communications tool of its age – possibly of any age – affecting everything we do, everything we say, and everything we achieve. 

The internet serves as an instrument for soaring to creative heights, and spotlights troubling questions about the lowest forms of human depravity.  It produces unprecedented opportunities for social interaction while raising deep questions about personal privacy and national security. And because of the internet, perhaps more than any other human advancement, the world is now a much different place than before it arrived, and continues to be reshaped as the technology evolves.

 

But on October 29, 1969, all of that was years in the future.  If there was a single moment that would define the start of the technology that would become the internet – this was it: the future was born. 

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173413 https://historynewsnetwork.org/article/173413 0
Roundup Top 10!

A Belated Recognition of Genocide by the House

by Samantha Power

For too long, Turkey bullied America into silence. Not anymore.

 

Impeachment Wasn’t Always This Fair

by Buckner F. Melton, Jr.

For more than half of the country’s history, potential impeachment defendants had wildly different rights from the ones they have today.

 

 

The ‘Deep State’ Exists to Battle People Like Trump

by Margaret O’Mara

A merit-based system for hiring federal employees was created in reaction to the rampant corruption of the Gilded Age.

 

 

What we get wrong about Ben Franklin’s ‘a republic, if you can keep it’

by Zara Anishanslin

Erasing the women of the founding era makes it harder to see women as leaders today.

 

 

A Racist Attack Shows How Whiteness Evolves

by Nell Irvin Painter

An assault at a New Jersey high school football game had an unexpected cast of characters.

 

 

The United States Overthrew Iran’s Last Democratic Leader

by Roham Alvandi and Mark J. Gasiorowski

Despite a campaign of historical revisionism in Washington, the archival record makes clear that the U.S. government was the key actor in the 1953 coup that ousted Mohammad Mosaddeq—not the Iranian clergy.

 

 

What Tenured Faculty Could Do, if They Cared About Adjuncts

by Herb Childress

Here are 11 things they can do right now that would make a difference.

 

 

What the Dismantling of the Berlin Wall Means 30 Years Later

by James Carroll

As the 30th anniversary of the end of the Cold War approaches, it should be obvious that there’s been a refusal in the United States to reckon with a decades-long set of conflagrations in the Greater Middle East as the inevitable consequence of that first American invasion in 1990.

 

 

Recalling Purple Hands protests of 1969 on Halloween

by Marc Stein

Halloween has long been one of the queerest of holidays, but on October 31, 1969, San Francisco LGBT activists found new ways to confront their terrifying fears of media misrepresentations and police violence.

 

 

Career Diversity and the Crisis of Grad Student Mental Health

by Erin Leigh Inama, Sarah Stoller, and James Vernon

The myth of the academy as a meritocracy that rewards the smartest and most talented often generates anxiety and depression.

 

 

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173460 https://historynewsnetwork.org/article/173460 0
Native Americans, government authorities, and reproductive politics

 

In the 1970s, doctors in the United States sterilized an estimated 25 to 42 percent of Native American women of childbearing age, some as young as 15. Even the lower estimate—one quarter of Native women—is a whopping statistic. The sterilizations, subsidized by the federal government and often undertaken without consent or under great duress, marked the culmination of a long history of efforts by federal and local authorities to manage the reproductive lives of Native families, explains Brianna Theobald, an assistant professor of history at the University of Rochester, in her new book, Reproduction on the Reservation: Pregnancy, Childbirth, and Colonialism in the Long Twentieth Century (University of North Carolina Press). 

In this interview, Dr. Theobald elaboraes on this troubling history. 

 

What was life like on Native reservations in the first few decades?

 I focus mostly on reservations in the West. Conditions on these reservations were extremely difficult in the late 19th century. Men were not allowed to hunt, and government rations were inadequate. At Crow there was an incredible demographic loss—deaths really—a result of not being able to move around and not being able to do what they had done traditionally for sustenance. They were confined to smaller spaces, which led to the rapid spread of disease. All of this had an effect on women’s health: women of childbearing age were particularly vulnerable to tuberculosis. At the same time, because of this demographic decline, there was a tremendous urgency among Native people to reproduce. So you had fewer women bearing a greater reproductive burden and you can see the outcomes in reduced infant and maternal welfare: greater sickness and mortality.

 

Why is it important, as you argue, to see reservations as colonial spaces?

These are spaces where the federal government has primary and ultimate authority. What I’m talking about here is settler colonialism—when the colonizer, in this case the Europeans and later Americans, come to stay. Their objective is to replicate the societies they left behind. For that, above all, they need land. And to get land, Native peoples had to disappear, in one way or another, for these resources to become available. Historically, this attempted elimination occurred in different ways at different times—through ethnic cleansing and forced removal, sometimes through massacres, and then by the 19th century through cultural assimilation.

 

How was cultural assimilation, in this context, a form of colonialism?

 The federal government’s assimilation agenda was an attempt to transform Native peoples into American citizens by forcing them to discard all markers of “Indianness,” adopt English and Western practices, and convert to Christianity. A lot of that assimilation agenda centered on gender, family, and the home. In the late 19th century, federal authorities, missionaries, and social reformers deemed it really important for Native people to identify as nuclear family units, led by a male head of household. That required marginalizing the extended family, which was essential to Native family structures, and decreased women’s power within the home, within the family, and within her community. The federal government wasn’t entirely successful in this effort, but the objective was clear.

You argue that colonial politics have always been, and remain reproductive politics. How so?

Efforts to alter and control Native women’s reproductive practices were integral to federal policies that at first glance might seem to have little to do with pregnancy or childbirth. Their reproductive experiences were affected by policies ranging from the allotment of tribal land to the relocation program following World War II. More generally, the federal government assumed greater control over Native reproduction over time.

In the 1910s, the government tried to get Native women to give birth with government physicians in hospitals, to move away from midwives and bring childbirth under the purview of the federal government instead. By the midcentury, when childbirth really had moved into hospitals, the federal government had a tremendous amount of control over where women gave birth, with whom they gave birth, the family planning options available to them, and so forth.

 

How else did federal authorities interfere in the family lives of Native Americans?

 The forerunner of the Bureau of Indian Affairs, the Office of Indian Affairs, had all these different employees on the reservation. Some were supposed to teach the men to farm, field matrons were supposed to go into women’s homes and teach them the art of domesticity, there were teachers, doctors, nurses. At the Crow Reservation in the late 19th century, I found that the superintendent’s directive to all these different employees was basically to watch what was going on and to report back: report any pregnancies—to curb abortion but also to know paternity, to know if this was out of wedlock, and if so, to pressure a legal Christian marriage. And if a woman had had several births out of wedlock, to determine if punishment might be in order. This surveillance was concerned with women’s reproductive lives, but also with knowing if a woman had left her husband, which in Crow society would have been fine, but was very much frowned upon and sometimes punished by the federal authorities.

What precipitated the mass sterilizations in the 1970s?

The Family Planning Services and Population Research Act of 1970 subsidized sterilizations for Medicaid and Indian Health Service patients. Many Native people received their healthcare through the IHS. We know that after its passage, sterilization rates on many reservations increased. On the Navajo Reservation, for example, they doubled between 1972 and 1978. That doesn’t mean that all these procedures were performed coercively—some women saw it as their best family planning option, given their circumstances—but we do know that the subsidization of the procedure as well as the increased legitimacy of sterilization as a form of birth control at the time facilitated coercive use of the technology.

 

Native American resistance is a major theme of your book. How have Native women, in particular, resisted these incursions over time?

Until midcentury there was tremendous resistance in some areas to the acceptance of government physicians. The women might tell the field nurse, “yes, I’ll come to the hospital” and then wouldn’t. It’s actually quite funny to read the documentary record. These field nurses would write in their reports that they were frustrated that women who were visibly pregnant would just lie to their faces and say, “no, I’m not”—in an effort to maintain reproductive self-determination, to keep reproduction in these gendered and generational networks where they believed it belonged.

In the 1930s, Susie Yellowtail, a Crow woman, took up midwifery because of her dissatisfaction with her own birthing experience at a government hospital.

Another example was the establishment of the American Indian Movement, or AIM, in 1968 in Minneapolis. This was an intertribal group that was very committed to rejecting the assimilationist pressures of the preceding decades—instead focusing on cultural revitalization and the defense of Native sovereignty, Native treaty rights. AIM is just one of several militant groups that became associated with what’s called the Red Power movement.

In the 1970s, Native activism and resistance became very visible, more widespread, and ultimately coordinated nationally and internationally. That’s when Native women really started to organize independently. They formed Women of All Red Nations, WARN—the group that especially took on these sterilization abuses. Under pressure, the U.S. Government Accountability Office investigated the issue in 1976. They released a report, which actually stopped short of saying that government divisions performed sterilizations coercively, but it did raise a number of concerns regarding the consent process. In the aftermath of this report, amidst Native activism, and also activism by African-American and Latina women, the Department of Health, Education, and Welfare adopted new regulations that offered some tangible protections for women, which went into effect in 1979.

 

Where are Native women today? You write that some hospitals on reservations have been forced to limit services or have closed due to chronic underfunding and staffing shortages—forcing some women in labor to travel an hour or two to the nearest hospital to deliver, which is unsafe.

 There are at least two strands. First, there’s a movement now among Native women who do not want a medicalized birthing experience in any hospital, and who are trying to create alternatives that seem more culturally appropriate to them, and which they view as an enactment of their bodily autonomy and sovereignty. As a result we see pockets of a resurgence of Native midwifery, and Native doulas.

At the same time, there are other Native women who are very upset, for various reasons, that they can no longer give birth at a government hospital on the reservation.

I see these two movements as quite complementary, in terms of the reproductive justice agenda, in that women should have some control over the circumstances under which they give birth. It’s important to note that the Native maternal mortality rate continues to outpace that of white women, for a variety of reasons that are squarely rooted in the colonial history.

 

This interview originally appeared on The University of Rochester's website. It is republished with permisison. 

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173414 https://historynewsnetwork.org/article/173414 0
US Militarism, Having Provoked ISIL into Being, Kills Cult Leader Baghdadi “Abu Bakr al-Baghdadi,” the nom de guerre of the notorious Iraqi terrorist Ibrahim al-Samarrai, is dead, killed in a US special forces raid on his compound in Syria’s northwest Idlib province. Declared a “caliph” in 2014, this minor cultist helped turn the Fertile Crescent into a charnel house. From a village near the mostly Sunni Arab city of Samarra, he barely graduated from high school and, in secular, socialist Iraq, was shunted to the less desirable Islamic University of Baghdad, where he studied some Islamic subjects, likely at a low level and mainly by rote.

Unlike what the Western obituaries say, he was not “an Islamic scholar.” He maybe gave some sermons in a small local mosque.

Baghdadi was in US custody in Iraq in 2004, and in the prison camp he made some of the contacts that later formed the ISIL terrorist organization. The US at any one time held 25,000 Iraqi young men in camps on suspicion of trying to resist the US military occupation, brutalizing and further radicalizing them. Some, as at Abu Ghraib, were tortured and humiliated.

ISIL did not arise organically from Islam. There was little violent religious extremism among Sunnis in Iraq before the United States invaded in 2003. Iraq had had a secular, socialist ideology and its government refused to put Islam in the constitution as the religion of state. Bush installed a Shiite sectarian government allied with Iran and pushed Iraq’s Sunni Arabs into such despair that some of them turned to “al-Qaeda in Mesopotamia,” which morphed into the “Islamic State of Iraq,” and then after the Syrian revolution of 2011 became the “Islamic State of Iraq and the Levant.” Had there been no American invasion and occupation of Iraq, there would have been no ISIL. US warmongering has sown dragons’ teeth throughout the Middle East.

Al-Baghdadi and his movement did enormous damage to the image of Islam in the world, and committed genocide against Muslims, as the term is defined in the Rome Statute that established the International Criminal Court. One of the insidious ways al-Baghdadi and his fellow cultists worked was to trick the Western media, where many journalists know nothing serious about Islam or the Middle East.

Just calling his organization the “Islamic State” was one way of trolling everyone. Baghdadi’s beliefs and practices were so far out of the mainstream of normative Sunni and Shiite Islam that they may as well have been Martians.

As I have noted before, it would be as though a guerrilla group in Mexico or Colombia declared that their name was The Vatican. And if the press fell for it, then every time the group massacred villagers, they’d be obliged to report that “Today the Vatican massacred 63 with machetes in the highlands.”

Since Western journalists actually know what the Vatican is and what Roman Catholicism is, they wouldn’t fall for this trick.

But some journalists in prominent outlets reported silly things like that ISIL is “very, very Islamic.” Yes, and the Ku Klux Klan is “very, very Christian.” People reply that ISIL erected a state over some 5 million people. Well, the Ku Klux Klan ran Indiana.

Nor was his ramshackle terrorist statelet a “caliphate.” People in the Muslim world, in urbane cities like Cairo and Beirut, laughed at his pretensions, calling Baghdadi the “Rolex Caliph” after he showed up at the pulpit sporting pricey bling.

Violent cults like ISIL grow out of social conditions. They tell you nothing about the character of the religion out of which they emerge. In East Africa, the Lord’s Resistance Army has terrorized Uganda and its neighbors, coming out of local interpretations of Christianity. In Japan, Om Shinrikyo put sarin gas in the Tokyo subway in 1995, sickening thousands (they were trying to commit mass murder). They are a Buddhist offshoot and were trying to provoke the advent of the next Buddha. Buddhists are often appalled to hear this, and protest that Om Shinrikyo is not Buddhism. Right. And ISIL is not Islam in exactly that sense

On June 14, 2014, ISIL massacred 1700 Shiite cadets of the Iraqi military whom they had taken captive, in the most horrible ways. 

Or then there was the Jordanian pilot whom they shot down, captured, and set afire in a cage. 

A. J. Arberry translated the chapter of Muhammad in the Qur’an, 47:4, concerning prisoners of war this way: “tie fast the bonds; then set them free, either by grace or ransom, till the war lays down its loads. So it shall be; and if God had willed, He would have avenged Himself upon them; but that He may try some of you by means of others. And those who are slain in the way of God, He will not send their works astray.”

That is, once the enemy is taken prisoner, they should either be simply released, or they should be ransomed back to their colleagues as a sort of war indemnity, and this should be done while the war is still going on.

Later Muslim tales from 130 to 300 years and more after the Prophet Muhammad’s death tell all sorts of war stories about early Islam. But the Qur’an is our only primary source for very early normative Islam, and its attitude toward prisoners of war is quite clear. Wealthy Roman generals of that era (the 500s and 600s) were known to pay a ransom for soldiers they had captured (see John of Malalas) as well.

Or there was ISIL’s genocide against the Izadi Kurds of the Sinjar region. They are non-Muslim but believe in God, following a religion influenced by ancient Iranian beliefs. 

The Qur’an 2:62 promises paradise to “whoever believes in God and the Last Day and does good deeds.” Izadi Kurds have been living in Northern Iraq for centuries. Like all minorities, they often had a hard row to hoe. But they weren’t ethnically cleansed by the Ottoman sultans, who also claimed to be caliphs or vicars of Muhammad.

I discuss the values of the Qur’an, the Muslim scripture, in my new book, Muhammad: Prophet of Peace amid the Clash of Empires.

The ISIL cult is not gone, and extremism continues to have a purchase in the Fertile Crescent. Trump refused to spend tens of millions of dollars expropriated to rehabilitate the Raqqa region of Syria after ISIL was defeated there. People in eastern Syria were brutalized and then suffered enormous damage as the US bombed their towns and villages to defeat ISIL, which had taken them over. They have nothing. Is it wise to leave them twisting in the wind?

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173415 https://historynewsnetwork.org/article/173415 0
New Jim Mattis Memoir Avoids Criticizing Trump, Instead Lashes Bush, Obama, and Others

 

Secretary of Defense Jim Mattis resigned from the Trump administration ten months ago. As a leadership scholar and historian, I was often asked what I thought of Mattis’s dramatic departure. I answered that it appeared to be a laudable resignation of protest based on differing national security priorities, and that we would learn more after Mattis’s memoir was published. His book, Call Sign Chaos, was released in September. 

 

Unfortunately, Mattis writes little about the Trump administration’s national security policies. He does include a copy of his resignation letter, which makes clear his disagreements with the president on the severity of the Russian and Chinese threats and on the imperative of America’s military alliances. But Mattis offers no insights on the challenges of working for Trump. Instead he writes, “I’m old fashioned. I don’t write about sitting presidents.” For this morally uncourageous decision, the former Marine Corps General has rightly taken flak. He has chosen to conceal valuable information from the American people who soon will be deciding whether to retain Trump for another four years. 

 

While Mattis refrains from directly criticizing Trump, the retired general is quick to censure many of his former civilian and military superiors. Army General Tommy Franks, who led military forces in the 2001 campaign in Afghanistan and the 2003 invasion of Iraq, takes heavy fire from Mattis, who at the time of 9/11 was commander of the 1st Marine Expeditionary Brigade. Mattis disparages Franks for often giving “vague guidance” and ambiguous orders, and for being overly cautious, “trapped by outdated thinking,” and for utilizing “unsound reasoning.” In Afghanistan, this led to “the gravest error of the war,” allowing Osama bin Laden to escape into the Tora Bora Mountains. In a final lob, Mattis criticizes Franks for not deploying “enough troops” to stabilize Iraq after the end of major combat operations.      

 

Mattis is equally critical of Bush appointee Paul Bremer who, as director the Coalition Provisional Authority, supplanted Franks as “the most powerful man in Iraq.” According to Mattis, Bremer provided “no specific guidance” to the military and also made a disastrous unilateral decision to demobilize the Iraqi Army and to ban most members of the Baath Party from the new government. 

 

Mattis does not spare the rod when discussing George W. Bush and Iraq, including the president’s continuous support of Bremer. Mattis even admits to being “stunned” and “disoriented” by Bush’s initial decision to invade Iraq. The war was unwise, he thought, because the United States had already adequately “boxed in” Saddam Hussein. Further, Hussein actually worked “to our strategic advantage” in the region by counter-balancing Iran. Mattis believed that Bush’s goal of establishing a truly independent Iraqi democratic government was “idealistic and tragically misplaced.” 

 

Mattis is especially critical of his civilian superiors when, despite his advice to the contrary, they insisted on a “reckless” and bloody conventional assault on Fallujah which led to “exploding” violence across the country. Then, when approaching victory in Falluja, the leadership gave Mattis a “harebrained” order to halt the offensive, allowing the enemy to consolidate its position. “The impact of such incoherence at the theater and national command levels,” Mattis writes, “cannot be overstated.” America’s civilian leaders “were spinning in a circle, without a strategic compass.”       

       

By 2010, Mattis had become a four star general, and President Barack Obama made him the Commander of CENTCOM, overseeing all U.S. operations in the Middle East and Central Asia, including the ongoing wars in Iraq and Afghanistan. Mattis soon determined that Obama, like Bush, had not adequately defined “the policy end states and the strategies that connect our military activities to those end states.” 

 

Nevertheless, after seven years of war, Iraq had finally obtained a fragile stability, raising the question of whether the United States should maintain a residual force there. Mattis lobbied for a contingent of 18,000 troops to sustain the hard-fought gains. But wanting to end the war, Obama rejected that counsel and withdrew all U.S. forces at the close of 2011. Shortly thereafter, as Mattis feared, the Iraqi civil war resurfaced and later ISIS zealots swept across western Iraq and into Syria. The withdrawal U.S. forces, Mattis writes, was a “catastrophic” decision by a president “ignoring reality.”          

 

Mattis offers equally harsh judgements of Obama’s policies in Afghanistan and Syria. While supportive of Obama’s decision to deploy more troops to Afghanistan, he was appalled by the president’s public announcement that he would recall U.S. forces eighteen months later. “Unless you want to lose,” Mattis writes, “you don’t tell an enemy when you are done fighting, and you don’t set an exit unrelated to the situation on the ground.” 

 

Similarly, Mattis supported Obama’s firm warning to Syrian leader Bashar al-Assad not to use chemical weapons against his own people. But when Assad did, Obama did not retaliate militarily and this “sophomoric” decision left Mattis “deeply disturbed.” 

 

Mattis’s stark criticisms of his former bosses is often justified, and to his credit he readily acknowledges his own failures as a commander.Call Sign Chaos is in essence a leadership primer, one that effectively underscores how important it is for senior subordinates to “lead from below” by offering independent judgments to their superiors. On multiple occasions Mattis admits his own failure to persuade his military and civilian bosses on the best courses of action. Whether it was Afghanistan, Iraq, or Syria his “efforts to influence American policy decisions had fallen short.”     

 

If Obama, Bush, and Trump were not up to Mattis’s leadership standards, who then is his exemplar? George H.W. Bush. He was a president who “backed up his words” after telling Saddam Hussein that his 1990 occupation of Kuwait “will not stand.” Bush was a leader who effectively rallied domestic and international support for war, who provided his generals with the forces they needed, and who avoided “overreach” after having achieved his objective of liberating Kuwait.     

 

Jim Mattis has a reputation as a blunt talker and Call Sign Chaos provides ample evidence of his frank and critical judgments. As a military officer and defense secretary, he swore his allegiance to the American people not to any single executive in chief. Now is not the time to demonstrate “old fashioned” loyalty to a sitting president. With consequential elections forthcoming now is the time for Mattis to tell voters the unvarnished truth about Donald Trump.     

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173407 https://historynewsnetwork.org/article/173407 0
These American Political Bachelors Were Known as ‘Siamese Twins’ During the Antebellum Era

 

 

Thomas J. Balcerski is an Assistant Professor of History at Eastern Connecticut State University and the author of Bosom Friends: The Intimate World of James Buchanan and William Rufus King, an original dual biography. He received his PhD in history from Cornell University in 2014. 

 

 

Why did neither James Buchanan or William Rufus King ever marry? They were lifelong bachelors. And how unusual were they for not getting married?

 

Bachelorhood was not uncommon during the 19th century, particularly the early part of that period. In fact, during the 19th century, the prevalence of bachelorhood grew, such that, in a place like New York City, it had reached an all-time high of perhaps 12% of the male population. 

 

Bachelorhood as a phenomena was very much part of the culture of the period. When you look at the United States Congress during this period, particularly the US Senate, I ran an analysis between the years 1790 and 1860 to estimate just how many men were unmarried and I determined the number to be at 7%, with perhaps as many as 4% being lifelong bachelors. On the whole, bachelors were a known part of politics. The question becomes partly a biographical one: Why did each man marry or not? And I say that because of that percentage of bachelors, many would go on to marry after their service in the Congress had ended. James Buchanan and William Rufus King, in contrast, never married and this does set them apart from others with whom they served and lived in a boardinghouse, and even politically for the period, they were the most prominent bachelor politicians of the 19th century. 

 

Another word about why each did not marry: I do think both men were very ambitious. For bachelors, the decision to marry or not was very much based on how it might affect one's future course, whether it be in one's own career or personal life. For political bachelors, there was an additional calculation of whether or not marriage would impede or enhance one's political standing, and in those cases, Buchanan and King calculated that they could achieve political success without marriage. That is not to say that either man did not engage in courtship, or at least purport to engage in courtship, but it is to say that at a certain point in their political careers, they had learned to identify as bachelors and to take the good and the bad that came with it.

 

How did the historian Elizabeth Ellet come to remember the two men as ‘the Siamese twins’ and how have they been remembered since?

 

Elizabeth Ellet is a writer, a historian, and a contemporary of many of the politicians of the 19th century. She personally knew several of the former first ladies of the period, and was at least on corresponding terms with them. One whom she corresponded with: Julia Gardiner Tyler, who was the second wife of President John Tyler. Note: John Tyler was widowered in office while President and remarried to a much younger Julia Gardiner. In the 1860s, when Elizabeth Ellet was preparing a history of what she called The Court Circles of the Republic, she reached out to Mrs. Julia Tyler and asked for her reminiscences about her two years as the First Lady and wife to President John Tyler. It's from that correspondence that I find the phrase "the Siamese twins" enters into historical memory, by way of a First Lady looking back at the boardinghouse culture of the period. That's why I began the book, with Ellet's recollection as a way to suggest both how memories are made and how historians have played a part in characterizing the relationship of Buchanan and King.

 

I did want to add that in addition to Julia Tyler, Elizabeth Ellet wrote to Harriet Lane Johnston, who was the niece of President James Buchanan and who served as his First Lady. Additionally, she wrote to Cornelia Van Ness Roosevelt who, as I discuss in the book, plays a critical role in bringing together King and Buchanan at a moment when they had separated upon King’s departure to France in 1844. Mrs. Ellet certainly plays a role in my story and I think it's a fitting one, to begin with a contemporary historian looking back at the period before the U.S. Civil War.

 

Your book argues that Buchanan and King’s relationship conformed to an observable pattern of intimate male friendships prevalent in the first half of the nineteenth century, meaning they were not America’s first gay Vice President and President. How did you arrive at this argument?

 

That's a great question in that it connects the argument I'm making about friendship with questions of sexuality, and particularly, modern understandings of it. I think though in your question, just to push back a little bit, is something of a conflation that I don't quite want to make. And that is to say that intimate male friendship is a construct that can be observable to the 19th century. The word ‘intimacy’ is the key word here. I'll begin with that. 

 

Intimacy is a word of the period. I find it in the correspondence of politicians, particularly people serving in the Congress. What it had connoted to mean was a closeness, not so much in a physical sense as much in a political, and to a lesser extent, personal sense. We get this phrase — personal and political friendship. 

 

For James Buchanan, he distinguishes these two things. When they combine, however, that is when intimacy seems to enter into the equation for him. In parsing his correspondence, I tried to see about whom he spoke of as an intimate friend—and there were several besides King. King, of course, is in that category and so it's not to say that there wasn't a more deeply intimate friendship compared to some of the others, but it does fit into a pattern, not just of Buchanan, but as I say, across the 19th century. The notion of sexuality and the notion that they are or are not gay does take us down a different path, and one that I think is what may draw readers to the book, and for that, I'm grateful. 

 

At the same time, in the introduction of the book, I talk about my assessment. The evidence does not permit a definitive rendering of their relationship as anything other than platonic, but it also allows me to suggest that we move beyond the question of modern terms related to sexual identity and orientation. We can and we should, as historians and biographers do, since interested readers want to know as much as we can about the outlooks and attitudes of these two men. 

 

For that, in my own mind, I've come to see Buchanan as someone actually quite different from King. I see them as almost on opposite ends of that spectrum, with Buchanan more clearly conforming to romantic courtship with women, and King not as much. In their relationship, then, I see much more of a one-way desire, attraction, and longing, that of William Rufus King for James Buchanan, than I see in return. For that reason alone, they should not be figured as a gay couple, and even if one wants to use the orientations of today to understand either or both men, I don't think it necessarily answers the question of their relationship as much as it perhaps can be a satisfying exercise in understanding historical sexuality.

 

Can you talk about the period of their active friendship, split into two discrete phases, and spanning more than 18 years, as the most successful example of a domestic political partnership in American history?

 

Yes, and to begin with that notion, 'domestic', I think it's important to remind readers how boardinghouses and boardinghouse culture worked in the 19th century. Prior to the period in which Congress met continually, more or less, with short recesses, during a two-year term, the Congress is typically only in session for three periods of the term, so that the typical Congress would spend less than a year of the 2-year term in Washington. The seasonal nature, therefore, of Congress required a different kind of residential pattern than what we see today in Washington, D.C. They instead shared temporary establishments or boardinghouses or messes. They lived there, they often took meals there, and in the process became, well, domestically intimate with one another. 

 

We see that across multiple boardinghouse patterns, not just Buchanan and King. Buchanan and King, though, were somewhat different than the typical boardinghouse group. For one thing, they shared the same Democratic party affiliation, but for another, they came from different sections of the country. In my study of the boardinghouses of the 19th century, it's a rare thing to find men from different parts of the country living together for so long a period.

 

They might come in and out for one or two sessions of the Congress. When you look at Buchanan and King, and you realize that they lived together for 10 years in Washington, D.C.—something else then was at work than mere convenience. That's when we come to the point that they self-consciously lived with one another because they were bachelors. My findings suggest that they thought of themselves, to use their words, as a "bachelor's mess," and they brought in other bachelors with whom to live. During that 10-year period, it wasn't just Buchanan and King, it was actually a rotating cast of characters, most of whom were unmarried. 

 

The second period of their friendship—you are right to call it two discrete phases—is after they are no longer living together, and in some ways, the second period is more poignant but more important. Poignant in that they lose the intimacy of their friendship. They are now separated and no longer living together. Important and significant in they both begin a rise into national power which ends with their election to high office. There is a lesson here, too, that for Buchanan and King, each man did need to leave the other in order to eventually achieve success on his own. They tried and failed to be on the same ticket for President and Vice President. Only separately, then, were they able to obtain that office.

 

As you wrote, Buchanan and King were different: they had varying socioeconomic statuses and were of opposite political parties. But they also shared some personal factors in common. 

 

To start with, it is unusual that two men from different parts of the country and who began with different political views would ultimately run on the same ticket. To understand why, we must first remember that Buchanan and King were born during what's called the First Party System which consisted of, on the one hand, the Federalist party, and on the other, the Democratic- Republican party. Buchanan was a Federalist. King was a Democratic-Republican.

 

Buchanan, therefore, had certain views about banking, the tariff, and the war, which King held in opposite. When Buchanan was first elected to the US House of Representatives, he was a Federalist. King, a Democratic-Republican. Politically, they came from a different background; personally, too, they came from different socioeconomic standings and cultural backgrounds. Buchanan was born fairly poor in a log cabin. His family was involved as merchants and traders along a route in what is today Mercersburg, Pennsylvania. He was one of many children, the first of which to go to college, and did not have much in the way of material wealth. He made it on his own as a lawyer in the city of Lancaster, which was then the capital of Pennsylvania before it moved to Harrisburg.

 

William Rufus King was born into a large slaveholding family that was prosperous and that grew agricultural products like wheat, corn, and cowpeas in North Carolina. He inherited land and slaves from his father when he turned of age, and he too went to college and studied law but had an easier go of it and got into politics at an early age as a result. He was a Democratic-Republican, so he supported the War of 1812. He stood against the banking system and against tariffs which generally protected manufacturing concerns in the north at the expense of agricultural concerns in the south. 

 

The great change in politics during this period can be summed up in one person: Andrew Jackson. Buchanan realized the force of nature that was Jackson and shifted his political allegiance towards him. King naturally merged into that direction as well. Buchanan probably, of the two, moved further in his political principles than King, but they both fell into line as Jacksonians. By the time, therefore, each comes to the US Senate—Buchanan comes later than King—the Jacksonian orthodoxy was fairly well-established. They more-or-less would follow it for the rest of their careers. 

 

One man from Pennsylvania, part of the North, which was moving away from slavery. There were still enslaved people in Pennsylvania after Buchanan was born. Emancipation laws would go into effect gradually. King will hold to the slaveholding system for his whole life, and in fact, it becomes the key issue on which Buchanan must bend for a political alliance with William Rufus King to work. In time, Buchanan will come to see the political value of protecting the slave system from his interactions with Southerners such as King.

 

You wrote that “the heart of the friendship of Buchanan and King lay a solemn pact, developed in the boardinghouses of Jacksonian Washington, that the Union must not split over the question of slavery.” How did the two men cultivate this pact?

 

I still stand by that statement and yet the evidence for it is more circumspect than direct. Let me try to give some examples of how I came to that conclusion. The one gets back to the very nature of a boardinghouse friendship. Buchanan, it turns out, was the only  Northener in the boardinghouse called the bachelor’s mess. All the other men who came into it—unmarried men—were Southerners. Indirectly, you see that Buchanan is the one who has to adjust himself culturally to his messmates' belief system. That's one piece of evidence. The second piece comes in with the voting patterns that Buchanan took and his speeches in the US Senate during this period. 

 

We find evidence of him, whenever possible, supporting repression or gagging discussion about the issue of slavery, and Buchanan is famously the author of the United States Senate gag rule forbidding any petition that would call for the end of slavery in the south. He put his money where his mouth is, you might say, politically. The third way we see it is just in how as President, and even before that in retirement, he lives the life of a cultivated southern gentleman at his estate in Lancaster called Wheatland. Then, as President, we see it in his policies through his support of the Dred Scott case, his support of the Lecompton Constitution which permitted slavery in the Kansas territory, and arguably by his less than hardline stance in the face of Southern secession in the winter of 1860 and 1861. 

 

Finally, too, Buchanan preferred the company of Southerners while President, and this is what's fascinating for me: One of his favorite people to have in his White House was none other than King's niece, Catherine Ellis. We see evidence of her at multiple points of Buchanan's term as President, including in a portrait that was painted to commemorate President Buchanan's visit with Prince Edward Albert to Mount Vernon to see the tomb of George Washington. It's been little noticed but Harriet Lane, his niece—and First Lady—stands directly next to Catherine, King's niece.

 

The term ‘bosom friendship’ meant a particular kind of domestic intimacy common in the eighteenth and nineteenth centuries. Are there any other early American phrases that have caught your attention during the making of this book?

 

You mentioned ‘Siamese twins’ earlier, and initially, I had thought that that would make a more apt title for my book, in that Elizabeth Ellet’s evocation of Buchanan and King as Siamese twins seems so nicely to capture the political meaning of their friendship. Why I chose ‘bosom friends’ and why I think it stands out among the various phrases and euphemisms used in the 19th century is that it connoted more positively. It allowed for something of that personal nature to be understood in their friendship. It was, of course, political, but people understood that they had a personal connection as well. 

 

We find the phrase being used about a number of political duos of the period, that include some of Buchanan and King's greatest enemies and critics. Towards that end, I would add to the various a list of phrases, such as Siamese twins and bosom friends, a few others that had entirely to do with political gossip from the period; words that were used to describe Buchanan and King in various stages. I'm thinking here of the phrase 'Aunt Nancy' or 'Aunt Fancy' or 'Miss Nancy'. All three of them show up directly in their correspondence, newspapers, talking about either Buchanan or King.

 

Deciphering gossip actually became a big part of my project, trying to think about the kinds of insults that were used. When we think about our own political times and how gossip and insult works, it's not a foreign concept to us. What's perhaps more interesting here is these series of 'aunt' insults do have a gender implication that tends to belittle and feminize them. For that reason, historians of the period have been attracted to them. They've often pointed to Andrew Jackson using the phrase 'Aunt Nancy' to describe Buchanan and King as somehow definitive proof of a sexual relationship. I always point out in return that it’s just one piece of what historians call ‘a grammar of political combat’, and that Buchanan and King, for their part, also used such gossip and language to talk about their political opponents. 

 

It is a world that requires us to almost dial back a little bit our modern assumptions about what words mean, and instead try to think about what nineteenth-century people thought of.

 

Many factors contributed to the lasting power of their relationship, but as you describe “its ultimate longevity may be attributed to another factor altogether: their nieces, Harriet Lane Johnston and Catherine Margaret Ellis.”

 

It's important to remember what happens after a great man dies, and that a President of the United States in the 19th century was not given any special treatment or given government attention like he would today. Like it or not, at some point, the papers and letters and artifacts from the presidency of Donald Trump will end up under National Archives and Records Administration control, as it has since the time of Herbert Hoover. Before President Hoover, no President was given that Federal support and apparatus. We find that with someone like Buchanan, his personal papers became then part of the obligation of his familial descendants.

 

Harriet Lane Johnston was an incredible lady and a dogged advocate for her uncle during the remainder of her life, and she will help to establish the collection that ends up at the Historical Society of Pennsylvania, as well as at the Library of Congress, both of which today are very important to understanding the life of James Buchanan. By contrast, Vice Presidents receive even less attention and support, and there are some US Vice Presidents about whom we know almost nothing because their papers do not survive. 

 

William Rufus King is in that category to a degree. His letters to other people show up in personal papers throughout the country, but the letters that he received are fewer than you might expect. There are possible explanations as to why Buchanan's correspondence is so large and voluminous, and why King's is not, but at baseline, it has to do with the circumstances of their families' lives after their respective deaths. Lancaster, Pennsylvania avoided the scourge of the Civil War. Selma, Alabama was among the last of the major battles in 1865. We know from the history of the battle of Selma that King's plantation was raided and valuable articles and items were destroyed, and that it is quite likely that personal papers were destroyed in the process. 

 

However, there's other explanations for the asymmetry in their papers, and I talk about that in the book. What I want to stress here is that these two nieces were incredibly proud having been at their uncles’ sides during their lifetimes and were even more doggedly devoted to preserving their legacies.

 

There is a myth—a persistent myth—that the two nieces, by prearrangement, destroyed their uncles' correspondence. My book helps to dispel that myth. If anything, it's because of these two nieces that we have surviving correspondence of their uncles. More than that, the correspondence between them reveals a different kind of intimate friendship. That between two women who shared something in common: in being at the scene of high political power in the 19th century, something that women were only rarely exposed to during that period. As such, they maintained a lifelong friendship as well.

 

From a historian's research standpoint, when you approach archives, you encounter many letters. Are you hoping for as much as possible to draw primary research sources from, or are you sizing up the collection? How do you size up and scope that in order to execute a book, dissertation, or project?

 

I took the approach of anything written by William Rufus King was valuable, and because we unfortunately do not have all that much, it proved to be the case. James Buchanan, on the other hand, is a different story. I spent much of the last several years reading as many of his letters as possible, and even so, I have to admit, I didn't get through them all. We are fortunate that archivists and librarians have prepared detailed finding aids which allow us to understand the ebb and flow of the collection, but more importantly, for Buchanan, we are fortunate (and I was extremely so) to be able to work from historiography in past histories and biographies. I'd like to point out that in Buchanan's case, Philip Klein wrote the definitive biography of James Buchanan, published in 1963. 

 

I had a chance also (and this is something I might give as a piece of advice) to look at historian Philip Klein's own personal papers, that upon his death in the 1990s, he donated to Pennsylvania State Library where he worked during his career. Reading Klein's notes and papers was an incredible experience. It made me realize that if Klein covered it, he did so accurately, and that part of what I hoped to do was build upon Klein to see areas he might have missed, to find new sources that would have come to light since the 1960s when his book was published. Also, to use the interpretive framework that I knew I was bringing in the 21st century which he didn't have in the mid-twentieth century.

 

To your question, yes, it is sometimes valuable to be selective when looking in an archive, but as a biographer—and in my case a dual biographer—everything is on the table. We need to be open to as much as possible, following sources where they lead us.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173403 https://historynewsnetwork.org/article/173403 0
From Flappers to Generation Z: The Cycles of Generational Hostility

 

On Fox News this past August, Sean Hannity bemoaned the apparent decline in patriotism among millennials: “That’s what young people think?”, as if all young people share a singular ideology. This clip demonstrates the all-too-familiar conflict between the Baby Boomer generation (Hannity was born in 1961) and millennials. This animosity, though very pervasive in today’s media, is merely the current iteration of a pattern of thought and behavior that occured throughout the 20th century.

 

Generational conflict, also known as generational gaps, arises when two different demographics collide because one (the younger) has established a value system that is fundamentally different than the other (the older). This phenomenon typically coincides with the emergence of an adolescent group whose ideologies and behaviors shock their parents.

 

This friction first reared its head during the 1920s with the rise of the flappers. A Wall Street Journal article from 2007 notes that the mass-production of cars revolutionized the adolescent experience by giving teens and young adults the independence and the agency to behave as they wished. This behavior often included drinking (during Prohibition), and what was then considered radical sexual activity. Parents were beginning to feel a loss of control over their seemingly wild children. In 1922, a mother was quoted in the New York Times saying “Middle-aged people, who remember the old-fashioned girl, must interfere in the young people’s affairs and help to restore them to wholesome standards”. This idea of “the old-fashioned girl” exemplifies the mindset of adults who criticize their successors. Young people's actions are considered a rejection of traditional values, and a deterioration of the American moral compass which their generation had helped to build, and now their generation must work to rebuild. Even Charlotte Perkins Gilman, a famed first-wave feminist, criticized the flapper generation. In 1995, educational historian Margaret Smith Crocco noted that “Gilman came to disparage the vices of the new feminists of the twenties who mimicked men's behavior”. Born in 1860, Gilman was in her sixties during the Roaring 20’s and was thus firmly a part of the older, more cynical generation.

 

The next notable generation who bore a great deal of disapproval from their predecessors was the youth of the 50’s and 60’s. These two decades saw the proliferation of the beatniks and the hippies. In an interview with Margaret Mead, poet and beatnik legend Allen Ginsberg defined the word beatnik as “a word of insult usually applied to people interested in the arts”. Ginsberg said that this insult was propagated by the media to portray his generation as “a vulgarity”, and he refused to subscribe to that definition. Margaret Mead mentioned that “we do love to name generations, and we’ve been naming them for quite a long time now”. 

 

This bitterness toward beats would soon encompass the hippies of the late 60’s and early 70’s. In a clip from the History Channel segment Ask Steve, resident historian Steve Gillon explains that the Greatest Generation “had an ethic of self-denial”, while their children, the Boomers, “had a sense of self-fulfilment”. These two approaches to life were incompatible, and thus sparked strife between the two groups. For example, in 1967 a CBS News special aired called “The Hippie Temptation”. Journalist Harry Reasoner said that hippies “tend to approach work as the rest of us do sport”. Throughout this story Reasoner maintains a very condescending tone: he describes hippie ideology as “style without content”, he says that hippies “make you uncomfortable”, and he calls their actions “grotesqueries”. Reasoner was born in 1923, which places him on the cusp of the Greatest Generation and the Silent Generation. His sentiments echo those of his peers, who saw hippies as lazy delinquents, a claim which in 2019 is extremely ironic given that Boomers cite the same evidence to describe millennials.

 

Older generations have always, and will most likely continue to disapprove of the actions of their children. However, these criticisms are often based on sweeping generalizations about an entire generation. In the aforementioned Ask Steve clip, Steve Gillon also makes the crucial point that Boomers were a more nuanced group than conventional wisdom would have us believe. In an article from The New Yorker, historian and critic Louis Menand urges us to detach the Baby Boomers form the social upheaval of the 1960’s, as they were too young to be the ones instigating that cultural revolution. He points out that most prominent figures associated with Civil Rights, Students for a Democratic Society, the antiwar movement, the women’s liberation movement, and even notable writers, artists and musicians were all born before 1946 (including Allen Ginsberg, who was born in 1940). Echoing Gillon, Menand writes “the fraction of any generation that engages in radical or countercultural behavior is always very small”.

 

When generations are discussed in such broad terms, whether it is to hurl criticisms or to attribute accomplishments, the variety and complexity of individual experiences is erased. This is especially true as our population becomes increasingly racially and economically diverse. The year in which one was born does not determine their identity. This is true whether someone was born in 1910 or 1990. Despite that fact, every generation from the Greatest Generation to millennials has been criticized, watered-down, and blamed for perceived societal problems. We can already see the roots of this cycle take hold among the discourse around the newest generation – Gen-Z. A search of the words “Generation Z” on the New York Times' website yields over 10,000 results.

 

Karl Mannheim, a foundational thinker of generational theory, defined generations as ‘cohorts’ of people who coalesce around a shared experience in their childhood. When I try and apply that methodology to my own life, I find myself at a crossroads. I was born in 1999. I don’t remember 9/11. Or dial-up internet. Or a time before the internet at all. But I do remember a time before iPhones. And Obama’s first inauguration. And the 2008 Recession. Am I a millennial? Or am I Gen-Z? The label I choose to adopt will determine which case of intergenerational conflict I am associated with, but it will not change my experiences. If this pattern of each generation criticizing the next persists, the best way to deal with its inherent flaws is self-awareness. Allen Ginsberg was self-aware when he realized that newspapers and magazines were exploiting the beats to sell more copies. Millennials and Gen-Z should embrace that same spirit and ignore whatever narrative is placed upon them because they know how stratified and idiosyncratic their lives truly are.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173341 https://historynewsnetwork.org/article/173341 0
Understanding America's History Of Gun Control

 

Receiving a breaking news alert about a mass shooting in the United States is no longer shocking, but anxiously anticipated. As the prevalence of gun violence and mass shootings continues to exponentially increase, asking the question “Did you hear about that shooting?” is now often met with the question “which one?” Americans have become shockingly desensitized to the constant violence caused by guns, and the significance of a mass shooting is appraised by the number of casualties inflicted, rather than the frequency at which these shootings occur.

 

While all Americans mourn the lives lost, politicians remain divided about the best solution. Those who support gun control attribute the recent increase of mass shootings to the lack of federal regulations on the sale of guns. Contrastingly, those who advocate for gun rights argue that they have the right to be armed according to the Second Amendment and that the removal of weapons from American citizens would be unconstitutional. However, gun control in America has not always been a polarized, uphill battle. Historically, support for gun control has been largely influenced by how gun violence and gun ownership have effected and subsequently shaped our political and social spheres. Understanding the turbulent history of gun control in the United States can explain why Americans cannot agree on gun control legislation when it is most needed. 

 

The origins of gun control began in 1934 when the violence caused by crime boss Al Capone incentivized Congress to pass legislation that required all gun sales to be recorded in a national registry as a way to regulate who owned firearms in the country. About 25 years later, the first single gunman mass shooting occurred in Camden, New Jersey. Howard Unruh killed 13 people in his neighborhood in 1949, bringing large-scale awareness to gun violence in the United States. But it was the assassination of President John F. Kennedy in 1963 and Rev. Martin Luther King Jr. in 1968 that sparked unprecedented public support for gun control at that time. These events catalyzed The Gun Control Act of 1968 which became law on October 22, 1968. The federal law prohibited the sale of mail-order guns, banned all convicted felons, drug users, and those found “mentally incompetent” from owning a gun, and changed the age of legal purchase to 21. Upon signing the Gun Control Act of 1968, President Lyndon B. Johnson stated: “We have been through a great deal of anguish these last few months and these last few years - too much anguish to forget so quickly… We have made much progress--but not nearly enough.” 

 

However, President Johnson’s call to add more regulations on gun sales in America did not come to pass. The rising crime rates in the 1960s generated widespread concern about violence in the United States. Many believe this fear of crime was compounded by racialized fears of black people with guns. For many, this intensified the perceived need to obtain a firearm for personal protection. As concerns for personal safety escalated, the National Rifle Association utilized the Second Amendment and its large political influence to lobby against previously established gun control policy.

 

Surprisingly, the NRA was not always opposed to gun control legislation; in fact, they supported initial efforts that enforced gun control, including the Gun Control Act of 1968. But as more and more individual Americans began buying firearms, the NRA lobbied to equate gun ownership with American freedom, as they interpreted the Second Amendment as the right for all citizens to bear arms individually. The NRA tried to convince the public that owning a gun was more than just a way to ensure personal safety: it was patriotic and a constitutional right. This newfound change surrounding the appeal in gun ownership, which stemmed from the NRA’s social and political influence, led to the 1986 Firearms Owners’ Protection Act. Ultimately, this law rescinded the majority of the gun control legislation that was established in the Gun Control Act of 1968 and included a prohibition on the previously implemented national registry of gun owners. This dramatic transformation in gun control legislation combined with the growing patriotic sentiment surrounding gun ownership coalesced into the polarized gun control issue the United States has come to recognize.

 

Since the 1986 Firearms Owners’ Protection Act, the NRA has fought hard to keep gun control to a bare minimum and has continued to promote this patriotic culture within communities that advocate for gun rights. It is important to note that in recent mass shootings, the majority of weapons used in these situations were purchased legally, despite several gunmen having documented mental health issues or criminal histories.  Although 89 percent of Americans support expanding federal background checks and all 19 democratic candidates running for the 2020 presidency support an assault weapons ban, Congress is seemingly unable, or unwilling, to pass gun control legislation.

 

Unfortunately, the topic of gun control in the United States is one that is both painfully familiar and extremely taboo. It has become a topic that has permeated into our everyday lives, yet discussing its history and complexity is avoided at the dinner table. But we need sufficient gun control legislation now. What is strikingly clear is that our current gun control legislation is simply inadequate at protecting Americans and is allowing those who want to hurt or kill large numbers of people as quickly as possible to obtain the means to do so both legally and with ease. Our history demonstrates that gun control legislation is not impossible to achieve, but also warns that how we frame gun ownership impacts political outcomes. When we glorify guns, it is harder to pass new gun control legislation. We must learn from historical trends and consider them as we continue to take steps towards implementing effective gun control today.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173342 https://historynewsnetwork.org/article/173342 0
Climate Change Should Makes Us Reckon Our Long History with Wind Energy

 

Greta Thunberg stood in front of over 100 global leaders at the U.N. climate summit on September 23rd and accused them of sacrificing our planet’s future. Her palpable rage resonated with world leaders and concerned citizens alike. French President Emmanuel Macron responded in support less than an hour later and German chancellor Angela Merkel met with Thunberg that same day. The video of her speech was widely shared on social media. She lamented that “for more than 30 years, the science has been crystal clear. How dare you continue to look away and come here and say you’re doing enough when the politics and solutions needed are still nowhere in sight?”

 

Climate change is often framed as an impending, ever-growing wave that never crashes. It’s formless; it will happen sometime in this century and we’re supposed to do something about it, but not many people really know what. There’s a hodge-podge of solutions that are thrown around at Democratic debates and global summits – a carbon tax, sea-level rise evacuation, sustainable agriculture, rejoining the Paris Climate Accord – but like the much-talked about Green New Deal, plans are not so clear-cut that people can easily rally around them.  

 

However, there is one form of green energy that humans have used to harness energy and power society since early recorded history: wind. As we debate what to do about climate change, we should recognize that green solutions like wind turbines are not new ideas and are a feasible energy source.

 

The science may have been crystal clear more than 30 years ago, but the use of wind as an easy, free source of energy started spinning long before that. Wind energy began to be used in the cradle of civilization in Ancient Egypt. As early as 5000 B.C., windmills caught the wind to help power boats up and down the Nile River. It soon developed methods of mechanical use. Wind wheels in modern-day Iran were used to grind grain and pump water by the 9th century. Wind power eventually spread to Europe, where the Netherlands adapted windmills to drain lakes and marshes along the Rhine River Delta. Early manufacturers of wind turbines quickly realized, however, that there was no economically viable method of energy storage. Until some way to capture the energy was discovered, wind energy was simply not feasible.

 

Until the 19th century, wind turbines were primarily used to drain wetlands. Beginning in 1887, though, prototypical wind turbines began to be developed in Scotland by Professor James Blyth of Anderson’s College in Glasgow, and in Cleveland, Ohio by Charles F. Brush. These were the first wind turbines used to produce electricity. Brush was able to address the energy-storage problem. His 12-kW turbine charged 408 batteries in the cellar of his mansion. At the 1893 World’s Columbian Exposition in Chicago, manufacturers touted their new designs to the rest of the world.

 

In the early 20th century, wind energy became a much more popular energy source. In the 1920s and 30s, the Midwest became the wind energy hub of the United States. Farms were dotted with wind turbines and it became the dominant energy source for the region. The Soviet Union initiated utility-scale wind turbine development in 1931. In 1941, the first megawatt-sized turbine was connected and put into use in Vermont. The turbine, manufactured by the S. Morgan Smith Company, was the only similarly sized turbine to come online for another 40 years. 

 

Wind energy only became more feasible as designs became more cost-effective and productive. In order to increase demand, manufacturers looked for inspiration in the fast-growing aviation industry. In the US, the designs of increasingly larger wind generators were heavily inspired by airplane propellers and monoplane wings. Development of turbines in Europe increased dramatically between 1935 and 1970. The efforts were led by Denmark, with some contributions from France, Germany, and the UK. These efforts showed that large-scale production and implementation of wind turbines was possible. In the second half of the 20th century, the demand for turbines began to materialize.

 

Wind energy as an industry really first developed in the late 20th century in Denmark and the United States. Since then, wind turbines have taken very different paths in the two countries. They do have one thing in common, though. Whenever resources and fossil fuels are in short supply or demand, as they were doing both World Wars and the 1970s oil crisis, demand for wind energy has increased and the industry has grown. 

 

The 1970s marked a major turning point for wind energy. The United States and the rest of the globe were experiencing the results of oil shortages. This shortage sparked an interest in alternative energy solutions, specifically wind energy. The US government supported and funded the research and production of large wind turbines. By the end of the 1980s, thousands of wind turbines had been installed off the coast of California. They were supported by federal and state laws that had encouraged the use of renewable sources of energy. 

 

The 1970s showed that relativew cost matters: wind can take over when oil is expensive. Greta Thunberg, in her address to the U.N., addressed this economic side of climate change. She said, “We are in the middle of a mass extinction, and all you can talk about is money and fairy tales of eternal economic growth.” That has been the case since wind power was invented. Countries have always opted for the cheaper source of energy found in fossil fuels.

 

Fossil fuels might not be the cheaper option much longer. Not only will fossil fuels eventually be depleted, our reliance on them should come to an end before that even happens. Nonrenewable resources are cheap now because of new fracking methods, but is it also because we don’t see a feasible alternative? As we transition away from fossil fuels, wind energy should be recognized as just that. The claim is not an exaggeration. In his book, Drawdown: The Most Comprehensive Plan Ever Proposed to Reverse Global Warming, Paul Hawken writes “In the United States, the wind energy potential of just three states – Kansas, North Dakota, and Texas – would be sufficient to meet electricity demand from coast to coast.” That statistic is staggering, but it is a sign of hope that the developments that wind energy has made in the last 50 years have made it more appealing.

 

History has shown that when we can’t find our energy from fossil fuels, we turn to wind. It has historical roots that go all the way back to early civilization, and as an energy source it has a lot of potential. Currently, we have all the energy we need with fossil fuel reserves, but in 30 years that won’t matter. Due to the fact that we’re already seeing the effects of climate change, wind energy will become very important in the next 20 to 30 years. We should embrace it. 

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173399 https://historynewsnetwork.org/article/173399 0
The Civil War's Unforgiving Final Year and How It Changed the War’s Legacy

 

A few years ago, I wrote a biography of Stonewall Jackson called Rebel Yell, which, in addition to tracking his life, chronicled the first two years of the American Civil War. Jackson fought in the war’s earliest battles, and later in some of its biggest: Second Manassas, Antietam, Fredericksburg, Chancellorsville. These were violent, bloody affairs, made all the more horrifying because neither the North nor Souththought the war would be so long and destructive. 

 

But as horrific as the war turned out to be, what happened in the first few years seems almost innocent compared to what happened in the last year. I don’t mean to diminish the human sacrifice, but the opening of the conflict was, as some contemporaneous historians observed, a “bandbox” war compared to what came later. Men and boys marched off to war with light in their eyes and hope in their hearts and bands playing in their town squares. Even when they died horribly of wounds or sickness, the idea of glory persisted. Hope and optimism still somehow trumped hatred, and despondency.

 

As I researched my new book about the war’s final year, Hymns of the Republic, I was absolutely struck by how hard, cruel, and bitter the war had become, and, just as important, how hard, bitter, and cruel its participants had become.

 

What happened in the war’s final, brutal phase, is what I call the Lee Paradox.

 

Though Robert E. Lee was not the sole reason the North could not beat the South—the war was a big place, with many theaters and many players—he was inarguably the main reason. For two years he had tied the Union in knots, politically as well as militarily. He had made fools of its generals. At the beginning of the war’s last year, Ulysses S. Grant had taken charge of the Union armies and particularly of the Army of the Potomac, whose mission was to destroy Lee. To the dismay of people in the North, Grant utterly failed to do that. Sixty-five thousand Union casualties in 2 months in Virginia testified to that. Lee would not be beaten. And so the war dragged on.

 

But Lee’s—and the South’s—ability to survive came at a ghastly price. The more Lee won—or at least did not lose—the more the South itself was destroyed. Survival meant destruction. This is the Lee Paradox. The collapsing Confederacy was steadily taking down everything and everyone with it. Two-thirds of all Southern wealth had vanished, along with 40 percent of its livestock, half of its farm machinery, and 25 percent of all white men between the ages of 25 and 40.

 

The worst part of Lee’s success was the hard war against civilians that it unleashed in the form of devastatingly destructive marches by Union generals William Tecumseh Sherman and Philip Sheridan. Their targets were not armies. After the fall of Atlanta in September 1864, Sherman was not much interested in armies. He wanted to break the South’s unbreakable will, and so the war was conducted mostly against civilians and their assets. Sherman’s march through Georgia in the fall of 1864 was horrendously destructive of all productive assets, from cotton gins to crops and barns and railroads. It was exceeded in destruction and horror only by what Sherman’s army did in South Carolina, which made the march to the sea in Georgia seem almost kindly by comparison. In the Shenandoah Valley that same fall, Phil Sheridan’s troopers fanned out behind his infantry and burned everything they could get their hands on except, technically, houses, though they burned plenty of those, too. Sheridan’s burning campaign also gave impetus to a brutal guerrilla war in Virginia, in which commanders like John Singleton Mosby and George Armstrong Custer engaged in retaliatory killings of captive soldiers. This was bitterness on a scale unseen in the war.

 

Enhancing this turn into hatred was the presence, in the Union army, of 180,000 black troops, 10 percent of the entire Union army, more than 60 percent of whom had recently been slaves. Confederate soldiers hated them with a passion, targeting them for slaughter in battles and giving them no quarter when captured.

 

What was happening in the fall and winter of 1864-1865 was perhaps best summed up by Sherman himself, the poster boy and ideologue of the hard new war. He wanted, as he put it, “to make [the South’s] inhabitants feel that war and individual ruin are synonymous.”

 

Perhaps not lost on Robert E. Lee—this was the final irony of the Paradox—was the fact that he was personally ruined by his own success. By the end of the war he had lost his family’s three large estates—including Arlington House (where Arlington National Cemetery is today)—all of his productive assets(including the people he once enslaved), and all of his personal money and investments. While he and the Army of Northern Virginia fought hard to the end, his shattered family became refugees, virtual paupers. The South, whose armies surrendered in the spring of 1865, was merely a shell of itself, so hollow and substanceless that it would take much of it the better part of a century to dig out. The bitterness and hatred would never be forgotten.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173401 https://historynewsnetwork.org/article/173401 0
Throwing Away the “Electability” Argument

Shirley Chisholm reviews political statistics, 1965

 

As the Democratic primary heats up, political commentators continue to analyze candidates through the same lens: electability. Many political analysts ask, could a woman be electable? Articles such as Democrats are prioritizing “electability” in 2020. That’s a coded term, Has Elizabeth Warren broken the electability ceiling?, Could a woman beat Trump? Democrats worry – and hope analyze electability and the many angles with which we look at female and minority candidates.

 

Many authors focus on Elizabeth Warren, who is now the perceived frontrunner in the race. Although she’s risen in the polls, people continue to assert she isn’t electable. For example, the LA Times recently published an op-ed entitled Electability matters. Nominating Elizabeth Warren would be a mistake. This media angle has two problems. First, it ignores the lasting impact politicians and activists have made from beyond the White House. Second, history suggests non-white males are more electable than we think.

 

Women and minorities have run for the highest office in the land way before Hilary Clinton and Barack Obama battled each other for the 2008 Democratic presidential nomination. Shirley Chisholm, the first African American woman elected to Congress, ran for President in 1972 and further made history as the first black person to seek the presidential nomination from one of the two major parties.

 

Under the slogan “Unbought and Unbossed,” Chisholm ran as an unabashed advocate for poor, inner-city residents. Displaying the tension within the Democratic party, one of her opponents was infamous segregationist George Wallace. Many people were wary of her campaign, believing that she was unable to defeat Nixon in the ’72 election because she was a black woman. Chisholm attracted strong support from black women but failed to garner much support from black men or white women. Many of them supported Sen. George McGovern because he was viewed as more likely to defeat Nixon.

 

She ended up winning only the Democratic primary in New Jersey, but she was the first African American or woman to win a presidential primary. People never ended up taking her campaign seriously, instead focusing on Senator George McGovern because of his perceived electability.

 

Although she was not the nominee, Chisholm had a large political influence that stretches to this day. Chisholm tried to use her coalition and 152 primary delegates to negotiate in favor of a party platform that emphasized and recognized the rights of women, black Americans, and Native people, but Sen. McGovern had such a large delegate lead in 1972 that he did not acquiesce to Chisholm’s demands.

 

After her loss in the 1972 primaries, Chisholm continued to make an enormous impact on politics and American culture by opening the door for other black and female candidates to run for president. Rev. Jess Jackson picked up where Chisholm left off and ran two presidential campaigns in 1984 and 1988.

 

Shirley Chisholm’s campaign run in 1972 has also similarly inspired candidates, specifically Senator Kamala Harris, who even named Chisholm as her political hero in a recent New York Times interview. Sen. Harris’s 2020 campaign logos are even a nod to Chisholm’s ’72 run.

 

Beyond running for president, African American women like Fannie Lou Hamer have made an indelible mark on American society by fighting for civil rights. After attending a meeting led by civil rights activists from the Student Non-Violent Coordinating Committee and the Southern Christian Leadership Conference, Hamer became a fierce fighter for black voting rights. It took until 1962 for Hamer, who was born in 1917, to realize that black people could register and vote. Later, she created the Mississippi Freedom Democratic Party (MFPD) to challenge the local Democratic Party and their efforts to block black participation. She later helped organize Freedom Summer, which brought college students to the deep south to fight for civil rights. In a recent article for Time, Historian Keisha Blain tells us that Hamer’s words offer much-needed guidance, direction and determination: faith without action is dead. For the rest of her life until her death in 1977, Fannie Lou Hamer fought to expand the suffrage and fight for civil rights for all.

 

Second, there is no historical basis for the idea that women and minority candidates aren’t electable.

 

In 2016, Hillary Clinton won the popular vote by a nearly three million vote margin, despite losing the electoral college.

 

In the 2008 Democratic Primary, voters perceived Clinton as more electable. In a primary poll, 39% of people said Barack Obama couldn’t win. Yet, Obama expanded the Democratic map and won states like North Carolina, Virginia, and even Indiana – states never before won by Democrats in this era. In 2008 and 2012, Barack Obama defeated establishment candidates John McCain and Mitt Romney and became the first African American president.

 

In 2018, 23 of the 41 House seats flipped by Democrats were won by women.

 

In 2018, two of the seats Democrats picked up in the Senate were won by women and one was openly bisexual.

 

In 2018, four of the seven Gubernatorial seats that Democrats flipped were won by women.

 

Further, a recent CBS poll suggested that 59% of Democratic voters prefer a female candidate, while 60% prefer a black candidate as opposed to a white candidate.

 

There is no empirical evidence to suggest that a woman can’t win in a certain state. Two midwestern states that Hillary Clinton lost in 2016, Kansas and Michigan, elected female governors in 2018 and many other women in the region were responsible for flipping Republican House Seats.

 

With the 2020 election already underway, many qualified female and minority candidates are still on the Democratic debate stage, raising lots of money, and even polling well. For example, since December 2018, Elizabeth Warren's average poll standing has increased by a whopping 21.1 percentage points. A recent Fox News poll has Elizabeth Warren 10 points ahead of Donald Trump in a head to head matchup. No other candidate has come close to that.

 

The United States has had its fair share of unorthodox presidential candidates, and yes, some of them, like self-professed clairvoyant and psychic Victoria Woodhull, may not meet the mark for being a qualified candidate.  But we must stop defining “electability” in terms of race and gender. Let’s finally throw this tired electability argument into the trash.

 

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173339 https://historynewsnetwork.org/article/173339 0
The Widening Gap Between the Super-Rich and Other Americans

 

Despite the upbeat words from America’s billionaire president about the “economic miracle” he has produced, economic inequality in the United States is on the rise.

 

In August 2019, the Economic Policy Institute reported that, in 2018, the average pay of CEOs at America’s 350 top firms hit $17.2 million―an increase, when adjusted for inflation, of 1,007.5 percent since 1978.  By contrast, the typical worker’s wage, adjusted for inflation, grew by only 11.9 percent over this 40-year period.  In 1965, the ratio of CEO-to-worker’s pay stood at 20-to-1; by 2018 (when CEOs received another hefty pay raise and workers received a 0.2 percent pay cut), it had reached 278-to-1.  

 

An AFL-CIO study, released in June 2019, had similar findings.  Examining compensation at Standard & Poors 500 companies, the labor federation reported that average CEO pay in 2018 had increased by $5.2 million over the preceding 10 years.  This resulted in an average CEO-to-worker pay ratio of 287-to-1.

 

These figures, of course, are only averages, and at numerous major corporations, the economic gap between boss and worker is much greater.  According to the AFL-CIO, the CEO-to-worker pay ratio at Walmart (America’s largest private employer) is 1,076 to 1, at Walt Disney Company 1,424-to-1, at McDonald’s 2,124-to-1, and at Gap 3,566-to-1.  At 49 S&P 500 firms, noted an Institute for Policy Studies report, half the work force―that is, 3.7 million employees―received wages below the official U.S. poverty line for a family of four.

 

Thus, despite the soaring incomes of top corporate executives and other wealthy Americans, the median household income in the United States grew by only 0.2 percent during 2018―a decline from the three previous years.  Commenting on U.S. wage stagnation, Sam Pizzigati, co-editor of inequality.org, observed that “average Americans have spent this entire century on a treadmill getting nowhere fast.  The nation’s median―most typical―households pocketed 2.3 percent fewer real dollars in 2018 than they earned in 2000.”

 

Although President Donald Trump has claimed that “inequality is down,” federal data released this year show that, in 2018, the nation’s income inequality reached the highest level since the U.S. Census Bureau began measuring it five decades before.

 

U.S. economic inequality is even greater in terms of wealth.  During the Democratic presidential debate in late June 2019, Senator Bernie Sanders reminded Americans that just three U.S. billionaires (Jeff Bezos, Bill Gates, and Warren Buffett) possessed as much wealth as half the people in the United States combined.  And the three richest U.S. families―the Waltons (owners of Walmart), the Mars candy family, and the Koch family (owners of a vast fossil fuel conglomerate)―possessed a combined fortune ($348.7 billion), which is 4 million times the wealth of the median U.S. family.

 

Although the median net worth of U.S. households has declined (after adjusting for inflation) since the late 1990s, the fortunes of the wealthy have skyrocketed.  The American billionaires sharing their ostensible wisdom at the World Economic Forum in Davos at the beginning of 2019 made enormous gains in wealth over the previous decade. They included Jamie Dimon (275 percent), Rupert Murdoch (472 percent), Stephen Schwarzman (486 percent), Marc Benioff (823 percent), and Mark Zuckerberg (1,853 percent).

 

According to computations made by Forbes in October 2019, the ten wealthiest Americans (with riches ranging from $53 billion to $107.5 billion each) had combined wealth of $697 billion―or an average of $69.7 billion each.  Assuming that, henceforth, they had no further income and had limitless longevity, they could each spend a million dollars a day for approximately 191 years.

 

Most other Americans possess far fewer economic resources.  In 2018, 38.1 million Americans lived below the U.S. government’s official poverty threshold, including many people working at multiple jobs.  Furthermore, another 93.6 million Americans lived close to poverty, bringing the total of impoverished and near-impoverished people to nearly 42 percent of the U.S. population.  

Naturally, economic deprivation has serious consequences.  According to the U.S. Department of Agriculture, 14.3 million households in America have difficulty providing enough food for their families.  Low income families are also plagued by inadequate education, alcohol and substance abuse, and poor housing, health, and life expectancy.  The U.S. Government Accountability Office reported in September 2019 that poor Americans die at an earlier age than rich ones.  Indeed, in 2019, for the first time in a century, life expectancy in the United States declined for three consecutive years.  Suicide rates, which closely correlate with poverty, increased by 33 percent since 1999.  Even what is left of the dwindling middle class faces the crippling costs of health care, college education, and debt payments.  

 

This situation bears no resemblance to that of America’s ultra-wealthy, who, in addition to pouring money into the campaign coffers of politicians that safeguard and expand their fortunes, continue purchases like one multi-billionaire’s acquisition of a $238 million Manhattan penthouse―a supplement to his two floors at the Waldorf Astoria hotel in Chicago ($30 million), Miami Beach penthouse ($60 million), Chicago penthouse ($59 million), and additional apartment in Manhattan ($40 million).  Other recent purchases by the ultra-rich include a $100 million, 305-foot “super-yacht” (complete with helipad and IMAX theater), private jet planes ($65 million), and (of course) gold toilet paper.

 

The latest attraction for America’s ultra-affluent is Manhattan’s 131-floor Central Park Tower building which, when completed, will become the tallest, most expensive residential dwelling in the United States.  It will feature179 luxury condos ranging in price from $6.9 million to $95 million and a seven-story Nordstrom flagship store with six restaurants, plus three floors of “amenity space” (dubbed the Central Park Club) spanning 50,000 square feet, with an outdoor terrace, pools, a wellness center, and a massive ballroom.  The immense height of the structure will underscore the vast power of the super-rich, as well as enable them to avoid noticing the many “losers” left behind on the teeming streets below.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173406 https://historynewsnetwork.org/article/173406 0
What the Most Influential Text on Cannibalism Can Teach Us About Studying History

 

In the 1970s, a student at Stony Brook University asked his anthropology professor, William Arens, why he “lectured on kinship, politics and economics instead of more interesting things like witchcraft, fieldwork experiences and cannibalism.” Arens listened to the student, reevaluated what he taught, and “consequently...turned to the study of man-eaters.”

 

As Arens researched popular accounts of cannibalism, he discovered a disturbing trend: there was not a single shred of compelling evidence that humans ever practiced the ritualistic eating of human flesh. Could conquistadors, for instance, be concocting stories of Native American cannibalism in order justify their conquests of the “heathens”? Arens presented the idea to colleagues who promptly told him “to concern [himself] with more serious scholarship.” This further stimulated Arens’s curiosity. 

 

Unable to locate reliable textual sources, Arens put a notice in the Newsletter of the American Anthropological Associationasking if anyone had eye-witness knowledge of ritualistic cannibalism. Arens received four responses, each a dead end. With Arens’s suspicions of ritualistic cannibalism seemingly confirmed, his project picked up steam and soon Oxford University Press accepted his manuscript. The resulting 1979 monograph, The Man-Eating Myth, is the most influential text ever written on cannibalism.

 

The reason why The Man-Eating Myth rejuvenated a whole field of study; the reason why it inspired a deluge of articles, theses, dissertations, and books; the reason why it changed the approach scholars take when dealing with sources that contain anthropophagy is because of the book’s shocking thesis. Arens argued that ritualistic cannibalism had never been observed or documented by anyone at any time. Instead, all recorded instances of cannibalism that he studied (save survival cannibalism à la the Donner Party, or “antisocial behavior” in the vein of Jeffrey Dahmer) were fabricated by whites in their quest to barbarize and brutalize those they intended to colonize. As Arens summarizes, “Excluding survival conditions, I have been unable to uncover adequate documentation of cannibalism as a custom in any form for any society. Rumors, suspicions, fears and accusations abound, but no satisfactory first-hand accounts.”

 

This article addresses six problematic aspects of Arens’s The Man-Eating Myth—the most pronounced is that ritualistic cannibalism does exist as a cultural practice across the globe. Even though the majority of this piece is a strong critique, I attempt to avoid the complete slash-and-burn style that is often seen in discussions of this controversial book. In addition to pointing out the work’s errors, my analysis highlights some good that came from The Man-Eating Myth, such as how Arens made authors far more accountable and careful when discussing the sensitive topic of anthropophagy, and the importance of not completely writing off colonizers’ viewpoints simply because they are colonizers. In essence, this article argues that although Arens’s denial of ritualistic cannibalism is totally irresponsible, his book is quintessential to modern studies on cannibalism.

 

***

 

Arens released his monograph at an opportune time. In the 1970s, the Ivory Tower thoroughly began rejecting the traditional and colonially biased versions of history. Pioneering scholarship such as Edward Said’s Orientalism and Michel Foucault’s Discipline and Punish preceded Arens’s work and dovetailed with his findings of colonizers controlling the gazes of outsiders for their own gain. Further making the environment ripe for The Man-Eating Myth, anthropologist Michael Harner published a controversial article in 1977 suggesting that the Aztecs maintained their empire through the use of cannibalism. Harner argued that because of rapid population growth and the absence of large sustainable herbivores (buffalo or deer), Aztecs had to rely on cannibalism in order to satisfy their protein requirements. Arens’s refutation of ritualistic cannibalism provided a provocative counter to Harner’s arguments. Oxford University Press expected a hit. 

 

The initial reception of the Man-Eating Myth was positive. William McGrew, a psychologist from the University of Stirling, proclaimed that “if [Arens’s] idea sounds preposterous, the reader might pause to reflect on how recently it was in Europe and America that witchcraft was taken very seriously indeed.” Khalid Hasan, in Third World Quarterly, reverberated the praise: “In a brilliant and well-documented work Arens scrutinizes the available anthropological and popular literature on cannibalism and establishes that no concrete evidence exists about the practice.”

But after a wave of positive reviews, a torrent of negative reviews flooded in—each more vicious than the last. “The difficulty with the book,” contended James Springer in Anthropological Quarterly, “is that Arens is almost certainly wrong.” “There is so little regard for accuracy,” quipped Shirley Lindenbaum, “that one wonders whether the book was in fact ever intended for a scholarly audience.” To explain the backlash and to have all the main critiques assembled in a single location for future researchers, I will concisely describe the six major issues with The Man-Eating Myth. 

 

Issue #1: A Purposefully Unattainable Criteria for Cannibalism

The provocative point of Arens’s argument is that hecould not find any valid sources of cannibalism. Therefore, if a single historical source is able to meet his strict criteria, his essentialist statement crumbles. With this in mind, Arens sets his criteria for a legitimate viewing of cannibalism at a nonsensical level: an eyewitness account from an academically trained anthropologist. This effectively nullifies every viewing of cannibalism prior to the twentieth century. As one scholar incredulously responds, “It is difficult to assume, as [Arens] does, that all explorers, conquistadors, missionaries, traders, and colonizers—as well as many historians and journalists—have inaccurately, and perhaps dishonestly, represented instances of cannibalism they claimed to have witnessed, and for which physical evidence has been found.”

 

Issue #2: Excessive Denigration

Arens’s thesis rests upon the backs of easily demonized historical actors such as Christopher Columbus and Hernán Cortés. These figures perfectly fit the model Arens has created: their accounts are outrageous, and they had everything to gain from propagating the assertion that Indians practiced cannibalism. Arens then treats feasible sources as if they had the same dark intentions and motivations as Cortés and Columbus. This allows Arens to use the reasoning that if a source came from a colonizer, their descriptions must be false. For an example of the ad hominem employed, consider Arens’s passage on Hans Staden, a German shipwrecked on the coast of Brazil:

 

[Staden] curiously informs the reader that “the savages had not the art of counting beyond five.” Consequently, they often have to resort to their fingers and toes. In those instances when higher mathematics are involved extra hands and feet are called in to assist in the enumeration. What the author is attempting to convey in this simple way with this addendum is that the Tupinamba lack culture in the sense of basic intellectual abilities. The inability to count is to him supportive documentation for the idea that these savages would resort to cannibalism. 

 

As anthropologist Donald Forsyth explains in an article countering Arens assertions, “Staden’s statement concerning Tupinamba enumeration is correct….ancient Tupi had no terms for numbers beyond four. Larger numbers were expressed in circumlocutions, often involving fingers and toes.” Staden expresses what he saw, but Arens puts thoughts in Staden’s head and twists the testimony to fit his needs.

 

Issue #3: Cannibalism is Not Inherently Evil

Arens believes that cannibalism goes against “the strongest and most elementary social constraints.” Asa result, The Man-Eating Myth is written with the mindset that cannibalism is naturally aberrant or evil behavior. This ignores that cannibalism functions as a positive act in some cultures. The Amahuaca Indians of the Amazon, for example, consumed the ash of their dead to “appease the spirit of the deceased.” Neglecting to do so could result in the deceased being stuck in this world “caus[ing] trouble, [and] hanging around wanting to kill someone.” The Wari’ of coastal Peru similarly described that cannibalism “was considered to be the most respectful way to treat a human body [after death].” For the Amahuacas and the Wari', endocannibalism is an affectionate act; to not practice it is cruel and immoral.Arens is unable to inhabit this cultural relativism; for Arens, all forms of cannibalism are evil. As Christopher Robert Hallpike expounds in his 2017 article on The Man-Eating Myth, “Arens’s unwillingness to believe in the very possibility of cannibalism as an institution appears, in fact, to be his own ethnocentric Western prejudice.”

 

Issue #4: Arens Refused to Look Deeply at European Culture

The fifth problem is closely related—Arens continually looked outward for cultures that practiced cannibalism rather than inward. Had he taken a closer look at Europeans, he would have found a wonderfully well-documented customary cannibalism. 

 

During the Renaissance, at the same time explorers wrote of cannibalistic orgies in the New World, consumers in the Old World—entranced in a culture of ailments, elixirs, and tinctures—ritually consumed human flesh as medicine. One ritual was savage; the other, enlightened. As author Bess Lovejoy writes in an introduction to the European flesh market, “many recipes relied on sympathetic magic: powdered blood helps bleeding, human fat helps bruising, skulls help with migraines or dizziness.” This sort of cannibalism had a different face. It was “scientific” and consequently, easier to overlook as Arens did. The irony, of course, is thick. As Europeans scorned cannibalism, they had a culture that simultaneously revered it.

 

Issue #5: Archaeological Evidence of Ritualistic Cannibalism Exists

In spite of Arens’s assertion that “the rarity of the [archaeological] finds...does not permit the conclusion that the material evidence ever points to cannibalism as a cultural pattern in either gustatory or ritual form,” archaeological evidence for cannibalism is now robust. 

 

Before The Man-Eating Myth, there existed a rickety list of criteria for osteological proof of cannibalism. Since the publication of Arens’s thesis, archaeologists have revamped that list and set a stricter standard. Osteological indicators of cannibalism include “pot polish,” or bones smoothed from rubbing against the sides of clay boiling pots; cut marks that are analogous to the cut marks on processed animal bones; and a pattern of bone being cut, and then broken, and then burned (harvested, prepared, and cooked). With the such osteological indicators, archaeologists discovered cannibalism in the American Southwest, in Neolithic France, and in prehistoric Ethiopia. And in 1999, a new technique was developed to further solidify evidence of cannibalism in our past: the presence of digested myoglobin, a human muscle protein, in fossilized feces.

 

Although the archaeological evidence of cannibalism is robust, the archaeological evidence of ritualistic cannibalism was less than clear-cut. That is key because Arens does not deny “rare [and] isolated instances of prehistoric beings who engaged in survival cannibalism.” Instead, he denies “cannibalism as a cultural pattern.”

 

In 1993, archaeologists made a major theoretical advancement by showing strong archaeological evidence of customary cannibalism in the American Southwest. A husband and wife team, Christy and Jacqueline Turner, analyzed hundreds of sites over the span of decades in the Anasazi cultural region and found that sites with strong evidence of cannibalism were not randomly distributed. Instead, the sites were exclusively located within the Anasazi culture area—none in the surrounding regions despite those regions having “more severe winters [which] should have produced some cannibalized assemblages if starvation had been the primary cause.” Moreover, survival-cannibalism could not explain why the bodies uncovered by Turner and Turner were so battered and beaten—the markings indicating torture-like trauma. With starvation-cannibalism ruled out, customary cannibalism became heavily inferred. Turner and Turner solidify this inference by turning to the historical record and showing that this outcropping of cannibalism was likely spurred by the spread of Aztec culture in the form of immigrants flowing north and following a “warrior-cultist tradition.”

 

Determining cultural cannibalism through archaeological means is a greatly burdensome and difficult task and had not been conclusively done prior to The Man-Eating Myth. Arens is overstating his case in arguing that there exists no evidence of customary cannibalism derived through archaeological means. Yet, in 1979, his assertion was technically correct.

 

Issue #6: Arens’s Limited Source Base

The Man-Eating Myth attacks instances of cannibalism among Africans, early man, Polynesians, the Indians of the American Southwest, the Iroquois, the Caribs, the Aztecs, the Tupinambás, and the peoples of the New Guinea Highlands. With such a broad range of peoples, Arens is unable to give a nuanced analysis of each group’s supposed cannibalism. For each community, Arens devotes a paltry twelve pages.

 

To clarify, the number of pages devoted to a topic is not fully indicative of that work’s quality. And the problem of being overly broad is inevitable when considering the scope of Arens’s book—a problem that Arens himself acknowledges. Arens decided to focus on “the most popular and best-documented case studies of cannibalism.” Therein lies the issue. Arens only uses the “popular and best-documented” cases of cannibalism as his sources, or at least principally. Numerous primary sources, secondary sources, and any other cultural histories are bypassed. Cannibalism, by The Man-Eating Myth’s correct assertion, is an incredibly dangerous label. When discussing the subject, a comprehensive review of the surrounding literature must be done; a comprehensive review which Arens neglected.

*** 

The New Yorker wrote that the The Man-Eating Myth “is a model of disciplined and fair argument.” The six aforementioned problems show that The Man-Eating Myth is instead a model of imprecision and sharp sophistry. As one scholar aptly puts, “If anthropologists don’t want to believe in evidence for regularly-practiced, culturally-sanctioned cannibalism it is because they are purposefully avoiding the evidence.”

The Second Part That Is Usually Forgotten

Writers usually end there—they bash the book and call it a day. This is a mistake. Academics are so frenzied by the scent of scholarly blood, that they have ignored insightful aspects of Arens’s work.

 

To begin, colonizers do in fact use cannibalism as a tool to claim what is not theirs. In my own studies on the Karankawa Indians of Texas, Anglo-American settlers regularly used rumors of these Native Peoples’ cannibalism to justify wanton murder.  In one vivid instance, Anglo-Americans supposedly stumbled upon some Karankawas cannibalizing a colonist’s young child. “The Indians were so completely absorbed in their diabolical and hellish orgie, as to be oblivious to their surroundings, and were taken by surprise.” The colonizers massacred all of the Karankawas except “a squaw and her two small children,” but after the Whites “consulted a little while...they decided it was best to exterminate such a race” and proceeded to murder the three remaining survivors. Dismissing Arens’s book dismisses this reality. Cannibalism is a powerful mechanism used to cast undesirables as worthy of extermination.

 

Continuing, Arens’s assertion that “anthropology has not maintained the usual standards of documentation and intellectual rigor expected when other topics are being considered” hits the nail on the head. Before The Man-Eating Myth, research tended to lean toward the implication that all native Peoples practiced cannibalism. Now scholars are far more careful with their approach to cannibalism.

 

In a scathing review, one writer stated that The Man-Eating Myth “does not advance our knowledge of cannibalism.” The opposite is true. Prior to the book’s publication, the field of cannibalism had been grossly understudied, which is one of the reasons why Arens found so little scholarly-backed evidence when examining cases of cannibalism. After publishing The Man-Eating Myth, the book’s controversy grew to such a severe level that scholars representing an assortment of fields jump-started research on anthropophagy to disprove the book’s thesis. Essentially, Arens’s book cannibalized itself. The reaction it prompted caused its own undoing. 

 

Historians today can learn a great deal from the Man-Eating Myth’s saga. The most profound takeaway is that when we focus too heavily on a single perspective while ignoring others, dangerously flawed history is bound to be produced. By dismissing colonizers’ history as propaganda and zeroing-in only on the oppressed’s perspective, Arens came to an unsound conclusion and denied deeply meaningful cultural practices. Yet, in the end, his book has done the most to inform us about an erroneously maligned cultural practice.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173404 https://historynewsnetwork.org/article/173404 0
A Brief History of the Fox

Photo by Adele Brand

 

No one will ever know where the first Homo sapiens laid eyes upon a living fox, or how the two species perceived each other. As pre-history continues, our fossils and theirs begin to overlap in paleontological sites, a silent testimony to forest meetings that have passed into the veil of unwritten time. But 16,000 years ago, when Palaeolithic painters were drawing steppe bison in the Spanish cave of Altamira, a woman of unknown name died in what is now Jordan, in a site called ‘Uyun al-Hammam. Her body was laid among flint and ground stone, and a red fox was carefully placed beside her ribs, resting with her for eternity on a bed of ochre.

 

We cannot perceive the meaning. Was this a pet, or an animal kept for its ceremonial significance? The care in the joint burial is believed to suggest some emotional link between human and fox, beyond that shown to wild- life perceived as food or clothing. It has been speculated that these pre-Natufian people coexisted with foxes that were at least half domesticated. Perhaps they scavenged rubbish on the edge of camps, along with the earliest dogs. Perhaps the behaviour so often complained about in London is more ancient than we think.

 

In any case, it is clear that foxes held a strong cultural significance for the later peoples of the Levant. They are commonly found in human graves in Kfar Hahoresh (modern Israel), dated to around 8,600 years ago, while stone carvings of foxes with thick brushes adorn the pil- lars of Göbekli Tepe in Turkey, believed to be the world’s oldest temple. In Mesolithic Britain, humans who hunted deer by the shore of extinct Lake Flixton – in the North Yorkshire archaeological site of Star Carr – must have been aware of their small red neighbours. Bones from two foxes have been found at this ancient settlement, along with those of Britain’s first known domestic dogs, but there is no indication of what role, if any, canids played in their culture.

 

Later, as humanity discovered the joy of story-telling, foxes joined the cast. The oral literature of native Americans occasionally opts for a fox as a trickster, albeit a potentially handy one; according to one Apache legend, it was Fox who stole fire from the fireflies and introduced it to Earth. It is across the Pacific in Japan, however, that fox folklore reaches its most astounding heights. Kitsune – the revered fox of Japanese myth, poetry and traditional belief – has existed in human thoughts for many centuries. It even makes an appearance in what may be the world’s oldest novel: Japan’s eleventh-century epic The Tale of Genji, where a human character debates whether the figure by a tree is a woman or a shapeshifting vulpine. Kitsune delight, deceive and confuse in countless other legends; while the theme of pretending to be an attractive woman is frequent, other tales relive how they mis- lead travellers by lighting ghost fires at night, assume the form of cedar trees, or even become the guardian angels of samurai. Today, anime writers continue the kitsune tradition.

 

Back in Europe, by Roman times the uneasy relationship between foxes and agriculture had woven itself into religious rituals – in the festival of Cerealia, for example, live foxes were released into the Circus Maximus with burning torches tied to their tails. Seven hundred years later, Aesop’s tales also provide a nod to fox interactions with farmers, and – to a lesser extent – with their neighbouring wildlife. My favourite Aesop fable features a wolf taking a fox to court for theft; given the vast quantity of wolf-killed carrion that real foxes consume, it seems vaguely reasonable.

 

Old English literature picks up similar themes. The Fox and the Wolf, a rhyming poem from the thirteenth century, stars a fox who helps himself to some chickens and then tricks a wolf into taking the blame:

 

A fox went out of the wood

Hungered so that to him was woe 

He ne was never in no way 

Hungered before half so greatly. 

He ne held neither way nor street

For to him (it) was loathsome men to meet 

To him (it) were more pleasing meet one hen 

Than half a hundred women.

He went quickly all the way 

Until he saw a wall.

Within the wall was a house.

The fox was thither very eager (to go) 

For he intended his hunger quench 

Either with food or with drink

 

And so it continues, with the hungry fox trapping himself in a well before deceiving a wolf named Sigrim into taking his place. Ironically, this poem was written about the same time that the wolf’s howl was finally falling silent in southern Britain.

 

Did the fox notice the disappearance of its distant relative? Perhaps, unconsciously. As shown in Białowieża and elsewhere, the wolf was a provider as well as rival, a powerful force in the wildwood whose absence has changed these islands as much as a spoke missing from a wheel. Some species have sharply increased, and others have probably declined.

 

Yet civilisation has done more than simply rip out culturally troublesome natives while boosting deer and grouse for hunting. We have a long history of acquiring new, useful species and releasing them; in Britain alone, that includes rabbits from Spain, fallow deer from Persia, sheep from Mesopotamia, hens from south-east Asia, and cats from Africa. Our trading ships accidentally added black rats from India and house mice from the Middle East, while American grey squirrels, Japanese sika deer and even Australian red-necked wallabies joined our countryside from zoos. We have persuaded ourselves that the six million sheep of Scotland are part of the ‘natural’ scene, but the Highland ecosystem evolved with none. Even the Scottish red deer population of 300,000 is far higher than in the time of the wolf. These changing grazing pressures affect the rodents and berries that foxes eat, and near-total deforestation has altered their territory sizes and feeding habits.

 

In North America, the prairie ecosystem has been largely dismantled; gone are the vast herds of bison, pronghorn and elk, and here to stay are European cattle and Old World dogs, as well as an increasing population of white-tailed deer. The rise and fall of farming has seen forests destroyed and recovered in the Appalachians and elsewhere, while accidental introductions of tree pests such as the gypsy moth – and indeed Dutch elm disease and European ash dieback – chip away at the structure of ecosystems. We have suppressed fire in forests built around it, or occasionally encouraged it through per- mitting the build-up of flammable material or carelessly dropping cigarettes. We have allowed invasive plants to hitch a lift on our trains, trucks and shoes. We have even changed the soil by introducing earthworms to forests outside their native range. In short, vast swathes of the globe now carry a human footprint.

 

In a flash of geological time, we have rewritten the fox’s wildwood, in ways both graphic and subtle. We have added, taken away, replanted and concreted.  

And the fox that once played its natural dodgems with the rest of the natural web will inevitably interact with the components of the new urbanized world that we have designed without ecological aforethought.

 

The fox is not an intruder into our world.

 

We have simply laid our modern ambitions over the landscape it already knew.

 

From THE HIDDEN WORLD OF THE FOX by Adele Brand Copyright © 2019 by Adele Brand. Reprinted by permission of William Morrow, an imprint of HarperCollins Publishers.

 

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173409 https://historynewsnetwork.org/article/173409 0
Why Originalism Should Apply to Impeachment

 

I have never been a supporter of originalism. Until recently, I was also skeptical that impeachment was the right way to handle Donald Trump’s presidency. But I have recently reevaluated my position in light of President Donald J. Trump’s egregious phone call with the Ukrainian president back in July, in which he requested a favor (dirt) on one of his political opponents in return for releasing military aid to that beleaguered nation. 

 

Originalism is the idea that “the Constitution should be interpreted in accordance with its original meaning---that is the meaning at the time of its enactment,” according to the Center for the Study of Originalism at the University of San Diego. Though not exclusively, conservatives tend to be the main supporters of the doctrine. My reasons for opposition to originalism were laid out in an article published by the History New Network in July of 2018.  In part, relying solely on the meaning of the Constitution in 1787 ignores the changes that have taken place over the past 232 years. Also, the Framers themselves did not agree on the meaning of much of what they wrote. One need look no further than the debates over the constitutionality of the Bank of the United States that took place in 1790. Alexander Hamilton, who had proposed the bank, thought it met constitutional muster. James Madison, often referred to as the father of the constitution, did not. 

 

Yet in one area, there appeared to be a great deal of agreement at the Constitutional Convention. This was over the issue of impeachment. The men who attended the convention had recently fought a revolution against the overbearing, centralized power of the British empire, and the abuses by the King and his ministers. “Many of the Framers had signed the Declaration of Independence, whose bill of particulars against King George III modeled what we would view as articles of Impeachment,” constitutional scholars Lawrence Tribe and Joshua Matz have written.   

 

The ideology of the Framers was grounded in classical republicanism.  At its core, classical republicanism was the idea that individuals sacrifice their private interest in order to advance the public interest. This was especially important for elected officials. The Framers feared that politicians would act in a corrupt manner, advancing their own interests at the expense of the public interest. This fear was especially acute for executive power due to the experiences of the colonists in the 1760s and 1770s when royal governors, appointed by the King, often dissolved local colonial assemblies when they disagreed with their decisions. They also vetoed bills on a regular basis. In response, the governments that were formed in the states after independence tended to limit the power of governors. Pennsylvania went the furthest in this regard, when they eliminated the office of governor and replaced it with a twelve-member executive council. The Articles of Confederation did not even include an executive officer among its provisions.

 

By the time the Framers attended the Constitutional Convention, the problems of government without a strong executive had become obvious. A number of the Framers wanted to provide what Hamilton called “energy in the executive” which he thought was “a leading definition of good government,” as he wrote in Federalist 70. Yet there was also concern that a powerful presidency, if occupied by an unscrupulous man, could endanger the republic. Their fears were largely assuaged by the assumption that Washington would be the first president. “The first man put at the helm will be a good one. Nobody knows what sort may come afterwards,” Benjamin Franklin observed.  Pierce Butler of South Carolina wrote in a 1788 letter that, “many of the members cast their eyes towards General Washington as President, and shaped their ideas of the powers to be given to a President by their opinions of his virtue.” But what about the people that would follow him? This was where the power of impeachment entered the picture.

 

The idea of impeachment had first arisen on June 2, but an extensive debate over the issue occurred on July 19 and 20th. Governeur Morris of Pennsylvania, among others, was opposed since it would endanger a president’s independence and the separation of powers. Morris’ objections were addressed by a large group of delegates, led by two Virginians, George Mason and James Madison. Mason thought that impeachment was essential to ensure the integrity of elections, and that the rule of law should apply to everyone. “Shall any man be above justice?” Mason asked. He was concerned that a “man who has practiced corruption & by that means procured his appointment” would “escape punishment” in the absence of the power of impeachment. Concerns about foreign interference in elections were raised by Madison, who feared a president “might betray his trust to foreign powers” or “pervert his administration in a scheme of peculation [embezzlement] or oppression.” By the end of the debate, Morris changed his mind and supported the power of impeachment. 

 

That was where matters stood until September 8, when the Convention needed to finally settle on what would constitute the grounds for impeachment. Morris indicated that the reasons for impeachment should “be enumerated & defined.” But the delegates struggled with an appropriate definition. Words like malpractice or maladministration were suggested but were too vague and “will be equivalent to a tenure during pleasure of the Senate,” according to Madison. Presidents should not be impeached because they were unpopular or incompetent, nor for policy differences. Finally, Mason recommended that “high crimes and misdemeanors” be added to bribery and treason as reasons for impeachment.

 

High crimes and misdemeanors seem vague to us in the twenty-first century. “An impeachable offense is whatever a majority of the House of Representatives considers it to be at a given moment in history,” Gerald Ford once said. While there is some truth to Ford’s observation, a better approach is that high crimes and misdemeanors should be interpreted within the republican ideology of the period and the overall debate that the Framers had over the issue of impeachment. They thought that the power of impeachment should be reserved for abuses of power, especially those that involved elections, the role of foreign interference, and actions that place personal interest above the public good. As Hamilton wrote in Federalist 65, impeachable acts are “POLITICAL, as they relate chiefly to injuries done immediately to the society themselves.” The historian Jeffrey Engel has written that the Framers had “a shared understanding of the phrase” and succinctly describes “impeachable offenses [as] those perpetrated with sinister intent to harm the republic for personal gain.” Tribe and Matz write that high crimes and misdemeanors may not involve a crime at all, but rather “corruption, betrayal, or an abuse of power that subverts core tenants of the US governmental system…that risk grave injury to the nation.” 

 

Impeachment should be a last resort, the nuclear option, when all the other methods of checking the abusive power of a president have become inadequate. It is not something to be celebrated, but rather to be used solemnly to protect American democracy. As Hamilton wrote, the process of impeachment can be divisive, agitating “the passions of the whole community… [dividing] it into parties more or less friendly or inimical to the accused.” Trump’s actions reflect a person who either does not know right from wrong or who believes he is above the law.  As such, Trump was willing to coerce a foreign government in order to discredit a political opponent. If left unchecked, he will continue to repeat actions like this (we already know he has attempted to pressure Australia), which will undermine the integrity of the 2020 election. As the August 12, 2019 whistle blower report indicates, “the President of United States is using the power of his office to solicit interference from a foreign country in the 2020 U.S. election” in order to advance his personal interests.” This falls squarely within the reasons why the Framers included the power of impeachment in the Constitution. In this case, the fears of the Founders, and our fears, line up perfectly. 

 

One would think that those who are the most fervent supporters of originalism would also be supportive of the impeachment effort, but this is not the case, at least not yet. If impeachment is not warranted under these circumstances, then when? Still, it is not too late for otherwise good people to support upholding the rule of law, the Constitution, and the original intent of the Framers. Even if there are not enough senators to ultimately remove the president, the addition of Republican support will provide the entire process with bi-partisanship, better help to inform the people of the nefarious actions of Trump, and lead to his removal at the ballot box in 2020.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173405 https://historynewsnetwork.org/article/173405 0
1949: A Crucial Year for America, Russia, China and the World

Mao beside Joseph Stalin at a ceremony celebrating Stalin's 71th birthday in Moscow in December 1949

 

Recently, I attended a birthday party for my nephew who was born in the year 1949. Reflecting on the year of his birth, I was struck by how significant the year 1949 was in American, Russian and Chinese history. It’s valuable to revisit three events of global importance from 70 years ago and reflect on how global events intertwine with personal histories. Three events of global importance came to mind.

 

First, on April 4, 1949, in Washington, D.C., the United States joined Canada and ten other European countries in forming—for better or worse-- the North Atlantic Treaty Organization (NATO). This was a military alliance against the Soviet Union whose purpose was to thwart a presumed invasion of Western Europe by Russia. 

 

For the first time in American History, the U.S., going against the advice of George Washington and Thomas Jefferson, joined a “permanent” and “entangling” military alliance. As a result, U.S. troops would be permanently stationed in Europe during peace time—also a first in American history

 

One of the members of NATO was Turkey. The ultimate result of this for both me and my nephew was that in 1972 we both lived in Turkey. He was in the U.S. Air-Force and stationed at its base in Adana, Turkey. At the same time, I was living in Istanbul and taught U.S. Foreign Policy for the University of Maryland on a Turkish army base twenty miles west of Istanbul and on a U.S. Air Station at Karamursel, Turkey located on the Southern Shore of the Sea of Marmara. 

 

Second, in mid-August 1949, the Soviet Union exploded its first Atomic bomb. America’s nuclear monopoly, lasting for four years after the bombings of  Hiroshima and Nagasaki, ended. The U.S. and the Soviet Union now increased their nuclear stockpiles. The era of MAD (Mutually Insured Destruction) now began.

 

This came to a climax during those frightening “thirteen days” in October 1962 called the Cuban Missile crisis. Then, both Moscow and Washington peered into the horrifying abyss of total mutual annihilation. Fortunately, they both stepped back before going over this tragic cliff. The solution to this crisis was basically settled when President John F. Kennedy secretly agreed to pull U.S. atomic missiles out of Turkey. 

 

At the time, I was an undergraduate student and did not fully appreciate the danger the world was in then in. The full sense of the vulnerability of the human species did not strike home to me until I began teaching classes on the subject starting in 1986 and continued to do so for the next twenty years.

 

Third, on October 1, 1949, Chairman of the Chinese Communist Party, Mao Tse Tung, stood on a balcony over-looking the Gate of Heavenly Peace and Tiananmen Square in Beijing and announced the entry of the People’s Republic of China into international affairs. All of Asia and the West would never be the same again. 

 

After this announcement, Mao proclaimed “China has stood up.” In saying this, Mao reminded his fellow citizens about China’s “Century of Humiliation” (1842-1949) suffered at the hand of Western nations and Japan. Events that happened to China during this time are seared in the historical consciousness of the Chinese people.

 

If one looks at China’s present place in global affairs, it is readily apparent that, seventy years after Mao made this statement, China is, indeed, standing up. It is no exaggeration that China is standing tall and the economic and political shadow it casts across the globe is astounding.  Today, for the first time since the reign of King George III of England in the 18thcentury, the U.S. and the West is being seriously challenged by a non-Western, non-Christian and non-white Civilization.

 

The time is not far off, that China may quite be standing taller, economically, than the U.S. President Barack Obama once referred to Sino-American relations as the most important determinant of whether life in the Twenty First Century will be more peaceful than the previous one.

 

The truth of Obama’s observation has only dawned recently in my mind. Since my retirement from the University at Albany, my life has brought me to the Asia, Japan and Korean Societies in New York. There, I have had the singular pleasure to learn about events in Asia from such august instructors as Kevin Rudd, President of the Asia Society Policy Institute, and 26th Prime Minister of Australia and Orville Schell, an impressive Chinese scholar, and President of the Chinese-America Center at the Asia Society. They, and the esteemed experts at the Japan and Korea Society are filling in the large gaps of ignorance in the education of this rank amateur in Asian history and affairs.

 

So, the world has turned over many times since 1949. And the next seventy years will, I’m sure, to quote Mao Tse Tung, be “interesting times.” I’m sure we all can agree that 1949 was, for these three historical events, a most significant year.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173402 https://historynewsnetwork.org/article/173402 0
What Can Historians Do? An Update from Historians for Peace and Democracy

 

In May 2019 Historians for Peace and Democracy (H-PAD) convened a national strategy meeting of over 60 historians at Columbia University to discuss how we, as historians, could confront the current political crisis in the United States. Since that time, H-PAD has been engaged in integrating new members into the organization, setting up new working groups, and producing educational materials for public distribution. 

 

Participants at the May meeting formed four working groups: Empire, Immigration, K-12, and Palestine. Each group has since formulated a mission statement. The Empire working group “will seek to connect with historians as citizens, educators, and scholars. A subcommittee of the working group, under the leadership of Molly Nolan, will prepare curricular materials on the topic of the US Empire. A second subcommittee, coordinated by Rusti Eisenberg and Prasannan Parthasarathi, will mobilize historians across the country to pressure Congress on issues of war, militarism and foreign policy.”

 

The Immigration working group “works to educate scholars, activists, and the general public about the history and current reality of immigrants in the Americas, most especially in the United States. It seeks to mobilize people to take action in support of immigrants’ human rights and for a just, humane immigration policy. We condemn all U.S. government policies that fail to welcome asylum seekers and refugees to the United States.”

 

The Immigration group plans to present a resolution at the January 2020 AHA meeting condemning affiliations between institutions of higher education and the detention/deportation apparatus embodied by ICE and Border Patrol. In order to be placed on the agenda, the resolution needs 100 AHA members to sign on by October 30. Please read the resolution and sign on. And if you are at the AHA meeting, please attend the business meeting to state your opinion.

 

The goal of the K-12 organizing committee is “to create and build partnerships between K-12 educators, academics and historians in order to affect the way in which history/social studies is taught and learned in the schools. In addition, the committee wishes to break down the artificial barriers that exist among and between K-12 teachers, history (and other relevant) departments and schools of education.”  

 

The Palestine working group states that “we have come together to work to educate ourselves, the profession, and the general public to counter the misinformation and censorship of ideas by opening space for critical debate rooted in scholarly expertise. We will work together to challenge the Israeli occupation of Palestine, and U.S. support for Israel. We seek to mobilize people to take action to counter censorship and suppression of academic freedom in universities in the U.S.” 

 

If you are interested in joining any of the groups, please contact the coordinators. They are Molly Nolan and Prasannan Parthasarathi for the Empire working group, Alex Aviña and Margaret Power for the Immigration working group, Alan Singer and Barbara Winslow for K-12, and Leena Dallasheh and Robyn Spencer for the Palestine working group.

 

Making history accessible to all remains our fundamental priority. One reason why U.S. policymakers and corporations can get away with their crimes is that most members of the U.S. public lack the relevant historical knowledge that would enable them to see through the deception. In the past two years our public history team has released short “broadsides” on historical topics of interest to the general public. In the last few months we have published four new broadsides: “Puerto Rico: A U.S. Colony in the Caribbean”; “Our Debt to Central American Refugees”; and the two-part “Why the United States Is Not a True Democracy.” These, and all of our previous broadsides, can be found on our website

 

In order to reach new audiences in the age of social media, H-PAD has also begun producing an exciting video series called Liberating History. These videos consist of short, accessible interviews that place current events in historical context. To date we have produced two episodes. In Episode 1, titled “Trump Administration Policy in the Middle East: A Cruel Continuity,” Irene Gendzier explains the historical roots of Trump’s policies toward Iran, Saudi Arabia, Israel/Palestine, and the rest of the region. 

 

In Episode 2, “The Structure of Punishment: Crack and the Rise of Mass Incarceration,” Donna Murch traces the historical origins of the U.S. “war on drugs” and the system of mass incarceration that accompanies it, focusing on the racist and hypocritical policing of Black crack cocaine users. All videos can be accessed on our website and on our new YouTube channel

 

If you would like more information about H-PAD, please visit our website or email us. There are many ways to be involved, for instance by writing a broadside, participating in a working group, or donating to help us produce more videos. We welcome everyone who agrees with our mission, whether professional historians, K-12 teachers, or non-historians with an interest in putting history at the service of movements for peace, democracy, and justice. 

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173412 https://historynewsnetwork.org/article/173412 0
“The Uplift of All” Through Nonviolent Direct Action

 

Palo Alto, California

 

“There is no limit to extending our services to our neighbors across State-made frontiers. God never made those frontiers.” – Mahatma Gandhi

 

“All men [and women] are caught in an inescapable network of mutuality, tied in a single garment of destiny.” – Martin Luther King, Jr.

 

From street battles in Hong Kong, battlefields of Syria, sweltering refugee camps in Asia and the Middle East, Africa, and Central America, from “every hill and molehill,” as Martin Luther King would say, people demand freedom and want to know, how do we get it? Mahatma Gandhi and King had no monopoly on answers to those questions. Sometimes people have no choice, King said, but to defend themselves, and past movements cannot be replicated. Nonetheless, Gandhi and King did offer a critical, philosophical, strategic and tactical road map to create mass struggles for change. It is called nonviolent direct action. 

 

Prompted in part by the 150th anniversary of Gandhi’s birth in 1869, a diverse gathering of hundreds met at Stanford University in mid-October for a conference titled “The Uplift of All: Gandhi, King, and the Global Struggle for Freedom and Justice.” Initiated by Dr. Clayborne Carson, Director of the Martin Luther King, Jr., Research and Education Institute, this conference connected us to the Gandhi-King Global Initiative. It includes leaders from both India and the U.S. in an effort “to build an international network of institutions, organizations, and activists committed to the nonviolent struggle for human rights.” Gandhi and King’s broad concerns about war, poverty, environmental crises, and human rights denials often go unspoken, so our gathering sought to bring their full teachings to light, to blend learnings from past nonviolent freedom and justice movements, and to apply them to the twenty-first century. 

 

Highlights included Dr. Carson’s public interviews with descendants of Gandhi, Ela Gandhi and Rajmohan Gandi, whose warm handshake and gentle manner made me feel a direct connection to the Mahatma. Juanita Chavez and Anthony Chavez, descendants of farm worker organizer Cesar Chavez, illustrated how the legacy of nonviolence organizing lives on among Latino/as in California. Martin Luther King III reminded us that, more than ever, human kind faces a choice between “nonviolence and nonexistence.” His daughter Yolanda Renee, who at age 9 had thrilled thousands protesting the murder of students in Parkland, Florida, at the March for Our Lives rally in the nation’s capital last year, affirmed, “I have a dream: this should be a gun-free world.” 

 

Conference participants discussed gun violence, racism and misogyny, war, the global environmental catastrophe, and other crises.  A high point of discussion came from Rev. James M. Lawson, Jr., a treasured link to King’s nonviolence philosophy and practical organizing. At ninety-one years of age, Lawson has trained people in nonviolent direct action in civil rights, labor, peace, civil and immigrant rights and other liberation movements from the mid-1950s to the present, highlighting the nonviolent quest to radically reconstruct society without adding more harm. He darkly declared that a “mix of mean spirits” accumulated over generations threatens our existence and that of the planet, but laid out specific examples of how Gandhi’s satyagraha, or “Love in action,” “spirit force,” “soul force,” can create “a force more powerful” than the oppressive forces that dominate the world. He told us that activism by itself is not enough and that movements often fail because they lack a step by step organizing framework. Gandhi focused on ten steps, King on six steps, while Lawson has boiled nonviolent direct action down to four steps. In his model, in-depth, dedicated organizing campaigns often begin with small groups at the local level that create a groundswell for national movements. In that step, which he calls “preparation for nonviolent struggle,” first requires a focus on analyzing issues, deciding who makes decisions, and locating the levers of power to make your opponents say yes when they want to say no. Preparation also includes training in nonviolence discipline and sacrifice that masses can follow when confronting those in power, who always have more weapons. Mary King, Director of the Lawson Institute, directed us to sources on its website, and linked nonviolence efforts back to the 1960s Student Nonviolent Coordinating Committee.  

 

In Lawson’s framework, dedicated organizers need to have a clear plan for direct action with an exit strategy whereby some agreements can be reached and implemented through negotiations to obtain measurable, attainable victories. Without an end point, movements die.   To get there, Nonviolence leaves room for reconciliation, without which societies do not move forward.  Lawson invokes the force of life we are given at birth that sustains us in order to make nonviolence “a force more powerful.” If his view seems overly optimistic or even naïve, he asks us to range back through history to see that “waging nonviolence” in fact does work and can create lasting change that violence fails to do because it does not resolve underlying issues. 

 

Nonviolence struggle links you backward and forward to generations of people who have changed the world and provides a personal link to others that can sustain a life of activism. “When you help others, you see your doubts in yourself melt away,” the opponent of the death penalty Sister Helen Prejean told us. Lawson’s brother Phillip offered his experiences as a Methodist minister working in black and brown communities facing poverty and violence in the Midwest and the San Francisco Bay area.  He felt that putting his life in the service of others, though dangerous, led to a meaningful life. He urged us to create relationships to the poor and the oppressed in whatever ways we can. We are born to die, but the “art of dying” well is grounded in a practice of love that allows you to discover the full sharing of life. “We are not in control, yet life calls on us in various ways to say either yes or no” to action, he said. 

 

As we left the gathering that weekend, I could not sum up what it all meant, but Phillip Lawson’s parting words echoed in my head. “Let loose of the things that bind you and enter into new relationships and each day experience a “newness that bubbles up inside of you.” Whatever the outcome and whatever the hardships, “choose life, wonderful and joyous.” 

 

 

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173411 https://historynewsnetwork.org/article/173411 0
Did Jefferson Think Humans Occupied a Privileged Position in the Cosmos?

 

Philosopher and mathematician Blaise Pascal writes in his Pensées: “In the end, what is man in nature?  A nothing compared to the infinite, and everything compared to the nothing, a midpoint between nothing and everything, infinitely removed from understanding the extremes:  the end of things and their principle are hopelessly hidden from him in an impenetrable secret.” The sentiment was not an existential gaffe, motivated by utter human despair. It was a sincere attempt by a deeply religious illuminato to understand man’s place in the cosmos, whose secrets were increasingly becoming known to investigators.

 

The supreme triumph would come, after Pascal’s death, with publication of Isaac Newton’s Principia Mathematica in 1687. Newton, who built on the findings of men such as Brahe, Kepler, and Galileo, discovered laws that were deemed applicable to all bodies far and wide, and the cosmos, from the time of early Greek antiquity with Anaximander and Democritus, had been thought to be of gargantuan proportion, if not infinite in extent. In Ptolemy’s (second century A.D.) words, “The earth has the ratio of a point to the heavens”—a sentiment iterated in 1543 by Copernicus, who merely replaced the position of the earth with that of the sun.

 

Newton’s discoveries set off an array of reactions by other illuminati of the day.

 

The French mathematician Pierre Simon Laplace—noting that all bodies, humans being bodies too, were subsumable under the law of gravitational attraction—thought that humans held no special status in the cosmos. A man who leaps from a steep precipice will invariably fall in the same manner as a stone dropped from the same precipice. Laplace famously responded when asked about God’s role in the cosmos: “I have no need of that hypothesis.”

 

On the other hand, philosopher Baruch Spinoza said that God was the infinite and necessarily existing substance of the universe. “By God I understand a being absolutely infinite, i.e., a substance consisting of an infinity of attributes, of which each one expresses an eternal and infinite essence.” Spinoza’s God was the universe.

 

Still others, having marveled at the nomological status of the cosmos revealed by Newton and working backward to a cause of that structure, saw God as the Demiourgos or Craftsman of the cosmos. David Hume was among them. He wrote through the mouth of Philo in Dialogues Concerning Natural Religion: “That the works of nature bear a great analogy to the productions of art is evident…. Here, then, the existence of a DEITY is plainly ascertained by reason.” There was the proviso, however, that the cosmos, as a crafted thing, was prodigiously unlike, in size and complexity, anything humanly crafted.

 

Just where did humans fit in the wondrous and gargantuan cosmos for Thomas Jefferson? Were they, as Pascal had said, mere nothings in comparison to everything and everything in comparison with nothing?

 

Jefferson for the most part followed Humean path.

 

In his essay “On Suicide,” Hume says: “The providence of the deity appears not immediately in any operation, but governs every thing by those general and immutable laws, which have been established form the beginning of time. All events, in one sense, may be pronounced the action of the almighty: They proceed from those powers, with which he as endowed his creatures.” Even the actions of humans are god-caused, as the human faculties are as much the workmanship of deity as are the laws of motion and gravitation. “When the passions play, when the judgment dictates, when the limbs obey; this is all the operation of God; and upon these animate principles, as well as upon the inanimate, has he established the government of the universe.” Moreover, no event is unimportant to deity, “who takes in, at one glance, the most distant regions of the space and remotest periods of time,” and all events are subject to the “general laws that govern the universe.” There is no room for free human agency.

 

Hume’s deity is manifest through observing the cosmos, but his activity is not direct, but indirect. As with the ancient Stoics, event is linked with event by causal concatenations such that no event is arbitrary. Deity is responsible for all bodies in the universe, their patterns of behavior, and the laws fixing such patterns. In that regard, “all events … may be pronounced the action of the almighty.” If there is a small break in fixity of events, that breach is subtle—undetectable to human perception. Given deity’s equal attention to all details, we can thus grasp that the sense of Hume’s intriguing comment later in the same essay, “The life of man is of no greater importance to the universe than that of an oyster,” is not meant pejoratively, but as a verbal dislodgement of anthropocentric biases humans might have concerning deity and the cosmos. The universe is a vast and complex network of events, causally concatenated, and humans, like oysters, are parts of that network.

 

Like Hume’s, Jefferson’s cosmos is no accident. It gives unmistakably evidence of design—cause linked with effect. He writes in a letter to John Adams (11 Apr. 1823) in a manner similar to Hume: “I hold (without appeal to revelation) that when we take a view of the Universe, in it’s parts general or particular, it is impossible for the human mind not to percieve [sic] and feel a conviction of design, consummate skill, and indefinite power in every atom of it’s composition.” The heavenly bodies are exactly held in course by “the balance of centrifugal and centripetal forces.” The earth’s structure—with the proper proportion of land, water, and air, and with minerals, vegetables, and animals perfectly organized and of numerous uses—“it is impossible … for the human mind not to believe that there is, in all this, design, cause and effect, up to an ultimate cause, a fabricator of all things from matter and motion, their preserver and regulator while permitted to exist in their present forms, and their regenerator into new and other forms.” In addition, there is evidence of superintendency. Old stars evanesce; new stars are born. Some races of animals have become extinct: “were there no restoring power, all existences might extinguish successively, one by one, until all should be reduced to a shapeless chaos.” So obvious is the design of the cosmos that for every atheist—atheism for Jefferson being inconsistent with morality—there are one million believers.

 

The causal structure literally bespeaks a creator that works not ex nihilo, but from matter and motion. Deity, preserves, regulates, and regenerates. The regeneration is best attributable not to imprescriptible divine intervention in the regularity of nature—Jefferson’s eschewal of thaumaturgy in construction of his two bibles is evidence of that—but to prescribed changes in things, written into the laws of nature, and dispositions of material things. Thus, superintendency works in the manner of a thermostat, once installed in a house, regulating the temperature of that house. The notions of “perceive” and “feel”—derived from Destutt de Tracy’s epistemology, and consistent with Lord Kames’ notion of intuitive perception—demonstrate that the argument is not analogical. On close inspection, one forms an immediate sensory impression, not unlike the impression one forms of an action, morally correct, given sanction of the moral sense.

 

Jefferson’s cosmos is substantially similar to that of Hume, though Hume—“the comparison of the universe to a machine of human contrivances is so obvious and natural, and is justified by so many instances of order and design in nature, that it must immediately strike all unprejudiced apprehensions, and procure universal approbation”—seems to argue for deity from analogy, not direct perception.

 

Finally, like Hume’s deity, Jefferson’s God is immune to supplication. Jefferson writes to Miles King (26 Sept. 1814): “[Deity] has formed us moral agents. Not that, in the perfection of his state, he can feel pain or pleasure from any thing we may do: he is far above our power: but that we may promote the happiness of those which whom he has place us in society, by acting honestly towards all, benevolently to those who fall within our way, respecting sacredly their rights bodily and mental, and cherishing especially their freedom conscience, as we value our own.” Though deity is immune to supplication, humans have been fashioned by God to be responsible for their own happiness.

 

So, did Jefferson think humans occupied a privileged position in the cosmos?

 

To answer that question, we must give some account of Jefferson’s view of humans’ moral sense. Whereas Hume argues that morality is a matter of sentiment, Jefferson describes morality as due to a god-granted sensory faculty. He writes to John Adams (14 Oct. 1816). “I believe that it is instinct, and innate, that the moral sense is as much a part of our constitution as that of feeling, seeing, or hearing; as a wise creator must have seen to be necessary in an animal destined to live in society: that every human mind feels pleasure in doing good to another.” Jefferson, in a letter to Peter Carr (19 Aug. 1785), compares the moral sense to a limb, whose functionality is perfected with proper use or is debilitated through overuse or underuse. The notion of proper use, for the sake of moral accountability, is critical, and here the analogies with a sensory organ or a limb invite different ways of cashing out proper use. Furthermore, the statement that it has been implanted “in an animal destined to live in society”—and this is a point that other scholars have failed to notice—does nothing to privilege humans in their rank among other animals. It makes it plausible that Jefferson believed other social animals—vertebrates such as bats, crows, elephants, dolphins, horses, and lions, and invertebrates such as ants, bees, termites, and wasps—because of their sociability, have a similar sense of morality, species specific.  

That interpretation, however, creates a gargantuan difficulty. Jefferson’s writings are suffuse with references to humans seemingly having a privileged position in the cosmos. Most of those references, however, are merely the result of political posture or epistolary politeness. Other references are consistent with Hume’s statement that oysters (and other living things) are the equals of humans when it comes to cosmic significance, because all are significant parts of a massive cosmic network of events, causally linked. There is much to be gained by reading Jefferson in a Humean light. How we think of who we are in large part determines who we are.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173410 https://historynewsnetwork.org/article/173410 0
Citations Are a Metaphor for Erasure in American History

This past week, educators, politicians, and activists debated if Americans should celebrate Columbus Day or Indigenous People's Day. Is the second Monday of October a day to commemorate the famous explorer or one to remember the people who endured imperial violence at his hands?

 

As a Black female historian and writer, I think this debate is directly connected to a key issue that historians routinely grapple with: citations. In many ways, the issue of citations (or the lack thereof) is a fitting metaphor for the erasure of Black people, people of color and women in American history. Citations determine how some voices are documented and remembered and how some go missing from the historical narrative.

 

Recently, many historians and others expressed concern about an article in the Washington Post which initially neglected to cite my book, The Weeping Time: Memory and the Largest Slave Auction in American History (it was later corrected).  On the one hand, it was very refreshing and encouraging to see the Washington Post discuss such an important historical event and period, but the oversight of my work raised important issues beyond the article itself.  My book, published by Cambridge University Press two years ago, took the same angle as the proposed project of the authors profiled in the Washington Post article. I tracked down 15% of the original 429 men, women and babies who were sold at that fateful sale on March 2-3, 1859. Others had previously written about "the Weeping time," including Catherine Clinton, Diana Ramey Berry, the late Malcolm Bell, landscape architect Kwesi DeGraft-Hanson and public official and Darien resident Griffin Lotson, but my book was the first full length scholarly monograph to track the descendants of the auction to the present day.  I also highlighted the research some descendants have done on their own history.  

 

I embarked on this ten year research project (which is ongoing) to give voice to the enslaved instead of centering the perspective of white slaveholders. Using contemporary newspaper accounts, slave narratives, census records, birth and death certificates as well as oral history accounts, I centered the voices of those who were most adversely impacted by this auction: the enslaved. At the opening of the book, I wanted to take readers on a journey back to 19th century antebellum America--to the scene of the crime, if you will--to the auction block that routinely separated families. I told the story of cotton hand Jeffrey and rice hand Dorcas who were engaged to be married but then ripped apart.

 

I wanted readers to see and hear what they experienced. There can be nothing sensational about the raw experience of losing a loved one. There is no drama that needs to be added to a scene in which a person on whom you have pinned your hopes, dreams, and future progeny is no longer a part of your story. I wanted readers to see and experience the horror of that separation--a separation that some of the families at our border are experiencing today.

 

Additionally, this research revealed the remarkable resilience of African American families. This was particularly evident in the 15% sample that I followed from the auction to the present day through their descendants. The Civil War broke out only two years after the auction and some of those who had been sold joined the fight for their own freedom. When the war was over, they then fought to exercise the full fruits of freedom: obtaining an education, diversified work opportunities, and voting rights. Exercising their rights also included buying land and making it profitable as Karen Bell's excellent award winning book, Claiming Freedom: Race, Kinship and Land in 19th Century Georgia, makes so abundantly clear. Many chose to formally marry since marriage as an institution beyond jumping the broom had not been universally available to the enslaved as Tera W. Hunter brings to light in herbook, Bound in Wedlock.

 

Today, a number of these descendants are productive members of their community. They are public officials, educators, cultural interpreters, business people and students. That said, the historical record reveals evidence of early deaths, land grabs and discrimination. Life for many became a series of migrations and weeping times--North and West in search of a better life or away from the terror of the KKK.

 

But overall, I found that their story did not end on the auction block. I was struck by how easily this and other untold stories of the African American experience have slipped out of the collective memory of America. Some voices are heard and others muted.

 

This issue pertains not just to what historical narratives we learn and remember, but also to who writes this history. For a long time, the Western historical field was dominated by men, in particular white men, yet today more and more women and people of color are historians. 

 

Against all odds, the indomitable historian and civil rights icon Mary Frances Berry and the influential Nell Painter challenged the largely one-dimensional perspective of the field and paved the way for many of us. Berry is still blazing trails with her latest work, History Teaches us to Resist.  Deborah Gray White's pathbreaking Arn't I A Woman about the unique experience of Black female slaves almost did not get published. Now it is a standard in the field. The work of these women and so many more pioneers--including the incomparable Toni Morrison and Paule Marshall--have enriched the historical record by telling stories that otherwise would not have been told.

 

Still, great challenges remain.

 

It is for this reason that there was a considerable reaction on social media to what many saw as my erasure in the Washington Post article. Henry Louis Gates Jr. and James Swanson have apologized for this oversight, and the reporter, Michael E. Ruane, has since added a line that acknowledged my work. Although this erasure in the article has been corrected, it raises the very issue of erasure of the Black experience and especially the Black female experience, including Black female scholars.

 

My hope is that this incident reminds us that we don't need more "Christopher Columbus moments," in the words of Tulane historian Roseanne Adderley. No more "discovering" places that someone else already occupies. And though we can not rewrite history--Christopher Columbus did come to the Americas in 1492--we should not write him out of history. Inspired by the best aspects of the civil rights and feminist movements, we should not write anyone else out of history either.

 

There was also a ray of hope in the midst of this debate. Many interested parties gave voice to their concerns. In so doing, they affirmed the importance of citations and acknowledgment. Women and men of all different backgrounds--Black, White, Asian, Latino, South American--demonstrated their investment in African Diaspora Studies in World History. There were also many in the general public who voiced concerns, showing a larger interest in history and scholarship. In times like these, we all need to be students of history who know that the stories we tell about our past deeply affect how we live, work and relate to each other in the present.

 

No one does anything of worth alone.  We cite our sources because the work of others has made our work possible. We are building an edifice and each of us has a brick to lay. But more than that, especially for those of us who are people of color and/or who are women who have stood for so long at the margins of society, we are adding our voices to a literal and intellectual marketplace which once saw us only as commodities. 

 

Yet as the debate over Columbus/Indigenous Peoples Day comes to a close this year, we can affirm that we are commodities no more. 

 

All our stories matter. All our voices matter.

 

 

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173358 https://historynewsnetwork.org/article/173358 0
We Must Not Grow Numb To The Yazidi Genocide

Yazidis on the mountain of Sinjar, Iraqi–Syrian border, 1920s

 

A few week ago, I had an opportunity to meet with six members of the Yazidi community, including journalists and activists. I must admit that although I knew and was very shaken when I first learned about the horrifying attack by ISIS on the Yazidi community in Sinjar, Iraq, listening to their account of what actually happened still stunned me beyond description. What was just as shocking and deeply disturbing is the fact that the international community has remained largely numb to the genocide that was perpetrated by ISIS starting in August 2014 against this uniquely peaceful and caring community. More than five years later, the Yazidis still suffer from the horror of those tragic events, which will continue haunt them as long as there are no specific plans by the Iraqi government and the international community to bring their plight to a humanely satisfactory conclusion. To put the story of the Yazidis in context, a brief background is in order. The name Yazidi comes from the Middle Persian Yazad, which simply means “divine being.” They share many aspects of Christianity and Islam. Their supreme being is known as Xwedê, who is beyond worldly affairs and is not prayed to directly. They have their own language and culture, and their centuries-old religion is among the oldest monotheistic pre-Abrahamic faiths. Despite centuries of persecution, the Yazidis have never forsaken their faith, which only attests to their remarkable sense of identity and strength of character. They are not well known globally; their resources are limited, and their political influence is negligible. During the assault on Sinjar in 2014 by ISIS, the Yazidis were targeted with mass killing and forcible transfer. It is estimated that between 5,000-10,000 men and boys were executed in the immediate aftermath of the attack, which is tantamount to genocide. Nearly 7,000 women and children were abducted and sold as slaves, and subjected to torture and sexual violence, while hundreds of thousands fled. More than 70 mass graves have been discovered in the region. Moreover, ISIS obliterated the agricultural resources of many rural communities, destroying wells, orchards, and infrastructure to prevent the Yazidis’ return to their homeland. What is especially worrisome is that there remain ISIS sleeper cells who will be ready to strike again, especially now in the aftermath of the Turkish invasion of the Kurdish territory in Syria, which led to the release and/or escape of thousands of ISIS prisoners. ISIS’ objective in the area was and still is the extermination of the Yazidis, among other ethnic and religious groups. The most affected group resulting from the unfathomable atrocities by ISIS are young Yazidi boys and girls who are suffering from untreated psychological trauma, which is further aggravated by the continuing hopelessness and despair. It is estimated that there are more than 300,000 Yazidis still living in emergency camps under terrible conditions that remain just as bad as they were during their first few weeks of exile. Particularly painful is the fact that hundreds of freed Yazidi women now face the stigma of bearing children fathered by ISIS fighters as a result of rape while in captivity. These women face banishment from their home community and their children born from non-Yazidi fathers will grow up with no sense of belonging, as they have no place among them. They will become increasingly vulnerable and an easy prey to be recruited by extremist groups. Back in Sinjar, however, sporadic violence continues, reconstruction work has slowed down considerably, and drinking water for schools and hospitals is scarce. One of the biggest problems is the presence of the Popular Mobilization Unit (PMU), an Iraqi government-sanctioned paramilitary force backed by Iran. The force is predominantly Shiite; they are as dangerous as ISIS and the Yazidis in the area are terrified that if PMU ends up taking over the entire area, the condition of the Yazidis and ultimately their fate will be even worse. Maria Fantappie, the senior advisor on Iraq at the International Crisis Group, said that “Despite having been freed from ISIS presence… the region de facto remains an occupied district where competing Iraqi and foreign agendas play out by coopting Yazidis into rival armed groups.” Regardless of their outcry for help and continued suffering, the international community is not providing the necessary financial aid to rebuild the Yazidis’ homes and villages. They try desperately to make their case known to whomever they can talk to, especially passing journalists who can spread the word about their unbearable condition. There are those who suggest that the Yazidis will be better off moving to other countries because Iraq is not safe and much of their land is occupied by various militias, which obviously is not the answer. Most Yazidis want to go back to their homeland where they lived and died for millennia. The 2018 Nobel Peace Prize co-winner Nadia Murad, who was recognized for her efforts to end sexual violence, stated: “We suffered but didn’t give up. We were not helped and rescued when ISIS attacked, but I hope this recognition means that the international community will help us recover from this genocide and will prevent such attacks against other communities like us in the future.” There are a number of measures that must be taken immediately to prevent further displacement and deprivation which the Yazidis have endured so painfully. The media campaign that began at the fifth anniversary—#DoNotForgetUs—designed to draw international attention has thus far produced limited results. Nevertheless, this campaign must continue and be supported by the EU and the US. Western powers along with the Iraqi government should initially provide a relatively small amount of $250 million, which can go a long way to begin the process of rehabilitation. The US should assume greater responsibility for the plight of the Yazidis, as the rise of ISIS is a horrific byproduct of the ill-fated Iraq war. The US, with the support of European powers, should push the UN to establish better monitoring for early warning signs of impending atrocities, and preserve evidence of the Yazidis’ genocide, which will be critically important for prosecuting ISIS fighters for their unspeakable crimes. In addition, mass graves should be exhumed by a special UN contingency to identify some of the victims and push for the creation of an international tribunal to put on trial many high-ranking ISIS commanders and charge them with crimes against humanity. As long as ISIS and other militias continue to operate in the area, the Iraqi government should fund the recruitment of locals to form a regional security force. This local force will have a vested interest in protecting their areas, and it should be bolstered by a fairly small US and EU military force to be based in Sinjar to protect the Yazidis and allow their exiles to return home. The Iraqi government must also bear a special responsibility to restore some normalcy by appropriating the necessary funding for rebuilding schools, hospitals, and infrastructure, and return services to Sinjar. Having suffered atrocities of such magnitude with little help from the outside to come to their rescue, the Yazidis, for good reason, no longer trust anyone as they feel betrayed and abandoned to the mercy of ISIS. It is critical then to begin a process of reconciliation that would allow for the nurturing of trust, which gradually can be realized if the above measures are in fact implemented in good faith. The international community cannot grow numb to genocide, as this will continue to haunt us only with greater force. The Yazidis have paid the ultimate price, and no other ethnic group should be subjected to the same fate by any perpetrator with impunity, and with apathy from the international community.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173397 https://historynewsnetwork.org/article/173397 0
Roundup Top 10!  

So you want to talk about lynching? Understand this first.

by Michele Norris

Let’s face it as the terror and the terrorism it was.

 

The Constitution isn’t the cure for President Trump. It is the cause.

by Shira Lurie

Democrats point to President Trump’s violations of the Constitution, but the document’s undemocratic foundations enable him.

 

 

Killing Me Softly with Militarism

by William J. Astore

Besides TV shows, movies, and commercials, there are many signs of the increasing embrace of militarized values and attitudes in this country. The result: the acceptance of a military in places where it shouldn’t be, one that’s over-celebrated, over-hyped, and given far too much money and cultural authority, while becoming virtually immune to serious criticism.

 

 

Climate Change Will Cost Us Even More Than We Think

by Naomi Oreskes and Nicholas Stern

Economists greatly underestimate the price tag on harsher weather and higher seas. Why is that?

 

 

The Enduring Power of Anticapitalism in American Politics

by Jamelle Bouie

From Debs to Sanders to Ocasio-Cortez, an ideal persists.

 

 

Why China should recognize that dissent can be patriotic

by Charlotte Brooks

History suggests that narrowly defining Chinese identity will backfire.

 

 

Trump’s Increasingly Weird Attempts to Compare Himself to Lincoln

by Sidney Blumenthal

Time and again, Trump has compared himself favorably to the sixteenth President, boasting, for example, that his poll numbers are higher—although, of course, there were no polls in the nineteenth century.

 

 

Pelosi Has History and the Constitution at Her Back

by Caroline Fredrickson

When the president runs amok, the House has a congressional duty to step in and provide oversight.

 

 

Have you heard of the catastrophic men theory of history? Step forward Boris Johnson...

by Nick Cohen

Self-interested and reckless leadership defines too much of our past – and present.

</

 

The 10 most misleading American historical sites

by James Loewen

Historical plaques are often anything but informative. Here are some of the worst offenders.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173400 https://historynewsnetwork.org/article/173400 0
The Internet At 50: How the Dot-Com Bubble Burst

 

This is the second article in a series reflecting on the Internet at 50. For the first article, click here

 

As the new millennium began, greed, ignorance, and misplaced hopes within the tech world nearly destroyed the financial potential for the internet.  But as described by Harlan Lebo, author of 100 Days: How Four Events in 1969 Shaped America (Amazon,Barnes & Noble), the real message that emerged after the dot-com bubble burst had even more important implications for the role of the internet as an enduring global force.

 

* * * * * * * *

 

“When will the Internet Bubble burst?” For scores of 'Net upstarts, that unpleasant popping sound is likely to be heard before the end of this year.”

                                                – Jack Willoughby, Barron’s, March 2000

 

* * * * * * * *

 

 

It was too good to last.

 

By the late 1990s, the internet had evolved beyond anything that the pioneers of digital technology could have imagined 30 years earlier.  From the first crude connections that had linked computers for academics and government agencies, the internet had blossomed into a dynamic and wildly-popular technology for a rapidly-growing public audience. 

 

And with that popularity, the internet became a river of investment opportunity and potential profits for dot-com developers and entrepreneurs. 

 

The formation of online companies – quickly dubbed “dot-coms” – became the business trend of the decade. With almost-daily unveiling of new dot-com enterprises, multi-million-dollar investment deals, and even bigger stock offerings, the prospects for a new era of internet-based business never looked brighter. 

 

From the mid-1990s until 2000, investing in budding dot-coms was the wildest of rides, expanding within an aura of wealth, power, and optimism that had become the hallmarks of the go-go internet world.

 

Lavish spending on marketing reached a high-profile peak on January 30, 2000, when 14 dot-com companies each paid more than $2 million to advertise during Super Bowl XXXIV – inspiring the game to be called the “Dot.com Super Bowl.”

 

But behind the extravagant spending and flashy deals festered a problem – a simple, disaster-provoking problem: for the most part, neither the new dot-com companies nor the investors who bought into them had the slightest idea what they were doing. 

 

* * * * * * * *

 

Much of the “growth” of new internet companies was a façade, an industry fed by novelty and perceived investment potential – but in most cases without planning or financial evidence to back up the talk. Hard-boiled financiers threw common sense out the window, investing in companies that, with even a moment of consideration, would have been viewed as the most absurd folly.

 

In retrospect, investment mistakes are always crystal-clear, but even so, the depth of the miscalculations in the late 1990s now seems unfathomable.

 

“Investors desperately, desperately wanted the dot-coms to succeed,” said Jeffrey Cole, director of the Center for the Digital Future at USC Annenberg. “Company management offered promises about the potential for their startups, and backers had expectations that had nothing to do with reality.

 

“The dot-com bubble,” Cole said, “was business plans written on the backs of napkins.”

 

The problem for many of the start-up companies was demonstrated in a single question from editor Rich Karlgaard to a young vice-president of “business development” at a start-up. When Karlgaard asked if the dot-com was profitable, the executive said, “We’re a pre-revenue company.”

 

In 2000, the bubble burst.

 

* * * * * * * *

 

What pin had pricked the surface? Some warnings had been coming from calmer voices, but the reckless types viewed the alerts as unwelcome noise. With legions of companies operating with no rational business plans for short-term survival – let alone long-term success – and most roaring ahead with a “grow big, grow fast” mentality, the collapse was inevitable. 

 

On March 10, the prices of dot-com stocks peaked – the slide began.

 

An indisputable alarm came on March 20, 2000, when Barron’s, the weekly financial magazine, splashed its cover with drawings of mounds of cash on fire behind the headline “Burning Fast.” The issue featured a study of more than 200 internet firms, with the publication’s analysis of “which ones could go up in flames, and when.” 

 

“When will the Internet Bubble burst?” asked columnist Jack Willoughby in his column titled “Burning Up” that preceded the study. “For scores of 'Net upstarts, that unpleasant popping sound is likely to be heard before the end of this year.” 

 

Barron’s followed up the original story three months later, this time with “burn rates” for internet companies that were blazing through their cash at the end of 1999; by the time the list appeared in Barron’s, the problems were much worse. At the top of the list of companies draining their reserves were such now-forgotten names as Netzee, CDnow, Boo, Beenz, eToys, Flooz, Kozmo, and Netivation; none would survive. For many other dot-coms as well, the cash from investors was beginning to run out. 

 

By April 6, dot-com stocks had lost nearly $1 trillion in stock value.

 

* * * * * * * *

 

The consequences of the bubble’s burst dragged on for several years – the worst of them in 2000 and 2001 – as a growing list of dot-com companies floundered under the weight of too-high expectations and too-low revenue. 

 

The fate of two companies in particular tells much of the story of the business misjudgments and the misplaced investor enthusiasm that created the dot-com collapse. Perhaps the most high-visibility example of the peak and downfall was Pets.com, which called itself “a new breed of pet store.” 

 

Pets.com debuted in February 1999 – with financing from some of the premiere venture capital companies – selling a full line of supplies for America’s pet owners. Marketing for Pets.com was backed by plenty of traditional print advertising, but it was the company’s mascot, a sock puppet of a ragged-eared dog that appeared in dozens of television commercials and became the company’s high-profile face to the public.  

 

The puppet (voiced by comedian Michael Ian Black) became instantly popular with a celebrity presence that extended far beyond corporate marketing: the puppet was “interviewed” on talk shows, and had his own giant helium balloon in the 1999 Macy’s Thanksgiving Day parade. 

 

But within months, the puppet would become the poster child for the entire meltdown.

 

Even with such a high-visibility position in retailing, Pets.com was never a sustainable enterprise. The company lost money almost every time a purchase was made, as it sold millions of dollars’ worth of products for as little as one-third of their cost in the hopes that customers could be converted to high-margin buying. 

 

In spring 2000, Pets.com spent $17 million on sales and marketing, at the same time bringing in half that much in revenue. By autumn, the company was spending $158 for each customer it acquired.

 

(Perhaps the Pets.com leadership should have heeded the words of their own mascot; among the puppet’s many antics in commercials, it could often be heard singing the first line from the song, “Spinning Wheel,” by Blood, Sweat, and Tears: “what goes up, must come down….”)

 

Later, many would ask: what could explain the reasons that investors sank money (literally) into the company?

 

“Perhaps venture capitalists should have been leery of Pets,” wrote tech columnist Mike Tarsala, “since even off-line retailers barely make any margin on pet food – the company's staple seller. The money came rolling in anyway.”

 

The company’s strategy could not last; less than a year after the puppet balloon floated through Manhattan, on November 9, 2000, Pets.com stopped taking orders, and the company laid off most of its 320 employees. In June 2008, CNET named Pets.com as one of history’s greatest dot-com disasters.

 

The demise of Pets.com may have been a high-profile debacle, but other meltdowns were even more costly, including several that showed just how unaware dot-com investors could be – even when alerted to problems. 

 

Possibly the worst of all was Webvan.com, the grocery delivery service, which opened in 1996 operated by a team of executives – not one of whom had management experience in the supermarket industry. 

 

When Webvan stock went on sale in November 1999 – and in spite of public notices that the company had already lost more than $65 million for the year and warned of losses for “the foreseeable future” – the stock sold for 65 percent over its initial offering price.

 

With huge expenses – at one point committing $1 billion for construction of distribution centers and delivery trucks – Webvan expanded too quickly, its costs far outstripping its revenue by millions, then hundreds of millions. The prospects for attracting customers were unrealistic and the returns were low; on July 8, 2001, the company website carried the notice, "We're sorry. Our store is temporarily unavailable while it is being updated. It will be available again soon." 

 

The next morning, 2,000 Webvan employees were laid off, and company closed – eight months after the initial stock offering. Overall, the company lost $830 million – reportedly the largest of the dot-com disasters. 

 

* * * * * * * *

 

But many of the more responsible dot-coms survived the bubble relatively unscathed, including eBay, Priceline, Craigslist, Monster, WebMD, and others that still thrive today. All were companies that had not over-promised and had not over-expanded, and each had something that almost all of the failed dot-coms had lacked: a thoughtful business model based on solid financial planning and realistic projections.

 

After the bubble, there were some well-earned opportunities for “I-told-you-sos.” In 1999, superstar investor Warren Buffett had warned early investors – those whose stock had risen based on unreasonable expectations – to get out before the end came. 

 

"After a heady experience of that kind," Buffett said of the gains in previous years, "normally sensible people drift into behavior akin to that of Cinderella at the ball. They know that overstaying the festivities...will eventually bring on pumpkins and mice." 

 

Buffett – whose purchases of companies in 2000 did not include a single technology firm – was pummeled by critics for his seeming lack of vision. But in 2001, with his investments intact, he looked back on the fallout, saying, “The fact is that a bubble market has allowed the creation of bubble companies – entities designed more with an eye to making money off investors rather than forthem.” 

 

When the dot-com dust had cleared, the results were gruesome: by 2004, more than half of new dot-coms – hundreds of companies – had failed. About $5 trillion in stock value was lost. Hundreds of cocky start-up executives who through initial stock offerings had been made instant millionaires – on paper at least – found themselves penniless. 

 

And thousands of employees – some estimates as high as 85,000 – confident that they had joined exciting and viable ventures, were abruptly on the street. The ripple effects also damaged the value of other successful dot-coms, and of computer and software companies as well.

 

Perhaps worse – but understandable given the financial debacle – investors temporarily lost faith in new dot-com investments, whether they were sustainable or not: in 1999, 107 start-ups doubled their stock value on the first day; in 2000, the number dropped to 67; by 2001, the number was zero.

 

Of the 14 dot-coms that advertised on the 2000 Super Bowl, in less than a year, five were gone. For the next Super Bowl, E-Trade, a company that survived the bubble, produced a commercial that showed a chimp riding a horse through a ghost town of defunct dot-coms. The ad ended with the single line: “Invest Wisely.” 

 

* * * * * * * *

 

As a cautionary tale and a business school lesson about irrational investor expectations, no modern example proved better than the dot-com bubble. But even more telling about the role of the online technology in the American experience was the viewpoint that emerged after the disaster which revealed the perception – a hope to some – that the internet was going to wither, if not completely disappear.

 

“After the bubble burst,” said Cole, “it was amazing to see how many people in industry assumed that the collapse meant the end of the internet itself.”

 

“We had been studying the internet since the early 90s,” Cole remembered, “and at meetings I would be asked, ‘now that this internet thing is over, what are you going to do now?’ They assumed that when the bubble burst, the usefulness of the internet had ended – and as a result they wouldn’t have to relearn how the business world works. 

 

“And I wasn’t just hearing this view from leadership in retail – it was journalists, advertising executives, and people in other fields as well. 

 

“But we knew,” Cole said, “that in spite of the bubble burst, a failure of the internet could not be farther from the truth.”

 

Those who watched the online world could see that not only was ‘the internet thing’ still relevant, but it was more popular than ever. 

 

Even while the dot-com debacle festered as daily news between 1999 and 2002, Internet use did not decline at all – in fact going online continued to increase.  By 2001, at the peak of the crash, more than 70 percent of Americans were internet users, and were spending an increasing amount of time online at home every day, and at work as well. 

 

Even after the collapse of many dot-com retailers, the number of Americans who bought online grew as well; by 2001, half of internet users had also become internet buyers – and continued to buy online. 

 

In spite of the burst of the dot-com bubble, the message was clear: America had no intention of giving up on the internet.  

 

 

 

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173359 https://historynewsnetwork.org/article/173359 0
1600s Verona: Romeo, Juliet, and A Romantic Tragedy for the Ages

 

MATCH.COM  (circa 1604)

JULIET CAPULET

I’m 14 (going on 40) and they all say I need to make better choices in men.  I am single. I’m looking for a man my age. I like men who get unto swordfights and love talking to girls hanging off balconies.

I’m jubilant and carefree and avoid family strife (OMG, what a family I’ve got!). Love to travel. I love walking on the beach over at the Italian Riviera, a glass of wine in my hand, the Mediterranean breeze in my hair, listening to Baroque music, or sailing in yon gondola in Venice, looking up at the stars.

They say I’m witty and a good dancer, but a bit star-crossed. ’ll know my true love when I meet him, and it could be you!

Or, as I tell all my boyfriends, parting is such sweet sorrow.

Can’t write more – gotta get to the drug store.

Juliecap23@Verona.org

    

Who has not seen a version of William Shakespeare’s Romeo and Juliet– somewhere, somehow? Thousands of productions of the play have been staged from 1600 on over the years. There has been a Broadway musical, West Side Story, that was later a hit movie. It was so popular that it is coming back to Broadway again, in December. There was a silent movie of Romeo and Juliet and more sound productions over the years. Even superstar actor Leo DeCaprio jumped into the Romeo and Juliet phenomenon, playing Romeo in a New York City street movie version of the story. There has even been a ballet, by Prokofiev, that is currently touring the country.

 

Yet another production of the play opened Saturday at the Shakespeare Theatre of New Jersey, at Drew University, in Madison, N.J. It is superb. Director Ian Belknap has not only staged a towering romantic tragedy, but underscored all of the drama of life in Verona and the rivalry between two powerful families, the Montagues and the Capulets. He has brought out the best in Romeo and Juliet, those famed star-crossed lovers, but also the very worst in their families. All of this make for a taut drama that swirls through old Verona as the young lovers swirl ln each other’s arms, unaware of the huge storm that is brewing around them and unaware, too, of the hatred of the families and their iron resolve to quash any union between them.

 

There is still family hate in the world, four hundred years after Shakespeare’s play. The best example of that  in America is the Hatfields and McCoys, of West Virginia and Kentucky, whose post Civil War animosity towards each other resulted in numerous deaths and execution. For what?

 

Families – people--hate each other today and for no legitimate reason. It isn’t just racial of class, as  director Belknap writes in the play’s program, but deep, deep senseless hate.

 

That hate envelops Romeo and Juliet and slowly starts to strangle them.

 

Romeo is a young, vibrant, dashing teen who falls in love with 14 year old Juliet at a party, They know their families will oppose their union but do not care because they are smitten with each other. The .

 

It all takes a terrible, bloody turn when Romeo, a Montague, slays Tybalt, a Capulet, on a city street in a duel. That sets off the firestorm. The Capulets want his head and the Prince banishes him from the city. He gets help form his father, who puts together a plan for an escape for Romeo and his love.

 

Belknap draws out Shakespeare’s characters, highlighting all of their triumphs and tragedies and paints a marvelous portrait of them on the Drew University stage. He scores well with Romeo and Juliet but gets a truly remarkable performance from Aedin Moloney as Juliet’s nurse. She is bouncy and joyous at the start of the play and then drowns in angst as the drama hurtles towards its sad conclusion. 

    

Everybody knows the conclusion of the play. It is tragic and pointless.

 

At the end of the play, you sit back with a fine understanding of history too, because Shakespeare, who wrote several plays about Italian life (Two Gentlemen of Verona and  Othello among them), tells you a lot about life in Italy in the late 1500s, a time of artistic and cultural triumph. You learn about arranged marriages and youthful unions (Juliet, at just 14, in that era was not considered young for marriage).

 

Director Belknap has done a splendid job of bringing the play to life for modern audience, highlighting all of its nuances that link Romeo and Juliet to contemporary life. He has added a lot of tension to it, too, a tension that turns into an emotional blaze at the end.

 

The director gets fine performances from a talented cast. Aiden Moloney is wonderful as the nurse. Miranda Rizzolo, although tentative at times, is a fine Juliet. Other performers who do good work are  Joshua David Robinson as Mercutio, Isaac Hickox Young as Benvolio, Torsten Johnson as Tybalt, Matt Sullivan as Friar Lawrence,  Mark Elliot Wilson as Lord Capulet, and  Michael Dale as Lord Montague and  Friar John.

 

The star of the show is Keshav Moodliar as Romeo. He captures the play the moment he glides on to the stage. His early humorous demeanor changes as his life tumbles out of his control, starting with the murder of Tybalt. At the end he is a devoted lover to his Juliet. In the fabled balcony scene with her, he is just dazzling as he sinks to the ground and then sprawls out on it, singing his memorable praises of his true love.

 

Together, the ensemble has created a winning Romeo and Juliet. Many of the men are superb swordsmen, too. What’s a Shakespeare play without a good swordfight, right?

 

This Romeo and Juliet is a triumph. What light is beyond yonder window? It is this production.

 

PRODUCTION: The play is produced by the Shakespeare Theatre of New Jersey. Scenic Design:  Lee Savage, Costumes: Paul Canada, Lighting: Michael Giannitti, Sound: Fabian Obispo, Fight Director: Rick Sordelet. The play is directed by Ian Belknap. It runs through November 17.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173396 https://historynewsnetwork.org/article/173396 0
Trump and the Divine Rights of Kings

 

Some ten days after the execution of King Charles I, a pamphlet with the Latinate title Eikon Basilike (which means “Royal Image”) appeared in the book stalls of London, attributed to the dead monarch and purporting to be a diary of his imprisonment. Within the book was an allegorical depiction of Charles: the king on penitential knee (nonetheless wearing the elaborate, regal clothes appropriate to his stature) while a ray of divine light penetrated his skull, generatingfor the monarcha vision of a shining crown. Eikon Basilike’s engraver William Marshall included an explanatory poem with the image:  of the “boist’rous Windes and rageing waves/So triumph I. And shine more bright/In sad Affliction’s darksom night.” For Charles, a man who despite his tremendous authority was still checked by the ancient rights invested in the legislative branch, it was apparently very hard to be king. 

 

Nobody feels sorrier for themselves than a monarch who discovers that their divine right is an illusion; nobody is more liable to lash out and project the blame for their predicament. For royalist defenders of the king, Charles had been unfairly assaulted by his Parliamentarian enemies; as a ruler gifted with the divine right of kings,all legislative and ecclesiastical prerogative was his, and as such his fall from grace could never be attributed to his own ineptitude or authoritarianism. When he (or his ghostwriter) announced “I would rather choose to wear a crown of thorns with my Saviour, than to exchange that of gold,” I’ve no doubt to read that sentiment as genuine; though when he follows up the subject of golden crowns by emphasizing that they and all that they represent are “due to me,” his political theory is clear. It turned out he was disastrously wrong about that. 

 

In matters of syntax, grammar, punctuation, diction, and spelling, readers of Eikon Basilike will not necessarily see much rhetorical similarity between a sentence like the “aspersion which some men cast upon that action, as if I had designed by force to assault the House and Commons, and invade their privileges is so false, that as God best knows, I had no such intent” and a tweet which read “A Total Scam, by the Do Nothing Democrats. For the good of the Country, the Wirch Hunt (sic) should end now!” When it comes to intent and meaning, however, there’s a lot of overlap. Despite Charles’ erudition, there is a similar sense of aggrievement at being asked to do something that each man doesn’t want to do. Both ran into some trouble with their legislative branch, and both men similarly questioned the vested rights of those respective bodies to act as counterbalance to executive authority. And they’re both angry at being questioned about it.

 

Historical comparison is a fickle and ambiguous gambit; I don’t want to belabor the similarities beyond comprehension. There are cultural, social, and political differences that are so profound that it would be an act of intellectual malpractice for me to claim that 2019 bears too much similarity with 1649. In more superficial attributes concerning temperament, forbearance, and faith,there’s little that’s similar about the two. A reading of Eikon Basilike demonstrates that as unconvincing and self-serving as Charles’ theological arguments may be, they were genuine; a reading of Trump’s Twitter feed shows him to be a man of seemingly limitless non-faith (even while his evangelical supporters pretend otherwise). 

 

But in another sense the two men do share a certain philosophy of power, whereby that which is invested in the head of state is seemingly limitless and always justified by the simple fact that they’re the ones who are wielding it. For Charles, this was religiously justified – the monarch was touched by God and so was allowed authority over other men. Trump’s reasoning bears more similarity to the fascist rhetorical trope of conflating the leader with some amorphous, ambiguous, faceless sense of “The People” (even while a majority of Americans now support impeachment and removal from office). Nonetheless, the conclusion is the same – nothing that the leader does can be illegal simply because the leader is the one doing it.

 

While Charles’ writing (or whoever wrote Eikon Basilike) is certainly more sophisticated than that of Trump, the over-weening sense of wounded pride from an autocrat spurned is present in the language of both. Defending himself against the observation that he had violated the rights of Parliament, Charles emphasizes that this claim “is so false.” Evocations of screeching “Fake News!” from 350 years ago, perhaps. Historian Michael Braddick writes in God’s Fury, England’s Fire: A New History of the English Civil Wars that the pamphlet attributed to Charles was “by far the greatest propaganda success following the regicide, calling forth anxious… rival histories.” Confusion sowed by Trump and his defenders serves a similar purpose:to craft an alternative history in real time. Lest I be accused of being the historian who tends to see their own field of study in whatever the headlines for that day are, I should emphasize that it was actually Trump’s defenders who first implicitly made the comparison to Charles, and as such a parsing of what those similarities are is helpful at this dangerous moment.  

 

Sitting next to the vampiric Rudy Giuliani on FOX News’ The Ingraham Angle, Trump surrogate and attorney Joseph diGenova claimed with supreme self-seriousness that “This is regicide by another name, fake impeachment.” It’s helpful to note that impeachment isn’t regicide, it’s not execution, it’s not imprisonment, it’s not even necessarily being removed from your job. It’s an investigation and congressional censure; realistically the most Trump has to fear is being fired (unless some of his past shady dealings still being investigated by the Southern District of Manhattan have him more worried about imprisonment). What’s illustrative about the histrionics implicit in the word “regicide,” however, are what they tell us about how Trump and his supporters see the president. If impeachment is “regicide,” then the conclusion must be that Trump is a king. DiGenova’s use of the word implies that he sees no problem with interpreting Trump as a king, only a problem with those who would dare to question that authority. That this is a deeply worrying way of understanding the executive goes without saying. 

 

“Regicide” is not a word you often hear self-implied in American political discourse. The U.S. founders in many ways worked in the stead of their English antecedents, and consequently were attuned to the theory and rhetoric of that previous century. In this manner they were inheritors of a virulently anti-monarchical politics. In a nation where, despite our many hypocrisies on the actual deployment of such power, we historically blanche at anything that seems too symbolically regal when applied to the office of the presidency, there is something sinister in diGenova’s language. What’s clear is that Trump and his surrogates understand the position as implying complete agency and complete authority over the other branches of government, that their understanding of power in that manner has more to do with Charles’ divine right than it does with the United States’ Constitution. 

 

Trump’s defenders are brazenly putting forth a counter-narrative of American law, one that is explicitly anti-American. Council to the President, attorney Pat Cipollone wrote in an October 8th letter to Speaker of the House Nancy Pelosi and three other congressional committee heads that the impeachment hearings are “contrary to the Constitution of the United States and all past bipartisan precedent.” A strange claim in a nation that has impeached two presidents before, and was on the way to impeaching another, and where Article 1, Section 2, Clause 5 of the actual Constitution itself explicitly states that “The House of Representatives… shall have the sole Power of Impeachment.” 

 

There’s a certain smug style in liberal politics which would (and has) claimed that Cipollone’s gambit demonstrates an unfamiliarity with the Constitution, that Trump and his supporters are simply too stupid to understand what’s in the document. I’d venture that that’s a misinterpretation, and a dangerous onebecause what Cipollone is actually doing in that letteris abandoning the dictates of the Constitution by redefining the Constitution out of existence. When Cipollone says that something is “unconstitutional,” that should not be taken literally, he is simply using it as a synonym which means “Something that my boss doesn’t agree with.” When diGenova says “regicide,” he means “Any process which questions Trump’s authority.” Such claims aren’t being made in this way by these men because these men are stupid, they’re being made in this way because these men are conniving, disingenuous, manipulative and incredibly threatening to the politics of a free republic. 

 

An argument could be made that what we’re witnessing is one of those periodic, dialectical flare ups that have occurred in the Anglophone world since Charles’ day (albeit at a hopefully much smaller scale). These are conflicts over who should have more power: a representative legislature or a unitary executive.  Demographer Kevin Phillips argued (not without some disciplinary controversy) in 1998’s The Cousins’ War: Religion, Politics, Civil Warfare and the Triumph of Anglo-America that certain events from the English and American revolutions through the American Civil War need to be read as part of the same conflict over questions of power and authority. He writes that the “English Civil War is the necessary starting point, not just for a piece of Britain’s history but for America’s. This is where the events and alignments leading up to the American Revolution began.” 

 

According to Philips, each of those three conflicts followed a certain Manichean script: in the English Revolution of the seventeenth-century there were Parliamentarians who defended their rights against an absolute monarch; there was a similar dynamic in the American Revolution. The American Civil War represented the latest iteration of a democratizing political force as the Union fought to expand republican values against the aristocratic Confederacy. In The Cousin’s War, this dialectic explains Anglophone political history; the periodic pitting of an authoritarian, aristocratic class against nobler, democratizing movements. 

 

Read as such, Trump’s arguments shouldn’t be understood as just disingenuous or misinformed (though they certainly can be those things), but as primarily a restatement of that old lie about the divine right of kings, and as a repudiation of the legislative authority that goes back to the Magna Carta. In Trump, we have an inhabitant of the Oval Office more similar to King George III than George Washington, a man in the shadow of Jefferson Davis rather than Abraham Lincoln. DiGenova and Cipollone’s claims share with Eikon Basilike the sense of aggrievement and privilege, they also share in some sense the same political theory regarding the rights of kings. It’s encouraging to remember that in the past wars enumerated by Philips, the authoritarian side ultimately lost. Yet, after each of those victories there was significant backsliding towards an undemocratic status quo.  

 

Something to keep in mind as we hope, prepare, and plan for Trump’s impeachment: an assault on this president is less radical and perhaps not as important as an assault on the very idea of the presidency. Executive power is fundamentally authoritarian regardless of who wields it; as Astra Taylor writes in Democracy May Not Exist but We’ll Miss it When it’s Gone, “The forces of oligarchy have been enabled, in part, by our tendency to accept a highly proscribed notion of democracy, one that limits popular power to the field of electoral politics, ignoring the other institutions and structures… that shape people’s lives.” Taylor argues that “This is a mistake.” 

 

Following the downfall of Trump, we should neither desire nor countenance any restoration, any return of a simple status quo. From the eclipse of this authoritarian moment we can perhaps dream of more egalitarian, more emancipatory, more democratic arrangements. Such was the desire of the poet John Milton, who eight months after Eikon Basilike wrote his rejoinder Eikonoklastes. Milton enthuses that “We expect therefore something more, that must distinguish free Government from slavish.” His revolution ultimately failed, but ours doesn’t necessarily have to. Any “Resistance” that only imagines the downfall of Trump doesn’t deserve the name, for we must dream bigger than the deposition of kings. Impeachment is a necessity, but what is required is a reorganization of our politics, our economics, and our culture to ensure that future tyrants will not lead in his stead. We require a politics which is finally commensurate with a nation of free women and men. 

 

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173351 https://historynewsnetwork.org/article/173351 0
Kurdish Stalingrad: The Origins of the US-YPG Battle Synergy

 

Back in September 2014, an invasion by ISIS and subsequent months of battle reduced the Syrian city of Kobane – once thriving with bustling markets where civilians would gather to buy vegetables and exchange gossip – to a desolate wasteland. For months thereafter the empty, bomb-blackened streets, lined with the wreckage of pockmarked buildings and burned-out cars, served as a poignant reminder of the heavy toll the Kurds of Kobane had paid in their resistance to the jihadi invaders. But little by little the city came back to life as many of Kobane’s proud residents returned home, cleaned the streets, reopened shops, and did all they could to prompt a return to normalcy. By January 2017, from the ruins of Kobane emerged a new falafel shop with a curious name – Trump Restaurant. 

 

A Syrian Kurd named Walid Shekhi decided to open Trump Restaurant in central Kobane because he wanted to show his appreciation for the United States’ role in rescuing his cherished hometown from ISIS’s barbaric atavism. It did not matter to Mr. Shekhi that Kobane was liberated under the Obama administration’s watch; Trump had already been elected by the time he opened the restaurant, and he had no interest in the United States’ domestic political landscape. “We Kurds love the United States, so we love Donald Trump,” he said. “That’s why I named my restaurant after him.”

 

Mr. Shehki’s love for the United States is neither anomalous nor insignificant, as the story of the liberation of his hometown also happens to be the origin story of the American-Kurdish battle synergy that ultimately deprived the ISIS terrorists of their territorial caliphate in Syria. Kobane quickly became a symbol of Kurdish resistance and, after liberating the embattle city from the jihadists’ tyrannical grip in early 2015, the Kurdish People’s Protection Units (YPG) would serve as the tip of the spear in America’s war on ISIS – carrying out crucial ground operations with the help of US-provided weapons, ammunition replenishments, training and logistical support, as well as coordinated airstrikes. 

 

That is, until Trump’s abrupt announcement earlier this month of an ill-planned withdrawal of U.S. troops from Syria which paved the way for a Turkish cross-border military operation targeting Kurdish-controlled areas. NATO ally Turkey, which makes no distinction between the YPG and the Kurdistan Workers’ Party (PKK) – a terrorist organization which has waged an on-again, off-again insurgency against the Turkish state since the 1980s– now threatens the very existence of the Kurds as it continues its widely condemned incursion across its southern border into Kurdish territory. 

 

The Kurdish YPG fighters were left to confront the second largest military in NATO without US support. As Kurdish civilians flee to safety from places like Kobane and YPG fighters launch a futile attempt to repel the better resourced and militarily superior Turkish forces, the desperate Kurds appear to be striking a deal with Syrian President Bashar al-Assad and Russian President Vladimir Putin that will provide them much-needed protection. Russian troops have already begun occupying abandoned American outposts and, as of October 18, images emerged of Assad-backed forces arriving in the Kobane area, triumphantly holding up images of Assad and replacing Kurdish flags with Syrian flags.

 

After so much success fighting alongside the United States, not to mention vital intelligence obtained on how Americans overtly and covertly conduct unconventional warfare, it is unfortunate to see the Kurds left with no choice but to turn toward America’s geopolitical adversaries for help. 

 

Trump never misses an opportunity to take credit for defeating ISIS – which, of course, is not actually defeated – recently claiming: “We were the ones that took care of [ISIS], specifically me because I’m the one that gave the order.” Such grandiose claims obfuscate the reality that it was the Kurds on the ground doing the majority of fighting and dying against ISIS. The American president would be wise to familiarize himself with the recent history that led to the small falafel shop in central Kobane that bears his name.

 

The Story of Kobane

 

In the fall of 2014, after conquering one third of Iraq and Syria with astonishing ease, ISIS set its sights on the inconspicuous border town of Kobane. The city’s conquest would have important strategic implications, as ISIS would gain control of a large uninterrupted section of the Turkish border, allowing the terrorists to expand supply routes and open the floodgates to thousands of fanatic militants. But Kobane’s defenders, a brave contingent of outnumbered, outgunned Kurdish fighters from the YPG refused to cower in fear. As al-Baghdadi’s genocidal terrorist army surged toward them, YPG spokesman Polat Can explained their willingness to die for Kobane: “We will resist to our last drop of blood together… If necessary we will repeat the Stalingrad resistance.” 

 

But as the Kurds fortified their positions and dug in to defend their hometown, ISIS’s strategic and tactical military prowess began to show. ISIS fighters systematically surrounded the city and methodically probed the outer lines of the besieged defenders from the west in the town of Jarabulus, the south near Sarrin, and the east near Tal Abyad – rapidly advancing on all three fronts and tightening the proverbial noose around the neck of Kobane. Tragically, the Kurds, whose national motto is “no friends but the mountains,” felt they had found themselves isolated and abandoned by an uncaring world.

 

Support came in late September, however, when the US-led coalition responded to the Kurds’ pleas for assistance and began launching merciless precision strikes on ISIS targets that had been identified on the ground by US-trained Kurdish air controllers. While the airstrikes slowed ISIS’s crushing offensive, the jihadi militants adjusted by setting infrastructure aflame to obscure the American air armada’s vision with towers of black smoke. ISIS then managed to press forward and it was not long until the infamous Black Banner was planted on a building in southern Kobane, marking the terrorists’ official penetration of the border town. Kurdish forces then declared the city a military area; all civilians were asked to leave immediately. Those who stayed prepared for a fight to the death. 

 

Once in the city composed of slim, meandering streets and winding alleys, ISIS’s reliance on brute force and heavy weaponry such as tanks proved to be more of a burden than an advantage. Equipped with an intimate familiarity of the terrain in Kobane, the YPG soldiers moved like ghosts as they bedeviled their fanatical foes with creative defense tactics such as ambushes and traps. But the waves of ISIS fighters never stopped coming, and the Kurds were soon low in weapons and ammunition. Impressed by their extraordinary resilience, the US military decided to intensify its support and airdropped much-needed weapons, ammunition, and medical supplies to the Kurds. American support breathed new life into the Kurds who were suddenly ready fight on with even greater speed and intensity.

 

By late October, as US air power cleared their way by engulfing ISIS positions in a storm of explosive rain, approximately 150 Iraqi peshmerga troops crossed the Turkish border into Syria to help their ethnic brothers and sisters liberate Kobane. Also at this time, YPG forces were further boosted by an influx of as many as 200 battle-hardened Syrian Arab rebels from the Free Syrian Army (FSA), an amalgamation of Arab Sunni rebel groups who, at this time, were known more for their opposition to the Assad regime. With the United States continuing its vital air support, the American-Kurdish-Arab troika conducted relentless joint operations against ISIS until the jihadi invaders had no choice but to retreat. The Kurds spent the next couple of months recapturing building after building, street after street, and village after village. 

 

By January 2015, ISIS officially acknowledged for the first time since the group rose to power that it had been defeated. In a video released by the pro-ISIS Aamaq News Agency, ISIS fighters cited US-coalition airstrikes as the primary reason for the defeat and downplayed the role of the Kurds, whom they referred to as “rats.” As the Kurds picked through the rubble of Kobane and assessed the damage incurred in battle, they reveled in their victory. “It is great to have beaten Daesh,” explained a Kurdish fighter from the YPG, using the Arabic acronym for ISIS. “But it would not have been possible without America and the peshmerga.”

 

Remember Kobane

 

The liberation of Kobane vividly illustrates not only the Kurds’ ability to repel ISIS, but also the remarkable synergy between the United States and the YPG.  This special relationship was ultimately what crumbled the ISIS caliphate and has since been vital in ensuring the enduring defeat of ISIS. The YPG has been working directly with U.S Special Operations forces in mop-up operations in northeastern Syria – gathering intelligence, tracking ISIS movements, disrupting its networks, and targeting its leadership as the jihadists revert to underground insurgency mode. 

 

The Kurds have served reliably for five years as America’s primary boots-on-the-ground ally in Syria when it comes to the bloody battle against ISIS, having lost approximately 11,000 lives in the process. They simply deserve better than abandonment in the face of a Turkish threat. 

One thing is for sure: everyone at Trump Restaurant in Kobane will be counting on the American leader to reverse course and continue his support. It would really be a shame to see the name of the falafel shop changed to Putin Restaurant or, even worse, destroyed entirely by Turkish-backed invaders. 

 

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173347 https://historynewsnetwork.org/article/173347 0
A Democratic GUT-Check: A Grand Unifying Theory of Democratic Victories

 

Democrats are justifiably nervous about the 2020 election. A strong economy and stubbornly loyal Republican base render an otherwise vulnerable incumbent into a perilous opponent. A brawling counter-puncher, Donald Trump’s political spirit animal might very well be the wily and oddly vicious raccoon. When cornered raccoons attack their predator’s eyes. Once their prey is blinded, they penetrate the chest wall, collapse the lungs, and infiltrate the abdomen cavity.  Septic peritonitis and massive organ failure ensue, followed by death. Politically trapped on infinite occasions, Trump has veritably raccooned our institutions, norms, and the body politic. Our national septic shock proves the adage of The Wire’s Omar Little, “[When] you come at the King, you best not miss.”

 

America simply cannot afford a second Trump term. Democrats, however, get one shot at nominating a candidate capable of assembling a coalition and driving turnout sufficient to defeat the president and drive a nail into the Trumpist coffin. Happily, a spate of polls reveal that Democratic primary voters prize electability more than issue agreement. Pragmatic and hungry to defeat Trump, Democrats should understand “electability” in its fullest historical sense. The preceding century of Democratic presidential politicking reveals that “electability” is not milquetoast, split the difference centrism. Historically, Democrats win when they adhere to a grand unifying theory (GUT). According to this principle, Democrats win when they nominate a political cipher and a cultural chameleon who possesses a preternatural charisma that can appeal to and energize a diverse set of voters.

 

The political party of the underdog and ethnic, racial, and social minorities has always lacked the cultural cohesion that the Republican’s possess. Consequently, successful Democratic nominees have been ideologically vague, comfortable in a variety of cultural settings, and exceptionally charismatic. In the first three quarters of the twentieth century, the party’s primary fault line lay between its rural, white Protestant and ethnically diverse urban wings. Unable to close that divide, rural America’s champion, William Jennings Bryan, lost three presidential races, 1896, 1900, & 1908. Poised for a fourth bite at the apple in 1912, party bosses intervened and accidentally discovered a template for the future in Woodrow Wilson. 

 

A mere two years into the first political office of his entire life, Wilson entered the presidential fray. Previous to this, both parties had almost always tapped party elders or retired generals as their presidential nominee. Possessing neither, Wilson did enjoy the unique biography and blank slate necessary to unify a fractious party. Virginia-born, he was Southern and agrarian enough to satisfy rural voters. His stint as governor of New Jersey meant he was not a typical Solid South politico. Finally, his newfound Progressivism put him in accord with the educated middle class and erstwhile populists. A master political orator, Wilson’s charisma helped bind a diverse coalition to him. A political Rorschach test, rural and urban Democrats, Progressives, and old-time Populists saw what they wanted in Wilson. Facing a divided GOP in a fractured four-way race, Wilson took 435 out of 531 electoral votes in a landslide victory.

 

A generation hence, Wilson’s precarious coalition had come apart. In three consecutive presidential races, 1920-1928, Democrats had earned more than forty-percent of the vote just once. Convalescing from polio, Franklin Roosevelt had avoided the foul taint and culture wars of the Democrat’s wilderness years. A compromise candidate, Roosevelt’s rural, upstate New York background, and adopted Warms Springs, Georgia home made him palatable to Southerners and rural voters. Likewise, his Northern upbringing and Progressive leadership in New York rendered him acceptable to urban Democrats. Campaigning on an ill-defined New Deal platform, FDR avoided unnecessary offense to party constituencies and took White House in 1932. His uncanny charm and gift for building an intimate connection with voters via his radio Fireside Chats resulted in the New Deal coalition. Comprised of white Southerners, farmers, the urban North, African Americans, and liberal intellectuals, the coalition endured for half a century; the GUT, nevertheless, remained necessary to maintain this ungainly assemblage.

 

In the decades following FDR, Harry Truman and LBJ were the sole exemptions to the GUT. In these cases, the exceptions prove the rule. Assuming the presidency upon the deaths of FDR and JFK, Truman and Johnson each won election in their own right. Rural Democrats who lacked public charisma and saddled with long records on national issues, the duo earned the ire of pieces and parts of the coalition’s diverse constituencies. Their unpopularity caused both to refuse a run for a second full-term and the election of GOP successors.

 

The Truman & Johnson example reveal just how much postwar Democrats struggled to keep their diverse coalition together. Understanding this, party leaders looked to the 1960 election with concern. With LBJ too Southern, Hubert Humphrey too liberal, Adlai Stevenson too much the loser and all saddled with long records, Democrats searched for a political Goldilocks. Equipped with an ambiguous ideology, few legislative accomplishments, and charisma to burn, JFK fit the cipher (and GUT) bill. Sixteen years later, the social issues of race, crime, and the culture wars had split the party yet again. It was left to a Bob Dylan-quoting, Sunday School-teaching, nuclear-engineer-cum-peanut-farmer to bridge these divides. With feet, big toes, and a pinky in every nook and cranny of the Roosevelt coalition, the obscure one-term governor of Georgia, Jimmy Carter, would be the final Democrat to bring New York and Mississippi into the same column.

 

With the Roosevelt coalition undone, Democrats were left with the so-called McGovern coalition. Amultiracial, multiethnic, cross-class assemblage of African Americans, Latinos, women, college students, professionals, and economically populist working-class whites, this assemblage of misfit toys presented familiar challenges. A product of the rural South and the Ivy League, Bill Clinton’s gubernatorial service helped him avoid the sticky wicket of controversial national issues. Whip smart, elite educated, and charismatic his wonky explications on policy, done in a Southern drawl, and moderate stance on social issues enabled him to speak to a multiplicity of audiences.

 

Like Clinton, Barack Obama also inhabited and felt at home in a variety of cultural worlds. The product of a Kansas-born mother and Kenyan father who grew up in Indonesia and Hawaii, he instinctively knew how to speak to diverse audiences. Moreover, his thin national resume meant he avoided the political crevasses that crisscross his party. Equipped with charm and electrifying rhetorical gifts, Obama literally embodies the GUT. Indeed, the GUT is the lone connecting thread that connects a white supremacist, Woodrow Wilson, to Barack Obama, and most every Democratic president of the twentieth century. As the GUT reveals “electability” lies not so much in centrist policy as it does in coalescing and energizing a diverse, majority coalition. 

 

To be sure, successful Democratic nominees have proffered mainstream, center-left policy proposals. Maximalist policy proposals and ideological rigidity do not unite diverse coalitions. But a Democratic GUT-check reveals that “electability” is not simply a checklist of centrist policy proposals. For those searching for the most viable Democratic challenger, history shows that the candidate with a thin national resume, charisma, and an aptitude for navigating a variety of cultural contexts possesses the resume for victory.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173350 https://historynewsnetwork.org/article/173350 0
Melania Trump Just Restarted a 100-Year-Old Political Controversy: The White House Tennis Court

 

On Tuesday October 8, with impeachment speculation swirling and increasingly disturbing reports coming out of Syria, First Lady Melania Trump broke through the noise to share some good news: Ground was being broken for the construction of a new tennis pavilion at the White House.    

 

The 1,200 square foot pavilion, we learned, will replace a small, lattice-covered bathroom structure currently on the site.  The White House tennis court itself, in its current location for the last 40 years and retrofitted most recently with a basketball hoop and court lines for Barack Obama, will remain mostly untouched.

 

“It is my hope that this private space will function as a place to gather and spend leisure time for First Families,” the First Lady said in a statement.  She also clarified that this “Legacy Project” would be funded entirely with private donations.  Like the First Ladies that preceded her, Melania intended to leave the White House a better place than she found it.     

 

Not surprisingly, the announcement caused the Twittersphere to lose its collective mind.  

 

@TonyPosnanski’s tweet captured the general mood of those that responded to @FLOTUS.  

 

“Thousands of our allies are being attacked because we abandoned them because of your husband, your husband is attacking the Constitution, and your husband is bullying Americans, but congrats on the new tennis court. Seriously, [explitive] you.  You are an embarrassment.”  

 

The response to the announcement became as much as story as the project itself.  “Melania Trump Trolled Over Her ‘Legacy Piece’: Does the White House Need an Entire Tennis Court Pavilion?” Newsweek asked.  

 

If the Trump White House is looking for advice on how to handle the blowback—It’s just a tennis pavilion; We just wanted to make the White House grounds a bit more beautiful—perhaps it should look back to the administration of Theodore Roosevelt.  After all, it was TR who brought tennis to the White House grounds in the first place.   

 

Shortly after ascending to the Presidency following the assassination of William McKinley in 1901, Roosevelt requested funds from Congress to overhaul the White House.  The executive mansion needed the attention; problems varied from exposed wiring (fire hazard) to cramped office space.  

 

As part of the renovation, the landscaping crew—working under Edith Roosevelt’s watchful eye—installed the first White House tennis court.  They placed it immediately adjacent to the President’s executive office, on the spot where the Oval Office sits today.  

 

As the court neared completion, both the Washington Post and New York Times picked up on the story.  The “President’s Children to Have a Model Playground Adjoining his Office,” the Post reported.  

 

Not everyone approved. Roosevelt’s critics seized upon the White House new tennis court as a sign that the President was out of touch with the average American.  One Tennessee newspaper, for example, suggested that the nation could hardly afford to keep Roosevelt in the White House.  “The White House has been enlarged at an expense of $500,000,” the paper wrote, “a $2,500 tennis-court has been built for his children, and the living expenses have been about triple.”  The paper called for an end to “this carnival of graft and extravagance.” 

 

A debate about the court ensued.  Roosevelt’s steadiest literary supporter, Outlook, argued in defense of the tennis court.  “Is the President Extravagant?”  No.  “It is true that there is a tennis-court on the White House grounds, but it cost less than … the greenhouses under the previous administrations.”  

 

The editors of Outlook put forth an early form of life-balance counseling.  “We think there can be no serious objection on the part of any decent American to the President playing tennis with his children, and it is impossible for them to play tennis except on the White House grounds.”  The Republicans liked the Outlook article so much, they entered it into the Congressional record. 

 

Puck, a devilishly satirical publication, questioned what exactly would transpire on the White House tennis court.  “The mutter of conspiracy is heard,” Puck editorialized.  Perhaps the White House tennis court simply provided cover for other activities.  “Who questions the happy outcome of conference or confab, the parties of which have previously lobbed and smashed, volleyed and served together on a common level the smooth delightful level of the White House tennis court?”     

 

For Roosevelt, however, the White House Tennis Court eventually went from being a political liability to an asset.  

 

Stories of Roosevelt’s long, sweaty matches—sometimes against unprepared foreign dignitaries—came to bolster his reputation as a purveyor of “The Strenuous Life.”  The group of 30 or so regulars at the court became known as Roosevelt’s “Tennis Cabinet.”  

 

The fact that Roosevelt went public at times about his struggle to keep his weight under control, and thus felt compelled to fit tennis (or boxing or hiking) into his schedule, also resonated in a nation struggling with the realities of urbanization and industrialization.  

 

While Melania declared her tennis pavilion a gift to future White House inhabitants, it seems likely that TR’s tennis court came about as a spousal nudge from Edith.  Edith was concerned about her husband’s growing waistline.  Life in the White House, Roosevelt admitted as the tennis court was being finished, “has been very conducive to me getting fat.”  Edith certainly noticed.  Once complete, the court, just steps from the President’s desk, served as a reminder to TR to get out and exercise.   

 

Effort trumped expertise on the TR’s White House tennis court.  “My impression is that father didn’t play a great game, but played very hard,” Roosevelt’s always-candid daughter Alice explained.  Or as another observer put it: “He played tennis vigorously on the White House courts, though he never became very expert, there being no danger at any time of the President’s entering the National Tennis Tournament at Newport.”

 

As Roosevelt’s administration neared its end, the narrative regarding the White House tennis court took on an exuberantly positive tone.  No journalist portrayed Roosevelt as a tennis snob playing on his own taxpayer-provided court; rather the press fought amongst itself to see who could best capture the image of a President of the United States valiantly competing on the court even though he had a country to run.  The President finds time for exercise, the thinking went, thus so should you.  

 

After Roosevelt left the White House, William H. Taft took over and oversaw the bulldozing of TR’s court in order to make room for further West Wing improvements.  Taft cared little about the change; he preferred golf to tennis anyhow.  Landscape architects configured a new court into the south lawn area of the grounds. The court was moved several times before taking its current position.  In 1989, President George H.W. Bush signed off on improvements to the tennis court, which until Obama retrofitted it for basketball, made the court what it is today.  

 

The Tennis Court snagged other victims along the way.  It was on the tennis court, so the story has long gone, that Calvin Coolidge Jr. got a blister, that then got infected, which then led to the teenager’s death from blood poisoning in 1924.   

 

For Jimmy Carter, the White House tennis court became a symbol of a weak, distracted, micro-managing President.  Late in his term, a White House insider wrote a tell-all accusing Carter of personally managing all requests to use the tennis court.  Carter denied the story, but it stuck.  

 

At a press conference on April 30, 1979, after talking about energy conservation, and the Soviet threat, and strategic arms limitations, Carter tried to put the tennis court issue to bed: 

 

“The White House tennis court: I have never personally monitored who used or did not use the White House tennis court. I have let my secretary, Susan Clough, receive requests from members of the White House staff who wanted to use the tennis court at certain times, so that more than one person would not want to use the same tennis court simultaneously, unless they were either on opposite sides of the net or engaged in a doubles contest.”

 

Needless to say, the non-denial denial did nothing to help Carter’s image.  

 

The lesson in all of this?  Beware of the White House tennis court.  Or more directly: Presidents, be wary of associating with country club sports during times of political crisis.  

 

As Theodore Roosevelt explained it: “I myself play tennis, but the game is a little more familiar; besides you never saw a photograph of me playing tennis.”

 

Perhaps just for someone like President Donald Trump, Roosevelt expounded even further.  “I am careful about that,” Roosevelt said of publicity regarding his connections to sports.  “Photographs on horseback, yes; tennis, no.  And golf is fatal.”    

 

 

 

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173344 https://historynewsnetwork.org/article/173344 0
To Stop The Rise of Nationalism, We Must Remember High-Tech’s Role in Supporting Past and Present Nationalistic Agendas

 

As the 74th UN General Assembly winds down, many commentators are discussing the world-wide rise of nationalism fueled by strongmen leading countries like China, Brazil, India, Turkey, the Philippines, and the United States. Often absent is a discussion of high-tech’s role in supporting nationalistic agendas. Since the dawn of the digital age, nationalism has relied on digital technology.

 

Nationalism turns on a simple question: Who does, and who does not, belong within a nation? Those who belong have the luxury of safety and security. Those who do not have the burden of barred entry or targeted removal. In the early 1900s, punched card technology was created to count populations to determine who was in a nation, then quickly extended to determine who actually belonged.

 

In 1928, the Eugenics Record Office at Cold Springs Harbor, New York, under director Charles Davenport, embarked on a study to identify mixed-race individuals on the island of Jamaica for forced sterilization or other population control means. Eugenics, a pseudo-science pursuing a mythically “pure” stock of human beings, sought to cull individuals who did not conform to their Nordic ideal from humanity through measures ranging from forced sterilization to death. Eugenicists loathed mixed-race individuals as pollutants of the human gene pool.

 

The Jamaica Study required massive amounts of data to be collected, processed and reported.  IBM, newly formed by Thomas J. Watson, had just what Davenport needed. IBM engineers worked with the ERO to design a punch card format for collecting the information on racial characteristics. They also worked out the details of adjusting sorters, tabulators, and printers to provide the ERO with the output required. Eugenicists worldwide celebrated the Jamaica Study’s success thanks to the support of IBM.(1) 

A few years after the Jamaica Study, using remarkably similar punched card formats, Watson offered IBM’s technology to the Third Reich, automating major aspects of Hitler’s war machine — including Luftwaffe bombing runs, train schedules for carrying Jews to camps, and the measures by which Jews were apprehended and exterminated. (2) 

In recognition of IBM’s extraordinary service, Hitler created a medal decorated with swastikas, awarded to Watson in 1937. Although Watson returned the medal upon America’s entrance into the war, IBM’s support of Hitler’s regime never ended. (IBM has neither acknowledged its role in the Holocaust nor disputed historical accounts of it.)

After Nazi Germany’s defeat, IBM turned to South Africa, where for decades the company provided computer technology to help classify and segregate South Africa’s population, producing the passbooks and the database storage designs for the separation and brutal subjugation of black South Africans. (3) 

 

Later, in the aftermath of 9/11, the New York City Police Department created a massive closed-circuit television surveillance center with feeds from thousands of cameras placed around the city. IBM secretly used NYPD camera footage of thousands of unknowing New Yorkers to refine its facial recognition software to search for and identify people by “hair color, facial hair and skin tone." (4)

 

But IBM is no longer alone. Major high-tech firms are now engaged in support of nationalism often under the guise of public safety and national security. Facial recognition has supplanted punched cards and passbooks as the technology of choice for determining who does and who does not belong within a nation.

 

In September 2019, Never Again Action, a Jewish peace group, marched from a Holocaust memorial in Boston to Amazon headquarters in Cambridge, Massachusetts, demanding Amazon cease supplying facial recognition technology for use at US borders, and citing IBM’s involvement in the Holocaust. 

 

Technology like Facebook and Twitter can now identify and virtually remove individuals and groups from a nation as the Russian Internet Research Agency (IRA) demonstrated in 2016. (5) The Pew Research Center reported that black voter turnout declined sharply in the 2016 presidential election for the first time in twenty years. More troubling, voter turnout increased among millennials with the exception of black millennials, targeted by IRA, whose turnout actually decreased by nearly 6 percent.(6) Strongmen understand that social media is the new means of media manipulation and population control, which they use effectively and aggressively in support of their nationalistic agendas.

 

Digital technology slips under the radar of public awareness. When companies that profited from Hitler’s regime were hauled before US Courts and international tribunals, IBM escaped detection or prosecution. It is fairly easy to understand how Ford’s vehicles might assist Germany’s war effort, much less so a company making punch cards and equipment to read them. Those who do understand have a responsibility to raise our voices against technology in support of nationalism, or risk a coming dystopian future.

 

 

********

 

(1) Edwin Black, War Against the Weak. (Washington, DC: Dialog Press, 2012), 292.

(2) See Edwin Black, IBM and the Holocaust. (Washington, DC: Dialog Press, 2001).

(3) See, for example, Michael Kwet, “Apartheid in the Shadows: The USA, IBM and South Africa’s Digital Police State,” CounterPunch, May 3, 2017 and Balintulo v. Ford Motors Co., IBM, General Motors Corp, No. 14–4104 (2nd Cir. July 27, 2015).

(4) George Joseph and Kenneth Lipp, “IBM Used NYPD Surveillance Footage to Develop Technology That Lets Police Search by Skin Color,” The Intercept, September 6, 2018, https://theintercept .com/2018/09/06/nypd-surveillance-camera-skin-tone-search/.

(5) United States of America v. Internet Research Agency, et. al. (Washington, DC: Department of Justice, February 16, 2018), 18, para. 46.

(6) Jens Manuel Krogstad and Mark Hugo Lopez, “Black Voter Turnout Fell in 2016, Even as a Record Number of Americans Cast Ballots,” Pew Research Center, May 12, 2017, http://www.pewresearch.org/fact-tank/2017/05/12/black-voter-turnout-fell-in-2016-even-as-a-record-number-of-americans-cast-ballots.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173354 https://historynewsnetwork.org/article/173354 0
A History of Influencing Presidential Children to Change Policy

 

Over the past couple of years, the press has frequently reported on the children of influential U.S. politicians and officials. Much of this has gone far beyond popular curiosities surrounding the education of former U.S. President Barack Obama’s daughters or how current U.S. President Donald Trump has relied heavily on his daughter Ivanka Trump to represent his interests abroad.

 

Rather, recent concerns surround how foreign, generally corrupt, governments have attempted to influence U.S. politics using connections to these children. During the 2016 presidential campaign, individuals connected to Vladimir Putin’s regime in Russia met in Trump Tower with Trump’s son Donald Trump, Jr., and son-in-law Jared Kushner. The seeming cover-up of the meeting by Trump’s associates and the president himself led to further allegations of corruption as part of Robert Mueller’s investigation into foreign influence in the 2016 presidential election. Now, the president and his allies, most notably his personal lawyer and former New York City mayor Rudy Giuliani, have alleged that former Vice President Joe Biden once intervened in Ukrainian politics to terminate an investigation into a corrupt energy company which had hired Biden’s son, Hunter.

 

Despite Americans’ boasts to have a government immune from external intervention, the truth is that other entities have always attempted to intervene in U.S. politics for their own benefit. Foreign regimes and multinational corporations have repeatedly hired former congressional officials, public relations firms, and more in the hopes of navigating an ever-increasing federal bureaucracy and gaining financial benefits from a country with global reach and interests.

 

In fact, one of the most adept at manipulating U.S. politics to his advantage was Dominican dictator Rafael Trujillo. From the 1930s until his assassination in 1961, Trujillo owed much of his power to his knowledge of the ins-and-outs of the U.S. government. After encouraging the massacre of thousands of Haitians in 1937 that dovetailed with escalating international fears regarding the rise of fascism, the Caribbean despot coordinated a massive campaign to restore his image in the United States, including recruiting Jewish refugees to contrast his reign with that of Adolf Hitler, lessening criticism of his domestic politics by comparing his prejudice against Haitians to anti-black racism in the United States, and more best described by Eric Paul Roorda.

 

Less well known, though, is how Trujillo targeted U.S. officials’ children in the late 1940s in hopes of securing favorable treatment. After the Second World War, the State Department led by Assistant Secretary of State for American Affairs Spruille Braden sought to distance the U.S. government from the dictator. When the U.S. government in 1948 selected as its new ambassador to the Dominican Republic Ralph Ackerman, Trujillo immediately hoped to avoid the past years’ frustrations and ingratiate himself to the new official. To do this, his ambassador in Washington Luis Thomen set his sights upon Ackerman’s son.

 

In a letter to Trujillo in July 1948, Thomen explained that the son happened to be an engineer working in Peru for Bolton & Lucas, a firm whose history with the Dominican regime included securing favorable contracts and munitions purchases. During an earlier conversation, Ackerman had mentioned that his son hoped to secure a job closer to home, hopefully in the United States. Here, Thomen saw a diplomatic opportunity. “Perhaps later,” Thomen wrote his jefeback in the Caribbean, “you could offer an opportunity in our country to this young engineer.” What Ackerman likely understood as a simple exchange of pleasantries that would be nothing more than customary in the first meetings between foreign officials was quickly seized upon by Thomen as a possible means to influence the new U.S. ambassador and shape U.S. foreign policy, even without any specifically outlined quid pro quo.

 

Even more illuminating was how Trujillo hoped to manipulate Michigan Republican and Chairman of the Senate Foreign Relations Committee Arthur Vandenberg in the late 1940s. A man with one of the most powerful positions in Congress, Vandenberg was notable for his adherence to policy and integrity. Because of this, Trujillo’s officials had never been able to gain any undue influence over the congressman. Consequently, the despot hoped to find an alternative route.

 

To do this, Trujillo put on his payroll doctor William Morgan. Generally, Morgan served the dictator as both an official and unofficial lobbyist. The doctor attended prominent diplomatic functions and appeared at golf tournaments hosted by one of Trujillo’s law firms featuring U.S. congresspersons, all designed to portray the despot as a reliable U.S. ally. And as Vandenberg’s personal friend and physician, Morgan became Trujillo’s hopeful connection to capture the senator’s support. It was the doctor who also targeted Vandenberg’s only son.

 

In July 1948, at the same time that Thomen suggested employing a U.S. ambassador’s child, Morgan reached out to Trujillo about a similar venture. The doctor happened to have an “intimate friend” who had spent his “last fifteen years” involved in his father’s “electoral campaigns.” Now, the man was rather “tired” of U.S. politics and interested in heading to the Dominican Republic to try his hand in business. The man, not surprisingly, happened to be Arthur Vandenberg, junior.

 

There is currently no evidence that Trujillo’s officials succeeded in realizing these plots. After all, Ackerman was never implicated in any corruption, and Vandenberg passed away in 1951 without any claims of impropriety.

 

Still, such maneuvers by this Caribbean dictator, or any corrupt regime since, do reveal how other governments conduct their politics and understand U.S. foreign relations. Trujillo and his officials hoped to circumvent outlined procedures and policies by going after the children of U.S. ambassadors and congresspersons. As the United States continues to expand its presence throughout the world and finds itself confronting such regimes whether in Russia or Ukraine, the nation will keep finding corrupt entities desperate to manipulate U.S. politics to their advantage with influential individuals’ children caught in the middle.

]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173345 https://historynewsnetwork.org/article/173345 0
The Greatest Danger in the Kurdish Crisis

The greatest danger in the Kurdish crisis is not in the tremendous loss of life, as tragic and shameful as that may be. In a volatile region of the world that is part and parcel of an on-going tragedy.  It isn’t in the fact that Kurds are repeatedly being used by one neighboring country against another as a convenient self-sustaining guerrilla force. The world community has been numbed to that. It is not in the U.S. betrayal of their cause either. That has been done seven times before. The greatest tragedy in the Kurdish crisis is in Turkey’s attempt to literally break up an otherwise contiguous Kurdistan straddling Turkey, Iran, Iraq, and Syria into Bantustans à la Israel and South Africa.

 

If the heavy-handedness of the Turkish military is any indication, Erdogan seems to be bent on the ethnic cleansing and relocation of Kurds in northern Syria and resettling their lands along the Turkish border with Arab Syrian refugees currently residing in Turkey.

 

In this clash of cultures, the encroachment on Kurdish lands in northern Syria through an Arabization of the settlements amounts to an outright ethnic cleansing. If left unchecked, it will most likely be repeated on the Iraqi side of the Turkey-Iraq border in due course.

 

In the larger scheme of things, the Iranian Kurds will be left at the mercy of Persian chauvinism and Kurds in Turkey will be conveniently rebranded as ‘mountain Turks,’ eliminating any chance for the 35 to 40 million Kurds to ever have a nation-state of their own as the ground would’ve literally shifted.

 

There was hope in the early years of the Erdogan administration that the Turkish government could show a human face towards the Kurds. When Fethullah Gülen’s grassroots Hizmet movement energized the Adalet ve Kalkınma Partisi (AKP) or the Justice and Development Party to a victory, Hizmet’s advocacy of the principles of inclusivity without coercion became the cornerstone of Erdogan’s administration. Restrictions on Kurdish cultural expressions that have been imposed since the times of Atatürk were lax. But since the fall out between Hizmet and AKP in 2013, not only have the humanistic aspects of the AKP administration been replaced by a more dogmatic attitude in matters of religion, but Erdogan’s liberal stance towards the Kurds has also turned into an authoritarian one.

 

With Turkey’s economic slowdown and the increasing challenge from the Kemalist center-left evidenced by their recent victory in Istanbul’s mayoral election, Erdogan’s incursion into Syria is as much a diversion tactic as it is a grandstanding to Turkish nationalism. Erdogan probably believes that the crushing of the Kurds will also take the wind from the sails of Kemalists super-nationalists.

 

Erdogan’s calculated risk may pay off in terms of Turkish politics, but in terms of the politics of the Muslim world, his popularity sinks very low— considerably low. Erdogan who once stood out as the last beacon of hope for progressive Muslims who witnessed Islam’s compatibility with modernity, progressive democracy, and economic development is no longer so. Erdogan’s policies now seem to be inline more with those of the Chinese ethnic cleansing and cultural indoctrination of the Uighurs in Xinjiang, Modi’s Indian incursion into Kashmir, Netanyahu’s aggressive Jewish settlement in Palestine, Putin’s suppressive incursion into Chechnya, and a number of other right-wing nationalist / fascist rulers who have occupied the world political stage as of late. Even in military terms, Turkey’s bloody incursions into the land of Syrian Kurds approximate those of the Saudis in Yemen.  

 

Retracting the earlier ‘green light’ statement by the White House followed by threats of economic sanctions against Turkey brings to mind the baiting of Saddam who was lured into invading his southern neighbor Kuwait in 1990. As intriguing as that analogy sounds, this scenario may not play out quite the same for the following reasons:

  • Weapon sales to Turkey would be too lucrative for the arms dealers to ignore. They would most likely find enough loopholes in the economic sanctions to get around it. The Kurds don’t tilt the scales for any Western power in this regard.
  • Domestically, with the Gülen movement out of the equation, Erdogan has no choice but to dress his religious dogmatism in a nationalist mantle.
  • Most importantly, Erdogan’s trump card is Turkey’s military prowess as a NATO member that makes Turkey invulnerable, invincible, and indispensable.
]]>
Tue, 12 Nov 2019 23:26:09 +0000 https://historynewsnetwork.org/article/173349 https://historynewsnetwork.org/article/173349 0
Can a 1960s-like Counterculture Emerge?

 

In the 1960s, if you opposed racism or American killing in Vietnam, there was a counterculture to support you. Music, films, TV, clothing, hairstyles, social thinking, speech—a whole web of interrelated phenomena existed to help you oppose the dominant culture, “the system,” or the “establishment.”

 

Today we have just as much reason to protest as did the 1960’s dissidents. The Trump presidency, our climate crisis, and our senseless gun violence are alone enough to fuel a whole counterculture of outrage. But where are our balladeers like Bob Dylan and Joan Baez, our concerts like Woodstock, our plays and films like Hair and The Graduate?

 

We are social creatures, and most of us are followers rather than leaders. Like fish in water, we need a sustaining element to surround us. We need a counterculture, or an opposing culture or way of life, to embolden our emotions and imaginations. 

 

The counterculture of the 1960s was not perfect. It had its unthinking followers, its biases, its over-generalizations—like “Don’t Trust Anyone over 30”—yet it provided a strong alternative to the dominant consumer culture. Why did it disappear and where today is any new counterculture?

 

Mainly, that of the 1960s died because it lacked deep and sustaining roots. It was fueled by college students, civil rights activities, and opposition to the war in Vietnam. But students graduated and were absorbed into the “system,” into the tentacles of corporate America, where countercultural values and “hippie” styles were unwelcome. Martin Luther King Jr. (MLK) was killed in 1968, depriving the civil rights cause of its most powerful leader. (That year also produced the assassination of Robert Kennedy and the election of Richard Nixon.) Finally, American troops were withdrawn from Vietnam and the military draft ended in the early 1970s.  

 

What followed in the 1970s and 1980s was the disappearance of the 1960s counterculture and the absorption of many of its former adherents into the “system.” In his Bobos In Paradise: The New Upper Class and How They Got There (2000), conservative columnist David Brooks wrote: “We’re by now all familiar with modern-day executives who have moved from S.D.S. [a radical student organization that flourished in the 1960s] to C.E.O. . . . Indeed, sometimes you get the impression the Free Speech Movement [begun 1964 at the University of California, Berkeley] produced more corporate executives than Harvard Business School.” 

 

In his The Culture of Narcissim (1978) historian Christopher Lasch identified a new type of culture that had arisen. It stressed self-awareness. But, unlike the counterculture of the 1960s, it did not oppose the capitalist consumer culture of its day, but rather meshed with it, goading “the masses into an unappeasable appetite not only for goods but for new experiences and personal fulfillment.”

 

Many of the former youth protesters of the 1960s participated in this “mass consumption,” as the growing consumer culture sold mass entertainment in new formats (including for music, films, and books) to young adults. 

 

The last quarter century have brought little relief from our culture of consumption and narcissism. One of the period’s most notable changes has been the expansion of the Internet and social media. In her highly acclaimed These Truths: A History of the United States, historian Jill Lepore stated that “blogging, posting, and tweeting, artifacts of a new culture of narcissism,” became commonplace. 

 

Although some have argued that the Internet has promoted a greater sense of community, Lepore is far from alone in emphasizing its encouragement of narcissism. In 2017, for example Newsweek stressed it in a piece entitled “Is Rampant Narcissism Undermining American Democracy?” Moreover, the previous year Americans chose for their president perhaps the most narcissistic and materialistic man to ever hold the office—Donald Trump.

 

Many Americans, however, oppose Trump. What prevents the emergence among them of a new counterculture to oppose him and the crass materialism he represents? 

 

For one thing, the college students of today are vastly different than those of the 1960s, and so too is higher education. It is much more expensive; more students incur large debts to pay for it; and a much lower percentage of students major in the humanities. Meanwhile, our consumer culture continues to prompt aspirations to earn a “good salary,” and there are scant ways of doing so outside of our dominant corporate culture. We need money not only for food, cars, and houses, but also for things many younger people want like computers, Internet services, cell phones, and an increasing variety of leisure activities. 

 

Moreover, stimulants like the 1960s civil rights struggles and opposition to the Vietnam War (and the draft) are gone. They can no longer galvanize young people. Yet, there remains one great hope, one phenomenon that could help a new counterculture burst forth—our present environmental crisis. Regardless of 2020 political results, this new birth could occur. 

 

Such a counterculture could develop out of seeds planted by individuals like the German/English economist and environmentalist E. F. Schumacher and Kentucky writer Wendell Berry, plus two more recent seed-planters, 350.org founder Bill McKibben and Pope Francis. Moreover, now in 2019 there are signs that such seeds are beginning to sprout. 

 

Schumacher was a hero to many of those influenced by the original protest movement of the 1960s. Indeed one of them, Theodore Roszak, who wrote The Making of a Counter Culture (1969), also authored the Introduction to Schumacher’s popular 1973 work Small Is Beautiful.  

 

In it and other works of the 1970s, Schumacher criticized modern industrial societies for “incessantly stimulating greed, envy, and avarice,” for preparing people to become “efficient servants, machines, ‘systems,’ and bureaucracies,” and for driving the world toward an environmental crisis. Instead of focusing education on career preparation in such societies, he believed it should help us answer questions like “What is our purpose in life?” and “What are our ethical obligations?” 

 

Four years after Schumacher’s death in 1977, Wendell Berry gave the first Annual E. F. Schumacher lecture. In it he praised Schumacher’s adherence to spiritual values. In 1983, in his essay “Two Economies,” Berry quoted Schumacher’s belief expressed in  “Buddhist Economics” that the aim of such an economics should be “to obtain the maximum of well-being with the minimum of consumption.” In his 2012 Jefferson LectureBerry suggested that our corporate capitalist consumer culture remained dominant and heavily implicated in our present climate crisis.

 

The 2009 Schumacher lecture was delivered by Bill McKibben, one of the USA’s “most important environmental activists.” That same year, along with Berry, he  protested at a coal-fired power plant near Capitol Hill in Washington D.C. The previous December, the two men had sent out a letter noting the global-warming danger of continuing reliance on coal—the “only hope of getting our atmosphere back to a safe levels . . . lies in stopping the use of coal to generate electricity.” In September 2019, McKibben’s 350.org co-organized a massive global climate strike involving 4 million people in 163 countries.  

 

In 2015, McKibben lavishly praised Pope Francis’s environmental encyclical and ended his essay writing, “This marks the first time that a person of great authority in our global culture has fully recognized the scale and depth of our crisis, and the consequent necessary rethinking of what it means to be human.”

 

In the encyclical itself, the pope stated, “the problem is that we still lack the culture needed to confront this crisis,” and there is an “urgent need for us to move forward in a bold cultural revolution.” And like other critics of modern narcissism, he bemoaned “today’s self-centred culture of instant gratification,” and of “extreme consumerism.”

 

Thus, Schumacher, Berry, McKibben, and Pope Francis all share the essential view that today’s consumer culture needs to be replaced by one featuring, in the pope’s words, a “spirituality [that] can motivate us to a more passionate concern for the protection of our world.”

 

The cultural critic Raymond Williams once wrote that “a culture, while it is being lived, is always in part unknown, in part unrealized.” Hence, we may not yet realize that the four individuals mentioned above have been seed-sowers for an emerging new countercultural movement. But there are some promising signs.

 

Regarding music, one recent article notes: “2019 has been a year of youth climate strikes and record-setting heatwaves, and—probably not coincidentally—it’s also the year that pop music stars started speaking about climate change en masse. . . . Now we’re seeing actual, quality pop music talking about the climate crisis from artists like Billie Eilish, Lana Del Rey, and (he claims) Lil Nas X.”  

 

In film, Paul Schrader’s First Reformed (2017) was a first rate exploration of the effects of climate change on an environmental activist and his minister (Ethan Hawke). 

 

Literature has produced more numerous examples of climate concern. Berry, author of novels, poems, and essays, has long written about the health of the earth and planet, and in recent decades about climate change (see, e.g., here and here). It is difficult to think of any other major American cultural figure who for so long has championed the type of values needed for a countercultural challenge to today’s consumer culture.  

 

In 2000, prolific fiction writer T. C. Boyle’s A Friend of the Earth depicts the world in 2025-26: “Global warming. I remember the time when people debated not only the fact of it but the consequence. . . . [Now] it’s like leaving your car in the parking lot in the sun all day with the windows rolled up and then climbing in and discovering they’ve been sealed shut—and the doors too. . . . That’s how it is.”

 

In 2018, Amazon published a collection of seven climate-fiction (cli-fi) storiesby major writers. The series was entitled “Warmer.” In 2019, Amitav Ghosh came out with his new novel Gun Island, the plot of which centers on global warming and its foolish denial. In September 2019 another well-known novelist, Jonathan Franzen, wrote about the coming of a “climate apocalypse.” 

 

Among poets, already in 1985 Carl Dennis wrote the amazingly perceptive “The Greenhouse Effect.” More recently, the influential Poetry Foundation has gathered together “environmental poetry [that] explores the complicated connections between people and nature, often written by poets who . . . are serving as witnesses to climate change while bringing attention to important environmental issues and advocating for preservation and conservation.” The Foundation collection also includes essays on ecopoetry, an important new trend dealing with climate change and other environmental topics. 

 

In the early 1960s, Bob Dylan (Nobel Prize for Poetry winner in 2016) composed and sang “The Times They Are A-Changin’.” It began: 

 

Come gather 'round, people Wherever you roam And admit that the waters Around you have grown . . . .