History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Wed, 08 Jul 2020 00:07:48 +0000 Wed, 08 Jul 2020 00:07:48 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://hnn.org/site/feed Minor Temporary Changes to HNN's Schedule Greetings to the HNN community. I hope everyone is enjoying this most unusual summer in whatever safe and socially responsible ways you can. 

I will be doing that at the end of this week and early next--taking a few days off to spend time outdoors. 

Readers can expect to see a slate of new op ed articles posted to the HNN homepage as usual on Sunday, July 12. HNN's triweekly newsletters will be released in slimmed-down form on Friday, Monday and Wednesday, and News Editor Chelsea Connolly will be posting items to the Breaking News and Historians in the News Sections. 


Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176331 https://historynewsnetwork.org/article/176331 0
A Renaming Everyone Can Get Behind For a decade at least, Washington, D.C., has been stuck in ugly political gridlock. As a step toward renewed bipartisanship, I offer this modest proposal. 


At the turn of the last century, Republicans engaged in a wave of memorials and renamings for President Ronald Reagan. In 1998, a Republican Congress passed a bill requiring renaming Washington National Airport for Reagan. The airport authority and many D.C. residents pointed out that it was already named for one president, but "Ronald Reagan Washington National Airport" went into effect nevertheless. 


After George W. Bush entered the White House in 2001, the renaming went on in earnest. Historians know that memorials in the U.S. have often sprouted in waves. Union monuments began to go up immediately after the Civil War. Most Confederate memorials were dedicated much later, in the period 1890 to 1940. Why? Because victors usually put up memorials, and in about 1890, the Confederacy — or more accurately, since it was a new generation, neo-Confederates — won the Civil War. And, Republicans argued, had not Reagan similarly won the Cold War?


A year or so after the breakup of the Soviet Union, I heard an interview about it with Eduard Shevardnadze, who had been Foreign Secretary in the U.S.S.R. Asked if Ronald Reagan deserved partial credit in some way for the downfall of Communism and the breakup of the U.S.S.R., he was momentarily struck dumb. Clearly he had never thought of that hypothesis. Having considered it, he rejected it out of hand, citing more basic economic, societal, and ideological contradictions within the Communist system. 


But this made no difference to Republicans. Years ago Walt Kelly had mocked such thinking in his comic strip Pogo, in a scene in which Albert Alligator, claiming some political mantle at the time, took credit for the weather, a fine sunny day. "Why not?" he protested. "It happened during my administration, didn't it?"


The resulting mania for memorializing Reagan thus reflected a political rather than historical judgment. Historically, Ronald Reagan surely ranks no higher than the third best Republican president of the twentieth century, well below Teddy Roosevelt and Dwight Eisenhower. No matter. Grover Norquist, leader of the Ronald Reagan Legacy Project and even more famous for his no-tax-increase pledge, called for a monument to Reagan in each of America's 3,067 counties and on the national mall in Washington, D.C.; his face on the $10 bill, replacing Alexander Hamilton's; and perhaps his profile added to Mount Rushmore! "Or we could have our own mountain," suggested Norquist.  


An article by Greg Kaza in National Review called Mt. McKinley a "precedent" for renaming some other peak for Ronald Reagan. Of course, more recently McKinley has given way to Denali, its aboriginal name, but at the time the example made sense. 


I have a suggestion for Mount Reagan that I think will never get renamed for someone else.


Each of our United States has by definition its highest point. The highest point in Reagan's home state of California is already named, of course, for Josiah Dwight Whitney, who founded the California Geological Survey. So is the highest point in Reagan's native state, Illinois, 1,235' high Charles Mound. In fact, the highest point in every state is already named, even Florida's Britton Hill, a mere 345' from sea level — except Delaware's.


Indeed, Delaware's tallest spot was misknown until recently. It was thought to be marked by a National Geodetic Survey azimuth on Ebright Road between Brandywine and Brandywood in far northern Delaware. The Ebright Azimuth turns out not to be the highest point in Delaware, however. The highest point in Delaware, at 451' a full two feet above the Ebright Azimuth, is in a mobile home park some 300 yards west. It is "the elevation in front of the first trailer," according to William S. Schenck of the Delaware Geological Survey. 


Of course, Ronald Reagan had nothing particularly to do with Delaware. But then he had nothing to do with aviation, either, except for smashing the air traffic controllers' union, which didn't stop Republicans from renaming Washington National Airport Ronald Reagan Washington National Airport. 


And, like McKinley with Spain, Reagan did win a war — with Grenada. So perhaps he does deserve to have a mountain named for him. Delaware's tallest spot — “Mount Reagan” — is perfectly appropriate. It matches exactly the size of the war Ronald Reagan won. 

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/blog/154370 https://historynewsnetwork.org/blog/154370 0
We’re All Historians Now 

July 4, 2020:  A friend writes, “I am troubled by what I see coming from the Left: a demand for ideological purity. You?“ This was my answer.

 Happy 4th!

 How wonderful we are having a debate on this day.  It’s very American!

There’s karma at play in Woodrow Wilson’s becoming a victim of ideological politics.  He was an ideologue.  He came to ruin when he rigidly rejected compromises over the Versailles Treaty.  (Rigidity in his case may have been aggravated by his stroke.)

I have no problem with Princeton’s scrubbing his name off their international institute.  His name no longer inspires idealism in the young and it’s the young who are enrolled in the school.  They deserve a school named after someone who will inspire them.

I am also in favor of removing Confederate statues or at least reimagining them.  They represent a white racist moment from this country’s darkest days.  The only reason they were put up was because white Southerners wanted to vindicate the reestablishment of their control following the end of Reconstruction. They are divisive symbols and in many cases bad art. (See statue of Forrest, the KKK founder.)  They can go.

We should also rename all those army bases named for Confederates.  The only reason those traitors to the cause of liberty were honored in this way was because Southern politicians in Congress, favored by the old seniority system, held the chairmanships of military affairs committees, a sign in itself of the stultifying grip the old aging racist Southern  Democrats held on power because the South was effectively a one-party state. There’s no reason to honor traitors.

That said, I think the impulse to cleanse, destroy, and remove smacks of a Maoist spirit of fanaticism.  It’s dangerous and has clearly gotten out of hand. It rests on the simple-minded and ahistorical idea that we should only honor individuals from the past who think like we do.  How silly!  Example.  Twenty years ago hardly anybody believed gay people should be able to get married. Today, most people think they should.  Does this mean we should pull down any statues we have of people who went on the record years ago against gay marriage?  Of course, not. Times change and so do our moral outlooks.  

A professor on Twitter this week recalled that he used to ask his students if they would have opposed slavery if they had lived jn the South.  All said yes.  How ridiculous.  We believe what we believe because of the culture in which we are raised.  Those Southerners who favored slavery favored it for the same reason Americans twenty years ago disfavored gay marriage. 

The impulse to render a moral judgment on the past is forgivable and understandable.  We naturally want to stand with people who share our values.   But we are not made into saints by virtue of having done so.  And yet many seem to think they are.  So they huff and they puff against Jefferson and Washington in the fallacious belief that this makes them pure.  It does not.  

These men were flawed — as we all are.  It is not for their flaws that we honor them, but for their achievements, which were many.  

And were we to begin to take down THEIR statues our country would be the poorer for it.  We are not united by am common ancestry, unlike many European countries.  We are united by our common ideals, the civic religion if America, which is grounded in those famous words of Jefferson:  “All men are created equal.”  Take away Jefferson and you risk upending the narrative on which this country is based.  

To be sure, the narrative keeps evolving.  Our narrative is not frozen in time.  Women, blacks, and gays, to name just three groups, have been added to the story of America, and of course I’m in favor of that. What I don’t favor is taking a sledgehammer to the old narrative and demolishing it.  Our people, like every other people on earth, need a story.

So I oppose pulling down statues of Jefferson and Washington while also wanting us to put up statues of slaves, women and gays.  

This is a conversation historians have been having since the sixties.  I’m delighted the rest of the country is now joining the debate. One of the key points historians have been making all these years is that a distinction needs to be made between history and memory.  History is complicated. Memory isn’t history. It’s simple. So when we put up a statue we are saying, Here is what we choose to remember.  Here, is something we think worthy of remembering. But it’s not history.  History is the story of how we got here and what people in the past believed and did.

During the Revolution Ben Franklin said something like:  we are all politicians now.  He might have added, we are all historians now, too.  Much as we’d like to avoid the hard demands history puts on us, we can’t sidestep the task.   This means coming to terms with the ugly currents that have heretofore been ignored or downplayed.  I find this bracing! 

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/blog/154368 https://historynewsnetwork.org/blog/154368 0
This Independence Day, We Need a Patriotism Index




Independence Day is a day for patriotic reaffirmation and renewal, is it not? With continuing and growing controversy over immigrants and immigration reform, police brutality, racial discrimination and unrest, and even the inequalities of pandemic vulnerability and treatment, it seems inevitable that an abundance of patriot talk and an overabundance of political judging will be an enduring feature of the political and social landscape in the months leading up to the next election.


What better time, then, to reestablish the patriotic bona fides of everyone – EVERYONE – in this country? What better time to determine who legitimately deserves to be here enjoying what is rightfully ours? Considering the subjective nature of patriotism, to say nothing of the sanctimonious, accusatory tenor of most patriot talk, it seems only fitting that we now develop a long-overdue Patriotism Index to distinguish real Americans from the many poseurs and potential enemies who lurk among us.


The first thing we need to do is come up with a scoring scheme. We can begin by acknowledging that patriotism is about love of country. But what, exactly, do we mean by country? Is it the land itself: spacious skies, amber waves of grain, purple mountains’ majesty, fruited plains, redwood forests, gulfstream waters? For scoring purposes, there isn’t much in this definition that we can use, unless we give negative points to tree-hugging conservationists and environmentalists, who naturally tend to be seditious anyway.


A more fruitful avenue might be to think of country as the people – as in “We the People.” There’s promise here. For starters, the whiter, more Christian, and more heterosexual you are, the more points you deserve. If you’re swarthy, you get negative points. Likewise for immigrants. The farther removed you are from being an immigrant, the more points you should get – unless you’re Native American, in which case you get no points.


Or maybe country is the constellation of ideas that reflect who we are (or who we think and say we are). If we opt for rhetoric over reality in our scoring, we would award points for advocates of compassion, tolerance, dissent, generosity, idealism, altruism, and of course freedom and equality.


On the other hand, if we opt for reality over rhetoric, points would go to practitioners of arrogance, hypocrisy, dogmatism, bellicosity, materialism, moralism, and selfishness. My preference is for this latter approach.


What about the love part of the equation? How would we actually determine love of country? There seem to be three possibilities: by deeds, words, or symbols. Since we live in the postmodern media age, we would want to weight symbolism more heavily than words, and words more heavily than deeds.


We could have a symbolic point scale that reflects our use of the American flag for all manner of muscle flexing, saber rattling, and drum beating. Wrapping oneself in the flag would be worth the most points, followed by waving the flag, followed by merely displaying it (on your house, body, clothes, or vehicle). Doing what the flag actually stands for – protesting social injustice, for example, or criticizing government excess, ineptitude, or secrecy – would receive minimal (if any) points. And, of course, don’t forget those MAGA caps and t-shirts.


We also could have a rhetorical point scale. Unrestrained patriot talk would be worth lots of points – regularly reciting the Pledge of Allegiance, say, or enthusiastically singing the National Anthem, “God Bless America,” or “America, the Beautiful,” or repetitiously mouthing platitudes about Making America Great Again. No points for the Golden Rule, the Universal Declaration of Human Rights, “Give Peace a Chance,” or “Imagine.”


Strident warrior talk – especially red-faced, vein-distending, temple-pounding references to hunting down and destroying evildoers, invading rogue states, strengthening defense, “dominating the battlespace” of domestic protesters, or employing “overwhelming military force” against “anarchic rioters” perpetrating an “orgy of violence” – also would be worth big points.


And, since hatred of other countries and people is one of the purest reflections of love for one’s own country, various forms of jingoistic outrage and hate speech directed at suspicious or inferior beliefs, races, and regimes – China, say, or Venezuela, or, depending on the day of the week, North Korea, or even an occasional swipe at Russia to cover our tracks – would garner additional points.


Finally, there would be a scale of deeds that would distinguish loyal from disloyal acts and take motives and intentions into account. Conformity, compliance, and obedience would be weighted heavily; non-conformity, heresy, and dissent not at all. Sacrifice – especially the unintended loss of privilege and prestige – would receive big points. Service to country would be weighted more heavily than service to humanity. Special points would be awarded for intentional refusal to abet or enable weak, unfit interlopers and parasites – like the migrant caravans swarming the southern border – who undeservedly seek to appropriate American plenty and protection.


Once we are all assigned documentable patriotism scores, we will have in place a rigorous method for purifying ourselves and sanctifying our claims to greatness. Then it will be clear that the words inscribed on the Statue of Liberty – “Give me your tired, your poor,/Your huddled masses yearning to breathe free” – are sentimental bunk, and that the words that best speak for America today are those of the great superpatriot Barry Goldwater: “Extremism in the defense of liberty is no vice!”


(Take note, dear reader, if you have taken this proposal literally, rather than as irony – and, scarier still, if you have found the idea attractive – you’re part of the problem, not part of the solution. Be afraid, be very afraid.) 

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176309 https://historynewsnetwork.org/article/176309 0
This Independence Day, Celebrate and Carry On America's Past Generosity

Ration Line, Oslo 1942


Under the cover of darkness in Nazi-occupied Norway, a package is brought to a woman whose husband had been taken prisoner. The box is filled with food, clothing and medicine for this mother of six children.    Whomever dropped it off has long disappeared into the night. But this mother appreciates the stranger more than anyone at that moment.  "I shall never forget when a stranger brought me a large package. At last, only by tears could I express my gratitude," she would write.  During World War II, many packages were delivered throughout Norway by mysterious couriers. It was part of a secret underground network dedicated to bringing food "illegally" to those in desperate need.  This was aid delivered to those most oppressed by the Nazi German occupiers. The Nazis made it routine to plunder the country of resources leaving Norwegians, especially families of those who resisted, to suffer. The Nazis placed limits on the amount of supplies that could come into Norway, making the secret relief operation quite necessary.   Imagine the joy expressed by Norwegians living in poverty and occupation to get one of these boxed miracles. Messages of thanks were collected by pastors involved. One recipient would write "This I shall never forget; It was like a Gift from Heaven.” Another Norwegian wrote “I am staying with my sister, who is suffering from tuberculosis, and now I can give her the food she needs.” Where did these life-saving boxes come from? They were prepared in Sweden, and stored close to the border with Norway.  Pastors, who had fled Norway once the Germans took over, were among the main organizers of this secret humanitarian relief.   The American Legation in Stockholm and others provided support in Sweden for preparing the food, clothing and medicine into packages.  The boxes were then hidden in wagons or trains to avoid detection by Nazi patrols once they crossed into occupied Norway.   Brave volunteers made the daring runs into the Norwegian countryside. Some boxes were moved by boat.  To be caught by the Nazis meant certain jail or worse. This is why the work was done under the cover of night. Pastors, nurses and other trustworthy citizens made the final deliveries to impoverished Norwegians.  But there was another element to this help for Norway. That can be traced all the way back to America with the National War Fund that collected donations to finance the daring missions. The charity American Relief for Norway, which was part of the War Fund, thrived because of the generosity of Americans.  These funds were forwarded to Sweden to help power the underground humanitarian movement.  Norwegian Pastor Sophus Norburg wrote “On behalf of the suffering people of my county, who have benefited by this help, I send heartfelt thanks to American Relief for Norway, whose financial support has made this relief activity possible.” As we celebrate America’s birthday on July 4th, let’s remember how our nation’s generosity has aided many hungry and oppressed people around the world.  We must never lose that caring spirit which makes our nation a symbol of hope everywhere.  This American humanitarian tradition is sacred and must be carried forward.  We have a special bond with all poor people around the world who are struggling to get food and other basics. We will not forget them in their time of greatest need. That is the American way of giving.   Actions we take at home today can lead to life-saving food tomorrow for a victim of war, disasters or poverty in a land far away.  This charitable spirit is something truly special we can celebrate on Independence Day and every day. 

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176308 https://historynewsnetwork.org/article/176308 0
Madison’s Sorrow Goes Beyond Pompeo's Dinners

Photo Gage Skidmore, CC BY-SA 3.0




Shortly after Secretary of State Mike Pompeo had the inspector general of the State Department fired, reports surfaced that Pompeo has been quietly hosting dinners at State for the wealthy and well-connected on the taxpayer’s dime.   


In the grand scheme of the abuses of power by the Trump administration, this is small beans.  More irksome is that Pompeo and his wife pompously called the affairs “Madison Dinners,” referring to how James Madison, not only the famed architect of the U.S. Constitution but also a secretary of state and president, liked to invite foreign diplomats for dinner. Pompeo and his wife, Susan, went so far as to use a template for the invitations based on James and Dolley Madison’s invitations.  


Looking down from heaven, the Founding Fathers join many contemporary Americans in dismay, indignation and, yes, sorrow over what has befallen the republic during the presidency of Donald J. Trump.  About the only indignity the country has not suffered is Trump gold plating the White House and emblazoning his name across the West Wing.  A buffoon, bully, and demagogue, Trump is both an elected authoritarian and a leader incredibly inept at exercising executive power in a coherent, meaningful way. 


Even the most competent and effective governments have had trouble taming and stamping out the COVID-19 pandemic. But the most incompetent nation states have been completely overwhelmed and Trump and his administration have failed in such mammoth proportions that nearly 125,000 Americans have died in just five months, double the toll of the Vietnam War which saw 58,318 Americans perish. 


While the coronavirus kills thousands, Trump and his cronies continue to undermine constitutional democracy in myriad ways, large and small. Attorney General William Barr’s Department of Justice issuing a motion to dismiss the Michael Flynn case represents a major attack on the rule of law; the petite corruption of Pompeo’s Madison Dinners is a minor affront to public ethics. 


Viewed from 30,000 feet, the real story is not President Trump and his ability to command attention 24/7/365 nor the misdeeds of administration officials. The real story is the attack on democracy. The most important story of this era—one that we neglect at our peril—is the Republican Party’s abandonment of conservatism and fierce embrace of illiberal reaction.  


We face a political crisis because our ideological consensus—perhaps the key factor in our political success—has been shattered in a way not seen since the Civil War. Our dilemma is not a single election; it is a malignant turn by one of the two major parties. The hard-right zealots who control the Republican Party are driven by a hatred of modern liberals—such as Franklin Roosevelt, Barack Obama, and Hillary Clinton—and a hostility toward the core liberal beliefs that have steered American politics since 1776. For more than 250 years, America has been governed by a liberal-conservative consensus forged by John Locke, Thomas Jefferson, James Madison, Alexander Hamilton, George Washington, and Thomas Paine. Now reactionary extremists shred the Constitution and have launched a reactionary counterrevolution against the founders. 


Neither traditionally conservative nor European fascist in nature, the Republican right can best be understood as a radical movement to remake America in the image of ungenerous and unlovely ideals. Unlike traditional liberals and conservatives (both adherents to Locke’s liberalism), reactionaries either explicitly or implicitly accept and encourage public policies that enshrine and institutionalize core illiberal values—privilege, hierarchy, inequality, and exclusion—particularly for white males and their families. These are the foundational values of the feudalism that the founders fled.  


America is no longer divided into liberals and conservatives. While there are millions of traditional conservatives in the electorate, the base and elected leadership of the Republican Party is deeply reactionary. Put simply, you cannot be a conservative if you do not defend the Constitution. 


Madisonian democracy works when the two major parties share similar values while disagreeing about public policy. In the past quarter century, two illiberal impulses—racism and the insatiable desire for power by the ultra rich—have captured the soul of the Republican Party. Leaving behind the values and virtues endorsed by the founders and the likes of Abraham Lincoln, Dwight Eisenhower, and George H.W. Bush, the GOP marches steadily rightward, displaying its worst aspects while deserting an honorable tradition. 


For more than thirty years, the Karl Roves and Rush Limbaughs of the GOP happily taught the right to hate Washington; it should come as no surprise then that the Tea Party/Trump generation of Republicans have no respect for Madisonian democracy and the delicate checks and balances of the American constitutional system.  Madisonian politics is a wonderful system of government when the players abide by the norms of compromise and Lockean liberalism. But it also can become a system of politics in which a determined minority sows chaos and creates ongoing dysfunction if it so chooses. 


The political contest is now between liberals and illiberal reactionaries. And without conservatives debating liberals in an intelligent, engaged manner to address the nation’s problems, Madisonian democracy breaks down. There is no way to bridge the stark philosophical and policy differences that exist between liberals and reactionaries. The gap is too wide, the distrust too deep.  


In the multiparty systems that populate Europe, the solution would be easy. Republicans would split into two separate parties—classic conservatives in one and reactionaries in the other. Yet, because of our winner-take-all elections, the multiparty option does not exist in the United States. We are stuck with two major parties and when one party abandons the broad liberal-conservative center, the system stops working. Putting the shoe on the other foot, how would conservatives feel if suddenly liberals became Marxist-Leninists? 


Madison’s sorrow is not the lavish dinners hosted by the Pompeos at the State Department. No, Madison’s sorrow—and our political pandemic—is the breakdown of democracy in America. 

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176302 https://historynewsnetwork.org/article/176302 0
The Case for a National Liberation Day Holiday on December 6

The House of Representatives Passes the 13th Amendment, January 31, 1865



On December 6, 1865, the Thirteenth Amendment of the U.S. Constitution was ratified and became the law of the land. That amendment enshrined into law that “slavery…shall (not) exist within the States…”

Prior to that seminal date in American history, slavery continued to exist in parts of the United States even after the end of the Civil War, including in states which had never joined the Confederacy.  President Lincoln’s Emancipation Proclamation had been a temporary executive order which Lincoln legally justified only as a ‘fit and necessary war measure’ within his powers of Commander-in-Chief in order to cripple the Confederacy’s rebellion, and purported only to free slaves in the Confederacy.  Issued on September 22, 1862, the proclamation provided that it would only take effect in those states currently in rebellion which did not re-enter the union prior to January 1, 1863. In those states which did decide to re-enter prior to January 1, 1863, the Proclamation was clear that its order of emancipation would have no effect and slavery could continue. 

After the surrender of Robert E. Lee at Appomattox on April 9, 1865, slavery was still fully legal and practiced in parts of the United States.  

On June 19, 1865, a full two months after Lee’s surrender, a little known Union General ventured into Texas, the most remote of the Confederate states, and let it be known that as Texas was one of the states which failed to re-enter the union prior to January 1, 1863, Lincoln’s Emancipation Proclamation would be enforced, at least until further notice and all hostilities ceased. Many of those in bondage had never even heard of the Emancipation Proclamation prior to that time.  It therefore remains a significant date in Texas history, despite the fact that on that date slavery continued to legally exist in other states.  

As late as December 5th, 1865, long after Lee’s surrender at Appomattox, and long after June 19, 1865, slavery was still legal and practiced in some of the states which had not engaged in rebellion, including Kentucky and Delaware. Only a Constitutional Amendment could end slavery in those states. Although Congress passed the 13th Amendment on January 31, 1865 over the vociferous objections of Democrats in Congress, it was not until December 6, 1865 that the amendment was ratified and became the law of the land. This was the date on which slavery was finally abolished by national law, and Kentucky and Delaware were legally required to free the slaves within their states. 

Today the question becomes which of these dates is the most significant and most worthy of celebration as a national holiday. Freedom was not enshrined across the land on September 22nd, January 1st, April 9th, or even January 31st. It was enshrined only when the Democrats in Congress were defeated and the 13th Amendment was ratified on December 6, 1865. Lincoln’s Emancipation Proclamation, based purely on military powers, is certainly significant in that it laid the groundwork for future legal emancipation. June 19th is also a significant date in Texas history, as it was a day that slaves in Texas were finally told of Lincoln’s military order of emancipation which emancipated some slaves in some states. But surely the most significant day is December 6, 1865, the day that slavery was legally abolished across the entire country, and not just in parts of the country. It is the date that should hereafter be celebrated as a national holiday and known as “Liberation Day”. 

Lest the movement for such a national holiday on December 6th be resisted on grounds that federal employees already have too many holidays, a modest suggestion is here proposed—namely that Columbus Day might oblige to make way for a more significant day of celebration. Columbus, though a venturesome explorer, was perhaps the luckiest, finding the American continent by accident based on a faulty calculation of the circumference of the world, and once who once finding it proceeded to enslave those he found there. There are reasons was America got its name not from Columbus but from a later explorer, Amerigo Vespucci. While some might regard Columbus Day as a day of celebration of Italian Heritage, even that heritage is called into question by modern historians today who believe it more likely that he may have been a Portuguese nobleman who adopted his name when he moved to Spain. In any case, the significance of Columbus Day may best be reflected today when mystified customers stand in long lines at the post office, only to discover when the doors are supposed to open, that it is closed because of a federal holiday they had forgotten about. Surely December 6, once made a national holiday would not be so forgotten and would be the cause of national celebrations on “Liberation Day.”

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176304 https://historynewsnetwork.org/article/176304 0
Abolition Movement Historian Ethan Kytle Discusses Confederate Monuments and Teaching Younger Students about Slavery



On June 24 the City of Charleston, South Carolina removed a giant statue of the 19th century pro-slavery leader John C. Calhoun that had dominated the historic downtown for more than a century.

City officials had previously resisted dismantling the giant statue, even after the 2015 tragedy when nine members of the city’s Emanuel African Methodist Episcopal Church were shot to death by white supremacist Dylann Roof.

Professor Ethan Kytle is co-author, with Blain Roberts, of Demark Vesey’s Garden: Slavery and Memory in the Cradle of the Confederacy, which detailed the role of the many statues honoring the Confederacy in Charleston. He is also the author of Romantic Reformers and the Antislavery Struggle in the Civil War Era and is the professor of history at California State University, Fresno. 


Q. Professor Kytle, give us a brief background on the Calhoun statue. Why is it so tall (110-feet)?

The Calhoun statue that was recently removed was installed in 1896. As my co-author, Blain Roberts, and I document in our book, Denmark Vesey’s Garden, this statue is Charleston’s second monument to the proslavery politician. The Ladies’ Calhoun Monument Association (LCMA) erected the first one in 1887. But the group took that monument down eight years later, citing aesthetic shortcomings. We have every reason to believe, however, that black Charlestonians’ ridicule and vandalism of the original Calhoun monument also played a role in the decision to take it down.

One explanation for the remarkable height of the second Calhoun monument, then, is that the LCMA wanted to protect it from the sort of defacement that the first monument suffered. Yet despite the second statue’s lofty perch, African Americans and other critics of Calhoun repeatedly vandalized the base.

Q. Why has the Calhoun monument lasted so long when other cities have removed many Confederate statues?

In recent years, the chief barrier to removing the Calhoun Monument was thought to be South Carolina’s Heritage Act, passed in 2000, which protects war monuments as well as memorials on public land. But Charleston’s legal team ultimately determined that this state law does not apply to the Calhoun statue, which, after all, is not a war monument [Calhoun died in 1850, though his pro-slavery politics and articulation of the doctrine of nullification were crucial to the Confederate project—ed.] and sits on private land that is leased to the city.

More generally, however, I am not sure that it is accurate to say that the second Calhoun monument lasted much longer than other Confederate and white supremacist statues. It stood for over one hundred years, but so have many Confederate memorials, most of which were also put up in the late nineteenth and early twentieth centuries—as conservative white southerners crafted their segregationist culture.

According to University of Alabama historian Hilary Green’s new project, which maps the removal of Confederate monuments and memorials, only a handful of these tributes were taken down before 2015. 

But things changed decisively that summer following the Mother Emanuel Church massacre, which took place just one block from the Calhoun monument. This tragedy proved to be a watershed in America’s long debate about the place of Confederate symbols in our commemorative landscape.

African Americans have critiqued and vandalized Confederate memorials for generations. It was only after the murder of the Emanuel Nine by a Confederate-flag waving white supremacist, however, that much of the rest of the country began to listen to them and to see the flag and Confederate monuments as the symbols of hate and oppression that they are. 

The Emanuel massacre, in other words, unleased a public, African American-led campaign against Confederate symbols that is transforming our country for the better. Since then dozens of Confederate flags, monuments, plaques, and other white supremacist memorials, including the second Calhoun monument, have been removed across the country and beyond. 

Q. In recent weeks, dozens of Confederate statues have been removed, either officially or by protestors. What should take their place? Should we erect statues of African American leaders? Can you point to a successful replacement?

First off, I don’t think it is fair for a scholar like me to tell a community what sort of monuments it should put up. This should be a local decision—and one that takes into account the perspectives of the entire community, which was not the case with Confederate monuments.

But I certainly support the installation of statues of African American leaders as well as leaders of other underrepresented groups, including Native Americans, Asian Americans, Latinos, women, and others. And I would like to see more public memorials highlighting our nation’s history of discrimination based on race, gender, and sexual orientation as well. It is worth noting that this process has already begun in Charleston where a number of statues honoring African-American leaders have been erected in the past decade. 

Q. You have written extensively about the history of the antislavery struggle. What is different about the recent protests?  

One difference is that 19th century antislavery activists didn’t spend time working to remove white supremacist monuments because the United States landscape was not yet filled with them. 

But abolitionists did target other symbols of racial oppression. In 1854, for instance, William Lloyd Garrison dramatically burned copies of the Fugitive Slave Law and the Constitution at a Fourth of July rally. 

On a broader level, I would emphasize the similarities rather than the differences between today’s protestors and pioneering abolitionists such as Garrison, Harriet Tubman, and Frederick Douglass. Both groups targeted systemic racism and both faced off against local, state, and national authorities that were complicit in this injustice.

Q. You teach college students at a major public university. How much do they understand about slavery and its legacy? How could we improve teaching about slavery in the K-12 grades? For example, California, state history is taught in the 4th grade; should that include a discussion of slavery? 

At Fresno State, I teach classes on the history of slavery, the Civil War, and Reconstruction. Most of my students are fascinated by these topics and tell me that they are only addressed in passing at the K-12 level. 

Since many of our history majors end up as social studies teachers themselves, I often ask them how they would teach these topics differently. We talk, for example, about whether it is appropriate to expose young students—say, 4th graders—to the brutalities of slavery. Nearly all of them agree that we shouldn’t sugarcoat our past in the classroom, even for our youngest students. Fortunately, organizations such as the Southern Poverty Law Center have started highlighting this shortcoming in our history curriculum and producing materials the help K-12 teachers redress it.  

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176310 https://historynewsnetwork.org/article/176310 0
New Novel "The Collaborator" Explores the Moral Ambiguities of a Holocaust Rescuer

Cover Image courtesy HarperCollins 2020


The Collaborator is based on a true story, and as soon as I heard it, a shiver ran down my spine. I recognised that shiver: it meant that I had come across something so extraordinary that I had to write about it. 

In 1944, during the darkest days of the Holocaust, when most of the Jews of Europe had already been murdered, and only the Hungarian Jews remained, a Hungarian Jew had the audacity to confront the dreaded Adolf Eichmann, who had recently arrived in Budapest with one mission: to send the Jews of Hungary to Auschwitz and achieve Hitler’s goal of Jewish genocide. 

I could already visualise that chilling scene: a powerless man who dared to enter Eichmann’s lair, and looked into the cold eyes of the fanatical Nazi who held the fate of almost a million Jews in his hands. 

Against all odds, he rescued over 1500 Jews from certain death by securing a train which took them to Switzerland. After the war, he migrated to Israel where he should have lived happily ever after, basking in the glory of his remarkable feat. But that’s not what happened. Instead, he was accused of being a Nazi collaborator, and no one could have predicted the tragic consequences that followed. 

This true story contained the ingredients that have always fascinated me: dramatic historical events, World War II, and the Holocaust. 

As Ernest Hemingway once observed, war gives us the opportunity to explore the best and worst in human nature, and I’ve always been fascinated by the way ordinary people behave when they are caught up in traumatic situations and their courage is pushed to the limit. 

 I had to research the true story and the historical background, but for me, researching is addictive. I love following the clues and discovering new facts, and feel tempted to keep on delving, but finally the moment of truth arrives when I have to stop researching and start writing.

Part of my research involved trips to Budapest and Israel. My visits enabled me to walk in the footsteps of the man who inspired The Collaborator, and to evoke the atmosphere of the time. In Australia, I tracked down Hungarian-born migrants whose parents or relatives had been on the rescue train. Talking to them gave me valuable insights into the man who saved them. 

As these events took place in 1944, I certainly didn’t expect to come across anyone who had actually been on that train, so I was astonished when my editor, who has no connection with the Jewish or Hungarian communities, called to tell me that she had just met a woman who had travelled on that rescue train as a child! 

I couldn’t wait to talk to this woman, and when I did, we discovered another coincidence:  it turned out that we knew each other from school! Although she was only four during the train journey, she remembered that her baby sister was born on that train. 

To gain more understanding of contemporary issues, I read history books, biographies, and memoirs about the man who rescued so many people, but what intrigued me even more than his achievement, was the controversy that raged around him, and continues to rage to this day. 

While some writers regarded him as a hero, others reviled him as a shady character, a self-serving Nazi collaborator. So, I became even more intrigued. 

I’ve always been fascinated by moral ambiguity, that grey area between adoration and condemnation, and this story was a perfect example. The more I thought about it, the more perplexed I became by the ethical issues it raised. 

Do we have the right to judge the actions of people in life and death situations? Are we honour-bound to keep promises, no matter to whom they were made, and in what situations? Can a man be a hero and a collaborator?

As I was writing a novel, I had to fictionalize the main character to have the freedom to interweave history and imagination. You could say that my challenge was to turn this history into mystery, because there is a huge mystery at the heart of my novel. It’s a love story too, a tale of passion and obsession, of vengeance and betrayal, but also of the healing power of truth, and the redemptive power of forgiveness. 

I knew I’d have to weave another strand into this story and invent fictional characters to create a plot. I remember sitting at my computer one summer’s day, staring at that depressingly blank screen, wondering how to begin, when suddenly I felt a chill. 

I was in a decrepit little room in Tel-Aviv, with the snow falling outside, while inside, a man with a thirst for vengeance, sipped black tea through a sugar cube and penned poisonous words about a man he loathed. I had just met the character whose pamphlet set the whole tragedy in motion. 

Next, I watched a young woman walk onto my screen. Annika was an angry Australian journalist who was disenchanted with her job and disconnected from her heritage and her Hungarian background. 

I heard her arguing with Marika, her Hungarian-born grandmother who resembled some Holocaust survivors of that generation that I have met over the years. 

I sensed that Marika’s critical attitude to her grand-daughter, and her antagonism towards the man who saved her life during the Holocaust, would provide the key to the mystery. But how this would work out, I didn’t yet know.

It seems to me that writing a novel is a magical process in which fiction is created by tapping into the subconscious mind. People often ask whether I have the plot worked out before I write the novel. I don’t, and not knowing keeps me on the edge of my chair from the moment I start writing until I finish. Why did Annika’s grandmother say she never wanted to hear her rescuer’s name again? Annika’s search for the truth, which takes her to Budapest and later to Israel, forms the plot of The Collaborator, and reveals the secret that has poisoned the lives of three generations of women. 

My fascination with the man whose life inspired my novel increased as I wrote The Collaborator, which illustrates my conviction that the world is filled with unbelievable lives, and fiction reminds us that they are real.


Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176301 https://historynewsnetwork.org/article/176301 0
While Monuments are Being Removed, a Historian Asks Questions

Photo Wikimedia Commons, Attribution-Share Alike 4.0 International




Following the lynchings of Black men and women and as part of the resulting “Black Lives Matter” and “I Can’t Breathe” protests, statues of those who associated themselves with the Confederate cause are increasingly coming down across the United States.


I support these moves. People have a right to walk around their neighborhood park without being terrorized by iconography devoted to people who denied their ancestors human rights. Knowing that these statues were generally built during the era of segregation is important. They were built specifically to honor and to perpetuate white supremacy. It was a way to memorialize racist “lost cause” rhetoric, not to recognize the past.


Confederate monuments in the United States are also often almost hiding in plain sight. Take Texas as an example.


In addition to the statues of Stephen F. Austin present across the state of Texas, consider other items sacred in Texas mythology. The Alamo? The San Jacinto Monument? While people are taught to see such as shrines of freedom that honor the Texas Revolution, these are all effectively Confederate monuments inasmuch as they represent the movement for white supremacy that drove Texans to rebel first against Mexico and then against the United States. Contrary to popular thought (at least in Texas), the Texas Revolution was not exactly about rebelling against a repressive Mexican government. In reality, Mexico was against slavery. The white residents residing--largely illegally--in Texas desperately wanted to maintain slavery because they saw Africans as inferior. Moreover, almost all had arrived in Texas mere years or months prior to the Revolution with the specific goal of helping take Texas for the US. The San Jacinto Monument represents “freedom” to Texas Nationalists. But to non-white people, the glorified San Jacinto Monument represents part of what denied them recognition as humans. If these sacred places represent freedom, it's a very narrow freedom defined by dishonesty and unfreedom.


Names can enshrine white supremacy, too. The namesake of the Texas state capital, Stephen F. Austin did more than any other individual to establish slavery in Texas. Due to Austin’s groundwork and activism for enslavement, Texas went from having 450 enslaved Black people in 1825; to 58,000 in 1850; and to 275,000 in 1864. Other monuments to racism include streets such as “Plantation” and “Dixie,” mere miles from my house.


What about the name of our nation’s capital, Washington DC? George Washington certainly didn’t help promote freedom. What about the name “White House” and the fact that enslaved labor built it? The “President’s House” or the “Executive Mansion” was first named the “White House” in 1901 after President Theodore Roosevelt faced large-scale backlash after having dinner with Booker T. Washington. Should this building, should this city also be renamed? 


People also shouldn’t have to live in cities and on streets; shouldn’t have to attend lectures in libraries and auditoriums; shouldn’t have to work in buildings whose names honor their oppression. And, indeed, if the history taught in public schools wasn’t so censored (in other words, erased), there would certainly be even more ultimatums for dismantled monuments and re-naming campaigns.


How far will things go? Will we re-examine public artifacts that perpetuate ableism? classism? homophobia? Sexism? These too are oppressive. We can’t battle racism alone. Audre Lorde fought against forces that wanted her to prioritize either her Black or lesbian identity, and instead believed, “There is no hierarchy of oppression.”


From another perspective, I worry that people opposed to more intersectional, inclusive, truthful histories will continue their efforts to hijack conversations with concerns of “we can’t erase history!” Protesters sometimes complicate their anti-racism activism and lose supporters, such as happened when they toppled the statue of abolitionist and Union soldier, Hans Christian Heg. A statue of Ulysses S. Grant was torn down too; he took stands against the Ku Klux Klan in the 1870s. Protesters need to know who exactly they are taking a stand against and why. (And, yes, I understand the ambivalent part that property destruction has played in civil rights movements and garnering attention for change.) 


Freedom is a constant battle. During a time when fascism is a growing threat from Republicans, I sometimes think it is better to focus more attention toward immediate issues related to the upcoming elections this fall. Regardless of the timeline, will cities do the necessary follow-up work to revisit their own histories and then to make long-term plans for more complete reconciliations? Or, will cities wait for protesters to acquiesce and then continue going about their business in ways that maintain what bell hooks has appropriately termed the Imperialist White Supremacist Capitalist (and one could add Heteronormative Ableist Theistic) Patriarchy?


Race-based enslavement was widely considered “a positive good” two centuries ago. Homophobia was publicly pervasive two decades ago. Values change. What's the boundary between judging people who lived over a century ago by our contemporary values and between recognizing that everyone has flaws? Given present standards of equity, few figures from the past could have a town named after them in 2020. That’s the point—debating statues and memorials is complicated, but it forces consideration of what things ideas like “freedom” mean. 

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176307 https://historynewsnetwork.org/article/176307 0
But Why Is America Exceptional?

Alexis de Tocqueville, 1849, by Honoré Daumier. National Gallery of Art.



The debate about American exceptionalism is not really about whether America is exceptional (unique), but about whether it should be.   Virtually everyone who encounters American culture can tell that Americans are different – they have their own sports, a fondness for guns, a fear of government that is unmatched in other democracies (and even many dictatorships), anantagonism to socialism and attachment to religion that likewise stand out in the Western world, a peculiar political system, a preoccupation with race, a distinctive approach to criminal justice, a materialist and consumerist ethos, and a pop culture that is easily identifiable as American (from rock ‘n roll, jazz, soul, hip hop, and country music to Hollywood blockbusters, westerns, comic books, and talk radio).  


This is why Europeans look at America and scratch their heads.   While all countries have their idiosyncrasies – Greece is different from Denmark, and France is different from Romania – they all see the United States as downright weird.   America shares this sense of itself as a special or odd country – the broad spectrum of Americans stretching left to right from Bill Maher to Bill O'Reilly agrees that the United States is unlike other countries.   The difference between Bill Maher and Bill O’Reilly is that Maher laments this and hopes that the United States joins the path of other Western liberal democracies, whereas O'Reilly celebrates America’s special path and hopes it endures.   So while Americans disagree on whether this country is better or worse for its distinctive features, they broadly agree that it is distinctive – an exception, a special case – among its industrialized counterparts.


The meaningful debate, therefore, is not whether the United States is different, but when and why it became different.   Indeed, the most prominent theme that students the world over learn about early-America is the formation of American identity – how American society became distinctively American, featuring uniquely American manners, philosophical and political sensibilities, religiosity, and sociology.   Most historians hold that this happened early and slowly.   They argue that European settlers encountered an exceptional physical and social environment in America.   These historians point to America’s extraordinarygeography, topography, climate, flora, and fauna, but also its singular social environment – its racial, ethnic, and religious diversity, its cheap land and high wages, slavery, and the absence of a legal aristocracy.   These scholars suggest that this exceptional physical and social environment gradually reshaped the English settlers’ cultural traits.   The colonies thus drifted steadily away from their English cultural roots, until they finally separated completely in 1776.


The idea that the American environment reshaped the culture of European settlers makes anthropological sense.  After all, we expect peoples living in different environments – hot versus cold, mountainous versus flat, urban versus rural – to differ from one another culturally.   But there is a contrary explanation for the formation of American culture and identity that makes as much sense, if not more.   It rejects the claim that American social, religious, ethical, and political sensibilities were new;  made in America and by America.   It instead tells a story of an old English culture that was preserved in America, as the mother country evolved and changed.   According to this narrative, America’s sudden separation from Britain in 1776 allowed settlers to retain old English practices and beliefs that were later swept away by modernity in Britain. 


This debate manifests itself in virtually every aspect of early-American history.   Did America’s regional accents form in America over time, as Anglo-Americans mixed with other ethnicities in different parts of America, or were these accents regional Britishaccents that settlers brought to America from the old country?   Likewise, is the French-Canadian accent a bastardization of “proper” French, or is it a linguistic time capsule that preserved the sound of seventeenth-century French?   Was the settlers’ religious experience uniquely-American or an extension of English developments?   Did America change English family structure, or did American families conform to traditional English patterns?   Were American political habits products of life in America, or of the settlers’ English heritage?   Such questions shape scholarship in fields as varied as technology, science, agriculture, military tactics, race, economic development, gender, literature, architecture, and fashion.  


What all these questions lead to is the American Revolution – did Americans launch their rebellion because they had become increasingly different and distant from their mother country?   Or did they rebel because they were English, and thus infused with conventionally-English fears about governmental power?


American historians largely explain the Revolution as a product of English settlers transforming into Americans.   But the settlers themselves saw their political resistance not as a product of uniquely-American sensibilities that they had acquired by living in America, but as a product of their English heritage.   Indeed, they did not exhibit or perceive a growing sense of difference or distance from England during the seventeenth and eighteenth centuries.  They referred to themselves as English (or British), they did not express disapproval of monarchy, they took patriotic pride in Britain’s accomplishments on the world stage, and they saw themselves as integral components of a transatlantic British civilization.   Moreover, even as they resisted Parliament’s imperial policies in the 1760s and ‘70s, settlers understood and explained their resistance as conventionally English – they understood themselves not as agents of change, but as upholding traditional English practices and liberties.  They saw themselves as the successors of those who had launched England’s Civil War (1642) and Glorious Revolution (1688).  Indeed, the settlers’ beliefs regarding self-government, law, and arbitrary power – the animating ideas of the American Revolution – were mainstream beliefs in Britain, just as they were in the colonies. 


Just as early Protestants saw themselves not as innovators, but as upholding and preserving the ancient Christian faith, so too did Revolutionary Americans.   They were carrying the standard of an old English political culture that had lost its way.   They were trying to preserve an old established order, rather than create a new order, or a new system of government. 


Alexis De Tocqueville observed in 1832 that “the American is the Englishman left to himself.”   He saw Americans’ cultural and political traits as old English traits.   Political separation from Britain allowed these traits to remain preserved in America, like a bug in amber, as they were whittled away in the old country.   The American Revolution – not America – created the distinctiveness that some laud as exceptional and others lament as simply odd.

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176255 https://historynewsnetwork.org/article/176255 0
Annexation Will Be the (Formal) Beginning of Apartheid and the End of Zionism

Demonstration against Israeli annexation of the West Bank, Rabin Rabin Square, Tel Aviv-Yaffo, June 6, 2020

Photo Yair Talmor, CC BY-SA 4.0




Fourteen years ago, during the height of the Iraq war, former President Jimmy Carter published a short volume on Mideast affairs, entitled Palestine: Peace not Apartheid. The book was met with broad and vehement outrage. The Anti-Defamation League ran an ad campaign accusing Carter of anti-Semitism. Ethan Bronner, writing in the New York Times Book Review on January 7, 2007, called the charge a distortion, but noted that the ADL’s “biggest complaint against the book — a legitimate one — is the word ‘apartheid’ in the title, with its false echo of the racist policies of the old South Africa.” 


Carter was not the first or the last person to raise the analogy to South African apartheid in commenting on the situation in Israel-Palestine. But the comparison has always evoked responses spanning the spectrum of denunciation to apoplectic rage from those sympathetic to Israel, especially in the American Jewish community. Apartheid South Africa was a paragon of systemic evil and flashpoint of international political activism. The very idea that Israel could merit being juxtaposed with such clear injustice has been anathema in mainstream American Jewish discourse (and in the center of American political life more generally).  


But the refusal to tolerate any comparison of Israel to apartheid South Africa elides some of the most important complexities of Zionism as both an ideal and historical movement. Like many American Jews I consider myself a Zionist, and have since a very early age. However, the plan of the Netanyahu government to annex large portions of the West Bank forces me (and will force many others) to re-examine a commitment to Zionism in the light of history and principle. 


Lost in much of the debate about Mideast affairs, especially here in the United States, is any critical perspective on what it means to call Israel a “Jewish state.” It is always controversial, for example, when someone calls the United States a “Christian nation.” Why then is the identity of Israel as a “Jewish state” seemingly so taken for granted in most precincts of American politics? 


The answer lies in American perceptions (sometimes latent) of Zionism’s content and history. For most of Israel’s secular supporters (myself included), the concept of Israel as a Jewish state has not violated scruples against ethnic nationalism or theocracy because the “Jewish” identity and mission of Israel are understood as narrowly circumscribed. Israel is not “Jewish” in compelling its citizens to adhere to any particular aspects of Jewish faith, or in granting special privileges to some citizens on the basis of Jewish ethnicity, or in fulfilling particular “national” aspirations of the Jewish people (e.g. the reconstitution of the Kingdom of David or the reconstruction of Solomon’s Temple). 

The only legitimate agenda that could define Israel as a “Jewish state,” from this mainstream liberal perspective, is to provide a safe haven for Jews against anti-Semitism. Because the horrors of the Holocaust proved that anti-Semitism is a uniquely destructive threat, it was justifiable to establish one sovereign member of the community of nations in which a majority of citizens were Jews—an identity not constructed in racial, ethnic, or religious terms, but defined narrowly as “those subject to anti-Semitism.” Such an accommodation could only be fair, however, if all of the non-Jewish citizens of this state were given the same rights and privileges as the majority.


This has been the prevailing understanding of the Zionist underpinnings of Israel’s existence among secular Americans (for Christian Zionists, who in the US are quite numerous, the picture is different), and it has generally been corroborated by Israeli leaders’ own representations of their state’s history and mission. This was because the establishment of Israeli sovereignty and the organization of its military were largely overseen by the Labor Party under the leadership of David Ben-Gurion (1886-1973), which likewise enjoyed the control of Israel’s government for almost the first three decades of the nation’s existence. 


Among the early theorists of Zionism, Labor took the doctrine of Theodor Herzl (1860-1904) as canonical.  Herzl wanted his theoretical state to protect the lives, dignity, and rights of Jews, but he scoffed at the idea that anything about his “Jewish state” should be culturally or religiously Jewish. His second Zionist writing, the novel Altneuland (“Old-New Land”), is set in a Palestinian utopia where Jews and Arabs live together in harmony and as equal citizens of a Jewish-majority state, and in which the protagonists struggle against a wicked rabbi who is trying to turn the “Jewish state” into a theocracy.


Though this Herzlian ideal delineates the “center left” of Israeli politics today, it was much more right wing, in relative terms, during the first half of the 20th century, when some of the most prominent spokespersons of Zionism were opposed to the notion of a “Jewish state” altogether. Albert Einstein (1879-1955), undoubtedly the most visible international celebrity who endorsed the Zionist cause, considered all forms of nationalism toxic and obsolete. He embraced a form of “cultural Zionism” in which Jews could develop a safe “homeland” in Palestine without need for an established state structure. Martin Buber (1878-1965), a founding faculty member of Hebrew University, and Henrietta Szold (1860-1945), the founder of Hadassah, advocated the formation of a “binational state” that would be shared by Jews and Arabs equally (and that would obviate the need of any form of “partition” or “two-state solution”). 


A key inflection point was reached in 1948. Labor had enjoyed the allegiance of a majority of Jews that established the “Yishuv” (the pre-independence Palestinian Jewish governing authority), and its military wing, the Haganah, became the backbone of the Israeli Defense Forces. Thus once partition and (Israeli) independence was achieved in the face of violent opposition, Ben-Gurion and his Labor partisans effectively had control of the Zionist agenda, and could define the project for the world at large. Leaders like Einstein were then confronted with a choice. They could either embrace Herzl’s vision of a “Jewish” state that was secular, egalitarian, and democratic, but nonetheless fully militarized and sovereign, or repudiate Zionism altogether. Einstein chose the former, and remained a staunch supporter of the new Israeli nation until his death.


We are arguably approaching another such inflection point now. The Herzlian ideal at the core of Israel’s national “persona” has been vexed since the state’s founding, but especially since 1967. Though the 20% of Israel’s population that is non-Jewish ostensibly share equal rights and privileges with their Jewish compatriots, in practical terms the efficacy of their franchise has always been limited. No non-Jewish member of the Knesset, for example, has ever held a major cabinet post. In the most recent election in which Benny Gantz’s “Blue and White” party garnered a narrow plurality of Knesset seats, his efforts to form a government were undermined by the fact that inviting the “Joint List” (a group that includes Arab political parties) to join a coalition would have caused many of his Jewish partisans and allies to mutiny. The occupation of East Jerusalem and the West Bank is an even greater affront to Herzlian ideals. In these territories that are under Israeli jurisdiction, Arabs are technically “stateless persons,” systemically denied the rights of fully sovereign citizenship.


Liberal secular Zionists like me, who embrace Herzl’s principles, have typically reconciled ourselves to the practical inequities of Israeli state and society by reference to geopolitics.  War and the threat of war have distorted the realization of Herzl’s vision. A just and equitable Zion has waited upon the achievement of a two-state solution. Once a fully sovereign Palestine becomes a peaceful neighbor of Israel and the Occupation ends, Israel will finally be free to function as a nation that is “Jewish,” but (more importantly) fully secular, democratic, and egalitarian (viz., one in which all citizens, Jews and non-Jews alike, enjoy the same rights, privileges, and civic power proportional to their numbers).


That hope has persistently been critiqued as a “pipe dream” by many. Whatever the case may have been in the past, the notion of some future “two-state solution” will be indefensible if and when the Netanyahu government executes its annexation plan. The logic of the annexation plan is clear: expand Israeli territory (to include land currently settled by Jews who have taken up residence in the occupied West Bank), without increasing the Arab population of Israel sufficiently so as to change the demographic balance of the electorate. 


The annexation plan would leave the Palestinian Authority in charge of isolated pockets of territory that could never, in practical terms, be cobbled together to form the socio-economic foundation of a viable state, effectively foreclosing the possibility of a two-state solution. It would render the status quo permanent in principle, relegating the Arab population of Israel-Palestine to a condition indistinguishable from apartheid. While the annexation plan might not be overtly “racist” in concept, it very deliberately dispenses power to distinct groups along ethnic lines. Arabs will be accorded different rights and powers as individuals on the basis of their residence on one or the other side of arbitrary lines that they have no hand in drawing. Since the lines will be drawn unilaterally and reviewed in the future by an authority that will be kept (by the very act of redrawing the map) in the control of a Jewish majority, all non-Jewish residents will be relegated to permanent states of second-class- or non-citizenship. That is the essential logic of apartheid.


Moreover, the “non-racist” character of this new apartheid “Greater Israel” would be dubious at best. Because Herzl’s vision is so central to the current historical self-image of Israeli society, Benjamin Netanyahu has been forced to give lip service to that ideal, calling Herzl “our modern Moses” in various venues. But, as is made clear from his published remarks, Netanyahu’s own principles are much more shaped by the ideas of Vladimir (“Ze’ev”) Jabotinsky (1880-1940), Herzl’s critic and a vehement opponent of Labor Zionism. Jabotinsky’s followers formed the Likud Party that Netanyahu leads (Netanyahu’s father, the historian Benzion Netanyahu, was a pall-bearer at Jabotinsky’s funeral). 


At a speech given in memoriam to Jabotinsky last year, Netanyahu said, “We are constantly defending ourselves…This is the iron wall.” The last phrase alluded overtly to a famous essay by Jabotinsky, in which he laid out the basic principles of his “Revisionist Zionism”:


“Voluntary reconciliation between the Palestinian Arabs and us is absolutely out of the question, whether now or in a foreseeable future…Our colonization should either stop or continue against the will of the native population. And this is why it may continue and develop only under the protection of a force independent of the local population- an iron wall, through which the local population cannot break (“On the Iron Wall [1923],” quoted from: Kaplan and Penslar eds., The Origins of Israel, 1882-1948, pp. 258-262).”


Jabotinsky viewed Zionism as a colonial project constructed in explicitly racial terms, pitting the “Jewish race” against the Arabs in the same way that the “British race” set itself to dominate India or Nigeria. Netanyahu would no doubt disavow the overtly racist principles of Jabotinsky’s ideology, but his invocation of an “iron wall through which the local population cannot break” is grotesque. The annexation plan is an extension of Jabotinsky’s vision of a Greater Israel that would include not only the West Bank, East Jerusalem and Gaza, but parts of Lebanon and Jordan. The idea that Jewish sovereignty could be extended so expansively without some form of ethnic cleansing (a goal Jabotinsky himself repeatedly denied) or apartheid is absurd. Netanyahu’s plan may not be as ambitious as Jabotinsky’s as a matter of degree, but in kind it is likewise apartheid.


In the same way that “cultural Zionists” like Einstein were forced to choose between an embrace of Herzlian Zionism and a repudiation of Zionism altogether in 1948, practical realities in the wake of implementation of the Netanyahu annexation plan will confront the majority of today’s Zionists with a stark choice. Once annexation is formalized, all Israeli appeals to the legacy and principles of Theodor Herzl will be reduced to empty window dressing. A post-annexation Israel will, for all intents and purposes, embody the illiberal Revisionist Zionism of Ze’ev Jabotinsky. 


Any conscientious individual who cherishes liberal ideals and opposes apartheid will, at that point, be forced to repudiate Zionism. This, of course, does not entail abandoning the people of Israel (both Jews and non-Jews) or rejecting solidarity with the larger Jewish community. One may still cherish the goals for which Zionism was founded (the dignity, welfare, and security of the Jewish people) while acknowledging that those goals can no longer be achieved through Zionist means. 


If “Zionism” survives in any form after annexation, it will much more closely resemble that of Martin Buber and Henrietta Szold than that of Theodor Herzl and David Ben-Gurion. Once a two-state solution and the preservation of a “Jewish state” is no longer possible, the only just and practical outcome will be the unification of (pre-1967) Israel with the West Bank, East Jerusalem, and Gaza Strip into a single sovereign nation in which all residents enjoy the privileges of birthright citizenship. What form and name such a nation would take, and what policies it would embrace (e.g. which communities, Jewish and/or Arab, would enjoy a “right of return”) are not knowable in advance. But what is certain is that such a state would serve both the cause of basic fairness and the interests of the international Jewish community better than the illiberal Zionist project upon which Benjamin Netanyahu seems determined to embark.

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176257 https://historynewsnetwork.org/article/176257 0
How Britain and Churchill Repelled the Nazi War Machine (1940-1941)




On June 4, 1940 Britain’s new Prime Minister Winston Churchill offered only defiance to the Third Reich by declaring to an enraptured House of Commons “we shall fight on the beaches, we shall fight on the landing grounds, we shall fight in the fields and in the streets, we shall fight in the hills; we shall never surrender.” Two months later, Hermann Göring, the commander of the German Luftwaffe, planned a series of aerial attacks over the English Channel to obliterate the Royal Air Force (RAF) and force Britain to ignominiously capitulate.  As Nazi blitzkriegs had already overwhelmed the Low Countries (Belgium, Luxembourg and the Netherlands) and eliminated France from the war, Hitler – ever confident and vengeful – began organizing a rally in Berlin at the end of August to celebrate his forthcoming victory over Great Britain.  

The event would never take place. 

In mid-August, the Luftwaffe struck several British targets.  To the surprise and consternation of the invaders, however, the RAF outdueled Nazi pilots and shot down forty-five German aircraft from the sky.  By comparison, the RAF only lost thirteen planes.  Two days later, Göring unleashed approximately 2,000 fighters and bombers upon the northern and southern coasts of England. Rather than recoil from the onslaught, the RAF repelled the attack, inflicted losses on the enemy and remained intact to fight another day. In the House of Commons on 20 August, Churchill eloquently summed up the brave contribution of the RAF stating “Never in the field of human conflict was so much owed by so many to so few.” (p. 157-182)

How did the British people manage to ward off a cross-Channel invasion and defend their Islands against the superior might of their ruthless enemy through 1940-41?  In The Splendid and the Vile: A Saga of Churchill, Family and Defiance during The Blitz (2020), Erik Larson – the author of the immensely popular The Devil in the White City (2003) – offers a compelling account of how British courage, sacrifice and tenacity in the face of annihilation resulted in victory for Britain, Europe and the world. 

The Price of Liberty: July 1940 – May 1941

Through vivid prose and detailed research, Larsen captures the seismic struggle of the British people during the Battle of Britain (July-October 1940) and The Blitz (September 1940 – May 1941).  On September 7-8 German fighters and bombers covered the skies, crossed over the coastline and dropped bombs on London day and night.  Fires burned out of control, hundreds of corpses lined the streets and more than one thousand injured civilians cried for help.  Unable to gain sufficient air superiority to launch Operation Sea Lion – the Nazi plan for a full-scale land invasion, Hitler and Göring turned to total war to break British resistance. (p. 213-215) By targeting “working-class neighborhoods” and cultural landmarks in the capital, German high-command sought to demoralize Britain with relentless waves of destruction. (p. 285)

In mid-November 1940, the Luftwaffe carried out an all-out assault on Coventry – a city northwest of London.  Hour after hour into the night and until dawn, German planes unloaded scores of incendiary bombs and set the town ablaze.  As shards of glass fell to the ground, “Bodies arrived at a makeshift morgue at a rate of up to sixty per hour…so mangled that they were unrecognizable.” (p. 294-295).  Nearly six hundred perished during the Nazi demolition.  Month after month, German bombs rained down upon British cities and towns – piercing nights with devastation, dust and dire conditions. In mid-April 1941, the Luftwaffe leveled parts of London and took more than 1,100 lives in a single raid. (p.430) Three weeks later, an even larger-scale nocturnal operation consisting of “505 bombers carrying 7,000 incendiaries and 718 tons of high explosive bombs” damaged high-profile targets, including the Tower of London, Westminster Abbey, part of the British Museum and “the tower that housed Big Ben.”(p.469-473) How did the Britons manage the seemingly endless siege?

Life & Living during the Battle of Britain & The Blitz, 1940-41

Writing in short chapters, Larson provides a plethora of fascinating details to vividly illustrate how people adapted and organized to cope with stress, strain and shortages.  Contrary to a popular misconception, most Britons took cover from Luftwaffe bombs either in the safest and sturdiest room of their homes or in designated air raid shelters rather than descending into an underground subway.  Dilapidated shelters often welcomed frightened men, women and children with dampness, putrid odors, grime, little to no heat in the winter months, an insufficient number of restrooms and in some cases – lice.  Upon the inspection of several shelters, Clementine Churchill, the wife of Winston Churchill, noted the absence of a particularly critical element necessary to British survival – tea kettles. (p. 313-316)

For most Britons, tea offered a crucial, brief and ritualized respite from wartime anxiety.  The simple acts of steeping tea leaves, stirring and sometimes pouring cream into a cup served as a balm to the soul and provided comfort in a familial and national tradition.  Of course, others, including Winston Churchill, leaned on copious amounts of alcohol. Tea and alcohol combatted the dread of negotiating dark, crater-filled streets and living without electricity, tampons or sufficient food amid the prospect of dying from an explosion or being buried alive under a collapsed building. (p. 240-246) Through uncertainty, danger and hardship, the nation refused to succumb to despair. 

Winston Churchill: A Model of Leadership in a Time of Crisis

Over the course of the monograph, Larson effectively captures the breadth and brilliance of Churchill’s prime ministership during the darkest days of the war by judiciously weaving his speeches around anecdotes and chronicles of events.  From the narrative, it can be ascertained that Churchill won both the war and the admiration of millions across the world due to possessing four critical elements of leadership.  

First, Churchill led with honesty.  Rather than attempting to assuage fears of a Nazi takeover or offer an expedient panacea to attain victory, the prime minister leveled with the British public and refused to minimize the threat or the sacrifices that would be required for victory by gravely pronouncing “I have nothing to offer but blood, toil, tears and sweat.” (p. 32-33) 

Second, Churchill evinced an uncanny ability to inspire his peers and the nation by remaining true to his ideals and principles.   As France faltered against the Nazi blitzkrieg in late May 1940, the prime minister conducted a private session with his top ministers and confessed to entertaining the prospect of pursuing diplomacy with Hitler to ward off a future invasion.  After immediately dismissing the idea, Churchill told his elite audience “If this long island story of ours is to end at last, let it end only when each of us lies choking in his own blood on the ground.” (p. 57) Upon hearing his impassioned plea against tyranny at all costs, the initially astonished officials rushed to his side to pledge their support and praise his resolve. The battle for the hearts and minds of his own government had been won. 

Third, Churchill crafted a coherent strategy for victory.  While shaving one morning, the prime minister turned to his son Randolph and suddenly exclaimed “I think I see my way through.”  Randolph then queried “Do you mean that we can avoid defeat…Or beat the bastards?  Well, I’m all for it, but I don’t see how you can do it.”  In response, Randolph’s father offered only seven words: “I shall drag the United States in.” (p. 24-25) Concurrent to robustly enlarging the RAF, bolstering anti-aircraft defenses, ordering the bombing of Berlin to boost British morale and establishing a formidable intelligence-gathering network, Churchill steadily forged a relationship with U.S. President Franklin Roosevelt through official diplomacy and personal correspondence for the purpose of winning American aid.  In deeming the survival of Britain essential to national and international security, Roosevelt skillfully outmaneuvered isolationists in Congress and supplied Churchill with battleships in the autumn of 1940 and war materiel under the Lend-Lease Act in March 1941.  These lifelines proved crucial until the United States formally entered the war one day after the bombing of Pearl Harbor by the Empire of Japan (December 7, 1941).  

Last but not least, Churchill demonstrated both the integrity of his character and his mission to protect the lives and liberties of the free world by displaying genuine, heartfelt empathy for the suffering of the British people and all those afflicted by fascist aggression.  While inspecting the damage from a Luftwaffe raid in the East End of London in September 1940 amid flames and broken bodies beneath piles of rubble, tears streamed down Churchill’s face.  An elderly woman who witnessed his profound expression, told onlookers “‘You see, he really cares; he’s crying.’” In response to another woman who screamed “When are we going to bomb Berlin, Winnie?,” the prime minister “shook his fist and snarled, ‘You leave that to me.’” (p.217-218) In fact, Churchill cried in public on numerous occasions and summoned equal amounts of courage and compassion in exerting intrepid leadership.  


In The Splendid and the Vile, Erik Larson has not only produced an engaging and timely portrait of the perilous period of when Britain stood alone against Nazi Germany but has also illuminated how tragedy and loss can be turned into a triumph and justice through steadfast determination and solidarity of purpose.  As such, his new volume literally speaks volumes to the past, the present and the future of humanity.

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176303 https://historynewsnetwork.org/article/176303 0
Of Steamboats and Fireworks: The Great Mississippi River Race of 1870



The 4th of July of 2020 will be different than those of the past. Masks will be worn. Distances will be kept. Crowds will be smaller. Hand sanitizer will be shared in abundance. The situation will likely lead to nostalgia and a desire for some form of distraction. An event that occurred exactly 150 years ago to the day stands out and could serve this purpose.


On the 4th of July of 1870, a 1,200 mile race between two steamboats on the Mississippi River was decided. The race attracted global interest. Reporters wrote about it. Gamblers wagered on it. People gathered and cheered for it. Currier & Ives immortalized it


The race tested both man and machine. The captains were considered the best on the river. They also enjoyed a long-standing feud. Captain Tom Leathers of the Natchez knew the river well. The trip from New Orleans to St. Louis was his run. Captain John Cannon of the Robert E. Lee ran from New Orleans to Louisville. The stretch of the Mississippi River north of its confluence with the Ohio was unknown to him.


The two steamboats were of similar dimensions and build—long and narrow. Either would stretch from goal line to goal line on a football field, but at less than 50 feet in width, at least four steamboats would be required to cover the entire playing surface. 


Each steamboat had four decks, and were of similar design. Cargo and engine works lay closest to the water. The tiny pilot house was perched on the top. The passenger suites and salons were on the second deck. The smaller crew quarters comprised the third deck. A pair of smokestacks on each stretched upwards of 100 feet into the air belching black smoke and embers. Each had two paddle wheels set about 3/4 of the way back from the bow and stood four stories high. 


The Natchez was newer and was engineered for speed. The Robert E. Lee was akin to a floating luxury hotel. It would seem logical that the Natchez should win with both design and experience on its side. 


Still, the Robert E. Lee arrived in St. Louis first. A crowd of 75,000 people was waiting. Its trip upriver was a record-setting 3 days, 18 hours and 14 minutes. The Natchez crossed the finish line 6 hours later. 


While the gap seems substantial, the Robert E. Lee did not catapult ahead until the vessels were within 200 miles of St. Louis. Indeed, they were about 30 minutes apart when a heavy fog fell onto the river. Thirty minutes was nothing when one considered the delays that were imposed by the mechanical failures—which happened with regularity—and the likelihood that the vessels would get caught on the shifting sandbars of the river. 


The Robert E. Lee continued slowly and cleared the fog in the early hours of July 4th. In contrast, Captain Leathers ordered the Natchez to be tied up until the fog cleared. He probably assumed Captain Cannon had ordered the same. By the time the fog cleared 6 hours later, the race was all but over. 


There would be no rematch. The heyday of steamboats was drawing to a close as the network of railroads criss-crossed the country. One coal-fired, steam-powered engineering marvel gave way to another. 


Jack Rudolph offers a compelling play-by-play of the events in an article entitled “Going for the Horns” which appeared in American Heritage in 1980. The account makes for an excellent story for those stretched out on blankets awaiting the arrival of darkness and looking for some distraction on the 150th anniversary of this great race.

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176305 https://historynewsnetwork.org/article/176305 0
Will Capitalist Consumer Culture Absorb Another Generation of Protest?

photo public domain, courtesy Wikipedia

I just read conservative columnist Ross Douthat’s article “The Second Defeat of Bernie Sanders,” and it made me sad. What Douthat thinks may be occurring as a result of all the protests stemming from the knee-on-neck killing of George Floyd is that Sanders “may be losing the battle for the future of the left.” Douthat sees the anti-racist protests “earning support from just about every major corporate and cultural institution,” and they are doing so because these protests do not threaten corporation-dominated capitalism the way Bernie Sanders’s campaign, for the last four years, has scared the profiteers of our existing capitalism. One that has produced the likes of Big Oil’s misleading propaganda regarding climate change and Purdue Pharma’s marketing of OxyContin, which put profits before lives and helped produce the opioid crisis

Douthat admits that the “current wave of protests will have unpredictable consequences.” Moreover, any results that reduce racism will be a major positive step forward. And yet… Bernie has stood for a major challenge to capitalism in the age of Trump. This challenge should not be absorbed like the consumer capitalism of the 1970s swallowed up the aspirations of many of the 1960s protesters--like a large ocean predator gulping up smaller fishes. 

In his Bobos In Paradise: The New Upper Class and How They Got There (2000), conservative columnist David Brooks wrote: “We’re by now all familiar with modern-day executives who have moved from S.D.S. [a radical student organization that flourished in the 1960s] to C.E.O. . . . Indeed, sometimes you get the impression the Free Speech Movement [begun in 1964 at the University of California, Berkeley] produced more corporate executives than Harvard Business School.” 

In his The Culture of Narcissim (1978) historian Christopher Lasch identified a new type of culture that had arisen. It stressed self-awareness. But, unlike the counterculture of the 1960s, it did not oppose the capitalist consumer culture of its day, but rather meshed with it, goading “the masses into an unappeasable appetite not only for goods but for new experiences and personal fulfillment.”

Many of the former youth protesters of the 1960s participated in this “mass consumption,” as the growing consumer culture sold mass entertainment in new formats (including for music, films, and books) to young adults. 

Recent decades have brought little relief from our culture of consumption and narcissism. One of the period’s most notable changes has been the expansion of the Internet and social media. In These Truths: A History of the United States, historian Jill Lepore states that “blogging, posting, and tweeting, artifacts of a new culture of narcissism,” became commonplace. In 2016, we Americans chose for our president perhaps the most narcissistic and materialistic man to ever hold the office—Donald Trump.

In an email in late June 2020, Sanders urged his followers to keep seeking “an economic system based on the principles of justice, not greed”; to “make health care a human right and not a jobs benefit”; and to “create millions of good jobs by implementing a Green New Deal as we lead the world in combating climate change.” These are the type of goals that appealed to many of Bernie’s supporters. They saw him as a candidate who would combat some of the abuses of U. S. capitalism, abuses that have led a majority of Democrats, especially younger ones, to look more favorably on the type of democratic socialism that has appeared in Western Europe than on the capitalism dominating in Trumpian America. 

Capitalist deficiencies have long been apparent. Almost a half century ago the British economist and environmentalist E. F. Schumacher, like Sanders a democratic socialist, presaged Bernie by writing that capitalism failed to adequately consider “the availability of basic resources and, alternatively or additionally, the capacity of the environment to cope with the degree of interference implied.” By advertising and marketing, it also encouraged a “frenzy of greed and . . . an orgy of envy,” and “the cultivation and expansion of needs is the antithesis of wisdom.” Moreover, by ignoring wisdom, humans were in danger of building up “a monster economy, which destroys the world.”

But the capitalism observed by Schumacher had been the dominant economy in the USA throughout the twentieth century. From the 1890s forward, as historian William Leach has written:

American corporate business, in league with key institutions, began the transformation of American society into a society preoccupied with consumption, with comfort and bodily well-being, with luxury, spending, and acquisition, with more goods this year than last, more next year than this. American consumer capitalism produced a culture almost violently hostile to the past and to tradition, a future-oriented culture of desire that confused the good life with goods. It was a culture that first appeared as an alternative culture . . . and then unfolded to become the reigning culture in the United States.

Many Americans, including U. S. presidents, perceived that ever-increasing consumption was necessary for the nation’s prosperity, though few would state the case as bluntly as one marketing consultant of the mid-1950s, who declared: “Our enormously productive economy . . .demands that we make consumption our way of life, that we convert the buying and use of goods into rituals, that we seek our spiritual satisfactions, our ego satisfactions, in consumption. . . . We need things consumed, burned up, worn out, replaced, and discarded at an ever increasing rate.”

A 2020 critique of our present capitalism comes from Anne Case and Angus Deaton’s recently published book Deaths of Despair and the Future of Capitalism. Its authors write “Capitalism does not have to work as it does in America today. It does not need to be abolished, but it should be redirected to work in the public interest.” They believe (as does Bernie) that the “healthcare industry . . . is a cancer at the heart of the economy, one that has widely metastasized, bringing down wages, destroying good jobs, and making it harder and harder for state and federal governments to afford what their constituents need. Public purpose and the wellbeing of ordinary people are being subordinated to the private gain of the already well-off. None of this would be possible without the acquiescence—and sometimes enthusiastic participation—of the politicians who are supposed to act in the interest of the public.” 

The authors also criticize “the rising economic and political power of corporations,” which has allowed them “to gain at the expense of ordinary people, consumers, and particularly workers. At its worst, this power has allowed some pharmaceutical companies, protected by government licensing, to make billions of dollars from sales of addictive opioids that were falsely peddled as safe, profiting by destroying lives. More generally, the American healthcare system is a leading example of an institution that, under political protection, redistributes income upward to hospitals, physicians, device makers, and pharmaceutical companies while delivering among the worst health outcomes of any rich country.”

Other critics of modern-day U. S. capitalism include Pope Francis, who has criticized modern-day capitalism as “unjust at its root,” and Pulitzer Prize winning economist Joseph Stiglitz. In his People, Power and Profits: Progressive Capitalism for an Age of Discontent (2019) Stiglitz argues for a more “progressive capitalism,” one that would recognize a “moral or transcendental ethic” and seek not just profits, but also the common good.   

In a 2016 talk Sanders raised the question, “How moral is our economy?” In his 2020 campaign he indicated that our present Trumpian economy, which favors the rich, furthers inequality, and despoils our environment, is immoral. 

Douthat suggests that Bernie’s criticism of capitalism is “losing the battle for the future of the left” because modern-day U. S. capitalism is indifferent to the protesters’ “purge of Confederate monuments” and that corporations can respond to police brutality criticisms by condemning racism without adopting the type of “transcendental ethic” that Stiglitz, Pope Francis, and Sanders are urging. 

In an early 2020 HNN op-ed Gary Dorrien wrote that “Sanders lines up with FDR, Martin Luther King Jr., and Catholic social teaching in believing that real freedom includes economic security.” And, indeed, Martin Luther King, among others, perceived the necessity, as Michelle Alexander has written, of “openly critiquing an economic system that will fund war and will reward greed, hand over fist, but will not pay workers a living wage,” and he ignored “all those who told him to just stay in his lane, just stick to talking about civil rights.” In one of his better speeches, at New York’s Riverside Church in 1967, he said, “we must rapidly begin the shift from a thing-oriented society to a person-oriented society. When machines and computers, profit motives and property rights, are considered more important than people, the giant triplets of racism, extreme materialism, and militarism are incapable of being conquered.”

Douthat may be right. Present-day protests may make corporations less racist without affecting their essential nature. They were able to digest, and even profit from, the hippies and protesters of the 1960s and early 1970s--after all, at some point young adult protesters had to earn money and where else were they going to do that except within the U. S. capitalist economy?

Nevertheless, protesters and leftists in 2020 will be mistaken if they forget and fail to heed Bernie’s criticism of today’s corporate capitalism. Like that of MLK, Stiglitz, and Pope Francis, it contains much wisdom. 

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176306 https://historynewsnetwork.org/article/176306 0
The Youngest History-Makers in the U.S. Senate Ronald L. Feinman is the author of Assassinations, Threats, and the American Presidency: From Andrew Jackson to Barack Obama (Rowman Littlefield Publishers, 2015).  A paperback edition is now available.


The US Constitution sets the minimum age to serve in the US Senate at 30 years.  Very few senators have taken office at the minimum age, but a few of them have made history as significant figures.

Two of these senators were selected by state legislatures in the early 19th century, without attention to their precise age, and actually came to the Senate before turning 30, while five others, elected by the voters of their states after passage of the 17th Amendment in 1913, served at the minimum age of 30.

The youngest senator ever to serve was John Henry Eaton of Tennessee, who entered the Senate at age 28 years, 4 months, and 29 days, serving from 1818-1829.  He was a strong supporter of Andrew Jackson, serving with him in the War of 1812, including the Battle of New Orleans, and was a strong critic of John C. Calhoun and his opposition to the protective tariff. When Eaton was named Secretary of War (1829-1831) under Jackson, it led to controversy over the fact that Eaton had married Peggy Timberlake very rapidly after her husband had died. This became a sex scandal, known as the “Petticoat Affair,” which riled Jackson, led to bad blood with Calhoun and his wife, who accused the Eatons of engaging in unseemly behavior, and helped to lead to the Nullification Crisis over the protective tariff in 1832-1833.  It was the first known sex scandal in American presidential history.  Eaton later served as Florida Territorial Governor from 1834-1836, and as US Minister to Spain from 1836-1840.

Henry Clay, arguably the most famous US Senator of all time, served in the Senate from Kentucky for a total of 15 years, over four periods of time: 1806-1807, 1810-1811, 1831-1842, and 1849-1852.  When first in the Senate, he was about four and a half months short of the legal age of 30. He became the youngest Speaker of the House of Representatives when he was six weeks short of age 34, serving in that body from 1811-1821 and from 1823-1825, most of that time as the leader.  He also served as Secretary of State under President John Quincy Adams from 1825-1829, and was a presidential nominee who lost three races, in 1824 to John Quincy Adams, 1832 to Andrew Jackson, and 1844 to James K. Polk, and was considered a serious contender a few other times.  Clay was a leader of a Congressional group known as the War Hawks, which helped to lead America to war in the War of 1812 against Great Britain.  He helped promote the “American System”, a strong federal government, a strong National Bank, a high protective tariff, and federally sponsored internal improvements.

Clay also became known as the “Great Compromiser,” involved in the promotion of the Missouri Compromise of 1820 over the issue of slavery expansion; the Nullification Crisis Compromise which prevented civil war over the protective tariff dispute between President Andrew Jackson and former Vice President John C. Calhoun in 1833; and as one of the negotiators of the Compromise of 1850 (with Daniel Webster and Stephen Douglas), averting civil war once again.  Clay was also one of the founders and promoters of the Whig Party as the opposition to Jacksonian Democracy.  In 1957, the Senate chose Clay as one of the five most significant members in its history, and a poll of scholars in 1982 ranked him in a tie with Wisconsin Progressive Republican Senator Robert LaFollette, Sr. as the most influential senator of all time.

In the modern era of the US Senate, five 30 year old senators were significant in the history of that body, with the first being Robert M. LaFollette, Jr, son of the famous “Fighting Bob”, LaFollette of Wisconsin, who is one of the five most acknowledged senators of all time, and who also ran for President as a Progressive in the Presidential Election of 1924.  Upon his death in June 1925, his son succeeded him by election at age 30 and approximately eight months, and served for the next 21 and a half years (from 1925-1947), until he was defeated in the Republican primary by the infamous Joseph McCarthy.  

“Young Bob” became an acknowledged leader of the progressive wing in the Republican Party, as his father had been, and with his younger brother, Philip LaFollette (who in 1931 became the youngest governor theretofore elected in Wisconsin), formed the Progressive Party of Wisconsin in the 1930s.  LaFollette Jr. supported much of the New Deal, as demonstrated in this author’s book, “Twilight of Progressivism: The Western Republican Senators and the New Deal” (Johns Hopkins University Press, 1981).  But he turned against Franklin D. Roosevelt on foreign policy, and was a leader of the isolationist bloc in Congress.

Rush D. Holt, Sr. of West Virginia was elected to the Senate at age 29 and five months in 1934, and had to wait until June 1935 to take his seat at the minimum required age of 30. He served one term of five and a half years, proclaiming himself a spokesman for the common man and a critic of privately owned utility corporations.  Although beginning as a strong supporter of Franklin D. Roosevelt and the New Deal, he rapidly became a conservative critic, more of a traditional populist liberal, ranked by one scholarly estimate as the third-most conservative Democratic senator between the New Deal and the end of the 20th century. 

 He became much more newsworthy for his strong isolationist stands on American foreign policy in the late 1930s. He was a spokesman for the America First Committee in 1940, after having supported the Neutrality Acts of the mid 1930s, and opposing membership in the League of Nations, the Reciprocal Trade Agreements, Naval Expansion legislation, and the Selective Service Act.  His controversial outspoken rhetoric led to his defeat in the Democratic primary in 1940, when he ended up a poor third in the vote.  He sought election to the Senate again, but his national career was over.  His son, Rush D. Holt, Jr., served in the House of Representatives as a Democrat from New Jersey from 1999-2015.

Senator Russell Long of Louisiana, son of the famous and also infamous “Kingfish”, Governor and Senator Huey Long, served in the Senate from age 30 years and almost 2 months, for a total of 38 years from 1949-1987.  He became the chairman of the Senate Finance Committee for 15 years, and due to his seniority and commitment to the elderly, disabled, the working poor, and the middle class, he came to be regarded with respect by his fellow Senators.  

He had a major role in much of the Great Society legislation under President Lyndon B. Johnson in the 1960s, including Medicare, and had a major impact on all tax legislation for decades.  However, his Achilles heel was his regular opposition to civil rights, including his vote against the Civil Rights Act of 1964, although he modified his opposition in later years.  He was also a major critic of the Earl Warren-led Supreme Court in the wake of the pathbreaking school integration case of Brown v. Board of Education in 1954.  But his influence, despite these perceived negatives, was massive.

Senator Edward M. (Ted) Kennedy of Massachusetts came to the Senate in November 1962, at 30 years and about eight months, serving a total of 47 years and eight and a half months until his death in August 2009, making him the fourth longest serving US Senator in American history.  Part of the Kennedy political dynasty, he was the brother of President John F. Kennedy and Attorney General and Senator Robert F. Kennedy, and he sought the presidency unsuccessfully against Jimmy Carter in 1980.  Long expected to be the heir of his brothers in presidential attainment, he finally gave up the opportunity to pursue the presidency in his last three decades, and instead became respected and admired as the “Lion of the Senate,”  respected by both fellow Democrats and Republicans across the aisle, often working on legislation with such Republican leaders as John McCain of Arizona and Orrin Hatch of Utah.  His major commitment was to health care reform, immigration reform, civil rights, gun regulation, and social justice at home and abroad.  

At times highly controversial, he was also acknowledged as the voice and conscience of American progressivism, and as a strong and effective speaker and debater.  He and his Senate staff authored about 2,500 bills, of which more than 300 were enacted into law, and cosponsored another 550 bills that became law.  Any listing of outstanding US Senators would have him in the top ten of modern times.  His bipartisanship efforts did not stop the opposition from often portraying him as a polarizing figure, but a lot of it was simply political posturing, with a deep level of respect from many who bitterly opposed him in debate on the Senate floor.  His battles against Supreme Court nominees of Republican presidents made him highly controversial, as well as his stands on foreign policy issues, including Vietnam, Afghanistan, Northern Ireland, and Israel.  His strong efforts on the environment and gay and transgender rights issues also made him notable and seen as highly principled.  Few senators have had the impact of Kennedy, and his death left a void in the Senate that proved hard to fill.

Finally, Senator Joseph Biden of Delaware was elected to the Senate before his 30th birthday in November 1972, not reaching the minimum age until late in that month. Biden took the Senate oath at age 30 and about seven weeks, but at a time of great personal tragedy; his first wife and daughter were killed in a traffic accident a month after his election, and his two sons were severely injured.  He thought of giving up his senate seat, but senior members of the senate convinced him to take the oath and helped him emotionally to overcome the horrible adversity, and still manage to spend a lot of time with his two sons as they recovered from the tragedy.  He would later marry his second wife, Jill, and have a daughter with her, and would go on to have one of the longest periods of service in the US Senate, 36 years from 1973 to 2009. Biden left as the 18th longest serving senator, and he was seen as a strong and effective speaker and debater.

Had Biden remained in the Senate, he might today be approaching the longest service of any senator in history.  But he was called upon by Barack Obama to be the 47th Vice President of the United States from 2009-2017, regarded as one of the two most active, engaged and influential Vice Presidents, along with Walter Mondale, who served Jimmy Carter.  The Obama-Biden team was seen by some supporters as a “bromance” of two unlikely friends, and it was assumed that Biden might run to succeed Obama in 2016, but the tragic death of his older son Beau in 2015 derailed such plans. There was no certainty in any case that Biden would have been able to overcome Hillary Clinton for the nomination.  

But now, in 2020, Joe Biden is the Democratic nominee for President, with a record of accomplishments that is hard to match, including leadership of the Senate Foreign Relations Committee for four years and Senate Judiciary Committee for eight years. With such a long record of experience in the Senate, Biden can be criticized for some policy positions and votes and for verbal gaffes, but he stands out as genuine, kind, generous, decent, and as a person of true empathy and concern for others, rare in any politician.  He has had great contacts with foreign government officials, and knows how to work across the aisle, as he often did in the Senate and as Vice President, helping to smooth conflicts with his diplomatic style.  Biden is perceived as a moderate centrist progressive more than others in his party, such as his former 2020 primary opponents, including Vermont Senator Bernie Sanders and Massachusetts Senator Elizabeth Warren. The fact that he is seen as “less progressive” than them seems to have promoted his present standing as far ahead in public opinion polls for the Presidency as of the end of June 2020.  

Whether Joe Biden can go on to become the 46th President of the United States will be decided in the next four months. If it happens, he will become the President who first held national office at a younger age than any other, while also being, ironically, the oldest first term President at age 78 and two months on Inauguration Day 2021.

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/blog/154367 https://historynewsnetwork.org/blog/154367 0
Ken Burns: Our Monuments are Representations of Myth, Not Fact  


As we consider what role monuments play in our culture, I’d ask us to listen to the words of James Baldwin from the film I made on the Statue of Liberty. That film, which aired on PBS in 1985, set out to understand the history of why that monument was created, as well as the symbolism and myth that have come to surround it.

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176180 https://historynewsnetwork.org/article/176180 0
The Roundup Top Ten for July 3, 2020

Suspect Science: Today’s Anglo-American Eugenics

by Alexandra Fair

Time and time again, the Pioneer Fund subsidized research that advanced eugenic theories about racial difference and actively undermined racial equality.


Underwater: Global Warming to Flood the Former Ports of the Transatlantic Slave Trade

by Daniel B. Domingues da Silva

How will the inundation of historic seaports as climate change progresses affect historical memory of the Atlantic slave trade? 



A Monument to Our Shared Purpose

by Allen C. Guelzo and James Hankins

The Freedmen’s Memorial in Washington embodies not white supremacy, but African-American agency and cooperative struggle.



Democrats May Beat Trump in November and Still not Learn the Most Important Lesson from his Presidency

by Daniel Bessner

Democrats must not just defeat Trump; they must commit to fighting a culture of elite impunity that has enabled the rise of Trump and an unaccountable Republican Party. 



Makers of Living, Breathing History: The Material Culture of Homemade Facemasks

by Erika L. Briesacher

Material culture centers objects as historical documents that can be read like a text; whether highlighting the physical piece or searching for the biography behind it, this approach reveals complex sociocultural behavior.




The Confederates Loved America, and They’re Still Defining What Patriotism Means

by Richard Kreitner

For most of U.S. history, patriotism and white supremacy, the values supposedly embodied by the two flags, have hardly been at odds. Rather, they have been mutually constitutive and disturbingly aligned.



When France Extorted Haiti – the Greatest Heist in History

by Marlene Daut

Because the indemnity Haiti paid to France is the first and only time a formerly enslaved people were forced to compensate those who had once enslaved them, Haiti should be at the center of the global movement for reparations.



Before Stonewall, There Was a Bookstore

by Jim Downs

Networks of activists transformed Stonewall from an isolated event into a turning point in the struggle for gay power.



Racist Violence in Wilmington’s Past Echoes in Police Officer Recordings Today

by Crystal R. Sanders

Wilmington, North Carolina police officers who spoke eagerly about the chance to kill black protesters evoke the history of a violent white supremacist coup against the city's biracial government during Reconstruction.



Why We Owe Gay Marriage to an Early Trans Activist

by Eric Cervini

Why isn't Sylvia Rivera a household name?


Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176298 https://historynewsnetwork.org/article/176298 0
The Slow Path to Police Reform in Northern Ireland

Crumlin Road Tunnel, Belfast. The tunnel connected a now-demolished courthouse with a now-empty jail, allowing suspected terrorists to be moved between the two buildings without exposure to the public. Photo Don Beaudette



In the days following the death of George Floyd at the hands of a white Minneapolis police officer, political leaders and activists have been calling for comprehensive police reform, with some activists demanding that governments defund or abolish the police. As cities like Minneapolis struggle with the question of how to address racism within their police forces, it is worth examining how other communities across the world have confronted similar challenges in the recent past. 


Northern Ireland is a part of the United Kingdom with a population divided along ethno-religious lines between Protestant unionists who want to remain a part of the UK and Catholic nationalists who want to become part of the Republic of Ireland. Catholics living in Northern Ireland faced significant discrimination in areas of voting rights, housing, employment, and policing from the time of Northern Ireland’s creation throughout most of the 20th century. 


A Catholic civil rights movement, modeled on the example of the Black civil rights movement in the U.S., demanded the redress of these grievances starting in the late 1960s. Tensions between Catholics and Protestants came to a head in the summer of 1969, when the British Army was deployed on the streets of Northern Ireland to restore order in response to widespread rioting, during which the Royal Ulster Constabulary (RUC), the UK’s police force in Northern Ireland, manifestly failed to defend Catholic communities.


Over the next three decades, the RUC killed 44 Catholics, 26 of them civilians. The RUC’s actions against the Catholic community extended beyond their own violence into collusion with Loyalist paramilitary groups. One of the most notorious such crimes was the murder of the solicitor Pat Finucane as he ate dinner with his family on February 12, 1989. This murder was carried out by members of the Ulster Defense Association, likely acting on information obtained from members of the RUC. 


Sectarianism and collusion with paramilitaries created a situation in which much of the nationalist community saw the RUC as an occupying force without any legitimacy to police their community. Even those few Catholics who joined the RUC faced sectarianism and discrimination from their Protestant colleagues, and these attitudes also influenced the police force’s interactions with Catholic communities throughout Northern Ireland. 


As Northern Ireland’s peace process accelerated in the 1990s, the people of west Belfast had “no faith whatsoever in the RUC. As far as nationalists were concerned the RUC was simply another orange militia, which was set up to basically keep the nationalist population in their place,” according to a senior republican who spoke with one of the authors for an interview in May 2015. 


The Catholic community’s lack of faith in the RUC was reflected in the police force’s demographics. On the eve of the Good Friday Agreement of 1998, only about eight percent of the RUC was Catholic. But police reform was a key feature of peace in Northern Ireland. And the reforms were so successful that today nearly 90 percent of the population of Northern Ireland has confidence in law enforcement. 


The transformation of the RUC into the Police Service of Northern Ireland was a complex and multifaceted process. A core goalof this transformation was to create a “police service capable of attracting and sustaining support from the community as a whole” that would be “representative of the society it polices.” 


At the same time, reformers in Northern Ireland sought to fundamentally change the manner in which policing was conducted, particularly in nationalist communities. The PSNI was directed to adopt an approach that focused on de-militarized, community based policing. Reforms in this area included the creation of neighborhood policing teams, a commitment to policing via foot patrol, replacing armored Land Rovers with ordinary police cars, and engaging more directly with grassroots restorative-justice organizations.


PSNI Superintendent for Service Delivery Bobby Singleton reflected on the significance of these changes in a 2015 interview with one of the authors. According to Singleton, “neighborhood policing has played a critical role in terms of … establishing, in some cases, relationships where they didn’t previously exist with the police. I think individuals in neighborhood policing teams have really toiled, through blood sweat and tears to really create trust and relationships which have allowed communities to put to one side some of the issues that they had with the police, and to try and work and cooperate with us.”


In Northern Ireland, the challenge of acceptable, effective, community-based policing was inextricably linked with the idea that all of Northern Ireland’s people should see themselves and their communities reflected in the police force. 


Increasing Catholic recruitment to the reformed PSNI was itself a controversial process. Sinn Féin, a nationalist political party with close links to the IRA, was highly critical of the legislation creating the PSNI and initially refused to endorse the new police service.  In 2001 Sinn Féin leader Gerry Adams warned potential Catholic recruits to the PSNI that they would “be accorded exactly the same treatment the republican movement accorded to the RUC.” 


Despite Adams’ ominous words, Catholics began to join the PSNI in larger numbers. Today, nearly one-third of all PSNI officers are from the Catholic community (as of the 2011 census, Catholics were roughly 45% of Northern Ireland’s total population). This change in the composition of Northern Ireland’s police force was achieved, in part, through a policy of “50-50 recruitment,” which aimed to gradually increase the proportion of Catholics in the PSNI over a period of 10 years as older Protestant officers retired. Fifty-fifty recruitment has since been brought to an end, but some have called for its return while others have opposed a resumption of the policy.


Recruitment of Catholics to the PSNI occurred alongside other important institutional changes. The Police (Northern Ireland) Actalso introduced a series of changes in the name, symbols, and management of the police force. The Good Friday Agreement introduced a devolved government in which Catholics and Protestants were required to share power. In 2007, Sinn Féin finally endorsed the PSNI as part of an agreement to share power with their long-time Protestant rivals, the Democratic Unionist Party. These changes, along with recruitment of Catholics, fostered high levels of community trust in the police among both Protestants and Catholics.


Based on the example of Northern Ireland, there is reason to argue that increasing the representation of people of color in police forces across the U.S. should be an important part of the broader agenda to address racism in these forces. The police forces of major U.S. cities are disproportionately white


Creating a police force that better represents the ethnic and racial makeup of the community it serves, and deploying that force with demilitarized tactics focused on community engagement and restorative justice, will foster confidence in the police; a more productive relationship with the community could reduce police violence toward citizens, especially black citizens who face this violence in disproportionate numbers. This conclusion is consistent with the findings of research on the effects of police force diversity and police misconduct in other parts of the UK and the U.S.


To be sure, Northern Ireland still faces problems with sectarianism. There are still “peace walls” that divide Catholics from Protestants, and the power-sharing executive did not function for nearly three years from January 2017 until January 2020. Yet progress toward a society in which Protestants and Catholics receive equal treatment under the law has been tangible and enduring. Police reform has been a significant part of that change. 


On April 18, 2019, members of the fringe republican group called the New IRA murdered journalist Lyra McKee as they were attempting to shoot members of the PSNI amidst rioting in Londonderry/Derry. Sinn Féin’s official Twitter account sent out a joint statement that said, “We reiterate our support for the PSNI, who while carrying out their duties were also the target of last night’s attack. We call on anyone with any information to bring that forward and assist their inquiries.” 


In this remarkable statement, Sinn Féin signaled the depth of its confidence in the PSNI by simultaneously condemning republican violence and encouraging the nationalist community to cooperate with the police. This statement was issued 21 years after the Good Friday Agreement. Change was slow and sometimes painful, and the process encountered many obstacles, but it happened. The process will likely be similarly contentious in the U.S., but if Northern Ireland can do it, surely we can, too.

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176195 https://historynewsnetwork.org/article/176195 0
Big Alex McKenzie and the Last Great Fraud of the Gilded Age




Gold! With the discovery of this treasure in bountiful quantities, the Alaska gold rush of 1900 became the maddest dash of its kind since the 49ers swarmed California a half century earlier. The gold fields of Cape Nome, jutting out into the Bering Sea, could be reached by steamer from Seattle in two weeks, icebergs permitting. Most tantalizing of all, Cape Nome’s gold could be easily found in the ruby-colored beach sands that stretched for many miles along the coast. “Few men become rich by slow economy,” a railroad flier proclaimed. “Fortunes are made by men of nerve and decision who take advantage of opportunities…WOULD YOU LIKE TO BE A MILLIONAIRE?”


But the idea that “few men become rich by slow economy” was not limited in its appeal to the honest toiler with shovel in sweaty palm. The titans of this time—the ‘Robber Barons’ of industry and finance—also cast a posessive eye on the gold bounty. 


 The lure of Alaska’s riches gave rise to a brazen plot involving the outright capture of a federal district court in Alaska— the takeover of nearly all the mechanisms of law and law enforcement in Cape Nome. The mastermind was Alexander McKenzie—Big Alex as his friends from the Dakotas called him. McKenzie, a former frontier sheriff, was a political boss—a mogul in the Republican Party, a maker of U.S. Senators, a man with tight connections to the Executive Mansion, as the White House was then called, and to the nation’s most powerful business magnates.


Naturally, McKenzie planned to give his friends a cut of his venture. This was, after all, the time in American life known as the Gilded Age, and the bosses operated like lords of the realm, dispensing and receiving favors as a matter of course. 


Nowadays, America is in the midst of what might be called a New Gilded Age. The era is marked, as was the original gilded age, by a “rigged game” (to borrow the phrase of Senator Elizabeth Warren of Massachusetts), a system of crony capitalism and crony politics. Most recently, the new coronavirus has exposed the entrenched inequities of today, with Senators selling stock to try to ‘front-run’ the crisis, big-wigs snagging tests ahead of ordinary folks, and physicians hoarding anti-virus medicines so they can write prescriptions for themselves and family members.


Human nature, no doubt, is often selfish. A gilded age, though, is distinguished by a kind of concentrated venality, as the craving for private gain at the expense of the public interest and trust emboldens grabs for power that become almost impossible to thwart.


Almost impossible—but not, history shows, entirely so.


McKenzie, possibly through the payment of bribes to members of the Senate, got his Alaska judge. He made off for Cape Nome and had the judge appoint him legal custodian of lucrative gold properties. With this position secured, the plan was to use the court to pry the mines away from their rightful owners and deposit the assets in a shell company controlled by McKenzie. The company would then issue stock on Wall Street, with Big Alex unloading his shares (and those he kept in a secret trust for his buddies) on unsuspecting buyers in the public.


It nearly worked. But the scheme, ultimately, was foiled by indignant miners, a muckraking press, and righteous judges on a federal appeals court in San Francisco. The judges grasped what McKenzie was up to, and when the boss defied their order to stop his plundering of the gold, they sent U.S. marshals to Cape Nome to arrest him. He was convicted of contempt of court and sentenced to jail time in Oakland. Resourceful as ever, he concocted a story about how he was near death from an incurable ailment. His friend, President William McKinley, intervened and freed him from captivity. But while Big Alex lived for decades afterward, he never made it back to Alaska.


McKenzie’s plot to corner Alaska’s gold proved to be the last great swindle of the original gilded age, as this seamy chapter in our national life gave way to what become known as the Progressive Era. America enacted the Seventeenth Amendment to provide for the direct popular election of Senators, so that bosses like Big Alex could no longer use their control of state legislatures to send pet choices to Washington.


Such reforms, of course, always fall short of perfection. Still, America seldom proves as malign or feckless as critics portray the country. A gilded age, whether of the nineteenth century or of the twenty-first, dies not out of exhaustion, but because the people rise up and will its demise.

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176190 https://historynewsnetwork.org/article/176190 0
Liberal Reform Threatens to Expand the Police Power--Just as it did in the Past

Photo Marc Cooper 2015, CC0


Calls to defund the police have grown in recent weeks following the nationwide outpouring of protest in response to the killing of George Floyd by Minneapolis Police officer Derek Chauvin. As the demands for the defunding of the police suggest, the goal of activists is not limited solely to removing or prosecuting so-called “bad apple” officers. Instead, they are centered on systemic changes to the very nature of policing in American society. 


Although pressure for defunding the police - and dismantling the police in the case of Minneapolis - has gained momentum, Democrats do not wholly support such measures. Most notably, Democratic presidential candidate Joe Biden has explicitly opposed proposals to defund the police. Not only does Biden oppose moves to address systemic problems of American policing, but has proposed expanding police funding. “I do not support defunding police,” Biden wrote in USA Today. “The better answer is to give police departments the resources they need to implement meaningful reforms, and to condition other federal dollars on completing those reforms.” Stating that he has “long been a firm believer in the power of community policing,” Biden has suggested providing police with an additional $300 million to “to reinvigorate community policing in our country,” alongside a host of procedural reforms including national use of force standard, increased use of body cameras, and diversifying police departments,  


Yet Biden’s “real reforms” are part of a long history of liberal responses to police violence that further embed the police power into the liberal state. In the process, this liberal law-and-order, as I call it in my book Policing Los Angeles, led to the expansion of police power and contributed to the reliance on the police to contain the very fallout of social and economic inequality that activists are now rising up against. 


Liberal law-and-order was a hallmark of the twenty-year mayoral administration of Tom Bradley in Los Angeles, the city’s first African American mayor. Bradley, a 21-year veteran of the LAPD left the force in 1961 to enter a career in politics, becoming a city councilman in 1963. Bradley’s career as a councilman was marked by strong criticism of the LAPD. After the 1965 Watts uprising, Bradley called for reforms to address problems of racism within the department and for greater civilian oversight. Yet, instead of reducing the police power, during Bradley’s mayoral administration the Los Angeles Police Department (LAPD) become more militarized, more powerful, and more present in the daily life of the city’s residents of color. This led to the largest moment of urban unrest in American history in 1992.


Notwithstanding LAPD Chief William Parker’s opposition to the limited reforms spurred by the McCone Commissioninvestigation of the 1965 Watts riots, Bradley continued to work toward reshaping the relationship between the police and residents of color. Most notably, Bradley was a strong proponent of police-community relations programs. While intended to enhance understanding between residents and the police, such reforms brought police into close contact with youth of color in particular and extended police authority into new arenas, such as support for the police to work with youth in music clubs and recreation programs.


When Bradley first ran for mayor in 1969, he extended the liberal approach by promising to pursue reforms intended to ensure the police treated residents fairly while, at the same time, making clear that the police would have the power and resources needed to keep the city safe. While he lost the 1969 election to the race-baiting Sam Yorty, Bradley ran again - and won - in 1973 where he campaigned on an even more explicit liberal law-and-order platform. “The insane political division which somehow makes it ‘conservative’ to be against crime and ‘liberal’ to be for civil liberties,” Bradley told the Los Angeles Bar, “has to start coming apart.”


While in office, Bradley sought to increase oversight of the department by appointing new members to the Board of Police Commissioners, limit extravagant spending by the police, and promote community-relations programs. Tying federal funding from the LEAA to procedural reforms was one means by which Bradley hoped to exert greater control. Using the newly-created Mayor’s Office of Criminal Justice Planning (MOCJP), Bradley streamlined criminal justice grant-proposals and required the LAPD to submit proposals for state and federal law enforcement funding to the MOCJP - something similar to what Biden has proposed by linking federal money to lukewarm reforms. 


Bradley also engaged in all-too-familiar procedural reforms, such as changing the LAPD’s use of force policies and adding human relations training - the precursor to today’s implicit bias training - for officers. Such liberal law-and-order responses to police killings, however, undermined demands for greater police accountability and external oversight of the department and reinforced the department’s authority to set its own limits. 


These procedural reforms and mayoral oversight not only failed to rein in the LAPD but extended its authority for at least two reasons. 


First, police budgets derived overwhelmingly from local sources to which federal funding was at best supplemental. Bradley, for instance, fulfilled a campaign promise to ensure a well-funded police department (while also opposing some of the LAPD’s most egregious requests, such as for jets and submarines). In 1972, for example, the LAPD’s total operating cost (including pensions) was $198.5 million, which accounted for 35.5 percent of the city’s budget. By 1982, the department’s total operating cost increased to $525 million, or 34.9 percent of the city budget.


Second, reforms meant to increase understanding between the police and community did not alter the structure of police power in the city. For instance, Bradley cooperated with LAPD Chief Ed Davis to promote community-oriented policing, known as TEAM policing. However, TEAM policing did nothing to shift power from the police to the community. One study conducted in the early 1970s found, “responsiveness to citizen demands is being sacrificed to the objective of crime control.” Efforts to wed the police to communities, in other words, enhanced police power. The TEAM policing model, the same study found, was “an attempt at formal cooptation—participation without control.” Such findings continue to plague contemporary community relations programs as well. 


In short, Bradley’s vision of liberal law-and-order sought to remake the relationship between politicians and the police while leaving the relationship between the police and the people they policed largely intact. 


But the problem was not only one of liberal political support for the police. The police positioned themselves as an independent political entity within the liberal state. Under Chiefs of Police Ed Davis and Daryl Gates, the LAPD successfully resisted systemic changes that would have reduced their ability to discipline officers and capitalized on crises - that they in part produced - to expand their authority on the streets. 


During Bradley’s first term, fears of juvenile crime led to demands from the police for new capacity to deal with the crisis. In response, Bradley pledged to crack down on youth crime and to release the police to address school safety concerns. Strengthening the juvenile justice system, however, wed the police deeper into institutions where they had not been before. While some funding was used to develop gang prevention and intervention programs, the LAPD also cooperated with the Los Angeles Unified School District (LAUSD) to arrest truants around inner-city schools of color. In addition, the LAPD developed a program to track “at-risk” kids deemed either criminal or potential criminals known as the Data Disposition Coordination Project (DDCP). While the DDCP was short-lived, the LAPD extended the capacity to police “at-risk” kids with the establishment of its notorious anti-gang units, Community Resources Against Street Hoodlums, or CRASH. 


At no point did liberals - locally and nationally - rely more on the police to contain those dispossessed by social and economic inequality than during the 1980s war on drugs and gangs. Echoing the claims of Daryl Gates, Bradley often referred to kids as “thugs” and promised to root out gang and drug violence using whatever means necessary. Bradley and Gates - although often at odds - cooperated in this campaign by centralizing all elements of the city’s antidrug and antigang programs in ways that enabled the LAPD to marshal “resources anytime, anywhere and on any scale to effectively wage battle against street drug peddlers and gangs narcotics traffickers.” Massive gang sweeps and militarized drug raids - many carried out by CRASH units and SWAT teams - added to the LAPD’s martial capacity and exacerbated tensions between communities of color and the police. 


When the video of the beating of Rodney King shocked the nation in 1991, Gates responded with an all-too-common claim. Gates denied that the beating reflected a systemic problem in the department, concluding, “This [incident] is an aberration.” Such “bad apples” defenses, which Biden has used, reinforced the connection between the police and the liberal state. Promising to treat systemic problems with more training, community policing, or procedural reform, liberal law-and-order had created the conditions for explosion of protest following the acquittal of the officers involved in the King beating.


While the 1992 Los Angeles rebellion led to some structural reforms, most notably limiting the tenure of the chief of police, the LAPD continued to maintain its pride of place in the city’s political power structure. Commitments to community policing and a consent decree leading to federal oversight did not fundamentally change the deep intersection between policing and the liberal state. Community policing, in particular, had not solved the fundamental power imbalance between police and residents. As the People’s Budget LA has shown, the LAPD continues to receive 54 percent of the city’s general fund. 


Across the country the police also got more - more authority and resources under the guise of liberal reform. Through the 1990s, the federal government, at the behest of Bill Clinton and then-senator Biden, pushed through funding for more 100,000 more police officers as part of the 1994 crime bill. Other funding went to expand community policing, training, and efforts to diversify police departments.


These liberal law-and-order reforms did not reduce police power nor racist policing in Los Angeles and across the country. But they certainly expanded the power and authority of the police. Procedural reforms, such as those promoted by Biden, did not work in the past and will not work now. The history of liberal law-and-order reveals that procedural reforms implemented on top of a structure of policing that has been empowered to protect property and control “disorder” are not only doomed to fail but will produce the conditions for more protest and resistance. 


Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176193 https://historynewsnetwork.org/article/176193 0
In the “Bramble” of Central Park, a Showdown Over Nature and Race



On a precious piece of the national landscape loaded with history, the birder and the dog walker faced off at twenty paces or so. With hands on their trusty pieces—one recording the scene to provide witness, one calling out for police back up—the two Coopers found themselves acting out a script that was centuries in the making. Their flashpoint lights up the story of how nature and race have been constructed in America, giving privileged access to some while turning others into eternal trespassers. 


The ground of their meeting was literally constructed: to meet the vision of Frederick Law Olmsted, in the 1850s workers hauled in 2.5 million cubic yards of stone and earth, adding in another 40,000 cubic yards of manure and compost before planting 270,000 trees and shrubs. To make way for what became Central Park, African Americans living in Seneca Village were unceremoniously evicted, and their gardens buried—as Roy Rosenzweig and Elizabeth Blackmar document in The Park and the People. For Olmsted, these people, their homes, and their relationship to nature didn’t matter: he wanted to a create a Romantic countryside in the city, to provide solace, rejuvenation and spiritual cleansing to an urbanizing America through accessible retreats to “pure” nature. Though Olmsted earlier had publicized the horrors of slavery in the South, his Park was built on removal of African Americans whom he did not explicitly welcome back to his re-creation. Instead, it was generally represented as a place of natural healing and social mixing for white people only. 


In parallel with the removal of African Americans in Seneca Village, Native Americans were forcibly removed from their homelands to construct National Parks at Yellowstone, Yosemite and Mount Rainier (to name a few). Promoters portrayed these places as pristine, untouched by people and therefore virgin wilderness, and patted themselves on the back for setting these places aside as spiritual playgrounds for Americans who were getting out of touch with nature. The people who had been in touch with nature in these places—for thousands of years—were expelled to make way for the fantasyland. As scholar Dina Gilio-Whitaker, a citizen of the Colville Confederated Tribes, points out, “When environmentalists laud ‘America’s best idea’ and reiterate narratives about pristine national park environments, they are participating in the erasure of Indigenous peoples, thus replicating colonial patterns of white supremacy and settler privilege.” 

First emptied, national and urban nature parks alike were recreated as ersatz Edens and white spaces. Theodore Roosevelt thought wilderness parks would be great places for white males—softening in urban America—to restore and reinvigorate their manliness. Many environmentalists in the early 20th century were eugenicists like Madison Grant—author of virulently racist tract The Passing of the Great Race (1916) and a patron of the American Museum of Natural History, where the colonialist statue of fellow traveler Roosevelt was just slated for removal.  Out of one side of his mouth Grant could decry the destruction of California redwood trees and out of the other declare that “the laws of nature require the obliteration of the unfit.” As Alexandra Stern explains in Eugenic Nation, for him “Saving the Redwoods meant more than just protecting a tree: it was a metaphor for preventing race suicide and defending the survival of white America.” Nature interpretation programs at National Parks were pioneered by Charles Goethe, a Sacramento eugenicist who would later be read and admired by Nazis. Later environmentalists—such as white nationalist Garret Hardin who wrote the influential essay “The Tragedy of the Commons”—blamed environmental destruction on the needs of a growing population, especially of people of color.  By contrast, many environmentalists, employing a circular logic of self-gratification, portrayed whites as the chosen people who could save themselves and become saviors themselves by revering and protecting nature. America’s environmentalism has a legacy of exclusion that is more a feature than a bug. 

Protecting “nature” has never been just about nature in America; it has also been a story about people, and deep-seated prejudice has colored the portrayal. America’s “good nature” was affiliated with cleanliness and whiteness, and contrasted with “bad nature”—dirtiness, a quality associated with the city, and also, relentlessly and malevolently in media of all sorts from the antebellum period to the present, with African Americans. The pure spaces of “good nature” have generally placed people of color in permanent exile, classifying them as defilers of the constructed nature’s purity or simply excluding them from pretty pictures of it.  As Carolyn Finney found in Black Faces, White Spaces: Reimagining the Relationship of African Americans to the Great Outdoors, the media have helped construct a “white wilderness.” A review of a ten-year period (1992-2001) of Outside Magazine showed that of 4602 images that included people only 103 were of African Americans; almost all of them were of black males running or playing basketball in an urban setting. 


African Americans, who had first been enslaved to wring profit from nature for the benefit of others, were in effect cut off from America’s celebrated landscapes. African Americans who connect with nature are often regarded as interlopers who are out of place. To enter “nature” was to cross a boundary of imagination, no less real than Jim Crow lines. As with all such boundary crossers, they have been perceived as agents of danger because they upset established and enforced order.


African Americans know from experience just how enforced that order is, often by the police working hand in glove with white civilian enablers and lookouts.  “Barbeque Beckys” are vigilant in every park and at every turn, ready to call in innocuous behavior framed as dangerous because it is being done by Black people. They create a powerful and pervasive force field, which keeps African Americans from being able to ramble freely in the streets or in the parks. To move about without harassment, Black people routinely, as Garnett Cadogan discloses in his essay “Walking while Black,” perform innocence and reassurance, acting out a “pantomime...to avoid the choreography of criminality.” Even so, like other people of color in America, he finds that he cannot wholly avoid having his movement arrested and his body taken into custody.


Pervasive racist representations have primed white people to look on black people outside with fear. D.W. Griffith’s “landmark” film Birth of a Nation (1915) pictured a lustful white man in blackface pursuing a white woman through the woods and up a mountain: rather than be defiled, she hurls herself off the cliff. This was cinematic ground zero for the trumped up fear that black men—anywhere, but especially in woods or brambles—are a threat to the “purity” of white women and nature alike. The film’s weaponized fear detonates again and again. It set off the rebirth of the Ku Klux Klan and countless hate crimes and lynchings, which to this day, as LaToya Brackett powerfully reminds us, have been turned and turned again into terroristic spectacle and entertainment.


Christian Cooper explained that Amy Cooper, when she called the cops on “an African American man threatening my life,” in effect “pulled the pin on the race grenade and tried to lob it at me....to tap into a deep, deep dark vein of racism...that runs through this country and has for centuries.” She didn’t have to reach far for the weapon, for the memory of the white woman jogger who had been raped in 1989 in another wooded area of Central Park lies near the surface. Five African American and Latino youths were blamed, framed and locked up. Despite their official exoneration 13 years after the fact, the pull of the racist imagery is so powerful that the highest elected official in our land, refusing to accept their innocence, still sees them as inherently guilty for having taken a walk in the Park while a white woman ran. 


Since being called out for their legacy and continued practices of exclusion, environmental organizations and publications have worked to make amends—including various chapters of the Audubon Society. Still, as J. Drew Lanham has noted, African American birders are still few and far between. In his list of “9 Rules for the Black Birdwatcher,” rule number 1 is “Be prepared to be confused with the other black birder. Yes, there are only two of you at the bird festival....Yes, you will be called by his name at least half a dozen times by supposedly observant people who can distinguish gull molts in a blizzard.” After Ahmaud Arbery was killed for the crime of running while Black, Lanham added additional “revelations” to the list: “Roadrunners don’t get gunned down for jogging through neighborhoods, do they?” and “Hooded warblers are lucky. They can wear hoodies and no one asks questions or feels threatened. Vigilante ’Mericans don’t mobilize to make citizen’s arrest if they loiter in a strange shrub for too long.”


At great personal peril because of the structures that would deny him unfettered access to the environment, Christian Cooper has also ignored the “keep out of nature” signs to pursue his deep love of birds in Central Park’s Ramble. When his sister Melody at first referred to this place as “the Bramble” in her viral tweet, she evoked a deeper history of African Americans claiming American nature as their own rightful ground. In African American folklore, trickster Br’er Rabbit, when caught by Br’er Fox’s tar baby trap, has to think fast, using reverse psychology on the creature who would be his master: he pleads, “whatever you do, Br’er Fox, do not put me in the briar patch, the bramble. It’s the worst place for me.” So Br’er Fox throws him in, and Br’er Rabbit scrambles free. Until white Americans like Amy Cooper stop putting out tar baby traps, and recognize that everyone has a right to the Ramble, this nation will not be free of the racism inscribed in its cherished landscapes. 

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176191 https://historynewsnetwork.org/article/176191 0
Newest Born of Nations: European Nationalism and the Confederate States of America



The Confederacy has exploded into the news once again, as protestors seeking justice for African-Americans topple Confederate statues and municipalities follow their lead in pledging to remove more.  These events have again been greeted by claims that the Confederacy was unrelated to slavery and therefore to contemporary problems of racism.  At the heart of this conflict is the question of what the Confederacy represented then and represents now. Historians who point to primary sources such as the Ordinances of Secession and Alexander Stephens’s “Cornerstone Speech” agree, along with Black Lives Matter protestors and much of the public, that slavery inspired secession and defined the Confederacy, though other Americans remain committed to a Lost Cause re-interpretation of the war that posited states’ rights, not slavery, inspired the Confederacy.


My new book, Newest Born of Nations:  European Nationalist Movements and the Making of the Confederacy, answers this critical question through an international perspective.  In Newest Born of Nations, I argue that white southerners in the Civil War era looked to contemporary European nationalist movements, such as the Revolutions of 1848 and Italian Risorgimento, observing aspiring nations as they sought independence and self-government from empires and monarchies.  White southerners used their analysis of these nationalist movements to compare the South to aspiring nations abroad, a process that allowed them to refine their vision of an ideal nation; to conceive of the South as a potential nation, distinct from the North and separate from the United States; and to justify secession and the creation of the Confederacy.  White southerners’ international perspectives on nationalism thus played a critical role in shaping and defining southern nationalism.


With these arguments, Newest Born of Nations corroborates the scholarly consensus that slavery fueled secession and defined the Confederacy.  White southerners developed multiple international perspectives that positioned the Confederacy differently relative to aspiring nations in Europe, but all of which used international comparisons to defend a desired vision of southern nationhood. In what I call the liberal secessionist international perspective, secessionists and Confederates claimed that secession and the creation of the Confederacy were legitimate because the southern nation followed in the footsteps of European nations in seeking liberal ideals such as self-government, national self-determination, and even republicanism.  These secessionists drew comparisons between the South and aspiring European nations, claiming that the white South was oppressed by abolitionism just as aspiring nations such as Ireland and Italy were oppressed by tyrannical empires such as Great Britain and Austria.  The Richmond Daily Dispatch, published by James A. Cowardin, was a leading proponent of this view.  Upon the election of Abraham Lincoln of the anti-slavery Republican Party, for example, the Dispatch wrote “This day the South comes under a dominion which has been forced upon her by the North; this day she begins a servitude as involuntary as that of Italy to Austria; this day inaugurates a foreign rule as distinct and complete as if we had been conquered by European bayonets, and annexed to the throne of some continental despot.” To these secessionists, any political victories by abolitionists would deny pro-slavery whites their self-government, just as tyrannical regimes abroad denied European nationalists their self-government.  


Fundamentally, then, these white southerners argued that they had a right to protect slavery, and that any threat to that supposed right to slavery constituted tyranny akin to that of a European empire.  Of course, the existence of slavery actually violated principles of self-government, and the desire to protect slavery similarly opposed rather than upheld the liberal values of self-government or republicanism.  Nonetheless, the liberal international perspective used international comparisons to claim that white southerners’ desire to protect slavery fit within the bounds of emerging nationalism in the mid-nineteenth century, and that the Confederacy was as legitimate as other new nations seeking independence.


The conservative secessionist international perspective was even more upfront in centering slavery at the heart of the Confederacy.  In the conservative international perspective, secessionists argued that secession and the Confederacy were legitimate because the Confederacy purified nationalism of what these secessionists claimed were destructive excesses of liberalism in aspiring nations in Europe.  Recognizing that most nationalist movements in Europe had failed, secessionists using the conservative international perspective claimed that these movements had failed because they sought not just national independence, but also greater social equality – a value obviously at odds with enslavement of human beings.  The Confederacy, they asserted, would stand as a corrective to this excess liberalism, and would use its conservatism, social hierarchy, and slavery to purify nationalism of the liberal impulses that had supposedly doomed it abroad.  Leonidas W. Spratt, editor of the Charleston Mercury, made this view clear in January of 1861, when he declared “if you shall elect slavery, avow it and affirm it . . . assert its right . . . to extension and to political recognition among the nations of the earth. If . . . you shall own slavery as the source of your authority . . . the work will be accomplished” and, further, “your Republic will not require the pruning process of another revolution; but poised upon its institutions, will move on to a career of greatness and of glory unapproached by any other nation in the world.”  To these Confederates, slavery was not just the defining element of the Confederacy; it was the legitimizing element that ensured the Confederacy would be the strongest nation the world had yet seen, with the purest implementation of nationalism.


Secessionists and Confederates thus used international perspectives as a convenient way to translate their concerns about protecting slavery into the international language of rights and nationalism.  The mid-nineteenth century Atlantic World was rife with questions about nationhood, citizenship, rights, and governance, and to secessionists, the Confederacy was simply the latest in a long line of aspiring nations seeking admission to the international family of nations.  Ultimately, contemporaries in the United States and in Europe largely rejected the Confederacy, both as a nation and as an equal to aspiring nations in Europe. In an intellectual and political atmosphere of growing abolitionism, a majority of Americans and Europeans recognized that a slavery-based nation did not echo the values of nationalism exhibited elsewhere.  Nonetheless, Confederates used international perspectives to defend slavery, and to try to legitimize their slavery-based nation.


Newest Born of Nations enhances our understanding of the Confederacy and the Civil War by reframing the American Civil War as part of the larger nineteenth century age of revolutions and nationalism.  Far from an exclusively domestic conflict, the Civil War had profound implications for the evolving nineteenth-century Atlantic World ideas of freedom, rights, citizenship, and nationalism.  Confederate nationalism developed in and through this international context.  Critically, Newest Born of Nations reveals that, despite the contradiction between slavery and the liberal ideals that resonated throughout the mid-nineteenth century Atlantic World, white southerners attempted to claim slavery as legitimate foundation for a self-governing republic.  Slavery was at the heart of the project of building an independent southern nation, and developing and deploying international perspectives was a key way that Confederates sought support and legitimacy for a nation built on slavery.


This internationalization of the making of the Confederacy reveals the development of southern nationalism to be a complicated, complex process, one fundamentally tied into the intellectual trends of the era, even as it opposed the growing trend of abolitionism and equality.  By placing secession, the Confederacy, and the American Civil War within this transnational context, Newest Born of Nations expands and complicates our understanding of the Confederacy, the Civil War, the age of nationalism and revolutions, and the nineteenth-century Atlantic World. 



Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176194 https://historynewsnetwork.org/article/176194 0
“A Very Different Story”: Marian Sims and Reconstruction



Last week, director and screenwriter John Ridley wrote an op-ed for the Los Angeles Times arguing that HBO should remove Gone with the Wind from its new HBO Max streaming service. The movie, he said, “glorifies the antebellum south,” “ignores the horrors of slavery, … perpetuates some of the most painful stereotypes of people of color,” and legitimates the Confederacy, which was based on “the ‘right’ to own, sell and buy human beings.”

The 1939 MGM movie was, of course, based on Margaret Mitchell’s 1936 novel of the same name. As offensive as most will find the movie’s images of African Americans, the book was considerably worse. Historian Nina Silber wrote in the Washington Post that “Mitchell’s best-selling book offered a classic ‘Lost Cause’ tale of crushed but resilient white Southerners, devoted black slaves and evil-minded Yankees. It traded heavily in racist descriptions and plot lines, from the ‘black apes’ committing ‘outrages on women’ to Mitchell’s reference to the character Mammy, her face ‘puckered in the sad bewilderment of an old ape.’ Ku Kluxers are the book’s heroes, helping restore order in the wake of racial chaos.”

The problem for Ridley and Silber (and many others—folks have been piling on Gone with the Wind lately) isn’t just that the movie and book had racist images; it’s that those images have been widely accepted as the truth. When people think of slavery, the Civil War, and Reconstruction, they often see them through the perspective of Gone with the Wind. Historians face a seemingly unending battle to convince people that that’s not the way it was.

But it didn’t have to be that way. Another book written at about the same time by another Georgia-born woman presented a very different view of the war and Reconstruction.

Marian McCamy was born in Dalton, Georgia, in 1899 (a year before Mitchell was born in Atlanta). She graduated from Agnes Scott College, taught history and French at Dalton High School, married lawyer Frank Sims, and moved to Charlotte, where she started writing, publishing stories in national magazines and seven novels, all reviewed in the pages of the New York Times.    

Most of her fiction was set in contemporary times, but Beyond Surrender (1942), her one foray into historical fiction, set her at odds with Margaret Mitchell’s view of the past.

The novel begins late in June 1865, as war-weary Denis Warden arrives home to Brook Haven, the family plantation in South Carolina. He has to deal with salvaging both Brook Haven and his personal life, all in the historical context of the days of Reconstruction.

The title can be read in two ways: as the story of a relationship after one surrenders, physically or emotionally, to another; and as a story set after the Confederacy surrendered in 1865. Sims intended both meanings. The publisher (Lippincott, in Philadelphia) also understood the book’s dual nature. One ad describes the main characters:

Denis Warden was forced to concede the South had been defeated at war—but war could never alter the rights and privileges to which he had been born. No! Not even if he had to go out and fight all over again.

Dolly [Helms, daughter of the merchant with whom Denis had to deal] knew about such rights and privileges only through hearsay, but she wanted them just as fiercely as Denis. And she could offer him the thing his body needed most.

Sharon [Long] should have married Denis. Theirs was a common background, a common tradition. And the war had given her a realistic approach to life which would have been a stabilizing influence on him.

Sara Warden [Denis’s mother] was a woman of acumen and intelligence—two things a lady was not supposed to have. It was she who kept Brook Haven from ruin during the four long years of war. Without her, Denis would have been irretrievably lost.

John Jernigan [a lawyer], who was to love Sara all his life, had not gone to war because he was a cripple. He knew how and why the war was lost—and how the peace might be lost as well. Had the South numbered more of his kind, Reconstruction might have been a very different story.

It looks like a soap opera! But look again at the description of lawyer John Jernigan. Drama rather than historical context drove Beyond Surrender, but in this work of historical fiction, the fiction does, of course, play out in a historical setting, and the setting was quite different from Mitchell’s story. Rather than writing about the alleged ignorance and intellectual apathy of the freedmen, Sims wrote that “Hordes of eager Negroes [were] trooping into the crude new temples of learning. . . ; there was pathos in the universal craving for ‘book learnin’’ as a key that would unlock all the mysteries and benefits of a new universe.” Luke, a former Brook Haven slave, showed a surprising intelligence and willingness to work after the war. The occupying U.S. troops were not monstrous oppressors; instead, they “have been pretty fair to both sides. Some of the commanding officers are brutes and fools, but a lot of ’em are decent.” The freedmen were “victims of circumstance—poor devils.” When asked if he would give black men the right to vote, John said “yes. And if I was all powerful I’d try to educate ’em a little and give it to the best ones anyhow, even without being forced to do it.”

The reviewer in the New York Times said that Beyond Surrender was “extraordinarily fair-minded.” A former northern abolitionist who moved South “is respected for his sincerity.” And perhaps most amazing, especially compared to that other book published a half dozen years earlier, “the continued devotion of Luke, the black foreman, is shown to be not a remnant of servility but, like his determination to have a farm of his own, the preferred responsibility of a free man.”

Sims was not an academic historian—much of her background reading was from Francis Butler Simkins and Robert H. Woody, South Carolina during Reconstruction [1932], one of the earliest revisionist works by white historians—but she knew that her book showed a new perspective. Shortly after the novel was published, she was invited to address the Women’s Club of Columbia, South Carolina. “I expected the study to be drudgery,” she said of the research that supported the book, “and perhaps it would have been if I had found only what I expected to find. Since I didn’t, it proved to be a fascinating voyage of discovery, a sort of paper-chase after truth, through the jungles of legend…. I believe that Reconstruction is the most generally misunderstood and misinterpreted era in American history…. I began research with a belief which is held by a vast majority of Southerners: that the war was … a picnic compared to Reconstruction, and that the North was directly responsible for all our suffering.”

In the novel, Denis, with his adherence to the older view of the war and Reconstruction, reminds us of Margaret Mitchell’s perspective, while John stands in for a new revisionist approach. As the book comes to an end with the aftermath of the contested presidential election of 1876, John and Denis talk about the freedmen’s future. “I was thinking,” John says, “of education and a decent chance to be decent, useful citizens.” Denis remains unconvinced. “You’ve done a big job in the face of big obstacles,” John tells him, “and I’m proud of you. I’m just trying to show you some things you wouldn’t be apt to see for yourself.”

Beyond Surrender was not nearly as popular as Gone with the Wind. But it reminds us today of a history that might have been. As the publisher said about John, “Had the South numbered more of his kind, Reconstruction might have been a very different story.”

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176189 https://historynewsnetwork.org/article/176189 0
Those We Abuse, We Loathe




I grew up white along the Mississippi River Delta in southeast Arkansas, one of the most racist regions of the country and site of the Elaine Race Massacre, arguably the most significant racial attack against African-Americans in our country’s history. I was reminded as I recently watched renewed grief combusting on the streets of most of our large American cities of the truism I learned early in my life: those we whites abuse, we in turn then loathe.


How can we whites not acknowledge the cause of the black anger and frustration in response to the cavalier manner in which the lives of Ahmaud Arbery, Breonna Taylor, and George Floyd were taken? Who could not say that whites hardly showed any regard at all in the executions of these three African Americans? Indeed, we too often witness the face of indifference as black lives are casually taken away.


We now hear the call, the habitual drumbeat, once more for racial healing, but it is simply not possible to have a credible discourse on the subject of racial healing in the absence of a full admission that black lives are routinely, if not normally, treated as largely trivial by an important segment of American officialdom, including the police. How else can we account for the recurrence of white violence without consequences? 


So, how do we move from whites inflicting abuse to the dismissal by whites of African-American lives? Implicit in the historical ability of whites to abuse blacks with impunity has been the related evil of seeing black lives as inferior. The various reminders of abuse one practices can be mortifying and agonizing, and can foster deep hatred, both for one’s own self as perpetrator but also toward the object or target. In the extreme, abusers can desire and seek the complete elimination of the object of one’s abuse in the belief that the strong urge behind the abuse will also simply disappear with the absence of the object or victim. Yet, this process rarely squelches the motive. The associated illusion that loathing only exists because the object deserves demonstrable hatred has led to violent outbursts reflected through lynchings and brutalizations, such as the mutilation and murder of Emmett Till, if not more recent deaths of numerous African-Americans. 


And yet, many whites recognize at some level that such hatefulness exists within white identity rather than being provoked by the Other. They knew they could never save themselves from damaged heritage, that legacy of prejudicial tradition and violence derived from American white racism, perpetrated against African-Americans, and passed on and repeated from generation to generation. The traditions, the customs, the evil reflected in history against African-Americans were too ingrained and too intertwined with who those American whites were for them to cleanse themselves.  


Over the seven decades of my life facing racial division and white subjugation of black people, I have witnessed time and again that whites have little capacity to cleanse themselves, as the continuous harm caused by whites against blacks, whether by abuse, loathing, or customary impulse, built up so much scar tissue over the years and generations that it became impossible for the perpetrators to penetrate into their own hidden humanity through those layers of tragedy whites had imposed.  


And thus, white abuse and loathing had to persist to degradation in the value of African-American lives. I cannot be convinced that American whites, for generations, didn’t comprehend the evil nature of African-American subjugation, but they could not help themselves from following the well-worn path--clearly marked without ambiguity--which grandfathers and grandmothers, mothers and fathers also followed, so that encroachments on the past and departure from traditions simply became a form of heresy. Over multiple generations, while ancestors realized the evil nature of racial oppression, the availability of free or cheap labor and the allure of maintaining white supremacy as a way of life were much too appealing to oppose. 


So, let us proceed to endorse racial healing. It is only prudent and necessary to do so as a nation. However, at the same time, let us recognize that such an effort will be futile from the outset in the absence of an initial step that leads to an admission by this country that black lives have been systemically valued less than white lives by not only governmental institutions but the public at large. Only on a foundation of a realistic acknowledgment of the past and current status for black lives in this country can we expect that such adverse behavior, as we continue to experience, will not be repeated again and again and that black lives can be viewed by all Americans as equal in value to white lives.  

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176192 https://historynewsnetwork.org/article/176192 0
Pawns of History: The Poetics of Russian Revolutionary Politics

A meeting of Chernoe Znamia members 1906 in Minsk.


I don’t know about the others, but I was in awe of the tenacity, durability, and fearlessness of human thought, especially that thought within which—or rather, beneath which—there loomed something larger than thought, something primeval and incomprehensible, something that made it impossible for men not to act in a certain way, not to experience the urge for action so powerful that even death, were it to stand in its way, would appear powerless.

Those were the words by which Aleksander Arosev recalled a secret meeting he attended as a teenager alongside his best friend, the broad-shouldered yet soft-spoken Vyacheslav Skriabin, and several other prodigious students from their high school in the Russian countryside. Under the weak, white light of a kerosene lamp, they read and discussed illegal socialist literature. If the tsarist police were to discover them, they would most likely receive a one-way ticket to the icy outskirts of Siberia, where years of forced labor awaited. 

But the boys were not afraid of such a fate. Protected by their idols, Karl Marx and Friedrich Engels, whose portraits watched over them from along the walls, they studied on in their attempt to understand the world so that one day they might shape it in their image. Though Arosev had always felt it in his bones, hardly could he have believed that in only a few years’ time both his wildest dreams and greatest nightmares would become reality: that in 1938 he would meet his death at the hands of a comrade he was yet to meet, while his friend Skriabin—who had only just adopted the nickname ‘Molotov’—would serve as the commissar of a nation they were yet to create. 

In his book, Three Whys of the Russian Revolution, the conservative historian Richard Pipes questions the historical inevitability of ‘Red October,’ the Bolshevik usurpation of the Provisional Government, by reminding us that, according to Marx, the world’s first communist uprising could not possibly have taken place in a country as archaic as Russia, but only in an industrially advanced society like England or Germany, where the capitalist system had, by gradually eroding the cultural differences between different groups of workers, created a unified, class-conscious proletariat. 

Valid though his point may be, by dismissing the insurrection as a fundamentally trivial event, Pipes turns somewhat of a blind eye to the elements of Russian culture that made such an event possible in the first place. Indeed, while its inevitability remains up for debate, there is sufficient reason to believe the ‘proletarian’ coup could not have occurred anywhere except in Russia—a nation which, during the final decades of the tsarist regime, had been transformed into an ideal breeding ground for radical activism.  

There was but one place a person in nineteenth century Russia could take his or her ideas, and that place was not parliament—there was none up until the creation of the Duma in 1905—but the printing press. Consequently, Russian literature became deeply infused with social and political thought. After the emancipation of the serfs in 1861, followed by the government’s laissez-faire attitude when it came to integrating them into the state, many authors took it upon themselves to figure out how people should live together and organize their society.  

The ones that saw no wrong made it past the censors with ease, while those who opposed the establishment were persecuted and penalized; Fyodor Dostoevsky was imprisoned for his involvement in a clandestine literary group, and Leo Tolstoy excommunicated from the Orthodox Church for writing Resurrection. But no matter how hard the government tried to silence its critics, their work still found many admirers, the most passionate of which belonged to a small, though steadily growing class of ‘professional’ revolutionaries. These were the people who would eventually become known as the Communists, and if we are to understand the trajectory Russia followed on their behalf, the stories that influenced them might be as good a place to start as any.  

Given how intertwined literature and legislation were in the Russian Empire, it should come as no surprise that Arosev was far from the only revolutionary figure who cultivated an emotional attachment to the books he read. The Bolshevik Sergei Mitskevich, for example, described how his “eyes were opened,” upon reading a socio-ideological novel by Ivan Turgenev titled The Virgin Soil, and how it inspired him to read religiously in hope of discovering “the key to the understanding of reality.”

Similarly, Lenin biographer Robert Service points out how little ‘Volodya’ was quick to develop intimate imaginary relationships with his favorite writers, particularly the influential revolutionary author Nikolai Chernishevsky, and was said to have described himself, if only in closed company, as being “in love” with Marx. The significance of this attachment was twofold. Firstly, it turned the writings of socialist thinkers into a kind of gospel presumed to provide an all-encompassing and irrefutable worldview. Secondly, it elevated the revolutionaries, the persecuted disciples of this repressed truth, to the status of prophets.

In its infancy, this kind of self-glorification tended to manifest itself in harmless, even pitiable forms. The Polish-born Bolshevik Feliks Kon, for instance, once likened himself to a “young knight determined to wake up a sleeping princess,” while the Russian Bolshevik Alexander Voronsky looked back on himself and his young comrades as “overconfident and full of peremptory fervor,” as they argued over “the commune, the land strips, and the relationship between the hero and the crowd.”

But once these imaginative children had grown into adults—adults with a high degree of political power at their disposal, no less—their elevated self-perception came to carry potentially severe consequences for their fellow countrymen. Most importantly, it led Lenin to proclaim that their revolution could not possibly succeed without the leadership of a so-called “revolutionary vanguard,” a paramilitary group consisting of trained revolutionaries who, by virtue of their profession, walked “along a precipitous and difficult path, holding each other firmly by the hand.” 

Opponents of the Soviets and their doctrines often suggest that the members of this vanguard drew on socialism’s humanitarian appeal in order to mask their selfish lust for power, because they knew very well that the only way in which men like them—born outside the aristocracy—could ever attain it was through revolution. While certainly in line with our age-old understanding of human nature, a closer inspection of the revolutionaries’ private writings renders this, too, difficult to believe. 

Indeed, to those same opponents, it may come as a shock to learn that many revolutionaries habitually doubted themselves and, as a result, took great pains to ensure they acted in the name of truth and truth alone. For example, in order to decide their political allegiance, Arosev and his companion, Vyacheslav, once debated each other on behalf of the Bolsheviks and their political rivals, the Socialist Revolutionaries, under the promise they both side with the winner. 

Likewise, the Left Communist Valerian Osinsky, weary of passing trends within the academic world, studied for months on end trying “very hard to give the Decembrists a non-Marxist explanation,” and did not join Lenin’s movement until he had officially failed to find it. Osinsky’s trials and tribulations constitute just one reason why the Russian historian Yuri Slezkine, in his 2017 study House of Government, calls the revolutionaries “preachers,” which he did, not to imply a strong connection between the ideas of Christianity and those of Marxism, but because the revolutionaries followed their ideology to a religious extent.

And yet, the revolutionary’s commitment to his cause did not, ultimately, consist of rationality alone. There was something larger, something “primeval,” as Arosev said, at play. For their involvement in covert organizations, many young Marxists found themselves, at one point in their life, exiled to a tiny village in northern Siberia. Buried year-round beneath the snow and isolated from the rest of civilization, they were accompanied only by a few of their close friends and a handful of books. Reading from dawn till dusk, or dusk till dawn—it was hard to tell out on the tundra—they willfully lost their ability to tell fiction from reality. 

In these barren conditions, plagued by depression, the Russian Bolshevik Yakov Sverdlov doubled down on a maxim he had formulated in his adolescence: “I put books to the test of life, and life to the test of books.” At the same time, Feliks Kon began dreaming vividly of the day when “the world of slavery and untruth would sink into the abyss, and the bright sun of liberty would shine over the earth.” Aside from noting the poetic energy with which Kon writes about his political aspirations, just consider, for a moment, how similar his own dream sounds to that of Vera Pavlovna, the selfless heroine of Chernishevsky’s feverishly popular socialist novel, What is to be Done?:

“The day breaks in a splendor of joy that is all of nature filing the soul with light, warmth, fragrance, song, love and tenderness.”

And so, at the core of the revolutionary movement, we finally find not only an unwavering determination to regard art and reality as one and the same, but also an unflinching devotion to the insurmountable task of materializing the beauty, peace and harmony that can so easily exist on a page, yet so seldom shows itself in life. 

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176130 https://historynewsnetwork.org/article/176130 0
Will George Floyd’s Murder Be Trump’s Undoing?




Police brutality did not start with Derek Chauvin, and it probably will not end with him either. Give a man a gun, and the temptation to abuse his power is always there.


In the 1960s, my mother was the principal of a vocational high school in mid-town Manhattan. An obviously deranged homeless man wielding a screwdriver entered the school’s gym, and began shouting obscenities at the students. The man happened to be black. The school called the police, and a white motorcycle cop was first on the scene. Without any warning or conversation, the cop shot the man dead in front of 100 students. The cop’s defense, backed by the police union, which got him lawyered up, was that he felt threatened by the screwdriver. The NYPD disciplined the cop. He got a slap on the wrist, something like forfeiting eight vacation days.


The shot that killed the vagrant was hardly heard around the world. There were no cries of “black lives matter” in those days. It was the Viet Nam era. No one called for defunding or demilitarizing the police, as they do now. There were no demonstrations. Only the sobering realization that human life is cheap and expendable.


Indeed, it wasn’t until 1993, that New York after decades of false starts, instituted a Civilian Complaint Review Board, an independent oversight agency to review cases of police brutality and recommend appropriate discipline. 


This time it is different. Much has happened since the 1960s, but the racial divide that existed from the founding of the country is still too much with us. 


Earlier in our history, we had presidents who sought to bring us together and heal wounds “with malice toward none, and charity for all.” Lincoln presciently understood the hypocrisy of slavery and its stain on American values. In the Lincoln-Douglas debates of 1854, he argued: 

I hate it [slavery] because it deprives our republican example of its just influence in the world-enables the enemies of free institutions, with plausibility, to taunt us as hypocrites-causes the real friends of freedom to doubt our sincerity, and especially because it forces so many really good men among ourselves into an open war with the very fundamental principles of civil liberty... and insisting that there is no right principle of action but self-interest.


We took another swing at it in the 1960s when we all thought that the Civil Rights movement would bring change. In March 1965, President Lyndon Johnson went before a joint session of Congress, and embraced the cause of the protesters, pleading for the “dignity of man and the destiny of democracy.” Johnson movingly spoke of American values: “Our mission is at once the oldest and most basic of this country: to right wrong, to do justice, to serve man.” He declared that the effort of American negroes to secure for themselves the full blessings of American life” “must be our cause too…[I]t’s not just Negroes, but really it’s all of us, who must overcome the crippling legacy of bigotry and injustice. And we shall overcome!”


Today, we see massive protests around the world. The murder of George Floyd on a Minneapolis street has captured the global conscience. Thousands of protestors in Asia, Europe and Australia across six continents joined the protestors in New York, Los Angeles Washington and Philadelphia to decry racism and injustice.


The stakes have never been higher for the United States. As did Lincoln a century and a half ago, Council on Foreign Relations President Richard Haass argues in an important Foreign Affairs piece that


Unless the United States is able to come together to address its persistent societal and political divides, global prospects for democracy may weaken, while friends and allies of the United States may rethink their decision to place their security in American hands, and competitors may dispense with some or all of their traditional caution.”


We wonder why the flames of protest engaged today’s sensibility, and not before. Nobody marched on six continents in 2014 after Eric Garner was filmed being garroted by officers on Staten Island—indeed, hardly anyone marched in New York, either. 


The protests will eventually end, and it is unknown at this point what reforms will come of them. But have we really learned anything from the tragic murder of George Floyd? The Senate, 65 years after the lynching of Emmett Till, dithers over a bill that defines lynching on the basis of race a federal hate crime. Since Floyd’s death, prosecutors in Buffalo, New York announced felony assault charges against two cops who, without any provocation, shoved a 75-year-old protester to the ground. The elderly man remains in a serious condition in hospital after hitting his head in the fall. Fifty-seven of the officers’ colleagues resigned in solidarity, and Trump defends the police spouting a far- fetched conspiracy theory. Solidarity with what? Criminal conduct?


The times present a unique political and historical opportunity for the President of the United States to embrace the cause of his countrymen and try to heal the wounds. Rather than speak from his “bully pulpit” in which he, like Lincoln and Johnson, identifies with the protesters, we have a rogue president in office. Trump threatens to unleash vicious dogs on those who breach his security perimeter; repeats the trope of the Nixon era that “when the looting starts, the shooting starts;” uses tear gas and rubber bullets to make an unprovoked attack on American citizens exercising their constitutional right of lawful assembly, so he can have a photo-op in front of a church he rarely, if ever, attends; invokes the Insurrection Act to threaten using the United States Army against citizens exercising their constitutional right of free assembly, while one of his favorite senators, Tom Cotton of Arkansas, a combat veteran, issues the clarion call to “send in the troops;” and states that proposed legislation removing a “qualified immunity” for police officers is a non-starter. The doctrine of qualified immunityshields police officers, from civil liability for conduct on the job unless they violate "clearly established" constitutional rights.


Undeniably, we have a president spewing nonsense in a political climate of “truth decay.” He continues to add to the tally of 18,000 lies the Washington Post totes up he has uttered since attaining the nation’s highest office. 


I am a lawyer, not a metallurgist, but it is remarkable that throughout his dark career, nothing seems to stick our Teflon president. But, as his approval ratings continue to descend in the polls, Americans are increasingly disgusted with this “great divider.” Perhaps now with the murder of George Floyd, the American people have come to realize that he is more sinning than sinned against. 

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176196 https://historynewsnetwork.org/article/176196 0
In This Election Year We Historians Need to Insist on Truth-telling



In May 2020, historian Jill Lepore began a new podcast seeking to answer the question "Who killed truth?" In her These Truths: A History of the United States (2019), she writes that “the work of the historian” includes being “the teller of truth.” And indeed what other task can be more important for us? Are we not society’s experts on telling the truth about the past? And, as H. Stuart Hughes once argued, that includes the recent past. “Tell the truth” should be as central to our mission, as “First, do no harm” is to doctors and nurses. 


Truth is especially important in this year of a crucial presidential election. Competing versions of the truth are already battling each other. And President Trump’s actions and inactions regarding the coronavirus pandemic and post-George-Floyd murder are the main battlegrounds. 


In a New York Times op-ed of 13 June, conservative columnist Peter Wehner, a frequent critic of Trump, wrote that even before becoming president Trump’s goal “was to annihilate the distinction between truth and falsity . . . to overwhelm people with misinformation and disinformation.” 


Politicians are not especially known for truth telling. As Hannah Arendt wrote in 1967, “No one has ever doubted that truth and politics are on rather bad terms with each other.” But Trump has brought disrespect for truth to a whole new level, one that easily has surpassed that of any previous president. Early in 2018 his fellow Republican, outgoing senator from Arizona, Jeff Flake stated


2017 was a year which saw the truth—objective, empirical, evidence-based truth—more battered and abused than any other in the history of our country, at the hands of the most powerful figure in our government. It was a year which saw the White House enshrine “alternative facts” into the American lexicon, as justification for what used to be known simply as good old-fashioned falsehoods. It was the year in which an unrelenting daily assault on the constitutionally protected free press was launched by that same White House, an assault that is as unprecedented as it is unwarranted. “The enemy of the people,” was what the president of the United States called the free press in 2017.


Later in 2018, Michiko Kakutani’s The Death of Truth: Notes on Falsehood in the Age of Trump wrote of the“monumentally serious consequences of his [Trump’s] assault on truth.” At the beginning of June 2020, Donald Trump and His Assault on Truth: The President's Falsehoods, Misleading Claims and Flat-Out Lies, by several members of The Washington Post Fact Checker team, appeared on bookshelves. It declared, “Donald Trump, the most mendacious president in U.S. history . . . . [is] not known for one big lie—just a constant stream of exaggerated, invented, boastful, purposely outrageous, spiteful, inconsistent, dubious and false claims.” 

The book also insisted that Trump’s “pace of deception has quickened exponentially. He averaged about six [false or misleading] claims a day in 2017, nearly 16 a day in 2018 and more than 22 a day in 2019.” In 2020 the number continued to rise, reaching 19,128 by late May. Furthermore, Trump has reduced the capacity of many government agencies like the Environmental Protection Agency (EPA) to act based on science-based truths rather than political bias.


But rather than analyzing all Trump’s falsehoods and weakening of factual-based government operations, let’s just concentrate on his responses to our ongoing pandemic and the continuing protests following the 25-May-knee-on-the-neck killing of George Floyd. Following an examination of Trumpian truth-tramplings regarding those two 2020 events, we shall look further at the historian’s role in insisting on truth-telling. 

On 13 April 2020 Trump exploited the daily White House Coronavirus Task Force briefing to play a four-minute video featuring TV clips and text which praised his coronavirus responses. On that same day, the Republican National Committee (RNC) began running ads in over a dozen battleground states praising Trump's coronavirus leadership. The main theme of both the four-minute video and the ads was summed up by a few quotes from one of the ads. “Our nation in crisis, but through the uncertainty and fear, our president is a steady hand. Bold action. Strong leadership. Uniting America,” and “From the beginning, President Trump was decisive. Stopping travel from foreign nations, gathering our best and brightest, slowing the spread of COVID-19. President Trump will relaunch our economy and fight for the American worker. Helping a nation in need delivering unprecedented bipartisan relief.” 

About the 13-April video CNN proclaimed, “TRUMP USES TASK FORCE BRIEFING TO TRY AND REWRITE HISTORY ON CORONAVIRUS RESPONSE.” Two of the networks reporters, Erin Burnett and John King, indicated some of the ways the video presented a false narrative. And CNN’s Jim Acosta declared that it looked like it was made in China or North Korea.

Thus, it appears that competing views of Trump’s coronavirus response, competing histories of it, are going to bombard citizens all the way up to the November presidential election. The same is likely to occur regarding the Trumpian response to protests stemming from the killing of George Floyd. Can voters’ get this history right? Can they distinguish between truthful history and fake history? The outcomes of the November Presidential and Congressional elections are likely to hinge on this capability.

Pew poll released May 28 does not provide great hope. Only 33 percent of Democrats and independents leaning that way and 23 percent of Republicans and Republican-leaning independents felt “highly confident in their ability to check the accuracy of COVID-19 news and information.”

Regarding Trump’s coronavirus responses, The Atlantic’s “All the President’s Lies About the Coronavirus” (27 May, 2020) and its promise of updating as needed provides a good overview. So too does Texas Congressman Lloyd Doggett’s up-to-date “Timeline of Trump’s Coronavirus Responses,” which includes such Trump gems as “We have it totally under control. . . . It’s going to be just fine” (22 January); “CDC and my Administration are doing a GREAT job of handling Coronavirus” (25 February); and “When we have a lot of cases, I don't look at that as a bad thing, I look at that as, in a certain respect, as being a good thing, . . . Because it means our testing is much better. I view it as a badge of honor, really, it's a badge of honor” (19 May, the day after U.S coronavirus deaths passed 90,000). 

The most notable presidential response to the protests after George Floyd’s death was Trump’s short walk from the White House to St. John’s Church, where he 

arranged a photo op of him holding up a Bible. To get to the church Trump used militarized security forces, rubber bullets, and tear gas to disperse peaceful protesters. After such tactics were criticized in the press, the Trump campaign claimed the press distorted the Trump response. 

In general Trump and his administration have attempted to link the protests to radical leftists, including a loose group of anti-fascist activists known as antifa. Trump’s most outrageous claim was that a 75-year-old man knocked to the ground by Buffalo police and hospitalized “could be an ANTIFA provocateur.” Yet, as the New York Times indicated “Federal Arrests Show No Sign That Antifa Plotted Protests”

Of course, Trump and his supporters often damn media that are critical of him by calling them “fake news.” But this frequent and careless labeling, even against such conservative publications as the Wall Street Journal, is hardly credible. As historians, we often emphasize that in seeking truth we should rely on reliable sources. Can anyone seriously claim that Trump is such a source?

In a post on HNN last month, Christine Adams and Nina Kushner wrote that “to hold Trump and the GOP accountable . . . will require a shared understanding of what constitutes truth. . . . This idea of truth based on reason and evidence is what supports almost all research from life-saving medical breakthroughs (such as the coronavirus vaccine we are nervously awaiting) to the development of the iPhone.  But it is not the monopoly of research. Rational evidence-based inquiry is the hallmark of journalism, the work of intelligence agencies, and even the legal system, however imperfectly.” Yet, the two historians realized, “it is exactly this understanding of the truth that is at risk.”

Historians stress on truth telling goes way back. A half century ago, for example, David Hackett Fischer emphasized it in his book Historians’ Fallacies (1970): “Every true statement must be thrice true. It must be true to its evidence, true to itself, and true to other historical truths with which it is colligated. Moreover, a historian must not merely tell truths, but demonstrate their truthfulness as well.”

Fischer’s book remains valuable because he reminds us of all the errors we as historians can, and have, made; and he is absolutely correct in emphasizing the centrality of truth telling to our profession. Seeking truth about past events must always remain our lodestar.

What former President Obama said in a July 2018 speech in South Africa is even more true today than it was two years ago. Still early in the Trump administration, Obama stated that “too much of politics today seems to reject the very concept of objective truth. People just make stuff up. . . . We see it in state-sponsored propaganda. . . . We see it in the promotion of anti-intellectualism and the rejection of science from leaders who find critical thinking and data somehow politically inconvenient.” 

Although Obama did not mention Trump, it was easy to discern that the former president believed his successor was encouraging truth trampling. And, unfortunately, Trump’s Republican Party was descending into the lying pit along with him. Donald Trump and His Assault on Truth cites a 2007 Associated Press–Yahoo poll which “found that 71 percent of Republicans said it was ‘extremely important’ for presidential candidates to be honest,” but in a 2018 Washington Post poll only 49 percent thought it was important, “22 points lower than in the poll a decade earlier.”

The party that once prided itself on emphasizing virtues such as honesty (see, e.g. William Bennett’s 1993 Book of Virtues) was apparently now having second thoughts about a value Bennett thought “was of pervasive human importance.” But Obama told his South African audience that “the denial of facts runs counter to democracy, it could be its undoing, which is why we must zealously protect independent media; and we have to guard against the tendency for social media to become purely a platform for spectacle, outrage, or disinformation; and we have to insist that our schools teach critical thinking to our young people.” In the spirit of that speech, where four times he repeated the phrase “history shows,” we can also add that historians need to continue insisting on the importance of truth-telling. 

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176187 https://historynewsnetwork.org/article/176187 0
Has Trump’s Popularity Reached a Tipping Point? Joe McCarthy's Fall May Give Clues



For more than three years, criticism of Donald Trump’s flawed presidency has been intense, yet his base of public support has remained solid. Commentators in the national media wonder, however, if Trump is beginning to lose his political grip. They point to public concern about his administration’s inadequate response to the pandemic, surging unemployment, and scathing criticism from generals and prominent officials who served in the White House. Opinion polls indicate some slippage in the president’s approval numbers. Might this be a tipping point, journalists ask? Could the President lose in November?

Much can happen between now and November 3. In the past Trump has pulled out of difficulty on numerous occasions, including the final weeks of the 2016 presidential election. But there are some intriguing similarities between Trump’s situation and that of Joseph McCarthy. Senator McCarthy seemed invincible at the beginning of 1954. In a matter of months, he fell from grace. Do Donald Trump’s recent difficulties suggest he, too, may experience a loss of public support? Trump’s situation resembles McCarthy’s in some ways, but there are also notable differences.

Joseph McCarthy, a U.S. senator from Wisconsin, rose to prominence in early 1950 by stoking fears about communism. His speech at an event for Republican women attracted considerable media attention. McCarthy claimed misleadingly that he had a list of 205 names of disloyal officials that were “still working and shaping the policy of the State Department.” Journalists and politicians identified McCarthy’s lies and misrepresentations, but the senator managed to keep them off-balance. McCarthy often announced new charges about communist influence in America, diverting attention from controversies related to his previous assertions.

When a conservative Democratic senator, Millard Tydings, headed an investigation of claims about communist influence, McCarthy attacked him. McCarthy’s staff promoted a doctored photo that falsely associated Tydings with an American communist leader, and McCarthy aided Tydings political opponent. Millard Tydings had been a popular lawmaker before McCarthy targeted him. Tydings suffered a huge election defeat in 1950. His experience demonstrated the perils of resistance to McCarthy. 

During a four-year period, Joseph McCarthy wielded extraordinary power. Like President Trump, he bullied and threated opponents. Many of McCarthy’s fellow Republicans were troubled by his behavior, but they kept quiet. They understood that McCarthy’s aggressive tactics boosted the GOP’s political fortunes. Republicans also recognized the electoral power of McCarthy’s loyal followers.

Joseph McCarthy achieved broad public support largely because communism seemed to be expanding globally. In the years after World War II, the “Cold War” began. The Soviet Union tightened its grip on Eastern Europe, Mao Zedong’s communist revolution took control of mainland China, the Russians developed nuclear weapons, revelations indicated spies gave secrets to the Russians, and the Korean War dragged on without a settlement. “Reds” seemed to be making substantial gains. Americans wanted tough leaders who would stand up against communist aggression. McCarthy acted like the man of the hour.

In early 1954 Joe McCarthy’s popularity ratings were strong. A poll in January reported that 50% of Americans queried approved of him and only 29% disapproved. By late 1950, however, polls revealed a striking loss of support. 35% judged McCarthy favorably in the November 1954 survey, 46% unfavorably. 

What caused the senator’s fast decline? McCarthy overreached, especially when he attacked the U.S. Army. Newscaster Edward R. Murrow’s television program delivered a scathing indictment of McCarthy’s tactics, and the Army-McCarthy hearings, also broadcast on national television, revealed McCarthy’s lies and abuses. Changing conditions also weakened the senator. Dwight D. Eisenhower, a war hero and popular Republican president, restored public confidence. Cold War tensions eased. The Soviet leader, Joseph Stalin, died in 1953, and a few months later fighting in the Korean War stopped. McCarthy’s tactic of stoking fear of communism lost much of its appeal in the changing political environment.

Similarities regarding Donald Trump’s situation are intriguing. President Trump has also attempted to frighten the public with scary claims about dangerous conspiracies and threats from radicals. Trump pounced on Republicans and former administration officials that criticized him publicly. His fierce attacks showed he intends to smash anyone who betrays him. Also, as in the case of Joe McCarthy, new developments have stirred discontent with Trump’s leadership. Presently, Americans are anxious about the pandemic and the economy. They worry, too, about angry clashes over politics, culture, and race. American society appears dangerously divided. 

Trump’s critics, long frustrated by his hold on power, wonder if these changes indicate a tipping point has been reached. Is Trump in serious political trouble, they ask?

The president’s standing has been damaged because of recent events and his bungling leadership, but his influence may not decline as rapidly as McCarthy’s did in 1954. Trump might hold on and win the 2020 election. He is President of the United States, not a senator, like McCarthy. Trump controls the bully pulpit. He receives free media coverage daily and numerous opportunities to promote his candidacy. Furthermore, Republicans in federal, state, and local governments are well positioned to create obstacles to Democratic voters in the 2020 elections. Vladimir Putin’s agents are also expected to use social media to influence American opinion prior to the election.

Still, the history of McCarthy’s rise and fall is suggestive. It reveals that sometimes political power can be more fleeting than pundits realize. New developments beyond a leaders’ control as well as the individual’s controversial actions can swiftly weaken public support. When a politician’s difficulties receive elevated attention in the national media, critics escalate their attacks. Others, long discontented, feel emboldened by the new signs of resistance. They pile on. Momentum for change builds rapidly.

Has Trump’s influence reached a tipping point? Or will his appeal with many voters, benefited by incumbency in 2020, produce another election victory? It is too early to tell. But the record of Joseph McCarthy’s decline in 1954 shows that power can diminish swiftly when changing societal conditions and flawed leadership create a perfect political storm. 

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176198 https://historynewsnetwork.org/article/176198 0
Vannevar Bush: Franklin D. Roosevelt’s Indispensable Expert

Vannevar Bush with President Harry Truman, 1948


Present-day contempt for scientific expertise takes us back, some eight decades ago, to the Second World War.

Enter Vannevar Bush, president of the Carnegie Foundation, who gained a welcomed audience with President Franklin D. Roosevelt.

The recipient of a joint doctorate in engineering from Harvard and MIT, Bush’s bona fides were exhaustive:  professor of electrical engineering at MIT, and later dean of engineering and eventually vice-president.  Bush is credited with developing the analog computer, among  numerous other milestones that prepared the path for our present-day digital age. 

Bush proposed to President Roosevelt the harnessing of national defense requirements to nation’s scientific expertise, intending to draw from university research centers as well as private enterprise.  The immediate result was President Roosevelt’s Executive Order 8807 in June of 1941, authorizing formation of the Office of Scientific Research and Development.   Instantly the president designated Bush as the executive officer of OSRD, with ready access to the White House.

Simultaneously Bush, again at the request of the president, served on a top-secret committee focused on the development of the atomic bomb.  In so doing he routinely interacted with General George C. Marshall (Army Chief of Staff), James Conant (president of Harvard University), and J. Robert Oppenheimer  (director of the Los Alamos Laboratory in New Mexico, where much of the classified research and development occurred.)          

Bush filled the ranks of OSRD with the nation’s foremost scientists, mathematicians, engineers, and physicians in common cause.  He also engaged large numbers of experts drawn from private-sector enterprises such as Bell Labs, RCA, and Sperry Gyroscope, to name only a handful.

OSRD was purposefully civilian-controlled because of the inter-service rivalries, often internecine, that plagued defense research during the First World War.  Ultimately OSRD would expend $500 billion (equal to $7.2 trillion in 2020).  Key achievements included guided missiles, proximity fuzes, radar, and the battle-field ready walkie-talkie.  OSRD’s medical committee also exercised a catalytic role in developing, producing  and distributing penicillin – a painstaking process -- which would greatly diminish battlefield fatalities in the D-Day invasion.

One further aspect regarding Vannevar Bush merits airing.  During the 1930s he disdained President Roosevelt’s New Deal domestic legislation, regarding it as encroaching on private enterprise (his partisan affiliation, if any, remains unclear). This prompted him to advise the president, in course of devising OSRD, that the agency should disband once the war concluded.  That would occur in 1947.  

Notably, given the exigencies of global conflict, President Roosevelt's trust in Bush was not unique. He appointed two prominent Republicans to his wartime cabinet.  Henry L. Stimson previously had served as Secretary of War to President William Howard Taft as well as Secretary of State to President Herbert Hoover.  Frank Knox became Secretary of the Navy.  Not only was Knox the Republican vice-presidential nominee in 1936 but he also composed a steady stream of editorials for his newspaper vilifying President Roosevelt’s domestic legislation.  Add to this, the president designated Wendell Willkie, his Republican opponent in the election of 1940, to serve as his ambassador-without-portfolio to promote the Allied cause during the course of the war.  As well, the president cultivated Senator Arthur H. Vandenberg, a Republican from Michigan, who would abandon his longstanding isolationist sensibilities to endorse an American role in establishing the United Nations.

Composing the foreword to OSRD’s official administrative history, published in 1948, Vannevar Bush cited what he regarded as its summative achievement:  “New lessons in  understanding and evaluation had to be learned by both the military and by the scientific community.”  

The postwar legacy of Vannevar Bush would culminate in 1950, when President Harry S Truman signed legislation authorizing formation of the National Science Foundation.  Bush believed the new agency –drawing upon the wartime role of OSRD  – would advance nation’s scientific and technological research.  In all of this he looked askance, in the era of the of Cold War, at the out-sized influence of the nation’s uniformed services in devising national science policy.    

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176129 https://historynewsnetwork.org/article/176129 0
Roundup Top Ten for June 26, 2020

The Confederacy Was an Antidemocratic, Centralized State

by Stephanie McCurry

Whatever way you look at it, it is impossible to turn this history and its leading figures into a part of American heritage. 


What Kind of Society Values Property Over Black Lives?

by Robin D.G. Kelley

"Let me offer a more productive question instead: What is the effect of obsessing over looting?"



How 1970s U.S. Immigration Policy Put Mexican Migrants at the Center of a System of Mass Expulsion

by Adam Goodman

"90% of the people pushed out of the country during the 20th century were Mexicans deported via a coercive, fast-track administrative process euphemistically referred to as 'voluntary departure,'" writes Adam Goodman.



Monuments to a Complicated Past

by Sean Wilentz

Unless we can outgrow the conception of history as a simplistic battle between darkness and light, we will be the captives of arrogant self-delusions and false innocence.



Donald Trump’s Message is Falling Flat Because it is Outdated

by A. K. Sandoval-Strausz

Our cities and suburbs look and feel very different than they did in the depths of the urban crisis in the late 1960s. That’s one reason Trump’s attempt to revive the law-and-order playbook of that era has fallen flat.



How to Stop the Cuts

by Sara Matthiesen

Historians and other faculty who want to protect their disciplines and their colleagues from budget cuts need to develop maps of power and how it operates in a university.



Police Say Deaths of Black People by Hanging are Suicides. Many Black People aren’t so Sure.

by Stacey Patton

Black people's suspicions that a number of recent hanging deaths were murders rather than suicides echoes a long history of concealing violence against black people by ruling it suicide. 



The Black Women Who Launched the Original Anti-Racist Reading List

by Ashley Dennis

Black women librarians have been important leaders in promoting books and publishing standards that encourage readers to recognize human dignity and reject racist stereotypes in children's literature.



Cancel the Fall College Football Season

by Victoria L. Jackson

For too long, instead of facilitating the intellectual advancement and economic empowerment of young Black men, college sports have helped make American universities another institution perpetuating the undervaluing of Black lives.



Martin Luther King’s Giant Triplets of Injustice

by Andrew Bacevich

Without addressing the fundamental evils of economic inequality and militarism American society will continue to fail to realize the promise of racial equality, as Martin Luther King warned in his 1967 speech at Riverside Church. 


Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176178 https://historynewsnetwork.org/article/176178 0
Ghosts of Neshoba: Why Trump Can't Dog Whistle His Way Back to the White House

Photo Robfergusonjr, Wikimedia Commons, CC BY-SA 3.0


Editor's Note: This essay accurately quotes a Mississippi politician's speech in 1963, which includes a racial slur.


Last week we learned that someone with a deep knowledge of a certain kind of history seems to be advising Donald Trump’s reelection campaign. The campaign announced that Trump’s first post-COVID-19-lockdown campaign rally would take place on June 19, in Tulsa, Oklahoma. Immediately, commentators observed that Tulsa was the site of American history’s most harrowing racial pogrom in 1921, and that June 19 was Juneteenth, a holiday commemorating the day slaves in Texas learned they were free. The campaign decided to move the rally to the next day instead.


It was easy, perhaps, to imagine all this was a one-off coincidence, “snowflake” liberals manufacturing outrage out of thin air—until the announcement that Trump’s acceptance speech was to take place in Jacksonville, Florida, on August 27. It was soon pointed out that black people in Jacksonville know that date as “Ax Handle Saturday,” when participants in a 1960 NAACP demonstration were chased through downtown streets for beatings. Next, the campaign tweeted a frightening message about Trump World’s latest boogeyman, ANTIFA—illustrated with the same upside-down red triangle the Nazis forced socialists to wear.


“This campaign is nothing but a dog whistle,” a friend emailed me, referring to the phrase used to describe the longstanding right-wing practice of signifying solidarity with racists by sending signals on coded frequencies that non-racist voters won’t recognize. The comparison most frequently drawn is to the first major speech of Ronald Reagan’s presidential general election campaign, on August 2, 1980, at the Neshoba County Fair in Mississippi. 


There, in the 1950s and ‘60s, politicians competed to outdo each other with nasty imprecations at civil rights organizations like the NAACP—an acronym a politician named Paul Johnson said—in his successful 1963 run for governor—stood for “Niggers, Apes, Alligators, Coons, and Possums.” In 1964, the fair opened as planned on August 8, even though six days earlier the bodies of voting rights activists James Chaney, Andrew Goodman and Mickey Schwerner were discovered buried in an earthen dam a few miles away, assassinated by the Ku Klux Klan with the assistance of Sheriff Lawrence Rainey.


And yet Reagan raised the curtain of his campaign there with a speech in which he affirmed his support for “states rights,” the signature code phrase of Southern racist politicians going all the way back to John C. Calhoun. Three months later, he swept all the Southern states save Georgia against Peachtree State native son Jimmy Carter. 


In years since, conventional wisdom hardened: dog-whistles work. No wonder, forty years later, Team Trump is eager to sound them again.


But my work suggests a contrary lesson: it’s likely that this speech hurt Reagan more than it helped him. In 1964 Barry Goldwater, after voting against the landmark 1964 civil rights act, got 87 percent of Mississippi’s vote. In 1976 Gerald Ford, who first supported, then voted against civil rights legislation, and whose Republican Party by 1976 was widely understood as anti-civil rights, lost Mississippi by only two points. But in 1980 Ronald Reagan, who had opposed all the civil rights bills of the 1960s, barely won Mississippi, improving on Ford’s performance there by only one percentage point.


Why? Return to that hot August day in Mississippi. Listen to the speech, which is available on YouTube. Of the scores I’ve listened to, this is perhaps the most diffident Reagan performance I’ve heard. He took less than ten minutes, an unusually large portion comprised of stories and jokes which went over much better than his most famous line, which he rushed and muffled, as if he was nervous—far from the sort of demagogic bray you’d expect reading accounts of the event.


According to unpublished research by historian Marcus Witcher of the University of Central Arkansas, the infamous words were added at the last-minute by the private suggestion of Reagan’s host, Congressman Trent Lott. That makes sense because, in fact, the phrase contradicted the campaign’s central messaging strategy. 


Documents I’ve studied reveal a campaign veritably obsessed with fighting the perception that Reagan’s conservative, anti-government politics camouflaged prejudice. A 1978 memo from Reagan advisor Peter Hannaford to Edwin Meese, for example, concerning “SUBJECT: ‘gay issues, revisited,’” observed “a very thin line to tread between getting the support of fundamentalists, on the one hand, and people who need to know how strongly RR is opposed to bigotry, on the other. The latter, in my opinion, are much more potent politically and in terms of swaying the opinions of others.”


That spirit carried forward through 1980. Reagan pollster Richard Wirthlin devised a strategy for Reagan to appear as often as possible before black audiences. “We weren’t expecting to pick up any black votes in New York,” an advisor later noted. “We just wanted to show moderates and liberals that Reagan wasn’t anti-black.”


The same message even held for white Mississippians. 1980 was well into the ascendancy of what historians like Matt Lassiter call “colorblind conservatism”: advocating policies that disparately impact African Americans, without appearing to do so. 


Indeed, one of this story’s strange ironies is that the campaign was originally planned to open before the Urban League—a venerable national civil rights organization. A scheduling problem resulted in the Neshoba speech coming first.


It’s not that the campaign did not seek to appeal to bigots; one 1979 Wirthlin memo said the most promising potential seam of Reagan voters were Democrats scoring highest on “authoritarianism—and a low score on egalitarianism.” And it wasn’t only Ronald Reagan’s general election campaign that opened in a racist epicenter. His nomination campaign did as well, in South Boston, site of vicious violence against integrating black students only five years earlier. Beside him on the podium was a politician named Albert “Dapper” O’Neil, an anti-integration leader famous for never going anywhere without a gun, and for his adamant support for South Africa’s apartheid government.


Next Reagan traveled to Cicero, the Chicago suburb so inhospitable to black people that Martin Luther King once gave up on a plan to march there for open housing after the Cook County sheriff told him it was “awfully close to a suicidal act.” Reagan’s briefing materials instructed him that that a top issue of voter concern there was a “recent HEW [Department of Health, Education and Welfare] decision to force a busing program on the city of Chicago and surrounding suburbs.”


He delivered his standard speech in Cicero and “Southie,” mentioning nothing about busing, just government overreach in the abstract. The campaign was skillfully treading that “thin line” described by Hannaford. Had he also delivered his standard speech in Neshoba nine months later, he would have done so again. Instead, he stepped over the line, taking Trent Lott’s advice—and his endorsement of “states rights” turned the dog whistle into a train whistle. 


His diffident delivery suggests Reagan understood how risky this was. If so, his suspicions were correct. It was the immediate conclusion of voices across the political spectrum that this was a terrible blunder.  


Carter joined seven southern governors in demanding an apology—skillfully playing to Southern political tropes by portraying Reagan as an unwelcome carpetbagger. Andrew Young penned a moving essay for the Washington Post about stopping in Neshoba County during Martin Luther King’s 1966 March Against Fear. King described the lynching in his speech, concluding, “The murderers of Goodman, Chaney, and Schwerner are no doubt within range of my voice.’  A voice rang out: ‘Yeah, damn right. We’re right behind you.”


The Reagan campaign had begun on its back foot—not least in Mississippi. Witcher unearthed a report on the ground from a party worker there: “Three weeks ago Reagan had a landslide victory in Mississippi. Today it is a tossup.” The state Republican chairman listed the campaign’s liabilities—writing “states’ rights flap” at the top. At Reagan headquarters, the fallout was so great that the campaign brought in a ringer from a major Washington lobbying shop to handle press relations.


And, just the opposite of Donald Trump of late, the campaign doubled down in its outreach efforts toward liberal constituencies. This was something Ronald Reagan was rather skilled at. After the candidate met with National Organization for Women president Eleanor Smeal to affirm that he supported feminist goals, she emerged to tell the New York Times that Reagan was “finally waking up to the fact that this issue is a lot hotter than he realized,” repeating campaign talking points that he had supported fourteen civil rights bills as governor.


Meanwhile, black surrogates hymned Reagan’s praises: “for too long the Democratic Party has taken black people for granted and lied to them about Republicans.” 


These efforts were so successful that, astonishingly, in the middle of October, none other than Ralph Abernathy stepped up to the pulpit of an African American church in Detroit, identified himself as “the man in whose arms Martin Luther King died,” and said, “I have been praying and I have been studying and today I have had an opportunity to come to a decision after a private meeting with Governor Reagan. And after we discussed certain issues, I am thoroughly convinced. . . . I endorse the candidacy of Ronald Reagan as the next president of the United States!”


And, of course, all the while, Reagan passionately plumped for policies that his voters full well understood would disadvantage African Americans. Thomas and Mary Byrne Edsall would later find that Carter received 93 percent of the vote of those who most enthusiastically supported efforts to improve conditions for black Americans. Reagan got 71 percent of the vote of those who most strongly opposed them. 


Ronald Reagan had the skill to recover from a lapse in discipline when it came to treading the fine line of race. Compare that to the current Republican aspirants. Juneteenth? “Ax Handle Saturday”? Nazi symbols? Shithole countries, Mexico sending their rapists? Memo to Donald J. Trump: you’re no Ronald Reagan. Dog whistling takes skill, and your racism is far too blatant to walk that fine line.

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176063 https://historynewsnetwork.org/article/176063 0
The Great Upheaval of 1877 Sheds Light on Today’s Protests



One hundred and forty-three years ago the nation was shaken by a nationwide series of strikes almost amounting to a mass rebellion. Though there are clear and obvious differences between the issues, modes of collective action, and the participants of that upheaval and the multiracial protests of African-Americans, other people of color, and their white allies that have occurred over the past two weeks, the similarities are real enough to offer some perspective on present circumstances.


The issue that started the 1877 affair was not police brutality and institutional racism but economic inequality. The year 1877 was the low point of the 1873-1878 depression, which brought wage cuts of 10 to 30 percent, driving many workers and their families to the point of desperation. The strikes began when railroad workers in Martinsburg, West Virginia walked off the job following a 10 percent cut on the Baltimore & Ohio Railroad. The strike spread west and soon engulfed transportation centers and major industrial cities including Pittsburgh, St. Louis, and Chicago.  


But, like the rallies and marches against the murder of George Floyd, what started out as a protest quickly escalated beyond the control of those who sought to lead it. It devolved into heterogeneous crowds with their own dynamics, sometimes resulting in violent clashes with authorities and property damage.   


In Pittsburgh, after it became clear that the Pittsburgh police and militia sympathized with the local crowds, authorities called in the militia from Philadelphia, sparking outrage and violence. After the outsiders fired on the protesters causing twenty deaths, a diverse crowd tried to burn down the round house into which the militia had fled and then burned and looted the rail yards.  


In Chicago the railroad strike quickly escalated into a general strike for a 20 percent wage increase and the eight-hour day.  Mobile crowds of various occupations (or no occupation at all) traversed the industrial districts calling out employees to strike. To most of the press, they were lawless mobs--“ragamuffins, vagrants, and saloon bummers.” But a more accurate description was that they were “roaming committees of strikers” often joined by passersby and teenage boys out for adventure.  The press’s conflation of protesters expressing  serious social grievances with these hangers-on encouraged much of the public to dismiss the whole affair as a “riot.”


After the first day, male crowd members were joined by working class women with their dresses tucked up, sometimes carrying stockings filled with stones to use as weapons. When they were confronted by police and militia the results were bloody and tragic.  Despite the brutality of the police in the present protests, the forces of order in 1877 were far more merciless in suppressing the strikes and disorder. In Chicago, at least thirty working people lay dead after four days of clashes with police.  In New York, where those in authority acted on advice akin to Donald Trump’s call to “dominate” protesters, there were no strikes after police brutally attacked and dispersed a mass meeting on the first day of the strike.


Who were these crowd participants of 1877?  Though few African-Americans lived in northern industrial areas, in places where significant numbers of black people lived, notably Cincinnati, St. Louis, and Louisville, they participated in the upheaval with alacrity.  But the largest portion of strikers in the industrial cities were white immigrants, mostly Irish, German, Bohemian (Czech), and Scandinavian.  These men and women were at the bottom of the class and status orders of the new industrial society then taking shape in the urban North.  They were acutely conscious of themselves as a group set apart from respectable society, “looked down upon and despised” as one young striker put it.   


The anger they displayed at the police and militia closely parallels that of today’s protesters. In Chicago, Irish crowd members called the police “peelers,” a term of derision imported from Ireland, where the constabulary of Sir Robert Peel had enforced British dictates on the colonized Irish.  While police brutality was not an issue raised by the strikes, it did become a major issue afterward. In Chicago, a new mayor was elected in 1879 having explicitly promised to limit the use of the police to suppress strikes and break up the gatherings of Socialists. In New York, where the police were staunchly backed by leading businessmen then in control of city government, they ran rampant over the labor movement, making their brutality a major issue in the labor party’s Henry George campaign of 1886. In 1890, the unions of the city’s construction workers charged that the city’s police force “as a body is dishonest, brutal, even criminal.”


There is much we can learn from recalling the 1877 strikes. Throughout American history, white immigrant and native-born working people as well as black people have found it necessary to mount unorganized, spontaneous nationwide protests and have faced off against the police in angry, sometimes bloody encounters. During these episodes, the news media has seized on instances of rioting and opportunistic looting to dismiss the substantive demands of protesters. 


And then as well as now, indiscriminate police violence in dealing with protests has elevated the issue of police accountability into a major political issue.

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176018 https://historynewsnetwork.org/article/176018 0
Witness Against the Beast



Donald Trump, June 1st, 2020.  Washington DC.  A large, powerful, white man, brandishing a bible like a billy club, in aggressive repudiation of all those who demand he acknowledge the death of George Floyd at the hands of Minneapolis police, and the endemic racism of contemporary America to which Floyd’s death and the deaths of countless others stands witness.  The image is unforgettable.  What does it mean for America’s evangelical Christians, to whom it is directed?

“I thought, look at my president!  He’s establishing the Lord’s kingdom in the world,” says Benjamin Horbowy, 37, of Tallahassee, Florida, in response to Trump’s June 1st photo-op.  “I believe this is a president who wears the full armor of God” (Guardian 3 June 2020).

I have no reason to doubt the sincerity of Mr. Horbowy’s beliefs.  Still, I ask him and those like him to examine their convictions very carefully.  Why?

There is another bible on display just a few blocks from the spot on which Trump stood, in the National Museum of African American History and Culture; a bible, unlike Trump’s, that is open not closed, worn and torn through long use, not pristine; a bible fragile with age, and sweat, and suffering.  It is Nat Turner’s bible.  Like Trump’s bible, it too stands for the power of evangelical Christianity, but it marshals that Christianity in defiance of profane white dominion rather than in defense of it.

Nat Turner was a Virginia slave who, in August 1831, led a bloody uprising that resulted in the deaths of 55 white and 44 black people.  Aside from the rebellion that bears his name, Nat Turner is best known for being unknowable, “the most famous, least-known person in American history.”  But this is not entirely true.  We know more of Turner than of virtually any other African American slave of his time, thanks to the remarkable Confessions of Nat Turner, a pamphlet composed by a local Virginia attorney, Thomas Ruffin Gray.  Gray based his pamphlet on an extended conversation he had with Turner in the days prior to Turner’s trial (for conspiring to rebel and making insurrection) and his inevitable execution, “hanged by the neck.” 

Fifty years ago, at the height of the civil rights movement, William Styron published a fictional account of Turner and his rebellion that drew in part on Gray’s Confessions.  Styron wanted to make the man and his motives comprehensible.  But Styron had no real desire to understand Turner on any terms but his own.  The historical Turner, Styron believed, was “a religious maniac,” and this was a figure with whom he wished to have nothing to do.  Historians, too, have secularized Turner.  Eugene Genovese, the late historian of slavery, thought Turner deserved “an honored place in our history” because he had “led a slave revolt under extremely difficult conditions.”  But the same Genovese derided Turner as a madman, a religious fanatic, “who had no idea of where he was leading his men or what they would do when they got there.”

Through the scribbles of his “confessor,” Turner left us an invaluable account of himself – deeply Christian, deeply evangelical, moved to act by an extraordinary faith that drove his apocalyptic thinking all the way from the squalor of his life as a slave in Southampton County to Christ’s Crucifixion, and onward, to the Second Coming and the Last Judgment. 

This Turner, who spoke plainly about himself, without disguise, has been turned into a shrouded mystery because we refuse to pay attention to what he actually said.  But once we penetrate the fog of misdescription and incomprehension we find that Nat Turner’s defiance of the profane and degenerate regime of Southampton County slaveholders was rooted in the deep marrow of his evangelical religious ideation.  That deep marrow gave him the strength to endure, the knowledge to understand, and the weapons with which to fight.  It gave him the power to witness against the beast, to reveal in prophecy, and to seek redemption – for all.  

200 years on, America’s white evangelicals find themselves allied with the successor of that profane slaveholder regime against which Turner fought – a successor regime of race hatred and white supremacy.  Like it or not, they are in alliance with the beast.  For the sake of their own souls, they would do well to reconsider.


Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176071 https://historynewsnetwork.org/article/176071 0
After Those Cruel Wars Were Over: Lessons from Two Economic Recoveries



As the U.S. unemployment rate hovers around the awful level of 15 percent, it is tempting to see the current crisis as a second coming of the Great Depression. But this analogy may go too far, to the extent that it leads people to believe that unemployment will still be high and production low for many years to come; that because 2020 is like 1930, 2029 will be like 1939. If widely accepted, this narrative projecting a Great Depression of the 2020s may itself inhibit recovery, by discouraging spending and investment. 

A more optimistic and useful set of analogies, we believe, is provided by the experiences of the United States after the World Wars. In the latter half of 1918, the U.S. economy suffered two severe shocks. First there was the end of the Great War, which meant the demobilization of the armed forces and the termination of the production of munitions. And then, as we all now know, the U.S. was hit by the Spanish flu pandemic. An economic contraction began around August 1918, even before the end of the war and the start of the deadlier second wave of the flu.  But it ended and an expansion began in March of 1919, making the postwar recession one of the shortest since the Civil War. Then, in 1920, there was another sharp contraction, made worse by the Federal Reserve’s decision to adopt a tight money policy to squeeze out the ongoing inflation, a legacy of wartime monetary policies. But by mid-1921, the U.S. economy was growing again, and, despite some weaknesses and short downturns, enjoyed close to a decade of solid growth. The story was similar at the end of World War II. A contraction began in early 1945, even before VE Day, but it ended before the end of the year. Once again the economy began to expand after a relatively brief contraction. 

These are not isolated examples. Throughout history, nations and communities have demonstrated remarkably resilience, in the wake of disasters, including those that caused extreme dislocation and mass death.  For example, Germany and Japan both made astonishing recoveries in the 1950s, despite the enormous destruction and loss of life they had suffered during World War II.  Remarkable post-crisis rebounds were also common in earlier eras.  Indeed, in the mid-nineteenth century, the great economist and philosopher John Stuart Mill wrote that “… what has so often excited wonder, the great rapidity with which countries recover from a state of devastation; the disappearance, in a short time, of all traces of the mischiefs done by earthquakes, floods, hurricanes, and the ravages of war. An enemy lays waste a country by fire and sword, and destroys or carries away nearly all the moveable wealth existing in it: all the inhabitants are ruined, and yet in a few years after, everything is much as it was before." 

The reason, as Mill explained, was that much of the durable capital, and the “skill and knowledge of the people”—what we would now call the human capital—survived, so they could be set to work again after these crises. And the same will be true for us. Automobile companies will still know how to make automobiles, hairdressers and barbers will still know how to give haircuts, kindergarten teachers will still know how to educate young people, and surgeons and nurses will still know how to do hip replacements.  Moreover, pent-up demand will help speed recovery. People will continue to want automobiles, haircuts, and … many other things.

Although history suggests the likelihood of a relatively rapid recovery, this doesn’t mean, of course, that there is nothing that government can do to promote this. For example, during World War II, Congress acted to correct the previous neglect of war veterans, by enacting the G.I. Bill.  That legislation provided unemployment benefits, which, thanks to the brevity of the postwar recession, were used less than expected. But veterans did take great advantage of the G.I. Bill’s education benefits. They helped many veterans whose education had been disrupted by the war, and, moreover, provided additional investments in human capital, above and beyond those available before the war. Today, with millions of our young women and men having their educations disrupted by the pandemic, governments can aid recovery and longer-run prosperity, by making new investments in education and human capital.  These might be beneficially combined with additional investments in infrastructure, including those focused on environmental sustainability, which would further enhance the future prospects of our young people.

Another lesson from the world wars is that even though we have to spend enormous sums on crash industrial programs to overcome the crisis, we don’t need to turn a blind eye to corruption and profiteering. During World War I, special excess-profits taxes were imposed to deal with these problems, but they were too little and too late to prevent widespread public discontent with profiteering.  This was remembered by Congress and the leaders of the World War II mobilization, who created a strict regime of high income taxes, windfall profits taxes, price and profit controls, and audits and investigations, to minimize malfeasance and boost public morale.  Today, we may not require such a heavy hand of regulation as the one that prevailed during World War II, but there is plenty of room for Congress to do more to investigate and remedy current problems with profiteering, corruption, mismanagement, and inequality.  So far, the Trump Administration has resisted sharing information about billions of dollars’ worth of emergency loans and other pandemic-related public outlays. To boost transparency and public trust, Congress may need to do more to support a new version of something like the Truman Committee, which provided useful public oversight of the military-industrial mobilization for World War II.

Of course, it is possible that this crisis will prove to be harder to overcome than we expect. Vaccine programs may run into unforeseen difficulties; U.S. and world leaders may fail to make good policy choices; and pandemic disease may prove to be so unmanageable as to suppress employment and investment for many years to come. We don’t dismiss entirely this dangerous possibility of a Great Depression of the 2020s, but we suggest that history offers considerable hope that it can be avoided. Human ingenuity and resilience, if adequately supported by competent governance, should allow us to overcome this crisis in months, instead of many years. We should embrace a cautious optimism, while demanding that our leaders take steps to promote, rather than inhibit, recovery and long-run prosperity. 

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176017 https://historynewsnetwork.org/article/176017 0
Misremembering the Fall of France 80 Years Later (Part 2) part one of this article is here


Calais, June 1940. Bundesarchiv, Bild 146-1971-042-08 / CC BY-SA 3.0 de


Any review of the military events of 1940 inevitably leads to some appraisal of the pre‑war condition. If resistance were actually intense, in will and in weapon, what might this suggest about a people said to have been so gutted by the experience of the First World War that all they wanted to do was enrich and amuse themselves? How does it square with contemporary reports that, contrary to the musing of latter‑day prophets, they went to war in September 1939 with confidence and determination? The American ambassador reckoned the nation's "self control and quiet courage" to be "far beyond the usual standard of the human race." Foreign journalists were struck by the fact that there were almost no incidents of reservists failing to report for duty. Janet Flanner judged the nation's morale "excellent" for being "intelligent, not emotional." If there was no enthusiasm for war, neither was there panic, nor presentiment of disaster ‑ which is why, when it came, one instant autopsy followed another in desperate bids to discover, after the fact, what had been missed before.


Had French intelligence overlooked the build-up of German arms under Hitler, or misunderstood how they would be used? No, it had monitored German rearmament since the 1920s, and was clear on the principles of what came to be called blitzkrieg. Had it misunderstood Hitler's intentions, or allowed successive French administrations to become complacent about the nation's security? No again. The warnings were legion and accelerating since 1936.


Had those administrations failed in their post‑Depression responsibilities by refusing to invest in the most modern instruments of war or in the industrial infrastructure needed to produce them? Again, the answer is no. Between January 1937 and September 1939, the French tank force had leaped from 162 machines to more than 2,200. The numbers of 25‑mm anti‑tank guns had risen from 1,800 to more than 2,600, while the arsenal of 75‑mm anti‑aircraft guns had tripled to nearly 400. In September 1938 with a monthly production of only 39 modern planes, the air command said war would mean its annihilation within a fortnight. By September 1939 monthly production was 285. By May 1940, French production of modern combat aircraft had surpassed 600, only a whisker away from German production; and it is in those production figures that one finds an explanation for why the French air force actually doubled in size between the onset of war and the armistice ‑ this despite the loss of 2,000 planes during the six‑week campaign.

Was there then something inherently flawed in the character of the country's leaders, or something missing in their inner constitutions that made them ill‑suited for war‑time leadership? Not unless being decorated war veterans from 1914‑18 ‑ as more than two‑thirds of the French cabinet were ‑ somehow disqualified them for war‑time office. Edouard Daladier, Prime Minister between April 1938 and March 1940, was one such. As an article in the Free Press of 2 September 1939 reported, Daladier had already seen "about as much front‑line fighting... as any man could." His commander‑in‑chief, General Maurice Gamelin, also received high marks from the same paper. Correspondent Harold Moore told the paper's readers that Gamelin, another veteran of the First World War, had assembled what was reputed to be "the finest army in Europe."


How then to reconcile all that made France's defeat unlikely and unpredicted, with her undeniable collapse? At the outset it might be worth remarking that this defeat was not singular, as the Poles, the Danes, the Norwegians, the Dutch, the Belgians, even the English survivors from Dunkirk, would attest. As for the French themselves, the military and civilian leadership, there were certainly errors of judgment.


Some of them were long‑term and abiding. They underestimated the speed with which armored vehicles could negotiate the hilly, forested terrain of the Ardennes, an underestimation that left them content to install only light fortifications across that sector, and to deploy behind those defences only reserve infantry divisions of mainly middle‑aged troops. Those miscalculations, in turn, were magnified by the related strategy of rapidly advancing the mobile left flank into Belgium at the first sign of a German assault, a plan which ensured that some of their best forces ‑ including one of their three light mechanized divisions ‑ were moving away from France in one direction just before seven panzer divisions started moving toward France in the opposite direction.


Significant, too, was the fact that their rearmament program was slower than what the future proved was necessary, partly because the country had emerged later from the world economic Depression than most great powers, and partly because that bitter experience had inspired a commandment for fiscal prudence. Moreover, and contrary to the notion that the French were technologically backward, their acute appreciation of the speed of technological change actually encouraged delaying mass production of the most modern weapons ‑ tanks and fighter planes especially ‑ until a crisis seemed imminent and their deployment more likely to determine the war's outcome.


Related in various ways to all of the foregoing was the signal failure in May 1940 to comprehend quickly enough the lightning pace of the early campaign. Having for too long concentrated on maximizing the armor and armaments on their own tanks ‑ at the expense of speed and fuel range ‑ the French high command could not adjust in a matter of weeks to the distances that enemy armour could traverse, its course paved in advance by the destructive intrusions of the German assault bombers. Calculations of the enemy's capacity for reaching its targets with adequate fuel and infantry support were consistently out by hours, a half‑day, even a day. And related to this, in turn, was the interwar air command's too‑prolonged fascination with strategic bombers, and the attendant playing down of fighter aircraft and of the on‑field impact of dive bombers. They were not ignorant of any of these instruments of modern warfare. They knew, but blinded by the certainty that they were right, they had not understood.


The German victory remains a victory, and the French loss, a loss. But what has happened over the past 80 years, particularly over the past 30, has amounted to a slow and meticulous reappraisal of what actually happened in May‑June 1940. Gone are the days of titles such as The Unfought Battle (1968). Today, current scholarship is in the process of dismantling the allegations that have so long supplied the comic with his bag of satirical jibes at France and the French. Slowly, the image of 100,000 Frenchmen with hands in the air is being replaced by the image of 100,000 hospital beds, and half as many gravestones.



Much has been written on this stunning, and surprising, defeat, some of it resolutely focused on the ground and air operations, some of it using those operations only as prelude to much deeper explorations of why the Third Republic collapsed practically overnight. In the ten year interim between 70th and 80th anniversary of the Fall, a significant number of works have appeared, many of which are interpretively consistent with what I have argued. All acknowledge the French fought valiantly in 1940. All acknowledge the staying power of the “Surrender Monkeys,” the enduring myths of gutless France, the temptation to avert eyes from the national humiliation. But all acknowledge the six week bloodbath, the casualties inflicted, as well as the casualties sustained. On both sides. 

That would be true of Philip Nord’s France 1940: Defending the Republic (2015), of Hugh Schofield’s article “The WW2 soldiers France has forgotten” (2015); and of Richard Carswell’s The Fall of France in the Second World War (2019). True also of Charles de Laubier’s article “Les soldat morts pour la France en 1940 méritent une commémoration” (2015), Jean-Dominique Merchet’s textual reproduction of a 2015 address by Chef de Bataillon Huiban, entitled “Il est temps to réhabiliter le soldat de 1940,” and Dominique Lormier’s La bataille de France au jour le jour (2010), a successor to his Comme des lions Mai-juin 1940: Le Sacrifice héroique de l’armée française. (2005) And finally, we are about to have Rémi Dalisson’s Les soldats de 1940. Une génération sacrifiée (2020)



Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/175955 https://historynewsnetwork.org/article/175955 0
Viral Consequences

Photo Marti Johnson, Wikimedia Commons, CC BY-SA 4.0


People from all cultures, in every walk of life, rely on stability. As much as possible, we form patterns and habits that are oriented towards keeping things the same. Regularity is an integral element of survival.


Dramatic change holds inherent risks, and refraining from acting impulsively is usually a good instinct. When we take the time to consider our options and make sober decisions, we are ensuring that our actions are made with gravitas.


This tendency also means that we sometimes hold on to self-destructive habits or tolerate inequitable situations, individually and collectively. We also can see a new problem arising, or a long-standing issue, but do not take action because it requires leaving behind a comfortable routine.


Often it takes converging events to demand change or justice. As we consider the future, the coronavirus pandemic and the response to the murder of George Floyd present critical issues related to each other.


We have had to rely on government honesty, support and protection through the first part of this year because of a virus. Decisions for our well-being have been made for us, and we have sacrificed freedoms at the insistence of our leaders. Then, right before our eyes, we witnessed a horrendous crime, reminding us graphically how law enforcement continues to abuse Black Americans. These same policemen were enforcing curfews for our safety, in the name of government, for months before this murder. 


In the wake of this crime, we have seen how some police officers have openly used violent and inappropriate tactics in the name of quelling disorder. Clearly, abusive policemen are in a minority, but events continue to prove the actions of many are supported by a system of social control that has failed to solve problems and is structured to suppress dissent.  This continues because there are political leaders who see peaceful protest as chaotic and criminal, only because it is a challenge to their preferred ordering of society. 


The demonstrated self-interested behavior of the administration in response to recent events has certainly made us aware of which leaders are committed to acting in the public interest—the whole public—and those who aren’t. 


The blatant partisanship of some political leaders, and its effect, is exposed on a daily basis. For example, President Trump reveals he has no interest in working on the social issues that still plague the United States, and arguably he has reinforced them. And his response to the pandemic shows his preference for economic stability and his own political survival over the need to face medical realities.


The Trump administration had also cut funding for pandemic preparedness and discounted the impact of the coronavirus as it began. 


Delay in tackling difficult issues or acknowledging danger, whether from lack of initiative, greed or prejudice, is outrageous and unforgivable for a leader holding the public trust. 


The president’s decision to stop U.S. funding of the World Health Organization has nothing to do with its effectiveness, and all to do with his personal gripes and frustrations. And without facts or foundation, Trump continues to minimize the risks of prematurely ending the precautions we have taken to slow the spread of the virus. 


Trump represents a number of world leaders who focus on power and profit at all costs. They reveal their narrow self-interests and blindness to the common needs of our human family.


The current lack of qualified leadership should awaken us to another looming threat on the horizon. As with the unpreparedness for pandemics, and blindness to the violence, bias and repression that faces Black Americans, there is also little concern for the coming global crisis of an order that will have an unprecedented effect on everyone. 


The violence repressing protesters is only a small preview of what we’ll get if and when our society faces famine and drought, refugee migration, another pandemic and/or economic dislocation from an unprecedented climate crisis. Even though the United Nations has presented the undeniable connection between health, social justice, and ecological destruction, most governments, including ours, remain oblivious to this three-pronged threat.


Environmental degradation is no secret; it has been consistently predicted and described by a vast number of scientists who have been observing the planet’s condition over many decades.


Yet the pattern of ignoring key social issues and environmental crisis until they become explosive and intractable, goes on. 


Even as the virus continues to rage, President Trump attempts to reverse sound policies that emanated from research connecting pollution to serious disease. He has revealed himself to be inept and dangerous in handling every challenge of his administration, and along with a team of climate change deniers, he is allowing continuing degradation of the planet. 


A critical juncture has arrived.


If global warming and atmospheric pollution are not confronted immediately, they will not recede until the planet is unrecognizable. If we wait much longer, belated disaster plans or trillions of dollars will have zero effect in combatting future violent storms, severe droughts, and rising oceans. 


If gargantuan funding relief can be found in response to a pandemic, it is also possible to secure a world superfund to reverse global warming. That must happen very soon.


However, financing is secondary to the attitude change necessary to provoke the industrial and social transformations that will stop the continuing destruction of the Earth’s ecological foundations. 


As the current health threat recedes, even with our new perspective on contagious disease, there will be a natural temptation for us to fall back into a false sense of security and return to comforting patterns. There are many people who are still at risk of dying as the virus still spreads.


And as the initial response to the killing of George Floyd subsides, even if there are changes to laws and attitudes that support the struggle of Black Americans, the battle is yet to be won against racists; their political interests will resist anything that represents significant reversals. 


However, without question, if the environmental policies and economics of the most powerful governments in the world remain unchanged, we will continue to virtually ignore the imminent danger, allowing the destruction of the planet’s ecosystem.


With this in mind, the heavy-handed police response to protesters presents a dire warning. Much of the planet’s political leadership, in league with its economic elite, will seek to maintain the economic status quo at any cost, and without concern for human degradation and suffering by any cause. Repressing all protest is a high priority. 


In the wake of the pandemic, the suppression of peaceful protest of routinized police violence against Black Americans, is an early symptom of the “new normal.” It is an intolerable syndrome that requires recognition and a powerful antidote.


Monarchies, theocracies, and dictatorships have initiated horrific events and wars in the distant and recent past. Yet the growing presence of an elite global oligarchy, integrated within democracies, determined to thwart ecological revitalization, is the most dangerous situation the planet has ever faced.  


And the greatest fear of the powerful forces unwilling to embrace the necessary environmental changes is huge uncontrollable public protests.


Over the course of history, in seeking basic needs and a peaceful life, humans have demonstrated flexibility and incredible ingenuity, particularly in times of crisis. It usually involves innovative thinking, and often, revolutionary adaptation.


Yet returning to the comforting stability of daily routines is possible only with a global awakening to the destructive spiral that has begun, There is a need to immediately modify our relationship to the Earth, and as importantly, to each other, or the consequences will be dire.

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176016 https://historynewsnetwork.org/article/176016 0
The Twisted History of Domestic Military Intervention

"The Battle of the Viaduct," 1877 Railroad Strike, Chicago.



In his controversial “Send in the Troops” New York Times op-ed, Senator Tom Cotton (Republican-Arkansas) misquoted the Constitution of the United States. New York Times editors, who are under fire for running the essay, either failed to fact-check the article or ignored errors. The Times now admits that the opinion piece “did not meet our standards,” but that does not explain why it was published. Cotton received both a bachelor’s and law degree at Harvard University where his undergraduate major was government, which raises questions about the quality of the education he received at the Ivy League institution. 


According to Cotton’s essay, the federal government has a constitutional duty to the states to “protect each of them from domestic violence.” He placed the phrase in quotes. However, what Article 4 Section 4 of the Constitution actually says is that “The United States shall guarantee to every State in this Union a Republican Form of Government, and shall protect each of them against Invasion; and on Application of the Legislature, or of the Executive (when the Legislature cannot be convened) against domestic Violence.” The federal government only has the constitutional authority to intervene in response to “domestic Violence” when requested by state governments. At a time when Americans are questioning the nation’s racist heritage, it is worth noting that Article 4 Section 2 is the passage that required the return of freedom-seekers who escaped enslavement to the persons who claimed to own them.


In his op-ed essay, Cotton defended Donald Trump’s assertion that under the Insurrection Act of 1807 the President had the unilateral right to dispatch federal troops to states and cities to quell domestic unrest. Cotton claimed “This venerable law, nearly as old as our republic itself, doesn’t amount to ‘martial law’ or the end of democracy, as some excitable critics, ignorant of both the law and our history, have comically suggested . . . Nor does it violate the Posse Comitatus Act, which constrains the military’s role in law enforcement but expressly excepts (sic) statutes such as the Insurrection Act.”


The use of federal troops in a law enforcement role has a twisted and often anti-working class and racist history. The Militia Act of 1792, the forerunner of the Insurrection Act of 1807, was written to advance genocidal policies against Native American nations. It authorized the deployment of federal troops to defeat native resistance to displacement by European American settlers moving into the Northwest Territory. After a series of American defeats by a confederation of native forces including the Miami, Lenape, Huron, and Shawnee, who were protecting their homelands, President Washington ordered the formation of a special federal army unit. Under command of General Anthony Wayne, federal troops defeated the confederated tribes at the Battle of Fallen Timbers in 1794. In the Treaty of Greenville, they were forced to abandon most of what would become the state of Ohio. I still remember reading about Wayne’s “heroic exploits” as a second-grader in the 1950s in Anthony Wayne, Daring Boy, part of the Childhood of Famous Americans series.


The Militia Act was next used in 1794 against white frontier farmers in western Pennsylvania. Washington, urged on by Secretary of Treasury Alexander Hamilton, dispatched federal troops to quell protests against a whiskey excise tax that favored large producers and discriminated against them. A 12,000 member federal force was sent to Western Pennsylvania to confront a rebel army that proved to be largely fictitious. A small group of farmers were arrested and sent to Philadelphia for trial on charges of treason. Two were convicted and an apparently embarrassed Washington pardoned them both. This was only the first instance in United States history that federal troops were used to throttle working-class protest.


The Insurrection Act of 1807 expanded Presidential authority to deploy federal troops to suppress domestic unrest. “[I]n all cases of insurrection, or obstruction to the laws, either of the United States, or of any individual state or territory, where it is lawful for the President of the United States to call forth the militia for the purpose of suppressing such insurrection, or of causing the laws to be duly executed, it shall be lawful for him to employ, for the same purposes, such part of the land or naval force of the United States, as shall be judged necessary, having first observed all the pre-requisites of the law in that respect.” The primary prerequisite for ordering federal troops into action was issuing a warning.


The 1807 law probably was a response by the Jefferson administration to fear of a revolt led by former Vice-President Aaron Burr in newly acquired western territories. Jefferson endorsed the legislation after Secretary of State James Madison argued that under existing law, “It does not appear that regular Troops can be employed, under any legal provision agst. Insurrections but only agst. expeditions having foreign Countries for the object.” Jefferson never used the new Presidential authority against Burr, but did invoke it in 1808 to block smuggling from Canada on Lake Champlain in upstate New York.


In the 1830s, President Andrew Jackson invoked the 1807 law twice. In 1831, at the request of the Governor of Virginia, federal troops were used to help suppress the Nat Turner slave rebellion. In 1834, Jackson used the law with an invitation from the governor of Maryland to break a strike by Irish immigrant workers on the Chesapeake and Ohio Canal. The law and federal troops were later used repeatedly to defeat workers striking for the right to organize unions, fairer pay, and safer conditions. In 1877, President Rutherford B. Hayes sent federal troops into Baltimore, Philadelphia, Pittsburgh, and Martinsburg, West Virginia to break the Great Railroad Strike. In a Presidential proclamation Hayes issued the required warning and announced that the request for federal assistance had come from the governor of West Virginia.


In 1894, President Grover Cleveland used the Insurrection Act to break the Pullman Strike and in 1914 President Woodrow Wilson used it against striking Colorado coal miners. In all three cases federal intervention was requested by the states and powerful business interests. Cleveland also sent federal troop into the Wyoming Territory in 1885 at the request of mining companies when workers affiliated with the Knights of Labor attacked Chinese contract laborers. In 1946, President Harry Truman used federal troops to break a strike by railway workers and in 1952, during the Korean War, he threatened to use his authority as Commander-in-Chief to seize, open, and operate U.S. steel mills when workers went out on strike.


Herbert Hoover notoriously used the Insurrection Act in 1932 to order federal troops to disperse World War I veteranswho gathered in Washington DC during the Great Depression to demand a promised federal bonus for their wartime efforts. President Lyndon Johnson (1967) and President Richard Nixon (1971) also used the act to prevent First Amendment guaranteed political protests in Washington DC, in both cases against the war in Vietnam. Franklin Roosevelt (1943), Johnson (1967-1968), and President George H.W. Bush (1992) used the act to police urban ghettos that exploded in violence in response to systemic racism and police abuse.


The Insurrection Act has been modified and reapproved by Congress a number of times. An 1861 revision authorized President Abraham Lincoln to use state militias and the United States armed forces to prevent Southern secession.  


“Whenever the President considers that unlawful obstructions, combinations, or assemblages, or rebellion against the authority of the United States, make it impracticable to enforce the laws of the United States in any State by the ordinary course of judicial proceedings, he may call into Federal service such of the militia of any State, and use such of the armed forces, as he considers necessary to enforce those laws or to suppress the rebellion” (July 29, 1861).


During the debate over the Ku Klux Klan Act of 1871, Congress rejected limits on federal authority that would prevent intervention to protect the rights of freedmen. President Ulysses Grant used the Insurrection Act three times. In October 1871 to combat Ku Klux Klan activity in South Carolina, in September 1872, to intervene in the Louisiana gubernatorial election, and in May 1874 to suppress armed battles between political factions following a disputed election.


However as Southern states were “rehabilitated” and white rule reestablished in the former Confederacy, white Americans increasingly opposed federal military intervention in domestic affairs without specific state authorization. As part of the Compromise of 1877 President Hayes, who had no compulsion about using the Insurrection Act against workers, signed the1879 Posse Comitatus Act, restricting Presidential use of the Insurrection Act to protect freedmen in the South.


“From and after the passage of this act it shall not be lawful to employ any part of the Army of the United States, as a posse comitatus, or otherwise, for the purpose of executing the laws, except in such cases and under such circumstances as such employment of said force may be expressly authorized by the Constitution or by act of Congress; and no money appropriated by this act shall be used to pay any of the expenses incurred in the employment of any troops in violation of this section and any person willfully violating the provisions of this section shall be deemed guilty of a misdemeanor and on conviction thereof shall be punished by fine not exceeding ten thousand dollars or imprisonment not exceeding two years or by both such fine and imprisonment.” - Sec. 15 of chapter 263, of the Acts of the 2nd session of the 45th Congress


It is important to note that since World War II, two Presidents, one a Republican and one a Democrat, used the Insurrection Act without prior state requests to defend African American civil rights under the 14th amendment. President Dwight Eisenhower sent federal troops into Little Rock, Arkansas in 1957 to protect African American students integrating the local high school. President John F. Kennedy used federal troops to protect civil rights activists in Mississippi and Alabama. 


If Donald Trump used the Eisenhower-Kennedy precedent, federal troops would be protecting peaceful and constitutionally protected marchers protesting against racism in the United States and police abuse and riots. Unfortunately, that is not about to happen.


Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176068 https://historynewsnetwork.org/article/176068 0
Peace is Temporary Without Trustworthy Leaders: Lessons from the Philadelphia Mutiny



We often look to history for guidance on the present, and we frequently call on the Founding Fathers for inspiration. But as we approach the June 21 anniversary of the Philadelphia Mutiny of 1783—when soldiers marched on the city, surrounded the Pennsylvania State House (today’s Independence Hall), and threatened Congress at the points of their bayonets—the lessons for our own time are only troubling. 

Because unlike the people of Revolutionary-era America, we have no equivalent to George Washington, a figure all sides could trust for leadership.

The Philadelphia Mutiny originated in a longstanding injustice: throughout the war, the army was seldom paid. By the early summer of 1783, an armistice was declared and only the final version of a treaty was needed to make it official—but the men in uniform still awaited payment. Even when the men began heading home in early June, thousands departed with empty pockets.

But not everyone. Some lucky Marylanders, for example, had been paid before mustering out, and as they marched south from their cantonment in New York they shared the news with the Pennsylvania troops they passed.

Infuriated by the unequal treatment, some 80 to 100 men left their posts in Lancaster and struck out for the City of Brotherly Love, where they joined more troops stationed in the city, also not happy about getting no money for their service.

Rumors swirled that the men would loot the Bank of North America, and Congress implored Pennsylvania to deploy the militia. But state leaders demurred. People thronged the streets to cheer the mutineers, suggesting the militia might not turn out as ordered.

On June 21, the soldiers surrounded the state house. Inside, Congress gathered in the same first floor room where they had adopted the Declaration of Independence. But instead of producing a grand statement of American ideals, they wrangled with the Pennsylvania government, meeting upstairs, about how to make the mob go away.

That night, Congress sent a plea for George Washington, encamped in New York’s Hudson Valley, to send reinforcements. Then, the delegates voted to flee the city and reconvene across the river in Princeton.

Washington was incensed when he heard about the mutiny. “I cannot sufficiently express my surprise and indignation,” he wrote, “at the arrogance, the folly and the wickedness of the Mutineers.” Washington dispatched 1,500 troops who dispersed the malcontents and arrested the ringleaders. 

On the eve of the Revolution’s final Fourth of July, Congress lay prostrate in New Jersey, a sign that independence almost failed before it was fully achieved.

The story of the Philadelphia Mutiny offers scant comfort for us today. A show of force restored order, but that worked because the mutineers were armed soldiers committing a capital crime by going outside their commanders’ orders. They weren’t civilians protesting, as witnessed in the wake of the George Floyd’s killing.

Once the mutiny was put down, peaceful demobilization was achieved largely thanks to Washington’s presence as a leader trusted by many soldiers and civilians alike. 

Renowned for his virtue, Washington was scrupulously deferential to civilian command of the army. He acknowledged the military’s proper place with large gestures, such as returning his commission as commander in chief to Congress once the peace with Britain was official. 

Yet, he was also master of the small, telling detail. Whenever he appeared before Congress, for example, he stood while the delegates sat and he called the body’s leader “Mr. President” or “your Excellency,” titles Washington himself would later enjoy.

At the same time, Washington advocated tirelessly for his men. He urged Congress and the states to provide more supplies. He remonstrated with politicians to deliver pay. He warned everyone not to overburden the army’s patience.

In an environment of intense mutual suspicion—soldiers accused civilians of stingy ingratitude while civilians saw the army as a threat to their liberty—Washington’s trustworthiness bound the two sides together.

And that’s where we have a major problem. We have no similar figure to rally around. Instead, we get division and confrontation and encouragement to continue fighting our political enemies.

Until we find leaders capable of building trust, people’s discontent, even if extinguished for the moment, will only smolder until bursting out again.

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176074 https://historynewsnetwork.org/article/176074 0
Walter Mondale's Campaign Did Accomplish Something (Ask Joe Biden's Future Running Mate)


Vice President Joe Biden’s promise to choose a woman as his running mate was historic.  Never before had a major presidential candidate so defined his search.  Some suggest Biden should choose an African-American or Hispanic woman, an even more historic commitment. In any event, Biden’s running mate will be the first woman vice-presidential candidate with a strong chance of election.   She will be the product not simply of Biden’s decision but of a process that began 36 years ago when former Vice President Walter F. Mondale opened the presidential ticket to women and other previously-excluded minorities.  

When Mondale announced his selection of Rep. Geraldine Ferraro as his running mate on July 12, 1984, it marked the first time anyone other than a man was a major party national candidate in a general election.  Mondale’s decision was a milestone in giving women access to leadership roles.  But as Ferraro’s acceptance speech a week later recognized, Mondale’s decision had broader significance.  She began her remarks: “My name is Geraldine Ferraro.  I stand before you to proclaim tonight: America is the land where dreams can come true for all of us.”

Ferraro’s opening thought implicitly recognized that Mondale’s pioneering action was not simply selecting a woman running mate but in conducting the first diverse selection process in American history.  Mondale chose Ferraro from a pool that included members of demographic groups traditionally excluded from national politics--women, African Americans, a Hispanic American, and a Jewish American among others.  Against considerable resistance, Mondale built a diverse pool of qualified options even though few from those groups then or had recently held the positions from which vice-presidential candidates usually came—senators, high executive officials, governors or members of the House of Representatives.

Presidential politics had been exclusively for white Christian male politicians.   John F. Kennedy overcame anti-Catholic bias in 1960 to become the first Catholic elected to national office.  From 1964 to 1980, the “diversity question” was whether a ticket should include a Catholic, as the unsuccessful 1964 Republican (William Miller) and 1968 and 1972 Democratic (Edmund Muskie, Thomas F. Eagleton and Sargent Shriver) tickets did.  Sen. Margaret Chase Smith in 1964 and Reps. Shirley Chisholm and Patsy Mink in 1972 ran largely symbolic campaigns for their party’s presidential nomination.  Sen. George McGovern contacted his friend, Sen. Abe Ribicoff, a Jew, about possibly being his running mate, but Ribicoff, like many others, found that prospect unappealing.  Ambassador to the United Kingdom Anne Armstrong was among four finalists before President Gerald R. Ford chose Sen. Bob Dole as his 1976 running mate, but Ford thought the risk of choosing a woman was too great.  Rev. Jesse Jackson won primaries in Washington, D.C. and three southern states and attracted about 18% of the votes in 1984, but never had a chance at the nomination.  

Eligibility for the general election ticket seemed limited to white male Christians in 1984 because that demographic essentially monopolized the positions from which vice-presidential candidates were chosen.  Every first-time running mate on a major party ticket in the 20th century had previously been a senator, governor, or member of the House or held high national executive office (with the sole exception of newspaper publisher Frank Knox in 1936).  Yet in 1984 there were only two women in the Senate, both Republicans, no blacks or Hispanics, and two Asian-Americans from Hawaii.  The only Democratic woman governor was Kentucky’s just-elected Martha Layne Collins.  Few women (Patricia Roberts Harris, Shirley Hufstedler, Juanita Kreps) or black people (Harris, Andrew Young, Donald McHenry) had recently served in Cabinet-level positions in Democratic administrations, and only the controversial Young had held high electoral office.  Only 13 Democratic women and 21 African-Americans served in the House.  Most of the seven Jewish senators were Republicans.

Mondale’s vice-presidential search included women, African-American, Hispanic, and Jewish public officials and some white males.  In addition to Ferraro, a three-term member of the House of Representatives and chair of the Democratic Convention’s platform committee supported by House Speaker Tip O’Neill, Mondale interviewed San Francisco Mayor Diane Feinstein, Collins, Mayors Tom Bradley (Los Angeles), Henry Cisneros (San Antonio), and Wilson Goode (Pittsburgh) and Texas Sen. Lloyd Bentsen. Mondale’s main competitor, Sen. Gary Hart, was also considered, as was Gov. Michael Dukakis.  Other likely contenders, such as Sen. Dale Bumpers and Gov. Mario Cuomo, eliminated themselves.

Mondale was criticized because the women and minorities he interviewed lacked the credentials traditionally associated with national candidates.  None of the women or minorities were senators or former Cabinet members and only Collins was a governor and Ferraro a member of the House.  But Mondale understood the fallacy of imposing conventional criteria since societal bias had denied such groups those opportunities.  Mondale thought some on his list had national talent.  Feinstein later became a senator from California and has been elected six times.  Bradley had narrowly lost for governor of California in 1982; he had led in the polls, but apparently many who told pollsters they would support him apparently changed their minds in the voting booth (the phenomenon of white voters’ support for minority candidates in polls dramatically exceeding their actual vote has been termed the “Bradley Effect—ed.). Cisneros later served in Bill Clinton’s Cabinet.  Dukakis and Bentsen were the 1988 Democratic ticket.

And there was Ferraro.  Polls indicated that Ferraro’s selection initially made the election competitive before a concerted Republican attack, largely directed at her husband’s business activities, tarnished her image.  Ferraro ran a credible campaign and held her own during the vice-presidential debate.

Mondale’s selection process heralded a new opening of American politics to previously disqualified constituencies.  Jackson was the runner up in the 1988 Democratic presidential primaries.  During the next quarter century, members of those groups occasionally were considered for the second spot (Elizabeth Dole, 1988; Colin Powell, 1992; Christine Todd Whitman and Jeanne Shaheen, 2000), sometimes perfunctorily, until Joe Lieberman (2000) and Sarah Palin (2008) were unsuccessful running mates.

In 2008, the competition between Senators Barack Obama and Hillary Clinton for the Democratic presidential nomination involved two figures who dominated Democratic presidential politics from 2008-2016.  Talented women and minorities served in the Clinton, George W. Bush and Obama administrations.  Three of the four Supreme Court appointments Clinton and Obama made elevated women, one being the first Hispanic appointed to that tribunal.  From 1997 to 2013, four of the five secretaries of state were women or minorities.  

By 2020, 26 women (17 Democratic) served in the Senate and 9 (6 Democrats) as state governors.  Nearly one-quarter of the House of Representatives are women.  Nine of the 100 senators are African Americans, Asian Americans, or Hispanic Americans.  Six women qualified for presidential debates in 2020 as did members of racial minorities.  Biden’s potential selectees include two former rivals for the presidential nomination, Senators Kamala Harris and Elizabeth Warren, and their Senate colleagues Tammy Baldwin, Tammy Duckworth, and Maggie Hassan, Governors Michelle Lujan Grisham, Gina Raimondo, and Gretchen Whitmer, Representative Val Demings, former National Security Advisor Susan Rice, and Atlanta Mayor Keisha Lance Bottoms, among others.

If Biden’s running mate is successful, she will be the first woman elected to national office after 116 men won those positions in the first 58 elections.  That would be a historic moment, the first time the election to the second office would have produced news more significant than the presidential result.  Yet it will be only the latest chapter in the journey that began in 1984 when Mondale opened his vice-presidential process to persons from such traditionally excluded groups and chose Ferraro as his running mate.

copyright Joel K. Goldstein, 2020

Editor's note: this article was submitted and accepted for publication before Senator Amy Klobuchar's decision to withdraw from consideration. It has been updated to reflect this development.

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176069 https://historynewsnetwork.org/article/176069 0
Spike Lee’s Da 5 Bloods: How Bad is It?



“Awful” is not the most thoughtful way to begin a film review. But why mince words? The film’s “bloods” are four black Americans who have returned to Viet Nam to recover the remains of a fifth, their buddy Norman. If you took the storylines of Frances Ford Coppola’s 1979 Apocalypse Now out of Bloods – yes, they even go upriver to strains of Wagner’s Ride of the Valkyries! – there is little originality left for filmmaker Spike Lee to claim; even its subplot of the four’s search for a buried chest of gold bars is plagiarized from 1999’s forgettable Gulf War adventure Three Kings. 

Bloods trots out every caricature of Viet Nam War figures that you can imagine, beginning with American veterans of the war. The image of traumatized veterans was a Hollywood staple even before PTSD was canonized in the 1980 DSM. The utility of the victim-veteran as a metaphor for the America victimized by the war made it political and filmic catnip from Ronald Reagan’s “Morning in America” years to Donald Trump’s campaign to “Make America Great Again.” You might think Spike Lee could leave it alone but no: when fireworks are thrown outside a bar in Ho Chi Minh City, all five “hit the deck.” Later, one of them randomly claims “we all have PTSD,” while still later another says, “We’re all broken.” 

But Viet Nam veterans were not all broken. Most returned quietly to their workplaces and schools like veterans of any other generation. Many others, politicized and empowered by their wartime experience, joined the antiwar movement. One of those, Chuck Searcy, returned to Viet Nam and founded Project Renew devoted to finding and defusing unexploded ordinance left behind by the American military in Quang Tri Province. Searcy is white. Maybe Lee has his bloods connecting with Searcy and redefining their mission—instead of searching for buried treasure, they search for buried landmines? Guess again.

Lee’s troop meets up with a group of backpackers in-country to do Project Renew-type work. The group’s leader is a whiter-than-white twenty-something woman, a scion of wealth reaped from what had been French Indochina—a riff from the 2001 Apocalypse Now Redux—who is irresistible to a son of one of the bloods who has (somehow, inexplicably) arrived in the jungle from Morehouse College. 

Bloods’ racial clichés are the film’s strongest through line. The bloods come across as foulmouthed and uninformed about the war as they probably were when they were sent to fight it; the veteran-father of the Morehouse College man is an absent presence in his son’s life; and with the exception of Otis, who takes time in Ho Chi Minh City to find the daughter he fathered back in the day, the bloods are as disdainful of the Vietnamese as many GIs were at the time. And the Vietnamese characters fit stereotypes too: some paramilitary guys looking like the hapless losers seen in most American movies (check); the one woman, a wartime prostitute (check). 

This awful movie is also dangerous. With one blood wearing a MAGA hat, the loss of the war is attributed to home-front betrayal—upon return, we were called “baby killers,” says one of the four. Republicans have been on a half-century campaign to avenge the treachery of the anti-war left, as they see it, and now is not the time to give credence to Trumpian revanchism as a promising course for Black Americans—or anyone.        

The one good reason for seeing Da 5 Bloods is to credential your advice to others that they don’t. 

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176072 https://historynewsnetwork.org/article/176072 0
Dear Vice President Biden: Bring Achievers Back to the White House



Dear Vice President Biden:

Congratulations on your successful campaign and your impending nomination.  The next few months, of course, will be arduous and you will confront endless personal attacks.  But the most difficult challenge will be planning on how to restore and revitalize a tragically hobbled nation.  

The situation before this year began was already dire:  the growing racism and anger permeating the country; unprecedented cruelty against immigrants and those seeking a better life for themselves and their families; crushing economic and health inequities; a debased civil discourse; smashed foreign alliances; and an unprecedented assault on the rule of law.  Events of the past few weeks further exacerbated a grim situation.

We are confronted with the greatest single challenge since at least the Great Depression.  Hundreds and even thousands are dying each day of a disease which was unknown months ago.  The heartbreak for the dying and their bereaved families is palpable.  Added to this is an economy wracked by soaring unemployment and even a scarcity of goods.  And now a shocking police war against peaceful protest is accompanied by violent acts by emboldened racist extremists.

A great leader demonstrates empathy and provides inspiration.  That will be needed as we grapple with the ongoing pandemic and, hopefully, a post-coronavirus world.  Few current political figures have shown as much decency as you, and your ability to relate to everyday Americans will be valuable as you and we move forward.

This health crisis has revealed the devastating result of a continued denigration of competence, expertise, and public service.  There is a vacuum of trained, experienced people to lead the executive branch, from the White House through the departments and agencies.

I spent several years researching one dinner, the unique event that President and Mrs. Kennedy held to honor forty-nine Nobel Prize winners at the White House in 1962.  Joining them at the dinner were even more great American scientists, writers, and scholars. This unique gathering celebrated the apogee of American accomplishment, and it came at a time of crisis, the height of the Cold War.

Those present that night had made enormous contributions to the United States and the world across the spectrum of human endeavor:  medicine, chemistry, physics, literature, and peace.  They selflessly developed medicines and technologies which literally transformed our society.  Some differed with the president, even stridently so.  Several of these distinguished leaders were immigrants fleeing tyranny in their native countries.

President Kennedy understood the importance of symbolism and the value of highlighting the achievements of these individuals.  In his remarks that night he underscored how they would prove to be an inspiration to young people throughout the hemisphere who would someday take their place at the forefront of knowledge and discovery.

We need to return to that spirit, and I hope that you will not only bring the greatest minds back into the government—especially those imbued with integrity—but also make scientists and thinkers welcome once again at the White House.

One other thought involving an earlier president who came to the office during a national crisis.  I also have been studying Franklin Roosevelt’s somewhat obscure speech at tiny Oglethorpe University six weeks before he was nominated for the first time in 1932.  FDR called for an expansion of the national government to meet the emergency; for a planned, national approach to move forward; and, most especially, to undertake “bold, persistent experimentation.”

He also said that day, “It is common sense to take a method and try it: If it fails, admit it frankly and try another.  But above all, try something.”  That speech set the strategy for the New Deal and for Roosevelt’s program for the 1930s.

If you take office in January 2021 you will be facing unprecedented difficulties and headwinds. But I hope that you will follow President Kennedy’s example and honor and encourage experience in searching for a renewed common effort to rebuild the country.  I also hope that you will emulate FDR’s simple yet profound example to be nimble, aggressive, and pragmatic.

This is all is a tall task, but you can be the right person at a hinge point of history.  One final request, and more critical in recent days than ever before: “Bring us together.”

Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176073 https://historynewsnetwork.org/article/176073 0
Roundup Top Ten for June 19, 2020

The GOP Missed Its Chance To Embrace Martin Luther King Jr.

by Tim Galsworthy

Invoking a sanitized and selective memory of Dr. King enables politicians and voters to trumpet order and exhibit faux outrage at disorder, rather than face up to endemic racial inequalities.


The History of the “Riot” Report

by Jill Lepore

How government commissions became alibis for inaction.



The End of Black Politics

by Keeanga-Yamahtta Taylor

The 1960s generation of Black protest demanded a stronger presence in local government. The current protest movement recognizes that presence isn't enough; leaders must advance an agenda that serves their least advantaged constituents. 



Bail Funds are Having a Moment in 2020

by Melanie Newport

Activists have supported protestors by contributing to bail funds, but it's time to follow through on the longstanding call of social movement leaders to abolish cash bail as a symbol and symptom of unequal justice.



After World War II, Most ‘Ordinary Nazis’ Returned to Lives of Obscurity. The World Must Recover Their Stories Before It’s Too Late

by Daniel Lee

The act of recovering perpetrators’ voices sheds light on consent and conformity under the swastika, enabling us to ask new questions about responsibility, blame and manipulation.



A Statue Was Toppled. Can We Finally Talk About the British Empire?

by Gurminder K. Bhambra

Protesters who dumped Edward Colston's statue into Bristol harbor have forced a long-overdue discussion of how the British Empire conquered and governed in the past and set the stage for racial divisions in contemporary Britain. 



A Silver Lining for the Golden Arches in Black America

by Marcia Chatelain

McDonald’s has profited handily from its Black customers, while its presence in Black communities has led to a vexing set of circumstances for Black wealth and health.



A Short History of Black Women and Police Violence

by Keisha N. Blain

Despite, or perhaps because of, their own vulnerability to state-sanctioned violence, black women have been key voices in the struggle to end it.



The Disgrace of Donald Trump

by Sean Wilentz

Trump wants to copy Richard Nixon's "law and order" appeals, but may end up echoing Herber Hoover's violent crushing of the Bonus March movement. 



Appalachian Hillsides as Black Ecologies: Housing, Memory, and The Sanctified Hill Disaster of 1972

by Jillean McCommons

The Sanctified Hill disaster exposed the vulnerability of Black people to climate events due to a combination of placement and neglect.


Wed, 08 Jul 2020 00:07:48 +0000 https://historynewsnetwork.org/article/176062 https://historynewsnetwork.org/article/176062 0