Archive for the ‘History’ Category

Has The Past Become Prologue Again?

Friday, March 24th, 2023

On 30 January 1933, the 85-year-old German hero of World War 1, President Paul von Hindenburg, appointed Nazi leader Adolph Hitler as Reich Chancellor, which was akin to being named Prime Minister. Hindenburg and his German Cabinet, many of whom shared Hitler’s Nationalist positions, thought they could control the loose-cannon Hitler better if he were in Government rather than out of it. Sort of like bringing the camel into the tent, where you hope he’ll spit out, rather than leaving him outside, where you know he’ll spit in.

Thirty-five days later, on 5 March 1933, a coalition of political parties led by the Nazis won the national parliamentary election.

Just as Hindenburg and his Cabinet thought they could control Hitler, so did his coalition party partners. They were all wrong. And, just like that, the 14-year Weimar Republic was dead.

Despite winning only 45% of the vote — 55% of the country having voted against them — the Nazis were now in charge, and within three months the coalition was a thing of the past, with every other political party in Germany having gone the way of the Wooly Mammoth. The Nazis, using what they called “coordination,” had banned them all.

Immediately, Hitler’s Storm Troopers, whose numbers had grown from 400,000 in 1932 to nearly 2 million in January of 1933 (they outnumbered the Jewish population by close to 4 top 1¹) amped up their brutal intimidation and persecution of Jews, Communists and homosexuals. According to the World Committee’s Brown Book, by the end of June they had murdered 43 Jews and severely beaten hundreds more, but the chroniclers point out these estimates are likely quite low.

The Prussian police force was the largest in Germany, and Hitler put Hermann Göring in charge of it. He immediately  populated it with unhinged Storm Troopers wearing police uniforms. They arrested anyone thought to be an “unreliable” German. This included Jews, members of the non-Nazi German press, intellectual elites, homosexuals, and more Jews. In fact, so many were arrested that the country’s prisons could not contain them all. The head of the SS, Heinrich Himmler, solved that problem. On 20 March, just two weeks after the Nazis’ election victory, he announced to the press that “a concentration camp for political prisoners” would be opened at Dachau, just outside Munich. It was to be Germany’s first concentration camp and set an ominous precedent. Two days later, four police trucks ferried 200 of the Nazis’ newly ordained “criminals” to their swell new digs. The citizens of Dachau watched them go by.

Three weeks later, to show they meant business, Himmler’s guards took four Jews out of their cells, brought them outside, stood them against a wall, and shot all four dead.

Dachau, however, was not an improvised solution to an overcrowding problem. As far back as 1921, Hitler had declared that when they came to power, the Nazis would imprison German Jews in concentration camps along the lines of those used by the British in the Boer war.

But the Nazis did much more in the first three months of the Third Reich than round up their version of the usual suspects. They also eviscerated higher education. On 7 April 1933, the Reichstag, the German parliament now controlled by Hitler, passed the Enabling Act, which contained a civil service provision that provided for the dismissal of “politically unreliable” state employees. This was a catch-all phrase for Jews, Communists, non-Aryans, as well as  anyone who had had the temerity to criticize the Nazis. And since, unlike other countries, all colleges and universities were state-owned, that meant many of Germany’s best and brightest were now out of work and facing physical danger. This included 20 past or future Nobel laureates. Albert Einstein was one of them — Germany’s loss; Princeton’s gain. But the Nazis never cared.

And they did not stop with professors and scientists. On 10 May 1933, at the instigation of Minister of Propaganda Joseph Goebbels, German university students organized an “act against unGerman spirit” in nineteen university towns across the country. They compiled a list of “unGerman” books, seized them from all the libraries they could find, piled them up in public squares, and set them all alight. Goebbels joined the students at the Berlin burning, the biggest, telling them they were “doing the right thing in committing the evil spirit of the past to the flames.” One after another, books were thrown onto the funeral pyre of intellect.

We’re not burning books in America — yet, but we sure are banning them.

That is how it started in that most momentous of years, 1933, a year scholars have likened to the Jacobin Reign of Terror of 1793 and 1794 France.

But in reality, the Nazis’ rise to power began with a quickly-put-down revolution in Munich immediately following the end of World War 1. Right up to the very end, the German military and the Kaiser had convinced the German people the country was winning the war. The Armistice signed on 11 November 1918 came as a huge shock, and the people felt they had been betrayed or, as one man put it, “Knifed in the back by the ruling class.” Then came the Treaty of Versailles with its draconian terms of surrender.

Out of the shock and humiliation of that defeat, a small group of radical, fanatical zealots began to slowly poison the soul of what, at that time, was the largest and most advanced country in Europe. In the 14 years of the Weimar Republic between the end of the war and 5 March 1933, the Nazis gradually unleashed a cultural revolution that eventually became an unstoppable national revolution — which ended 12 years later, deep in the ground of a Berlin bunker.

The Nazis did not come to power overnight, but the circumstances of the 1920s and early 1930s sowed fertile ground for their eventual ascendancy. People wrote them off at the beginning. But an economic depression, tremendous bitterness over the perceived betrayal at the end of the war along with the humiliating terms of the Versailles Treaty, and one man of messianic and evil determination was all it took. And millions upon millions paid the price.

Americans knew what was happening in 1933 Germany. Our journalists covered it in detail, and our newspapers published what they wrote: the beatings, the discovery of Jews lying in gutters covered in blood, the book burnings. All of it. But we had our own problems back then, so nobody did a thing to help. Right here, it’s fair to ask, could anything have been done, by anyone, to reverse the unfolding terror. The behavior of the Nazis had been horrific, but the regime had been in power for only a few months. At the same time, the entire world was still in the midst of a global depression, and most countries looked upon what was happening in Germany as a German problem that Germans would fix. At that point, no one cared. Germans had done it to themselves and had walked into that biggest of bear traps with their eyes wide shut.

In America right now we are undergoing our own cultural revolution, and it has some of the same chaotic characteristics of the early 1920s in Germany. Of course it’s different, and we’ve built systems that we hope will withstand the current partisan fanaticism. But January 6th really happened, and it could have been catastrophically worse, just as Adolph Hitler’s Beer Hall Putsch really happened in November 1923, ten years prior to his coming to power. We might want to note that, while 335 of the January 6th insurrectionists have been sentenced to prison thus far, Hitler and his putsch cohorts also went to prison.

It’s what happened afterwards that made all the difference.

____________________

¹ According to the United State Holocaust Memorial Museum, there were approximately 523,000 Jews living in Germany in January, 1933.

 

Time And Time Again, It’s Hubris That Does Us In

Wednesday, March 22nd, 2023

Trying to follow, much less get your head around, America’s ongoing culture wars, ridiculous partisanship, and all the bile spouted repeatedly by hypocritical politicians is like being at a Rappers Convention. It’s constant chaos.

In the middle of that rancid daily lunacy, we might be forgiven for missing a significant milestone: This week marks the 20th anniversary of the invasion of Iraq. Make that the second invasion.

In January 1991, during the presidency of George H. W. Bush, the U.S. went to war with Iraq to free the country of Kuwait (and all its oil), which Iraq had invaded and taken over in August, five months earlier. America had 33 allies in the venture including most Arab states. Iraq had no allies. Not one.

After American and British airpower destroyed more than 30% of the Iraqi military’s capability, the ground operation, Operation Desert Sabre, brilliantly planned and executed under the leadership of Gen. Norman Schwarzkopf, lasted all of 100 hours before Saddam Hussein was forced to accept a cease fire.

And that’s where it stopped. President Bush, General Schwarzkopf, and Chairman of the Joint Chiefs General Colin Powell decided not to continue on to Baghdad, which frustrated a lot of hawkish politicians.

It was the right decision. The war had been won, Kuwait freed, and Hussein humbled and forced to agree to international inspections to root out any weapons of mass destruction he might have stockpiled. Moving on to Baghdad would have mired the U.S. down in a protracted slog, and the Arab allies would never have agreed, anyway. For that matter, it’s likely none of our allies would have agreed.

The 1991 invasion showed American leadership at its finest. The 2003 invasion, the second invasion, showed it at its worst, and we’ve been paying for it ever since.

President George W. Bush’s 2003 invasion demonstrated in vivid colors what hubris can do to otherwise rational people. After 9/11, our job was to capture Osama bin Laden and destroy al Queda. Nearly every country in the world was on our side. Then came the stupidity of Iraq.

Claiming without a shred of verified evidence that Iraq had stockpiles of weapons of mass destruction, — it didn’t — the Administration invaded. Soon, we had taken over the country. We were the dog who caught the bus.

Tragically, Bush did not have his father’s wisdom (remember “Mission Accomplished?”), and he and his neocon associates believed they could conquer and rebuild Iraq in the image of America, just as Presidents Eisenhower, Kennedy, Johnson and Nixon had believed the same about Vietnam. All of these five presidents were catastrophically wrong.

Nearly three million U.S. soldiers deployed to Iraq and Afghanistan beginning in 2001. Twenty-five hundred are still in Iraq today. According to Brown University’s Watson Institute for International and Public Affairs, more than 7,000 of our troops had died there by the end of 2019. Thousands more were wounded, many of those maimed for life.

Just as so many Americans now have experience fighting, dying and being wounded in the Middle East, I have experience in Southeast Asia, where more than 50,000 of my brothers in arms died in a hostile place.

Thirty-five years after I landed in Saigon and got to participate in one of America’s worst mistakes — until then, — President Bush and his Neocons  dropped us into another awful, no-win position. He and men with names like Cheney, Wolfowitz, Pearl, Kristol, et al, were blinded by the bright lights of “American exceptionalism.” Few, if any of them, had ever known a day of military service. They knew the right people and either had deferments, lots of them, or, like George Bush, were weekenders. Although it appears to have been fine for weekenders of Mr. Bush’s social and political status to skip those tiresome drills if they proved inconvenient.

A lifetime spent walking war’s sanitized sidelines, never hearing that unforgettable and very special sound a bullet makes as it whizzes past your ear,  may prevent one from appreciating the chaotic hell of war and from grasping how terrifying it really ought to be to rip men and women from the fabric of their families to face the horrifying prospect of fighting and dying in a strange land for a counterfeit cause.

The Iraq war has been a national nightmare, but what I always found most horrifying about it was that once we were in it nobody, especially the egotists who tossed us into that deepest of pits, ever had any idea of how to get us out of it, which is exactly the same thing that happened to us in Vietnam. Vietnam, where lessons should have been learned, but weren’t. Instead, they were swept under the nation’s political rug for posterity to trip over. And it did.

In the pantheon of man-made catastrophes, our wars in Vietnam and Iraq have been monumental achievements.

Happy anniversary.

 

The Calendar, The Nuts, And A Long-Ago Time In A Faraway Place

Monday, March 20th, 2023

No political punching today. This year’s terrific NCAA tournament (my bracket was busted in about a minute and a half) has put me in a good frame of mind.

So, let me tell you a story.

Long ago, in a galaxy far, far away, a 23-year-old, newly minted, Infantry 2nd Lieutenant Airborne Ranger with my name spent a fair amount of time in a little woebegone country in Southeast Asia called Vietnam.

In Vietnam, I took command of a platoon of about 30 draftee soldiers, none of whom wanted to be there and all of whom never understood why they were. There were no college graduates among that lot, and a few never finished high school. They were America’s flotsam and jetsam, and they all knew it. I would grow to love every one of them.

Month after month, my guys and I patrolled the mountains in the north of South Vietnam, occasionally encountering North Vietnamese Regulars who were doing the same thing. Those were interesting meetings. We didn’t talk much when we met, but we did frequently have a somewhat frank exchange of views.

Our Platoon had some memorable moments in Vietnam, such as The March to the Sea, The Rescue, The Whistling Mortar Round, The Search for the Body That Turned Out to Be a Piece of Wood, and The Flying Flywheel. But those are stories for another day. For right now, for today, we’re telling the story of The Calendar and the Nuts.

Four months before the end of my first Vietnam tour, the Army promoted me to 1st Lieutenant, made me say “goodbye” to my guys, sent a Huey chopper to fly me out of the jungle, and gave me a staff job on Firebase Vegel in northern South Vietnam. A firebase was a temporary army camp built by the Corps of Engineers on top of a mountain. It supplied the troops in its area both logistically and militarily. And by “militarily” I mean weapons, ammunition, and helicopter gunship support and transportation. Firebases were cushier than the jungle, but often more dangerous, because they were stationery targets. That was made apparent to me a number of times in vivid ways.

My job on Vegel was to dream up crazy search and destroy operations for the “grunts” in the jungle, the crazier the better. I did my best to make them crazy enough to satisfy the Commanders, but not crazy enough to get our folks killed. It was not easy.

The Army of North Vietnam and their comrades in the south, the Viet Cong, were a determined foe. They were fighting with biblical devotion for a purpose they believed in — their country. They weren’t going anywhere until the war was over and they had won. We, on the other hand, were the political pawns, the shmucks who were there because we had to be, and none of us liked it all that much. Wasn’t our country. And all of us knew, with a fair degree of certainty, the date we were scheduled to go home — if we could stay alive long enough.

With two months to go in the country with the biggest mosquitos on earth, I began to get a bit anxious. I knew guys who had come to untimely ends with only a few days left, one, a good friend, within three hours of leaving. So, realizing I needed a diversion to take my mind off things, I decided to create one — my very own 60 day, Short-Timer’s Calendar.

I confess while deep in the jungle in the 1960s my admiration for and envy of Hugh Heffner knew no bounds. Consequently, my Short-Timer’s Calendar was the centerfold of the June 1970 Playboy magazine. To build the Calendar, I enlisted the aid of my Battalion Commander Bulldog Carter (that’s right, Bulldog), and Buck Kernan, my partner who went on to become a Lieutenant General, like his father before him. The three of us divided the luscious photo into 60 puzzle-like areas counting down from 60 to one. The trajectory of the progression became increasingly lascivious.

Thereafter, we held a nightly, candle-lit ceremony in the bunker occupied by Buck and me.

⏺⏺⏺

But before I describe the ceremony, I have to tell you about the Macadamia nuts.

During Vietnam, soldiers who were unfortunate enough to find themselves in that hellhole were allowed a ten-day R&R (Rest and Relaxation) vacation, usually a little after the mid-point of their tour. Unmarried soldiers usually went either to Bangkok, Thailand, or Australia.

Most of the time the married folks went to Hawaii to meet their wives. So, when my turn came, I hopped a plane at Da Nang airbase halfway up the coast of the South China Sea and flew off to meet my wife, Marilyn, in Honolulu.

When we first checked into our hotel and got to our room, we discovered the hotel had left a small jar of macadamia nuts for us. Up until then I had never tasted a macadamia nut in my life, but once I tossed the first one down the gullet, I was hooked. I subsequently learned Hawaii is noted for its macadamia nuts. There’s even a Macadamia Nut Visitor Center somewhere on Oahu.

You may forgive me for saying with the exception of the inside of our hotel room and the balcony outside it on Waikiki Beach, the one with a beautiful view of Diamond Head up the coast, we never did see much of Hawaii for the first four or five days.

But toward the end of the R&R, after we’d come up for air, seen the sights, sampled the beach, and done the obligatory Don Ho nightclub show, we went to the PX (Post Exchange) at Scofield Army Barracks and bought a large jar of Macadamia nuts for me to take back to Vietnam. In Vietnam, little things became luxurious delicacies.

The next day, Marilyn and I boarded our separate planes, she to return to the civilized world of Massachusetts, and me to head back to something completely different.

⏺⏺⏺

Back to the ceremony.

The bunker assigned to Buck and me had a single bunk bed. There was only one bed, because Buck and I took 12-hour shifts in the Operations Center, where we prevented the dominos from falling and kept the world safe for democracy. One of us would end his shift, head to the bunker, wave as he passed the other guy, and crash into the bed.

Every night, at 2000 hours — 8 p.m. to you — the three of us, Bulldog, Buck and I, would gather in the bunker. I had scrounged a small table which I had placed against the wall to the side of the bed. I had lovingly pinned Miss June to the wall above the table. At the appointed hour, I would light the two candles I had placed on each side of the table under the pin-up. I would open the jar of Macadamia nuts, which occupied a special spot in the center of the table, and hand each of my comrades one nut, taking one for myself. We would then spend a moment in quiet reflection, meditating on the bounty before us, after which I would, with a red marker purloined from the Ops Center, X-out that day’s descending number on Miss June’s tantalizing body.

We would then eat the nuts.

We did that for 59 consecutive nights. Fifty-nine red Xs covered Miss June. We were down to ONE! On the final night, we held a special ceremony, inviting the Battalion XO, the other six staff officers, the Battalion Sgt. Major, and the Chaplain, Father McBride, into the bunker, which became almost as crowded as the stateroom scene in “Night at the Opera.” We gave everyone a Macadamia nut that night, and, to great applause, I placed the last red X on Miss June’. Even Father McBride smiled.

Then, in a service worthy of priestly ordination, I passed the jar of Macadamia nuts to Buck — who, because he still had six weeks to go, later on would replace my centerfold with his centerfold and continue the tradition. We retired my centerfold to a place of prominence on the side wall of the Ops Center, where it looked down on all the guys, and where Bulldog could see it every day, its 60 red Xs pointing the way to his bit of heaven back in the U.S. Six weeks later, Buck’s would hang his beside it.

The next day, I choppered south, boarded a chartered Pan Am plane with about three hundred other happy survivors, and flew home to what we called “the world.”

Since then, Macadamia nuts have occupied a special place in my heart.

Florida’s Governor Ron DeSantis Builds His Educational Petrie Dish

Tuesday, January 24th, 2023

I know it’s masochistic, but I couldn’t help it. I found myself thinking about Florida Governor Ron DeSantis and his all-out assault on education, specifically education about racism, Wokism (if that’s a word), the LGBTQ+ community, and anything else he doesn’t agree with.

I began my long and winding journey down the DeSantis rabbit hole when I learned that yesterday was the day in 1964 when South Dakota became the deciding and 38th state to ratify the 24th amendment to the US Constitution.

The 24th Amendment prohibits making the right to vote conditional on paying a poll tax, or any other kind of tax. It reads:

Section 1. The right of citizens of the United States to vote in any primary or other election for President or Vice President, for electors for President or Vice President, or for Senator or Representative in Congress, shall not be denied or abridged by the United States or any State by reason of failure to pay any poll tax or other tax.

Section 2. The Congress shall have power to enforce this article by appropriate legislation.

The 24th Amendment applied to Presidential and Congressional elections. Two years later, in 1966, the U.S. Supreme Court ruled 6–3 in Harper v. Virginia Board of Elections that poll taxes for any level of elections were unconstitutional. It said these violated the Equal Protection Clause of the Fourteenth Amendment.

Seven states never held a vote to ratify the Amendment. They are Wyoming, Arizona, Arkansas, Oklahoma, Louisiana, South Carolina and Georgia. One state voted to reject the Amendment’s approval altogether. That was Mississippi. Mississippi again. The state seems to rejoice in being the bottom of the country’s bird cage.

Four states, Virginia, North Carolina, Alabama and Texas, waited years to ratify the Amendment, with Texas being the last, in 2009.

If you don’t count Virginia, which enacted a poll tax in 1876, but repealed it six years later in 1882, Florida was the first state to make a poll tax a condition of voting, enacting the legislation in 1885. It became effective in 1889. In 1941, 52 years later, Florida repealed its poll tax.

Florida did not repeal the poll tax because its legislators were conscience-stricken and knew they had to do the right thing. No. The state repealed the tax because too many white legislative candidates (they were all white) were buying votes by paying the tax for poor black and white constituents (disproportionately black, of course) who couldn’t afford it themselves. In essence, the tax was no longer doing what it was intended to do: suppress black votes.

Florida had two other legislatively approved ways to suppress black voting. The first was the Literacy test. According to the Tampa Bay Times:

In 1915, the Legislature enacted a literacy test along with a companion grandfather clause. The clause, common throughout the South, declared that any person who had a relative who voted prior to a certain date did not have to take the test.

According to the proposed Florida law, if you had a relative who was eligible to vote on Jan. 1, 1867, you were exempt from taking the test. Since no black Floridian was voting prior to that date, all of them had to pass the test.

Blacks were frequently asked more technical and legal questions than whites. When one black applicant was asked what “habeas corpus” meant, he responded: “Habeas corpus means this black man ain’t gonna register today.”

The final way the legislature held down, disenfranchised, the black vote in Florida was by means of the Criminal Disenfranchisement Law. This law, first enacted in 1868, reenacted in 1968, and in effect even today, bars anyone with a felony conviction from ever voting. Florida is one of seven states that still retain this disenfranchisement statute, which disproportionately affects blacks.*

Disproportionate imprisonment of blacks is not something peculiar to Florida. Nationwide, according to Bureau of Justice data, 18 and 19-year-old black men are 12.4 times more likely to be imprisoned than their white peers. And it doesn’t get much better as blacks age, as the chart below shows.

With this as background (and here are 24 more charts showing pervasive racism directed at blacks), Governor DeSantis insists there is no such thing as institutional racism, especially in Florida. And he’s gone to great lengths to make sure anyone in Florida who suggests otherwise will require divine intervention to escape punishment.

Ask Andrew Warren. Last August, DeSantis suspended Warren, the twice-elected Hillsborough County State Attorney, saying he violated his oath of office and has been soft on crime (Remarkably, Florida’s Governor has the legislative authority to do this). What had Warren done? Nothing, except for signing a group statement with other prosecutors saying “we decline to use our offices’ resources to criminalize reproductive health decisions.” In other words, Warren was suspended, not for something he did, but for something he said he might do at some time in the future.

Warren sued to get his job back. Yesterday, a federal judge ruled that, although DeSantis violated the Florida Constitution and the First Amendment, he lacked the power to reinstate Warren. In his 53-page ruling, U.S. District Judge Robert Hinkle, while grudgingly dismissing the case, excoriated DeSantis and his staff for attacking Warren for purely political reasons. Nonetheless, DeSantis won, which is usually the way things work in Florida.

And now, as we are smack dab in the second day of “Florida Literacy Week,” comes the Florida Department of Education’s new rules to enforce the Governor’s Parental Rights In Education Act, known by critics as “Don’t Say Gay” or the Stop WOKE Act and Florida law 1467, the Curriculum Transparency Law, which requires school districts to be transparent in the selection of instructional materials and library and reading materials.

Taken together, these two statutes limit what teachers can teach and what their students can read.

The two statutes are supposed to apply to what goes on in the classroom. Consequently, in federal court filings, lawyers representing DeSantis insist  the statutes do not apply to library books. In practice the opposite is true. A recent 23-slide librarian training program, approved by the Florida Department of Education, asserts: “There is some overlap between the selection criteria for instructional and library materials.” One slide says that library books and teacher instructional materials cannot include “unsolicited theories that may lead to student indoctrination.”

Good luck trying to understand what an “unsolicited theory” is, or what “student indoctrination” means. Indoctrination into what?

The rules are confusing for librarians, but they’re even murkier for classroom teachers, many of whom have created little classroom libraries over the years of their teaching. The Department of Education’s new rules require “media specialists” to vet every one of the non-curriculum  books teachers may have in their classrooms, as well as all the books in the school libraries. In Florida, some school librarians earn “media specialist certificates.” These are the “media specialists” tasked with vetting all the books in Florida’s 4,202 K-12 public schools. In Popular Information, Judd Legum reports that Kevin Chapman, the Chief of Staff for the Manatee County School District, told him that County principals told teachers last week they are subject to a third-class felony charge if unvetted books in their classrooms are deemed to violate the prohibitions contained in either of the two statutes.

Needless to say, those little classroom libraries are disappearing faster than the small piece of meat I dropped on the kitchen floor this morning right in front of my 80-pound dog, Lancelot (so named because he’s not Lance-a-little).

Florida law 1467 on Curriculum Transparency is particularly pernicious, because it prohibits teachers from exercising their own educated judgement regarding what is appropriate for their particular students. For Florida’s teachers, this is scary stuff. They are going to have to be very careful with what they say, or even suggest, in their classrooms.

Some teachers, perhaps many, will refuse to give up their intellectual freedom. It will be interesting to see how that plays out. As George Orwell said, “In a time of universal deceit, telling the truth becomes a revolutionary act.”

Nevertheless, it seems Governor Ron DeSantis has achieved in Florida what all autocrats crave. He has brazenly fastened iron bonds on what the next and future generations of Floridians are allowed to know. To my mind, he has also underestimated the youth of his state whose intelligence, curiosity, global involvement, and just plain desire to know and learn cannot and will not be inhibited by anything an autocratic governor, whose overarching goal in life is to rule the world, will ever do.

My money’s on the kids.

_________________

*Angela Behrens, Christopher Uggen, and Jeff Manza, Ballot Manipulation and the “Menace of Negro Domination”: Racial Threat and Felon Disenfranchisement in the United States, 1850-2002, 109 AMERICAN JOURNAL OF SOCIOLOGY 559 (2003).

 

 

Today, We Thank Our Veterans ― Ninety Years Ago, We Did Something Else

Friday, November 11th, 2022

Well, that was interesting. The midterm election, not yet over, showed once again experts and their predictions are worth about as much as a sneaker full of puppy poo. All the supposedly smart pundits who predicted the end of democracy as we know it are now doing their best to explain what happened, and why. They forgot, or ignored the fact, that, on the whole, Americans are decent, loyal citizens. Right now, regardless of how the election turns out, we can feel proud of most our neighbors around the nation thoughtfully exercised their right and duty to vote, regardless of how they voted and for whom.

It would be nice to think the Congress we are now still electing will be as thoughtful, decent and loyal to the oath it will collectively take in January. However, I can’t help fearing that in Ronald Reagan’s “shining city upon a hill,” internecine, malodorous warfare will remain on full display. Looking for all the world like a gussied-up version of the Hatfields and McCoys, Republicans and Democrats will once again assemble in a highly organized circular firing squad, seeming far more intent on annihilating each other than on devoting themselves to the moral imperative of governing.

But enough of that. This is Veterans Day, a day to give thanks to all the men and women who gave themselves to the defense of our country.

We’ve come a long way in that regard. It wasn’t always so. One hot summer, fourteen years after Johnny came marching home, he was persecuted, dehumanized and then cast into the darkness of the Great Depression. Here’s what happened.

1932 – Washington, D.C.

At the close of World War I, Congress decided to thank the war’s veterans for their service with some cash — $500, which, in today’s dollars, would be about $7,500. Quite a bonus. But there was a catch: The “bonus” authorized by the Adjusted Compensation Act of 1924 would not be paid until 1945. The veterans did not complain at the time. It was The Roaring Twenties. Everyone was flush.

But then along came the Great Depression. The economy descended from full employment in 1929, where the unemployment rate was 3.2 percent, into massive unemployment in 1933 when the unemployment rate reached 25 percent. From sitting on top of the world, plutocrats were suddenly seen jumping out of windows on Wall Street. Breadlines became the meal du jour. The word, “Hobo,” which had been around, but hardly used, since 1888, became a symbol for the forgotten man.

In the summer of 1932, 25,000 penniless, desperate veterans and their wives and children descended on Washington, D.C. They camped in District parks, dumps, abandoned warehouses and empty stores.  These aging warriors had come to the nation’s capital to ask Congress, admittedly 13 years early, for their $500. Newspapers christened them “the bonus Army,” or “the bonus marchers.” They called themselves the “Bonus Expeditionary Force,” the BEF.

The men drilled, sang war songs, and, once, led by a Medal of Honor winner and watched by a hundred thousand silent Washingtonians, marched up Pennsylvania Avenue bearing flags of faded cotton.

The BEF had pleaded in vain with Congress for the money. They were ignored and left to wither. As a last resort they appealed to President Hoover to meet with them. He sent word he was too busy. Then, confronted with 25,000 squatters he would later label “communists,” while asserting less than 10% of them were veterans*, he isolated himself from the city, canceled plans to visit the Senate, had police patrol the White House grounds day and night, chained the gates of the Executive Mansion, erected barricades around the White House and closed traffic for a distance of one block on all sides of the Mansion. A one-armed veteran, attempting to picket, was beaten and jailed.

Conditions for the veterans were pathetic. The summer heat was severe. Lacking shade or screens, the BEF was beaten down by the climate’s fury. Since the founding of the city, Washington was viewed as a place to be avoided in the summertime. In the words of an official guidebook, Washington was “a peculiarly interesting place for the study of insects.” The veteran men and their families had arrived at the height of Cherry Blossom season, but by July they were debilitated, ghostly, dehydrated and hot. Very hot. The columnist Drew Pearson called them “ragged, weary and apathetic with no hope on their faces.” Downtown businessmen complained through the Chamber of Commerce that “the sight of so many down-at-the-heel men has a depressing effect on business.”

And that was the extent of their crime, their threat to the country. They weren’t good for business.

General Douglas MacArthur, the Army’s only four star general who, even then, referred to himself in the third person, had met with some of the men and assured them if he had to evict them he would allow them to leave “with dignity.” But when the end came for the BEF at 10:00 A.M. on 28 July 1932 there was no dignity to be found. Hoover had had enough, and he ordered “Mac” to get rid of them. Trouble was, he didn’t tell the General “how” to get rid of them. MacArthur, who never did anything small in his life, was unleashed.

First, Police Commissioner Glassford, who had been sympathetic to the men, was sent to tell them they had to leave, orders of the President. They refused, which was when MacArthur sent the Army in, led by then Major George Patton and his 3rd Mounted Cavalry — with him prancing at the front atop his privately-owned horse (he had a stable-full; he was rich) — followed by infantry and a World War I vintage Tank Brigade. Bullets began to fly. BEF men were killed. Two babies were gassed to death. And Joseph Angelino suffered a deep wound from Patton’s sabre-wielding cavalry, the same Joseph Angelino, who, on 26 September 1918, had won the Distinguished Service Cross, the Army’s second highest medal, for saving the life of a young officer named George Patton.

By midnight that day, the Army had driven the BEF veterans, their wives and children across the Potomac and out of the city. But that wasn’t good enough for MacArthur and Hoover. The BEF was chased and harassed west and south, out past Ohio and all the way down to Georgia. Then, the veterans just folded into the vast transient population that roamed the land in 1932.

In 1936, overriding a veto by President Roosevelt, Congress voted to immediately pay World War I veterans their full $500 bonus specified in the Adjusted Compensation Act of 1924.

____________________________________________

*The Veterans Administration, which had the actual service records, would subsequently refute Hoover’s claim with an exhaustive study concluding that 94% of the  bonus marchers were veterans of World War I.

“Where Is Nancy?”

Friday, October 28th, 2022

Last night in San Francisco, 42-year-old David Depape broke into the home of Speaker of the House Nancy Pelosi and went looking for her, yelling, “Where is Nancy?’

“Nancy” wasn’t home, but her octogenarian husband Paul was. He and Depape scuffled and Depape beat Mr. Pelosi with a hammer.

According to San Francisco Police Chief Bill Scott, “The suspect pulled the hammer away from Mr. Pelosi and violently assaulted him with it. Our officers immediately tackled the suspect, disarmed him, took him into custody, requested emergency backup and rendered medical aid.”  CNN’s Jamie Gangel scooped that the attacker was attempting to tie up Paul Pelosi “until Nancy got home,” and said he was “waiting for Nancy.” (She was in D.C. at the time.)

Although 82-year-old Paul Pelosi suffered serious injuries (a hammer will do that) he is expected to make a complete recovery. And there you are. An assassination avoided. Let’s all move on.

Well, not so fast. Let me tell you a story.

In the 1890s and early 1900s, Europe’s political tectonic plates began moving precipitously as wealthy elite landowners were being challenged by emerging socialism whose leaders demanded power for the working classes, something anathema to the elites. One of the issues intensifying the socialist movement in France was the Dreyfus case.

In 1894, Captain Alfred Dreyfus had been falsely accused and convicted of treason for delivering French military secrets to the German Embassy in Paris. He was sentenced to life in prison on Devil’s Island in French Guiana. Dreyfus was a 35-year-old Jewish Alsatian French artillery officer, and, yes, from its beginning antisemitism hung heavily over what came to be known as the Dreyfus Affair.

Born in 1859, Jean Jaurès was a highly respected leader of the socialist movement in the Legislative Assembly. He, among others, took up the cause of Dreyfus and was instrumental in forcing a second trial once it became known that high military officers had conspired to frame Dreyfus. In 1896, after 12 years, Dreyfus was fully acquitted and went on to serve with distinction in the First World War. The Dreyfus Affair propelled Jaurès to the forefront of French politicians.

Between the end of the Dreyfus Affair and the war, Jaurès became the intellectual champion of the socialist cause as leader of the French Socialist Party. He was an antimilitarist who was able to pragmatically straddle France’s left and right political wings and could see both sides of an issue. However, in the feverish fervor of the run-up to World War I, while the vast majority of the French were convinced the ignominy of the Franco-Prussian War, with its mortifying defeat at Sedan and humiliating Treaty of Versailles, had to be avenged and Germany crushed, Jaurès advocated diplomacy. He continued tirelessly to prod the public to reject the calls for war.

On 31 July 1914, the day before both the Germans and French mobilized for battle, Jaurès, knowing he had lost the argument and worn out from his efforts, went to the Café Croisant a little after 9:00 pm, for dinner, sitting with his back to a window that looked out onto the street. That was when Raoul Villain, a 29-year-old French nationalist who’d been following him since the evening before, shouted “pacifist” and “traitor,” and fired two shots into his back. Jaurès slumped forward. Five minutes later, he was dead.

The killing of Jaurès stunned Paris. Everything came to halt. He was buried on 4 August, the day the war began.

The death of Jaurès is what can happen when seemingly rational people lose their rationality and are magnetically drawn to violence. Last night in San Fransisco we could have had a similar result.

The political rancor and hatred oozing throughout America today, with its tangible potential for violence, has been brewing for decades. Donald Trump did not begin it. He just made it fashionable.

Although both Mitch McConnell and Kevin McCarthy were quick to condemn last night’s home invasion, I’m wondering what they and the legislators they’re supposed to lead will do now. Will any of them condemn the former president for inciting this violence? Will they condemn any of the Republican leaders who, cult-like, mimic Trump’s incitement? The next 24 hours will be interesting.

On 6 January 2021, in the corridors of the Capitol, the traitorous insurrectionists were yelling, “Where’s Mike Pence?” I ask you what would have happened if they’d found him? And what would have happened last night in San Francisco if the Speaker of the House, the third-highest ranking person in US government, had been home?

Please. Think about that.

 

A Potpourri To Begin Your Week

Monday, September 12th, 2022

Ukraine changing history on the move.

It is 15 December 1937. Today’s international news section of the New York Times is dripping with stories that, nineteen years after World War I, are lighting the way to the next global conflagration. In two years it will begin and happen all over again. On this day we see reports of marches, riots, assassinations, street brawls, and arson. Political warfare. An overture to the real war coming.

In Spain, political warfare has flared into civil war, and, the Times reported, the Army of the Republic has attacked General Franco’s fascist forces at the Aragonese town of Teruel. In three months, Franco will counterattack, rout the Republican forces and capture most of Catalonia and the Levante. He will succeed with troops and warplanes provided by Germany and Italy.

Turn the page and find Hitler’s Nazi Germany issuing new  restrictions on the Jews, slowly squeezing the life out of them. On the facing page, a photograph of Benito Mussolini in his personal railcar giving  the stiff-armed fascist salute. Beneath, a photo of Stalin reviewing a parade of tank columns.

Is there anything that could be done, could have been done, to avert the coming catastrophe? Of course there was, but nobody did it. Mussolini? The Italians loved him; he resurrected the former glory of Rome, and Franco showed Spaniards what nationalistic power looked like. Hitler’s hate fueled the country’s hate. The Jews? Germany, with Hitler’s face, wanted them gone—forever. And Stalin, the man who killed millions of Ukrainians by intentionally starving them with a smile on his face? The Russians never blinked. Neither did the Americans. The Times’s Walter Durante defended him and won a Pulitzer for his efforts.

And so it went. The world stumbled into six years of hell, with millions dead.

Today, in 2022, although it has taken much time, we have made progress. Inhumanity, still glowing bright in many places, is, nonetheless, dimmer than 80 years ago. Today, the Ukraine that Stalin starved is squeezing the Stalin wannabe Vladimir Putin into a box of his own making. The Ukrainian Army is moving ahead and, with tremendous help from a unified NATO, is forcing the Russian Army to retreat, although the Russians call it “regrouping.”

No one knows where this ends, or how, but it seems to me that at some point the people of Russian are going to wake up and see all the body bags coming home. What then?

The race to curb racism in the American Century: The mission of W. E. B. Du Bois.

This month’s edition of the journal Foreign Affairs contains a fascinating and illuminating essay on the charismatic and complicated life of W. E. B. Du Bois.

Written by Zachariah Mampilly, the Marxe Endowed Chair of International Affairs at the Marxe School of Public and International Affairs at Baruch College, this long-form piece details Du Bois’s lifelong, uncompromising mission to eradicate racism.

A sociologist by training, he helped found the National Association for the Advancement of Colored People (NAACP) in 1909. During the Jim Crow era, he became known for an uncompromising stance, demanding equal rights for Black Americans through his journalism and advocacy work while also making seminal contributions to various academic debates.

Du Bois was born in 1868 in Great Barrington, Massachusetts, about 20 miles from where I sit, and his lifespan overlaps almost exactly with the Jim Crow era, a period of persecution during which Black Americans faced severe restrictions on their ability to participate in political, economic, and social life.

Between the two World Wars, he focused more and more on international affairs, arguing that the colonial projects  European countries were pursuing in Asia and Africa had galvanized an envious United States to carve out its own colonies. In 1898, a year before Du Bois published his first major sociological study, The Philadelphia Negro, the United States’ imperial ambitions produced the annexation of Hawaii and the acquisition of Puerto Rico, Guam, and the Philippines as spoils of the Spanish-American War. Du Bois thought America’s imperialistic ambitions and actions fed into and enhanced the country’s racism at home. Consequently, his writings and lectures veered increasingly to the left.

In observing anticolonial struggles in India and elsewhere, Du Bois saw clearly how occupation of foreign lands would breed resistance in the colonized people. From this he concluded that colonial domination abroad often required the sacrifice of democracy at home. In his eyes, Zampilly writes:

Imperialism inevitably led to increased racial and economic inequality at home: military adventures and opportunities for extracting natural resources empowered the capitalist class (and its favored segments of the underclass) and stoked racial prejudice that justified further interventions in foreign lands.

Thus, Du Bois saw domestic racism as the tail of the internationally racist dog.

It was natural that as time went on Du Bois’s views evolved. He became more radical in his writings. He saw international capitalism as the cause of black exploitation. In his middle years he went from believing in “democratic socialism” to embracing communism.  As a result, J. Edgar Hoover’s FBI began investigating him in 1942 and, despite concluding  there was “no evidence of subversive activity,” continued to investigate him for the rest of his life. In 1952, the State Department revoked his passport. The next year, the Supreme Court declared the policy of denying passports to suspected communists unconstitutional.

His wholehearted support of Joseph Stalin, while inconsistent with his lifelong support for democracy, demonstrated his belief that democracy and Western liberalism were incompatible with racial and economic equality.

Zampilly concludes his essay about Du Bois with this insightful observation:

His work upends the liberal fantasy of the United States’ inevitable progress toward a “more perfect union” that would inspire a just global order and gives the lie to the realist fantasy that how the country behaves internationally can be separated from domestic politics.

My own conclusion is this: During his life, Du Bois made seminal contributions to academia, which, over time, cost him dearly. He was arguably black America’s leading intellectual of the 20th century. If that is at least close to being true, then here is a question for today: Why are so many people, for example governors of red states, fearful of allowing his story and teachings, as well as those of other Black intellectuals, to be taught in America’s classrooms?

The US Open Tennis Championship: In a word, Glorious.

Speaking of Race, I cannot end this Letter without a shout out to this year’s championship.

The three-week US Open is played at the Billie Jean King Tennis Center. The main events happen at the Arthur Ashe Center Court Stadium. Ashe, an inspirational Black American, and King, an inspirational Lesbian American, embody inclusive diversity and are the best kind of examples we have for sincere and devoted yearnings for equality. It is more than fitting that Friday night Frances Tiafoe, a young 24 year old Black American, played 19-year-old Spanish phenom Carlos Alcaraz in a thrilling five-set, five-hour semi-final match on the Arthur Ashe Center Court. Tiafoe is the son of immigrants from Sierra Leone and spent much of his childhood at the Junior Tennis Champions Center in College Park, Md., where his father worked as a custodian. Sometimes he spent the night there, because his mother worked nights in a hospital. The stadium was full and loud, and, although he lost, Tiafoe had the crowd, had all of us, in the palm of his hand. He’ll be back.

Yes, we have a long way to go. But the US Open shows us how far we’ve come. Tennis now looks like America looks.

 

Once Again History Rhymes

Tuesday, September 6th, 2022

“History never repeats itself, but it does often rhyme.” – Mark Twain

In 1870, Germany ended the Franco-Prussian War by decisively defeating the French army in a Battle of Annihilation at Sedan. Germany’s overly greedy and needlessly cruel terms of surrender were excruciating for France and included the annexation of Alsace-Lorraine, a move against which the prescient Bismarck had advised. It became a constant, festering wound in the heart of every French man and woman. From that point on both countries, each of whom knew they would meet again on the battlefield, prepared for the rematch that would become World War I.

Looking at the behavior of one of the two belligerents, Germany, over the next 45 years illuminates and instructs what is happening now more than a century later, as Vladimir Putin, who has been planning the conquest of Ukraine for nearly 20 years, is following the same unsuccessful, potholed road. We can learn a lot from the mistakes of the past. We can, but we don’t.

In the interval between Sedan and 1914, Germany’s Chief of the General Staff, Count Alfred von Schlieffen, devoted his entire tenure (1891 to 1906) to creating what would become the German Plan of Attack. The plan called for a huge, lightning-like strike through Belgium, which would result in the capture of Paris in nearly six weeks, 40 days. But there was a problem: Belgium neutrality, which had been created in 1831 at an international conference in London that recognized Belgium as an independent, neutral state, its neutrality to be guaranteed by the European powers. Forty years later, shortly after the Franco-Prussian War, British Prime Minister Gladstone secured a treaty from France and Germany that if either violated Belgium neutrality England would work with the other defending Belgium, although without engaging in “the general operations of the war.”

Regardless of Belgian neutrality, Schlieffen’s plan devotedly followed the bible of Germany’s war oracle Baron Carl von Clausewitz, who wrote in the time of Waterloo. Clausewitz had ordained a quick victory by “decisive battle” as the primary object of an offensive war, the only kind Germany understood. He advocated the fast capture of the opponent’s capital above all else. Consequently, to conquer France quickly by taking Paris required ignoring Belgium neutrality.

Schlieffen edited and re-edited his plan over the course of his term, and in 1906, when he retired, the plan was complete.  It was exact in every detail, a model of precision, and it factored in every possible contingency.

The only thing it lacked was flexibility. That is, what to do if something went wrong. And many things did. As that great American philosopher Mike Tyson put it, “All your plans go out the window the first time someone punches you in the mouth.”

The Germans invaded Belgium on their way to Paris on 4 August 1914. In addition to misjudging the determination of the French to defend themselves and believing Britain would either stay out completely or join the battle late, Kaiser Wilhelm was certain the puny Belgians would simply roll over and play dead. However, Belgium’s King Albert, the Kaiser’s cousin, had other ideas and refused to follow the plan. In an act of heroic patriotism, he mobilized the Belgium army, primitive though it was, and fought. Belgium resistance disrupted Schlieffen’s precise timetable, and the Germans never did get to Paris. Instead, Germany was forced to settle for four years of trench warfare, attrition and ultimate surrender in November 1918. The terms of surrender forced on Germany were as bad as it had forced on France after Sedan and led to Hitler’s rise and World War II. We never learn.

The German defeat in the first World War can be directly linked to the arrogance and hubris of its leaders in their certainty that King Albert would not object to the invasion of his country by an army an order of magnitude larger and more accomplished than his own. They did not take into consideration the hatred taking Alsace-Lorraine had spawned in the French, or that the British would do the honorable thing and come in on the side of France following the violation of Belgian neutrality. Neither did they appreciate that Russia, a signatory to the treaty for defending Belgium, would mobilize, join the war, and engage the German army weeks before Schlieffen’s plan anticipated.  Schlieffen and the Kaiser, with their myopic tunnel vision, had never believed any of this would happen. They had refused to even contemplate that their perfect plan could be inadequate in any way.

Schlieffen died in January, 1913, and never saw any of the debacle that was to follow. On 9 November 1918, the German high command, two days before the country’s surrender, forced Kaiser Wilhelm to abdicate. He retired to  neutral Netherlands where he lived in isolation for the rest of his life.

In yet another example of history rhyming, even repeating, we are now witnessing a new instance of military and dictatorial myopia. This time in Ukraine where Vladimir Putin, who seems to fancy himself the second coming of Peter the Great, has wildly miscalculated both the tenacity and determination of Ukrainian patriotism and the commitment and unity of NATO members who, like Gladstone’s Britain, are committed to defending Ukraine, although without engaging in “the general operations of the war.”

Here in 2022, we watch King Albert come to life in the actions of President Zelenskyy.

As what happened to Schlieffen’s perfect plan, Putin’s hubris-driven quick victory was not to be. Like the Germans of August 1914, he failed to capture the Ukrainian capital in the early days of the war. Now, he is now facing a long, slow slog as victory ineluctably slips farther away. The recent Ukrainian counterattacks in the South and East are living proof of this.

Thinking about all this stupidity, I can only conclude that Schlieffen, the Kaiser, Putin and others who yearn for conquest always fail to appreciate, and seriously undervalue, the love of homeland coursing through the veins of all of us. History is full of examples that continue to be ignored. America, itself, has fallen victim to this many times, most recently in Afghanistan.

It would be less than fitting, but still desirable, if Putin’s generals would do to him what the German generals did to the Kaiser. But that, I fear, is where history will neither repeat nor rhyme.

 

 

 

A Few Weekend Thoughts On Biden’s College Loan Forgiveness Program

Saturday, August 27th, 2022

On Wednesday of this week, President Biden issued an Executive Order to forgive some of the debt owed by those who had received college loans. In doing so, Biden was attempting to fulfill a campaign promise to forgive undergraduate student debt for people earning up to $125,000 ($250,000 for a family). “I made a commitment that we would provide student debt relief, and I’m honoring that commitment today,” he said in remarks at the White House.

According to the Office of Federal Student Aid (OFSA), an office within the US Department of Education, Biden’s plan comes in three parts. The first part extends the repayment loan pause a final time (again) to the end of 2022. Part 2 is what’s getting all the attention at the moment. It says:

To smooth the transition back to repayment and help borrowers at highest risk of delinquencies or default once payments resume, the U.S. Department of Education will provide up to $20,000 in debt cancellation to Pell Grant recipients with loans held by the Department of Education and up to $10,000 in debt cancellation to non-Pell Grant recipients. Borrowers are eligible for this relief if their individual income is less than $125,000 or $250,000 for households.

Part 3 of the President’s plan is different in that it is in the form a  proposed rule “to create a new income-driven repayment plan that will substantially reduce future monthly payments for lower-and middle-income borrowers,” according to the OFSA. The proposal would:

  • Require borrowers to pay no more than 5% of their discretionary income monthly on undergraduate loans. This is down from the 10% available under the most recent income-driven repayment plan.
  • Raise the amount of income that is considered non-discretionary income and therefore is protected from repayment, guaranteeing that no borrower earning under 225% of the federal poverty level—about the annual equivalent of a $15 an hour wage for a single borrower—will have to make a monthly payment.
  • Forgive loan balances after 10 years of payments, instead of 20 years, for borrowers with loan balances of $12,000 or less.
  • Cover the borrower’s unpaid monthly interest, so that unlike other existing income-driven repayment plans, no borrower’s loan balance will grow as long as they make their monthly payments—even when that monthly payment is $0 because their income is low.

Part 3 is consequential, and the fourth bullet point of Part 3 even more so. Interest payments can easily double the size of a student loan, and anything that reduces the interest burden will reduce the size of the loan and, consequently, the time required to pay it off. But a proposed rule is not an Order and will take time before being finalized, perhaps a lot of time.

Right now we are in the knee jerk phase of this issue. Republicans categorize Biden’s move as political and unfair to those who worked hard to pay off their loans. Why should their tax dollars now subsidize the millions who haven’t? The far right, more rabid of the bunch, have been raining tweet storms condemning the very idea of forgiving the loans, all the while forgetting to mention their own Paycheck Protection Act loans, most well over $100,000, have all been forgiven.

In thinking about this, the first question one might want to ask is: Does the President have the authority to do it? House Speaker Nancy Pelosi doesn’t think so. “The president can’t do it,” she said in July. “That’s not even a discussion.”

We can expect this decision to be challenged in the courts. But, at the very least, it offers President Biden a chance to say he is honoring a commitment, a promise, even if the Judiciary ultimately won’t let him do it.

How and why has going to college come to this? I think the answer can be found in the long, winding, potholed road to higher education of the last 55 years. It’s complicated, and people have devoted entire careers to studying it.

I’m concerned, in a practical sense, with what changed from the time I and my peers affordably attended college in the late 1960s. For instance, how and in what manner have costs increased? To what degree and why is there now a far greater percentage of high school graduates attending four year, or even two-year colleges? Have wages commensurately grown with college costs to allow parents and their children to be able to afford it all? How has the for-profit boom in colleges contributed to the college loan crisis, if it has?

To begin to answer those questions, let’s first take a look at where we are now.

Adam Looney, the Nonresident Senior Fellow at the left-leaning Brookings Institution and the Executive Director of the Marriner S. Eccles Institute at the University of Utah, is one of our foremost experts on college loans and costs. He has argued for quite some time against across-the-board loan forgiveness, because a disproportionate amount goes to people who don’t need it, Ivy League educated doctors, lawyers, etc. He has produced the following table to demonstrate his argument. The table categorizes all colleges and graduate programs represented in the College Scorecard by their selectivity using Barron’s college rankings. The left panel of the table describes the debts owed by students at these colleges. The right panel describes their family economic background and their post-college outcomes. From top to bottom, the schools are categorized by their selectivity—how hard it is to get accepted. Note that the more selective the school, the greater the average debt (with the exception of the for-profits). The same holds true for the two far right columns. The more selective the school, the greater the after college earnings. Note also that, with the exception of the Ivy Plus graduates, the average after college earnings for every other category are less than the President’s cap of $125,000 for loan forgiveness qualification.

I’m going to ignore the harm done by for-profit colleges, except to say the largest single source of student debt in America is one of them—the University of Phoenix, the gigantic online for-profit chain. Students who graduated or dropped out in 2017-2018 owed about $2.6 billion in student loans; two years after graduation, 93 percent of borrowers had fallen behind on their loans, which caused interest owed to grow like festering weeds. These are people Looney agrees need to be helped—a lot.

I thought it might be instructive to look at this through the lens of one, typical, highly reputable, selective public university. As Looney’s table shows, graduates of selective public colleges and universities make up 33.7% of the total share of college debt. I’ve picked the University of Massachusetts. UMass is representative of all state universities, and, because I’m from Massachusetts and long ago was a Trustee at one of its foundations, I know the school better than, say, Penn State or Connecticut.

The UMass flagship campus in Amherst sits on more than 1,400 acres and has about 24,000 students. Out of more than 850 US public colleges, it is #68 in US News & World Report’s current rankings. Tuition, fees, room and board total $32,168 for in-state residents, about $50,000 for out-of-staters. The Commonwealth of Massachusetts currently contributes (subsidizes) 31% of the university’s total costs, or $14,287 per student, which means students’ tuition would be considerably more without that help, somewhere in the range of the cost of a selective private college, or an out-of-state UMass student. Every state subsidizes its selective public colleges to some degree.

Nationally, in 1967, 47% of high school graduates moved on to college. Seventeen percent would drop out, 15.4% white, 28.6% black. Today, less than 10% drop out; 10.7%% of drop outs are Black. We are approaching equality in that regard.

That’s where UMass is now. Fifty-five-years-ago, when I was young, things were different. Facts And Figures 1967, from the then UMass Office of Institutional Studies, is a 163-page, deeply detailed report of the university as it was then, all of it in one spot. I do not think you’d find a similar study today.

In 1967, annual tuition and fees were $336; room and board, $939, for a total cost of $1,275. The university employed 729 full-time faculty for 9,439 students. Today, there are about 1,400 full-time faculty. In 1967, the Commonwealth of Massachusetts picked up 67% of the university’s operating costs (as opposed to the aforementioned 31% today).

What you bought in 1967 for $1.00 would now cost $8.87, with a cumulative rate of inflation of 787%. Over that time, tuition, fees, room and board at the University of Massachusetts have increased by a factor of more than 24. If the tuition at UMass had just grown by the rate of inflation, it would now be $11,310, not $32,168.

So, extrapolating from current demographic and UMass data to the national picture, four things have been at work over the last 55 years. First, student costs have grown at nearly three-times the rate of inflation. Second, the state has reduced its share of student costs by more than 50%, which is representative of the nation. Third, the percent of high school graduates who go on to college has grown from 47% to nearly 62%. And fourth. wages have not even remotely kept up with the cost of college. According to the Congressional Research Service, real wages (wages adjusted for inflation), grew only 8.8%, at the 50th percentile level of all earners, since 1979.

President Biden’s initiative will likely remain a political football at least until the mid-terms, probably beyond. My own conclusion is that it will help a lot of people who need it and will be unnecessary largesse, at taxpayers expense, for those many who don’t. And it does nothing to solve the real problem.

Unless and until we can control the cost of college, this crisis will continue for future generations.  College cost growth at three times the rate of inflation is unsustainable.

We need to do much more than forgive a slice of college loans. That’s like trying to save a sinking ship by tossing the first mate a rope of sand.

 

On Health, History And The Fine Art Of Fudging Data

Wednesday, August 10th, 2022

The cost of insulin, or, half a loaf is better than none

The Inflation Reduction Act (IRA), passed this past Sunday in the Senate and now sitting for certain passage in the House this week, will cap the cost of an insulin vial at $35 for Medicare beneficiaries with diabetes. However, for those not on Medicare, insulin costs will remain unchanged.

Of the 30 million Americans who have diabetes, more than 7 million of them require daily insulin. A Kaiser Family Foundation study released in July, 2022, found 3.3 million of the 7 million are Medicare beneficiaries  and documented the rise in insulin’s cost since 2007.

Aggregate out-of-pocket spending by people with Medicare Part D for insulin products quadrupled between 2007 to 2020, increasing from $236 million to $1.03 billion. The number of Medicare Part D enrollees using insulin doubled over these years, from 1.6 million to 3.3 million beneficiaries, which indicates that the increase in aggregate out-of-pocket spending was not solely a function of more Medicare beneficiaries using insulin.

The IRA is great news for the Medicare beneficiaries who make up nearly half of the population needing daily injections of insulin to live, but a provision in the original bill that would have capped the cost at $35 for all diabetics, not just those on Medicare, never made it to the final bill. Left out in the cold are the 3.7 million diabetics requiring insulin to keep living who are privately insured or not insured at all. That was an expense bridge too far for Republicans.

Will you permit a bit of cynicism here? Needing 60 votes to pass, 57 senators voted in favor of capping insulin at $35 per vial for all diabetics, 50 Democrats, seven Republicans.  Americans overwhelmingly support this as is shown in this Kaiser Family Foundation poll taken recently:

Eighty-nine percent consider this a priority, 53% a top priority. I suggest Republican leadership, never intending to allow this to pass, permitted those seven, standing for reelection this fall, to vote for the bill to give them cover in the upcoming election. Is that too cynical?

If that’s not bad enough, a study by Yale University researchers, published in Health Affairs, also in July, concluded that “Among Americans who use insulin, 14.1 percent reached catastrophic spending over the course of one year, representing almost 1.2 million people.” The researchers defined “catastrophic spending” as spending more than 40 percent of postsubsistence family income on insulin alone. Postsubsistence income is what’s left over after the cost of housing and food.

Nearly two-thirds of patients who experience catastrophic spending on insulin, about 792 thousand people, are Medicare beneficiaries. The IRA will help these people immensely. However, as it stands now it will do nothing to assist the non-Medicare diabetics who annually face catastrophic spending due to the cost of insulin. This group numbers about 408 thousand who need insulin just to go on living, and, yes, these are poor people with few resources.

Not to put too fine a point on it, but we should not forget that insulin isn’t the only medical resource diabetics use and need. There are also the syringes used to inject the stuff, not to mention the testing strips and glucose monitors that analyze the levels of blood glucose, which diabetics have to track religiously. Diabetes is an expensive disease, and insulin is only one part of the expense.

Every time I and others write about the cost and quality of health care in the US, it almost seems as if we’re all standing on the shore throwing strawberries at a battleship expecting some sort of damage. The Inflation Reduction Act contains the first significant health care move forward since the Affordable Care Act of 12 years ago. It’s progress at last, but so much more is needed.

A great historian and better American is now history himself

David McCullough has died. We have lost a giant.

McCullough had that special gift of telling stories of our past in ways that made us think we were there when they happened. He put us solidly in the shoes of the people he was writing about. For him, history is not about a was; it is about the is of the time. Like us, his subjects lived in a present, not a past. He never judged the choices made in the past; he just told the truth through stories meticulously researched and empathically written. That’s how he could win two Pulitzer Prizes, two National Book Awards and a Presidential Medal of Freedom.

I first met McCullough in the 1980s through his first book, The Johnstown Flood, published in 1968. I could not put it down. Read it through in one sitting. It was the start of his brilliant career, and its success gave him  hope he could actually devote himself to history and do well at it. But he never wrote for the money. What drove him was his love for and curiosity about understanding from whence we came.

In a 2018 interview for Boston Magazine with Thomas Stackpole, he was discussing his latest, and last, historical work, The Pioneers, about a group of New Englanders in the 19th century who picked themselves up, headed west,  settled Ohio, and courageously kept it an anti-slavery state. During the interview, he said:

There are an infinite number of benefits to history. It isn’t just that we learn about what happened and it isn’t just about politics and war. History is human. It’s about people. They have their problems and the shadow sides of their lives, just as we do, and they made mistakes, as we do. But they also have a different outlook that we need to understand. One of the most important qualities that history generates is empathy—to have the capacity to put yourself in the other person’s place, to put yourself, for example, in the place of these people who accomplished what they did despite sudden setbacks, deaths, blizzards, floods, earthquakes, epidemic disease. The second important thing is gratitude. Every day, we’re all enjoying freedoms and aspects of life that we never would have had if it weren’t for those who figure importantly in history.

Today’s Americans seem to think history begins about ten years ago. It is a modern day tragedy, and we own it.  Consequently, humanity keeps making the same mistakes over and over again, never learning from those who showed us where the land mines were lying, hidden underfoot. McCullough did that for 50 years. He leaves a large hole in our American universe.

Fudging data with style

Heading back to diabetes for a moment. You may recall the old adage, “Figures lie, and liars figure.” Well, this is not about that. The fudging I’m going to show has not a lie in it. What it does have is deception on a grand scale, and it comes from our CDC, which, usually, I greatly admire. But not this time.

As we’ve all learned throughout the COVID pandemic, the CDC tracks and reports data — a lot of it.

One of the things the CDC  reports about is Diabetes Mortality By State. It’s been doing it since 2005, and it’s in the last six years that we see, if we look, deception.

Here is how the CDC reported this data in 2015:

The redder things are, the worse they are, so this looks bad, and it is.  The scale above shows the distribution of the colors for the states, starting at 13.4 in Colorado and Nevada and ending at 32.4 in West Virginia. Those are deaths per 100,000 people.

Now, here is how the CDC reported diabetic mortality six years later in 2020:

In 2015 there were three dark red states, eight almost dark red states, and 20 almost almost dark red. But now we have only two dark red, three almost dark red, and those 20 semi dark states have turned to light tan. Wow! What an improvement.

One could be forgiven for going away happy….if one did not look at the actual numbers.

In 2015, Mississippi and West Virginia were the highest mortality states, 32.4 and 31.7 deaths per 100,000 people, respectively. Their numbers in 2020 soared about 30% to 41.0 and 43.1. The states with the lowest mortality in 2015, Nevada and Colorado (13.4 and 15.9), in 2020 are 18.0 and 24.2 deaths per 100,000. Wyoming now comes in with the second lowest mortality at 20.7.

But things look so much better. The distribution scale is different, but who looks at that?

The CDC has done something shameful; it has moved the goalposts and didn’t tell anyone. In reality, diabetic mortality has gotten much worse over the last six years, but unless you dug deep, not only would you not know that, you’d think there was an actual big improvement.

This is another reason why the insulin provision in the Inflation Reduction Act is a big deal.