Remaining Humane In The Fury of War

The tenth year of the Trojan war had taken a terrible turn against the Greeks or Achaeans as Homer calls them.  Achilles, the ferocious warrior, is sitting it out after a spat over a war trophy woman with the chief commander Agamemnon.  The Trojans led by their noble and brave leader Hector, son of King Priam, are closing in toward the Greeks’ camp, poised to throw them to the sea.  At that moment of desperation for the Greeks, Achilles’s dear friend Patroclus, clad in Achilles’s armor, steps into the fray.  Unfortunately, the comeback of the Greeks is short-lived as Patroclus falls under Hector’s spear.

Achilles mourns his friend’s demise and to avenge his grave loss decides to return to the battle.  He eventually kills Hector, and then in an infamous gesture of inhumane treatment of a fallen enemy, Achilles drags Hector’s body in front of the walls of Troy before retrieving it to his camp.   Up to that point, the epic of Iliad is a story of brutal fighting, horrible deaths and deceitful acts by the gods as they take turns and cavalierly intervene in favor of the Greeks or the Trojans.  But at the closing chapters of the Iliad, Homer’s epic turns into a story of human drama.  It becomes a teaching lesson about the capacity of humans to feel contrition for their acts and empathy for their foes.

In the darkness of the night, and helped by Hermes, Priam slips through the Greek camp and reaches Achilles’s tent.  There the white-hair old King implores Achilles to release Hector’s corpse so that he bury his son as is proper for any man.  Priam reminds Achilles of what his father would deserve if he, Achilles, had fallen in battle.  At that moment, the proud, arrogant, and vengeful Achilles breaks down and starts crying as visions of his father pass through his mind.  He realizes the banality of his act and overcoming his lust for revenge and his sorrow for having lost Patroclus recaptures his sense of humanity.  Above all, he connects with the grief of the old man.  He grants Priam his request and Hector finally receives the honorable funeral he deserves.

Three thousand years later, in another war, an American Navy SEAL officer leans over the scraggly teenage body of a wounded ISIS fighter, pulls a knife and stabs the sedated captive in the neck.  Although that is not the cause of death, a military tribunal finds the SEAL officer guilty for posing in a photograph holding the dead captive up by the hair.  The officer is sentenced to confinement, demotion and possible expulsion.  But by that time, this officer has become a hero to conservative crowds and media.  The President orders that the officer be restored to his rank and maintain his Navy SEAL status.   The rallying cry of the officer’s supporters is that to be merciful to the enemy is political correctness gone too far; to conform to military rules is weakness, a fool’s errand, when the enemy is a member of a band of ruthless fighters of a stateless and terrorist entity.  On the other side, military commanders and civic organizations protest that voiding military disciplinary action undermines the rules that should apply in the conduct of war and treatment of combatants.

What lessons about human conduct, rage and magnanimity can we learn from these two acts of war?  How can we as outsiders pass judgment on such events?

We first realize that several thousand years of human history have not changed human nature much when it comes to war.  No matter whether a war is just or not, it has a way of dehumanizing the individual.  Warriors engage in battle with the same rage and ferocity as ever against their opponents.  They often cannot resist to subject their foes to what Achilles and officer Gallagher committed against their fallen enemies.  But by dehumanizing the enemy in the process we dehumanize ourselves because we eventually discover we have violated the other person’s – no matter how much vilified in our eyes – dignity and the right to mercy.   Those same rights we wish our enemies grant us.

In the times before and for many centuries after the Trojan war, a warrior had only his personal sense of morality, magnanimity and compassion to guide him how he fought and treated the enemy.  But often, as with Achilles, none of these personal restraints mattered.  Achilles’s reckoning of his disrespect and brutality comes only when Priam pleas to him as a devastated father.  I have no way to tell whether officer Gallagher had a similar personal reckoning.

There is, though, something that has changed since the days of the Trojan war.  We finally realized that someone has to step in between the warrior’s rage and lust for revenge and the defeated enemy.  Someone has to prevent the individual from descending down to the dark chambers of one’s soul where lurks the urge to deprive the enemy of his humanity.  Someone has to save the warrior from losing his own dignity and thus being dehumanized himself.  Thus, from the establishment of the Red Cross (Crescent) in the middle nineteenth century to the Geneva Convention and the International Criminal Court nations have come together to check and harness the aggressive instincts of human nature and punish the violators of the accepted norms.

Equally important, national armies have adopted their own military rules of conduct to control behavior that dehumanizes combatants on both sides of the lines of conflict.  International treaties and military codes of conduct are our only defenses against allowing the suffering of wars to extend into the annihilation of the human spirit.

I think it is in the context of the need for rules that protect against dehumanization of those we send to war that we ought to reflect and judge the events surrounding officer Gallagher’s action, the President’s reversal of the military tribunal’s decision, and the outcry against it.

What If They Knew: A Different Thanksgiving Take

Because this is a short week – preparing, traveling, celebrating Thanksgiving takes most of it – I thought I should take a break from the blog.  But an idea kept creeping up in my brain that was too tempting to let go.

Thanksgiving is a story about the early experience of a bunch of people who had left their ancestral homes and sought a better future to somebody else’s land.  All human migrations are played out the same way.  Almost always, some people move into a place already inhabited by others.  Sometimes, the newcomers and the indigenous people find a way to live peacefully together.  Most often, the newcomers use aggression to takeover and subjugate the natives.  Many years later, the descendants of those newcomers declare the land their own since time immemorial.   They become the new natives that feel they have to defend “their” land from new newcomers.

Since the time information started to travel ahead and faster than people, the “natives” everywhere have some idea who the would-be newcomers are and where they come from.  Thus, the natives make judgments about the character and cultural makeup of the newcomers.  The Europeans know a lot about the Syrians, Iraqi, Afghani, and African refugees who cross the Mediterranean.  They know their religion, their culture, the kind of countries they come from.  So, if, for example, they happen to be Christian, they may be more welcoming of them than if they are Muslim.  And so on.

On this side of the Atlantic, we also know where the refugees and migrants come from and who they are.  Since the founding of America, we have judged that we prefer to let in more of the immigrants that come from Europe.  Further north and west in Europe, even better.  Not so good if you come from south of our border or the Middle East or Africa.

Now let’s think of those native Americans who saw the Mayflower sail into Plymouth, Mass.  They knew nothing of the people that disembarked.  Nothing about their culture or the countries they had come from.  All they saw was that these newcomers looked like them, walked like them, and communicated with some medium that sounded like a language.  In short, the natives saw the Pilgrims as fellow Homo Sapiens – not that those natives or anyone else knew at the time which human species we are.

What if the natives knew the Pilgrims had sailed from war-torn Europe?  What if the natives knew the newcomers came from a continent that was in the middle of all-out religious wars; a continent whose people were willing to kill in the name of God?  What if the natives knew the newcomers carried deadlier weapons and diseases?  Would the native Americans be as welcoming as they were?  Would they have helped the Pilgrims survive their first winter, had they known the aggressive nature of the newcomers?

So, I wonder whether America as we know it might have been the result of ignorance?

Capitalism, Wealth, and The Good Society

Consider the following order of things.  Any year your income exceeds $2 million, you pay 90% in taxes on the excess income.  Each year, the President and the Majority Leaders of Congress honor the top ten contributors to tax revenue.  They even give them a plaque.  Something similar to the annual honors for top achievers in arts and humanities.  If you are Bill Gates or Warren Buffet you are proud to show your guests all the plaques you have accumulated over many years of income creation and contributions to taxes.

You think this is a crazy fantasy.  It’s not.  It happened in the 1950s minus the ceremony and the plaques.  The marginal tax rate for personal income over $200,000 (or $2 million in today’s dollars) was 91%.  And capitalism in America thrived.  But over the years, we started to read out of a different book, according to which high tax rates for even very rich people became an anathema.  How did we arrive to the new notion about capitalism and taxes?  Was it because high taxes slow (a) the work ethic? Or (b) the rate of corporate investment? Or (c) the rate of innovations?  In short, did we discover that high taxes could bring the collapse of capitalism?

The truth of the matter is that we didn’t discover any such tax effects.  Take the argument of diminishing work supply and effort if incomes are limited (say, by taxes).  Major sport leagues in the US have total salary caps; no such limits exist in European sports.  Can anyone credibly argue that American athletes compete less vigorously?  Switzerland had a tax holiday and there was no change in work intensity or supply.  In the US, changes in welfare benefits have been associated with insignificant changes in work habits.   Alaska pays $5,000 per household a year out of its oil fund.  No slackers there either.   As this year’s Nobel Prize winners in economics, Esther Duflo and Abhijit Banerjee, write: “Financial incentives are nowhere near as powerful as they are usually assumed to be.”  Why do people still work and excel in spite of lower financial rewards?  Because of personal pride, status in their communities, dignity, and desire to demonstrate social solidarity and cooperative spirit.

And what about boosting corporate investments by giving tax relief to corporate profits?  We have seen that the tax relief corporations got from the 2017 tax law was primarily used to buy back stocks than to make new investments.  A NYT (Nov. 17, 2019) article featured the story of FedEx, which despite going from a tax bill of $1.5 billion in 2017 to $0 in 2018 (a $1.5 billion tax windfall) made no appreciable addition to its investments.  The same elusive evidence about a dependence on lighter taxation holds for the rate of innovation.  These examples do not totally invalidate the effect of taxes on economic activity and outcomes, but neither do they support the hysterical claims that taxing very high incomes and wealth is a fatal blow to capitalism.

Over the two hundred years of history with capitalism, people have succeeded in leaving behind lives of subsistence and building more equitable and prosperous societies.  For most of that time, economic gains and social progress moved on parallel tracks.  That co-movement has, after all, been the main reason behind the endurance and political legitimacy and acceptance of capitalism.

Over the last thirty years, though, we have witnessed a serious loosening of the bonds between capitalism and society, especially in this country.  The drive for individual success, epitomized by the accumulation of wealth, has made us much less attentive to the imperative of social progress and cohesiveness.  Wealth accumulation is now treated almost like a sport.   Every year we are told how rich people are ranked in wealth.  There is less interest in how the average Joe ranks in overall human wellbeing.

Thus, we have arrived at a state where the rising aggregate national income and wealth are distributed with unprecedented abundance to the few and great stinginess to the many.  How else can we explain that the average tax rate for the 400 richest households dropped from 70% in the 1950s to 23% in 2018 whereas it rose from 16% to 26% for the bottom 10%; that the US minimum wage is 34% of the typical wage (the lowest in the OECD group) versus, for example, 62% in France; that the 1% of richest Americans earned 20% of the national income in 2014 compared to 11% in 1980; and that an estimated $1 trillion per year has been transferred from the bottom 80% to the top 1% since 1979, a result of the redistribution of wages and salaries away from the many and toward the few?

Such tectonic shifts in the distribution of incomes and wealth call into question the economic and moral underpinnings of wealth creation and distribution.  One culprit is, of course, the declining progressivity of the tax code complemented by the preferential taxation of certain incomes (like capital gains and carried income) and forms of wealth (estates, in particular).  The other culprit is that wages have not kept pace with the gains of productivity, resulting in a shrinking share of wages in the national income in favor of profits.  And a third culprit is the breakdown of meritocracy in education.  In a new book, The Meritocracy Trap, Daniel Markovits, A Yale Law professor, describes how the admissions process of top colleges has been corrupted to make room for the children of the upper classes and the connected at the expense of equally qualified but underprivileged students.  Thus, Markovits argues we have the emergence of an oligarchic elite that can perpetuate its inter-generational advantage.  The fact that social mobility in the US is now below that of European nations adds credence to his argument.

The sense of inequity in economic rewards and opportunities along with the deterioration of several critical indicators of human development are reflected in the recent political choices of American voters.  And the feeling of inequity explains why a majority of Americans support a wealth tax and even raising the marginal income tax rate to 70%.  The erosion of faith in the fairness of the capitalist system among younger Americans ought to be a warning.

Wealth creation is not the problem.  How wealth is created and is distributed is the real problem.  Its ultimate purpose ought to be the betterment of society not the extravagant aggrandizement of the individual.  Instilling social responsibility in the creation and distribution of wealth is the challenge we must meet.  Perpetuating policies that seek to create wealth at the expense of fairness and then shield it from fairly sharing in society’s needs is, in my opinion, what can ultimately undermine the moral foundations of capitalism and jeopardize its viability.

The State of Democracy

We all know what happened to the frog that failed to notice that the temperature of the water was rising until it was too late.  People in democratic states may face the same fate unless they take time from their everyday lives and pay attention to what is going on in the political systems of their countries.

Americans thought they were insulated from such worries.  After all, our government under Republican and Democratic administrations was the champion of liberal democracy around the world.  But while, we were busy admonishing others about the rules of democratic governance, our own democracy had started to erode.  Freedom House, an agency partly funded by the US government, has been taken the pulse of democracy around the world for a long time.  In its 2019 report, it reported that the quality of democracy in the US has been on the decline over the last 8 years.  Across the globe, the Freedom House finds that the quality of democracy has slipped over the past 13 years.  The Economic Intelligence Unit, an agency based in the UK, ranked the US 25th in quality of democracy out of 167 countries in 2018.  All three Scandinavian countries were in the top five places.  It was in 2016 the US slipped in the rankings from Full Democracy to Flawed Democracy.  According to EIU, of the 167 countries it ranks, 75 countries fall in the Full Democracy or Flawed Democracy category and another 39 countries are ranked as Hybrid Democracies.

Notwithstanding the usual criticisms all rankings draw, there is no doubt the sense that liberal democracy is slipping here and abroad has become more palpable in recent years. I believe there are three factors that appear to contribute the most to the decline of liberal democracy.  One is the feeling that “the system (i.e., democracy) is not working for me.”  In America, this sentiment is strongly correlated in intensity and time-wise with the growth of the lobby industry, starting in the 1970s.  A 2015 study found that large corporations and their associations spend $34 for every dollar spent by labor unions and public-interest groups.  The Supreme Court ruling on Citizens United has certainly expanded the influence of big money.  Moreover, the need of members of Congress to raise campaign funds keeps them farther and longer away from their constituents and their everyday concerns.

The second factor is the diminished opportunities for social and economic advancement.  For example, nine out of ten Americans born in 1940 made more money at age 30 than their parents made at the same age.  The ratio is now down to five out of ten for those born in the early 1980s.  Diminishing intergenerational mobility as well as the eye-popping income and wealth inequality feed into other populist sentiments driven by  economic trends.  Thus, societies that have experienced positive growth in the Human Development Index are more likely to be for global trade than societies that have stagnated in human development.   (And the evidence shows that while the HDI is rising in China, it is falling or stagnating in many parts of America.)

Finally, the third factor is the growing disregard for other people’s views and even their rights.  This tendency is usually fed by religious fanaticism, fear of diminished political control, and nativism (i.e., nationalism).

The sense among broad swaths of the population of a country that they are left out of the political decision-making process leads to political apathy and withdrawal.  The feeling of not sharing in the spoils of growth leads to populism.  And the disregard of other people’s rights and freedoms most often coupled with a feeling of victimhood or persecution leads such groups to seek protection by all means, even at the expense of constitutional rights.  All three sentiments can, and oftentimes are, exploited by parties or strongmen and demagogues, thus, contributing to the erosion of the rule of law and civil rights.  And this is exactly what we have seen happening around the world.

What should be a sobering warning to western democracies, and certainly to America which has traditionally vied for global influence, is the rising credibility of political systems that present themselves as alternatives to liberal democracy.  Consider, for example, the results of a recent survey conducted by the Global Network for Advanced Management among business students from 30 countries.  A majority of these students from developed (not including the US) and developing countries expressed the opinion that developing countries and emerging markets are looking more to China than to the US for guidance on how to organize the economy and society.  And the World Values Survey (a global organization of social scientists) found that in mature democracies the statement “It is essential to live in a democracy” was supported by 30 percent of millennials (those born after 1980) compared to 70 percent of respondents born around 1930.  This signifies the receding belief among younger people about democracy as a successful political system.

What escapes many of us is that over the millennia of recorded human history, democracy, and liberal democracy in particular, has been around for a relatively short period of time.  This is so because democracy is a fragile and demanding political system.  It is built on social trust and individual courage.  It takes both of these for those who lose a political contest to trust that they will not be treated badly in the hands of the winners.  Democracy endures when the constituents share common overarching values and ideas, the preservation of which offsets any potential loss from being in the opposition.

I am not alone in saying that this sharing in common values and ideas has been terribly fractured in America.  As two Yale Law professors, Amy Chua and Joe Rubenfeld, put it in an article in The Atlantic, “Americans on both the left and the right now view their political opponents not as fellow Americans with differing views, but as enemies to be vanquished.”

So, how do we step back from this point before it’s too late?  I would rather leave the task of answering this question to each one of us.

To Slow Time Run Fast and Stay Low

As the bus pulled farther away from its stop, the houses would become fewer and fewer until they were left behind; the olive groves where I would ride my bike would look less familiar; the coast line would no longer be the one with the beaches where I would spend time swimming and playing with friends.  That’s when I would recall the moment I jumped out of the buss when it arrived at the village, the welcoming of my uncles, aunts and cousins, and the anticipation of the pleasures of a whole summer ahead of me.  All that would come to an end two months later.  I would try to relive those first moments of arrival but it was hard to really “live” them.  I could not replay my summer vacation.  That was then my personal struggle with time.

We all struggle with time.  We want time to stay still, or the duration of something we enjoy to remain endless.  Other times we wish something lasts as little as possible.  So, given our human fascination and struggle with time, I read with great interest Carlo Rovelli’s short book The Order of Time.  Rovelli is an Italian physicist, specializing in quantum gravity.  But don’t assume he is a story teller of cold scientific facts.  His book is informative and poetic, close to science but never far from our human essence.

In this book, we learn that time as a physical variable is anything but stable, single, or always present.  Time passes faster if you live at the top rather than the ground floor of the Empire State building.  Time passes more slowly if you keep moving than staying still.  The rambunctious kids that run around in the ground floor age more slowly than the old folks who spend hours watching TV in the top floor.  The present, the now, is also not the same for all of us.  When my cousin in Greece hears my voice on the phone, I have already moved into the future because my voice does not reach my cousin instantaneously.  There is a multitude of “presents” in our universe.  In the world of quantum physics, that is, the world of the very, very tiny things we call particles, time, as a variable, is absent.  Processes can be played forth and back without upsetting the laws and equations of quantum mechanics.  In this infinitesimally small world, the very foundation of our universe, there is no past, present or future.

By the end of that part of the book, we realize that time is not what we think it is.  That line we divide into past, present and future along which we believe our lives unfold.  But after Rovelli has destroyed our everyday notions of time, he starts to reconstruct time as we humans experience it.  Our ability to perceive the world in its smallest scale, the quantum scale, has not been selected by nature as a trait necessary for our survival.  That’s why we do not sense all these peculiarities about time.  What is necessary for us to survive and thrive is just enough (actually a coarse) perception of the world as a three-dimensional space plus a fourth dimension we experience as a single time, ordered from past to future.

Our human sense of time, as far as the outside world is concerned, starts with our incomplete perception of the world as a physical system that moves from low entropy to higher entropy.  Entropy measures how well- arranged things are.  The leaves on the tree are ordered in some arrangement.  Then the fall wind blows them to the ground where they lie in a less ordered arrangement.  Later the wind scatters them into greater disorder.  The leaves on the tree had lower entropy than the leaves scattered around the yard.  Thus, in our human eyes, the world moves from low (greater order) entropy to higher (greater disorder) entropy.  Entropy cannot move backwards, that is, entropy does not go from high to low.*  Entropy, that is, the order of the world as we perceive it, is the closest thing we have in relation to nature that can sustain our sense of a past, present and future.

If it is so difficult to find time in the physical world, how does it emerge in our human lives?  Rovelli argues that time emerges in our brains as memory (a sense there is a past) and anticipation (a sense there is a future).  The clock is in our brain.  It helps us organize our lives along a line from past to future and give us a sense of personal identity and makes us conscious of our interactions with the external world.

Here Rovelli starts to sound like the French philosopher Henri Bergson (An Introduction to Metaphysics) for whom consciousness means memory.  Our memory is the repository of the past so that the past lives in the present.  But the present is elusive like the flow of a river.   “We cannot step into the same river twice,” is how the Ionian philosopher Heraclitus (6th century BCE) put it.  We sense time as duration not as a string of still moments.  Bergson writes “Without this survival of the past into the present there would be no duration, but only instantaneity.”  Therefore, the becoming not the being is true reality.

Once we start to accept that becoming (the change) not the being (the stillness) is what the world is all about and what we intuitively sense, then we can start coming to better terms with time.  We recognize that our sense of time, despite the load of past unpleasant or sad moments it carries, is what makes us live fuller lives with consciousness of who we are.  That, to quote Rovelli, makes time a source of anguish but in the end a tremendous gift.

Those years I spent my summers in the village, I knew nothing about Rovelli or Bergson.  I knew nothing about time in the external or our internal world.  All I wanted, as the bus pulled away, was to take one more glimpse of what I was leaving behind, anything that would keep me tied to the summer moments.  The moments that were now becoming melancholy memories relieved only by the anticipation that another summer would arrive again next year.  This is what time is all about.

* Entropy moves in one direction from low to high because of the second law of thermodynamics which states that the total entropy of an isolated system can never decrease over time.

God In Politics

Two weeks ago, NYT columnist Frank Bruni wrote a column titled “The Democratic Primary’s God Deficit.”  In it, Bruni argues the Democtratic candidates should talk more about God and religion.  This, according to Bruni, would help connect them to religious Americans and increase their appeal with such voters.  Moreover, it would be smart politics because it would not cede this block of voters to President Trump.  This is an interesting proposition but not without problems and challenges notwithstanding certain opportunities.

First, should voters be interested in a candidate’s religious beliefs?  The constitution does not permit any test based on religious persuasion for those seeking a political office.  Then, why would pubic knowledge of the religious beliefs of a candidate matter?  The constitution also provides for a separation of church and state.   Therefore, voter awareness of a candidate’s religious affiliation should not matter, unless a voter expects favorable treatment of his or her church and faith from a coreligionist candidate, which in essence would be contrary to the Establishment Clause of the First Amendment.

Second, bringing God into political debates frivolously and selfishly might also offend Christians and Jews in light of the third Commandment “You shall not take the name of the Lord your God in vain.”  Finally, how are atheists, agnostics and nones (i.e., those unaffiliated with any church) supposed to talk about religion?  These are some of the challenges I see in inserting God in politics.

Having said that, I recognize that given the separation of church and state and the right of religious liberty enshrined in the First Amendment, the public has a legitimate interest in knowing a candidate’s views in regards to these matters regardless of the religious persuasion of a politician.  The point of interest then is how politicians respond to this challenge.

In 2016, Christian conservatives (comprised mostly of Evangelicals) saw an opportunity to further their religious agenda by decidedly siding with Donald Trump in spite of his revealing a character and using tactics that would have little chance meeting the moral test question “Would Jesus do this?”  Interestingly, the Christian Right saw in Donald Trump a stronger crusader of its cause than Ted Cruz, who also campaigned for that block’s support.  Even, Trump’s unfounded and scurrilous accusations against Cruz’s father and wife were not enough to halt conservative Christians’ enthusiasm for Mr. Trump.

Religions have a long history of seeking alliance with and protection from state leaders in order to survive or to dominate the spiritual domain.  As Ross Douthat (NYT, 9/16/2018) put it, Christian conservatives made a Trump bet that his support of their religious agenda justified his personal moral failings just like fourth century Christians made a bet with Emperor Constantin in order to secure religious freedom, and whom they even elevated to sainthood despite that Emperor’s involvement in family murders.  In a similar fashion, the leaders of conservative Christian churches have lost no time to declare that God’s will is behind Mr. Trump’s ascendancy.  Thus, they admonish their faithful to “render to God and Trump”; that God “wanted Donald Trump to become president”; that President Trump is meant to be a new King Cyrus sent by God to save Christians as the real Cyrus delivered the Jews from Babylonian captivity.  Many more such messianic pronouncements are in record.

The anxiety, rational or not, of the Christian Right, has been masterfully manipulated by Mr. Trump and his closest associates.  Several examples prove the point.  “If you don’t win this election, you’ll never see another Republican, and you’ll have a whole different church structure,” candidate Trump said on the Christian Broadcasting Network.  Not to be overshadowed by the President, Vice President Mike Pence has not let up in his persistent support of religious causes with little regard to the First Amendment.  More recently, Attorney General William Barr, in a speech at Notre Dame University, reportedly derided secularism and called it a threat to America aiming at destroying the traditional moral order.  This from the guy who is supposed to enforce the Establishment Clause; the clause that guarantees both freedom for religion as well as freedom from religion.

Here, the invocation of God and religion is part of a political agenda that leaves many Americans dismayed and uncomfortable in its sectarian partisanship.  To accept a generalization of this approach across political parties and campaigns risks taking us all down a very slippery road.

So, how should politicians talk about God and religion?  I would argue that politicians can address the public’s interest in the First Amendment while respecting the constitution by avoiding language that politicizes favoritism for this or that religious sect or for people of faith versus those without.  Expressions of personal religious beliefs should inform us about the person’s moral compass and not signal endorsement of a religious establishment.  In the same context, secularists can also discuss how reason and universal humanism informs their morality and their views about religious liberty and the separation of church and state.

I believe the public service both Democratic and Republican politicians can render to their country is to educate their fellow Americans about the proper role of religion in politics, the rights of all regardless of adherence to faith or not, and the perils of letting sectarian politics dominate political discourse and competition.

And a final note.  Contrary to Bruni’s column, Democrats have been talking about God in their campaign trails.*  See the Atlantic article in  https://www.theatlantic.com/ideas/archive/2019/06/2020-democrats-are-talking-about-religious-faith/592966/

The Special Case of Credit Unions

I have just come back from a conference for board directors of credit unions and it’s time that I talked about this type of financial institution.  Not because I want to make you a credit union member, although this would be great, but because, in some special ways, credit unions are the anti-paradigm of for-profit businesses and what we come to take for granted regarding human economic behavior.

First, a few things about credit unions.  They operate as financial cooperatives that belong to their members.  In contrast to earlier times, anyone can now join a credit union without the condition of a common bond among members.  Like commercial banks, credit unions are regulated by the federal government or their state.  Because credit unions are recognized as not-for-profit entities, they enjoy a tax-free status.  Historically speaking, credit unions were a German invention imported in the US early in the twentieth century.  The idea was to allow people of low economic means with limited access to the banking system to pool their savings together so that they could borrow from this common pool of funds.  Today’s credit unions have grown in size and sophistication, rivaling banks in the range of retail banking products and services they offer, including financial technology.

One hundred fifteen million Americans – slightly more than one third of the US population – belong to 5,500 credit unions, which control $1.5 trillion worth of assets.  Of course, for-profit banks far outweigh credit unions in value of assets; for example, JP Morgan-Chase alone controls $2.3 trillion of assets.  But the membership size clearly shows that Americans are eager to embrace an economic institution with a social purpose.  And this is true in blue and red states.   Indeed, the US credit union movement is the strongest and biggest in the world.

Credit unions are a different business species in several important respects.  First, their members are both owners and consumers of their credit union products and services.  Profits remain within the organization or are distributed to members which implies that the members never lose value to somebody else.  Whether they pay higher interest rates on loans or receive lower interest rates on deposits, in either case, the resultant surplus remains with the credit union and, thus, belongs to the members.  This is entirely different from the case of for-profit businesses where shareholders can benefit at the expense of customers.  Because of the dual status of members as owners-consumers, credit unions are very responsive to the needs of their members.  In fact, on average, credit unions charge lower loan rates and pay higher deposit rates than banks.

Credit unions are governed by board of directors comprised of volunteers.  Federal rules allow only one board member – usually the chair – to be paid a stipend.  Board volunteers (as well supervisory committee* volunteers) are reimbursed for expenses to attend meetings and conferences.  Some states allow stipends for board members but such payments are kept quite low.  In general, direct and indirect compensation of board and supervisory committee volunteers is meager when compared to the compensation packages paid to members of corporate boards.

Now, one may infer that since credit union volunteers receive very low compensation, credit unions attract low-skill volunteers.  Nothing would be farthest from the truth.  Credit unions are governed just fine by these – let’s call them – lay people.  Credit unions have fared much better than banks through various financial crises, including that of 2008.  Board members come from all professions and backgrounds.  By applying common sense and a high dose of diligence and loyalty to their fellow members, these volunteer directors provide competent and responsible custody of their credit union’s assets through booms and busts.

This experience begs two questions.  First, why corporations lavish inordinately high compensation packages to their boards when the credit union governance model shows successful governance can be had at a much lower cost?  Second, why do credit union volunteers offer their services for no or very low compensation when this type of service commands hefty rewards in the for-profit sector?  One answer is that not every type of effort has to be monetized to attract takers.  The reason behind this is the presence of a strong social altruistic instinct that we see through out the volunteer movement.  So, when we are told that monetary reward is necessary to induce certain effort and the more of it the greater the effort is, this neglects to consider that not all things can be bought with money only.  My hypothesis is that if corporate boards were open to all – as opposed to the members of “boys or girls” networks, we would see a lot of people stepping forward willing to do the job at much lower compensation.

There is an important reason why credit union boards remain competent.  Board members (as well as supervisory committee members) attend one or two conferences a year.  This way, they are updated about new laws and regulations, learn about developments in the financial sector, and how to be more effective directors or supervisory committee members.  Most importantly volunteers learn from each other.  Because credit unions are part of a movement under the motto “People helping People,” volunteers feel no competitive pressure to be secretive.  That is, the cooperation spirit extends to exchanging ideas among volunteers for the purpose to make credit unions as a whole successful.  This is unheard in the world of for-profit businesses.

The overall success of credit unions is also driven by another advantage in governance.  Credit union boards are chaired by one of the volunteer directors, not the CEO, as is typical in the corporate world.  That implies there is a clear distinction between those who set policy (the board) and the top executive who executes the policy.  This way, the main stakeholders, i.e., the members/owners, are directly represented in the top echelons of governance and the chief executive has less opportunity or power to pursue self-dealing, as we frequently see in the corporate world.**

Of course, financial cooperatives cannot replace the for-profit banks.  But by being part of the market, they enhance competition and help us better understand the pros and cons of different corporate governance models.  This is a gain for all – credit union members and bank customers.

* Supervisory committees have the purpose to stave off fraud and lapses in operational rules within credit unions.

** Exhibit A here is the case of pharmaceutical firm Theranos.  A twenty something founder and CEO of this start up was able to fool high-caliber board directors, including the former Secretary of State, Henry Kissinger!!!  So much for lavishly-paid corporate boards. 

 

Are We Watching America In Retreat?

The exit of American military forces from the Middle East has ended the same way as it started: as a failure of judgement and honesty.  Sixteen years ago, we entered the Iraq war under the cover of fake information and false evidence.  A week ago, we exited on the whim of a President under the cover of his “great and unmatched wisdom.”  The former was the result of collective deception.  The latter was the result of one individual’s self-delusion.  Neither served the interests of the country.

Last week’s decision to exit the Middle East, and especially the way it was done – abandoning the people, the Kurds, that fought with us, is perhaps the last episode of what has been a seventy-year period of American preeminence in the global stage.  Or it can be looked at as the first step into a future America that turns inward and isolationist.  No doubt it satisfies a good number of Americans who, tired of the unwise waste of blood and treasure in foreign fronts, would like to see that priorities turn toward domestic needs.  It also satisfies those who believe that American power was abused to prop up undemocratic regimes or the narrow economic interests of US firms and their international collaborators.

Nonetheless, somewhere between selfish abuse of power and arms-length indifference lies the space for a powerful and wealthy country, like the US, to play a constructive role in global affairs.  This is especially true today, as we recognize that to expand economic prosperity, eradicate illnesses, improve educational opportunities and deal with the climate challenge in the interest of all of humanity more not less international cooperation is needed.

This constructive role seemed to be what America would commit itself to in the aftermath of the WW II.  The Marshall Plan, NATO, the World Bank, the International Monetary Fund, GATT (the General Agreement on Tariffs and Trade) and the UN had the purpose to establish a world order that would foster international cooperation and peace and promote economic development.  In many of these initiatives, America was willing to absorb the cost (not necessarily without the anticipation of future benefits) because it had three exclusive advantages: military strength, economic power, and the soft power of a liberal democracy.  The result was 70 years of relative peace and economic growth that set the post-World War II period apart from the periods that had preceded it.  Robert Kagan (The Cost of American Retreat, WSJ Sept. 2018) makes the point that the US-underwritten world order had rules that America often flaunted.  But “[A]t the heart of the order was a grand bargain: The other liberal powers ceded strategic hegemony to the US, but in return the US would not use that hegemony to constrain their economic growth.”  That’s what Trump’s transactional approach to foreign policy never managed to grasp.

Unfortunately, each of these advantages dissipated as time went on.  First, the legitimacy of military power came into question either for being used in the pursuit of dubious and self-service objectives or not used enough.  Thus, Clinton was criticized for his inaction in Uganda and his delayed intervention in the former Yugoslavia to end the inter-ethnic atrocities.  Similarly, Obama was blamed for his hesitation to deal with Assad and Sissy that left the Arab Spring an unfulfilled dream under the forceful pushback of two dictators, one seasoned and the other newly-minted.  Under the emotional weight of the 9-11 terrorist attacks, Bush listened to his neocon advisers to seek retribution against the wrong country, Iraq, instead of the country of the perpetrators, Saudi Arabia.

Confidence in the ability of our economy to generate prosperity for all fell victim to domestic policies that opened an unjustifiable gap between the privileged few and the masses of working Americans.  With crumbling infrastructure, festering but underfunded social problems and rising costs for the welfare net America is a country immensely rich in the aggregate but unwilling to absorb the economic cost to pursue its strategic interests through trade and climate agreements or defense treaties.  In years past, trade relations and agreements with other countries, including China, were part of a national strategy.  Eventually,  working class Americans came to see globalization as a project run by large corporations for their own interests not theirs.  Thus, globalization became the bane for an inward-looking nationalistic sentiment.

And finally, but most importantly, America’s soft power to promote liberal democracy has been dying by a million cuts under the continued onslaught of the current president against free press, the justice system, the electoral process, and the assiduous coddling of authoritarian strong men or one-party rulers.

So, here we are.  American troops, not even given the president’s thought for an orderly and dignified exit, are retreating in haste from the Syrian front, reminiscent of the exodus from Vietnam.  Russian flags, hoisted on armored vans, are seen entering American military campgrounds as they fill the void, a void that is both territorial and geopolitical.  And Pompeo and Pence are now in Ankara as supplicants to plead for restraint.  All this must grate the patriotism of Republican politicians.  But it’s the price for their Faustian bargain with Mr. Trump.

Make America Great Again was just a slogan from another era.  Millions of Americans took it as genuine precursor to something grand.  But now, MAGA lies disfigured along the US-Mexican border, victim of the grotesque and inhumane treatment of impoverished and frightened migrants.  It lies along the Syrian-Turkish border full of shame for the betrayal of our army’s brothers-in-arms.  And it lies in the web of the incoherent utterings of a man without credibility.

It is up to the next president to pick up the pieces and restore America’s place in the world.

Our Tortured Establishment Clause

Right after policewoman Amber Guyger had been sentenced for the murder of a black man in his apartment in Dallas, the judge, Tammy Kemp, walked up to her, hugged her, and handed her a Bible.  In the words of the NYT article, “Some praised it [her gesture] as a rare and much-needed moment of humanity; others criticized it as potentially unconstitutional…”  Unique perhaps among nations, Americans have to live with such contradictory viewpoints thanks to the First Amendment that, besides protecting free speech, states that “Congress shall make no law respecting the establishment of religion, or prohibiting the free exercise thereof,” which has come to be known as the Establishment Clause.

Of course, Judge Kemp, a representative of the state, was not making a law by giving out a Bible, but in the hotly contested wars about the intent and scope of the Establishment Clause her gesture could be interpreted as supporting Judeo-Christian beliefs contrary to the intent of the First Amendment.  This is so because court opinions have ruled that the separation of church and state also refers to actions by the government and its official representatives.

Clashes around the Establishment Clause are at the heart of the culture wars in American politics and courts.  To the combatants, secularists on one side and religious adherents on the other, how the Establishment Clause is interpreted and enforced is central to building a national narrative and unifying anchor.  So, I decided to revisit a book I had read more than ten years ago, titled Divided By God by Noah Feldman (then at the New York School of Law and now at Harvard Law).  The historical and legal account of the book is very instructive.

Let’s start first with the historical fact that separation of church and state has never been truly enforced in the US.  From the early decades of the Republic, publicly funded schools inculcated a Protestant viewpoint that glossed over differences across Protestant sects (the nonsectarian approach).  When Catholics came to the country, their requests to allow their pupils to receive a Catholic religious teaching were rebuffed by the Protestant majority.  The result was the establishment of Catholic parochial schools without any state funding.  The same was the treatment of students of other Non-Protestant Christian sects as well as of the Jewish faith.

When the courts and the Supreme Court, in particular, finally started to take up cases regarding the separation of church and state in the wake of the Second World War, the decisions were premised on different legal arguments, often outside the strict purview of the Establishment Clause.  For example, the right not to salute the flag claimed by Jehovah Witnesses was rejected in 1940 as violating the Establishment Clause but the same right was accepted shortly thereafter for different plaintiffs as a right protected by free speech.  This practice of basing court opinions on different legal premises has continued to our day and it is a major reason why the public is so much perplexed as to what is right and wrong under the Establishment Clause.

Over the years, Supreme Court decisions have bifurcated into two approaches.  Incursions of religious teachings and practices, like prayer in public schools, Bible reading, and teaching of creationism in biology courses, have been struck down by the Supreme Court on the ground they favor religion and, thus, violate the separation of church and state.  On the other hand, the Supreme Court has decided in favor of the use of public funds and resources for religious purposes, like vouchers to attend religious schools, or the use of school facilities.  The Court has also ruled to uphold the display of religious symbols in public places as long as not any religious or non-religious group (like atheists) is excluded.  Therefore, the courts have moved to interpret the Establishment Clause to mean not an absolute exclusion of religion from the public space, but rather a fair and neutral treatment of religious and non-religious expressions, the so-called neutrality principle.

At the same time, we have witnessed a realignment in the rival groups and their approach concerning the place of religion in public policy and discourse.  The original movement of Protestant fundamentalists aspired to run the US as a Christian polity while pure secularists demanded an absolute separation of church and state.  Eventually, the Protestant fundamentalists morphed into a group Feldman calls Values Evangelicals.  They advocate that, irrespective of differences, religious people share a common set of moral values which they would like to be at the core of a national unity project for America.  On their part, the pure secularists evolved into a group Feldman calls Legal Secularists.  They espouse religious liberty and freedom of expression for all, secularists and religious people, with the caveat, however, that religious arguments would not inform the establishment of laws or government actions.   Thus, to them the national unity ought to be built on arguments informed by reason.

Feldman finds that neither group has a convincing case.  By demanding that religious beliefs are left behind before entering public discourse, Legal Secularists deny religious people the right to inform their positions by what is central to their thought systems.  But Values Evangelicals also face serious contradictions as they try to build a common base of values for their national unity project.  In such issues as the death penalty and divorce, Protestants and Catholics often disagree.  On abortion and gay rights, Christian values do not square well with values held by adherents to the Jewish faith.  And, Judeo-Christian values do not necessarily align with those of Muslims, Hindus and Buddhists.

In light of all the contrasting views and legal opinions, Feldman adopts a middle position.  He suggests that religious symbols (like displays) or gestures (like the one by Judge Kemp) be allowed as long as they do not appear to discriminate against other beliefs, but we should walk back from government actions and policies that avail public funds and resources to religious purposes,  which the Framers of the Constitution would find much more objectionable.

What Feldman leaves out is the, by now, unmasked partisanship among politicians and jurists in interpreting the Establishment Clause.  If every citizen should have the right to inform discourse on matters of church and state by secular or religious beliefs, who is to be the honest referee?  Shouldn’t that be those who mediate these debates, our politicians and most critically our jurists?  The public deserves to have an impartial and honest interpretation of the First Amendment from those who have this constitutional duty and responsibility.

Balancing Individual and Social Interests

Climate and environment; gun controls; use of technology; private wealth and public needs.  These are some of the major issues that bedevil Americans these days.  With the exception of gun controls, the rest are of major concern to the world as well.  All these issues have one common denominator: the rights of the individual versus the rights of the society.  How we decide to resolve these issues in the near future and beyond depends on which side we decide to come down on this very old conundrum, that is, the balance between the interests of the unit, i.e., the individual, and those of the collective, i.e., the society or its political expression, the state.

Within the Western world, this question has been debated since the days of Plato and Aristotle.  Their ideas have been refined, revised, expanded and subtracted by Western scholars and philosophers over the intervening centuries without however coming to a solid guiding conclusion.*  In the words of a writer, when it comes to the individual versus the society, we are all either Platonists or Aristotelians.  Outside the Western world, the same question has been raised but it has been resolved more decidedly in favor of the interests of society.  Under the influence of Confucius and the imperative of social harmony, China, Japan and other East Asian societies prioritize the interests of society, and by extension those of the state, over those of the individual.  India and the Muslim world also put more value to traditional secular and religious customs and norms that keep individual discretion circumscribed.

So, what does it mean to say we are Platonists or Aristotelians.  For Platonists, each one of us attains goodness and excellence if we serve the society in the position we can perform best: as guardians, if we have leadership talents; as warriors, if we have bravery and physical strength; as artisans, if we have talents for business and industry.  As individuals we excel when we take our best-suited station in life and thus help our state to excel.

For Aristotelians, the individual attains goodness and excellence when each one of us fulfills his or her human potential, a potential the way we see it and actualize so that we live happy lives in the world as is.  To this end, the state ought to offer individuals the means and opportunities to actualize this potential.

Both thought systems value the quality of society and state.  And both consider each individual to be critical for the success of society or state.  In Plato’s society, however, the individual has a more prescribed mission.  In Aristotle’s, the individual is more master of his or her course in life.  Both, nonetheless, call on individuals to act as responsible and virtuous citizens that care about the collective good.

It is not difficult to understand, even from the above brief description, that Platonists are willing to live in more ordered societies, societies with a top down organizational design.  Aristotelians, on the other hand, prefer to live in less rigid societies that follow a bottom up organizational design.  Plato’s societies and states have the advantage of social cohesiveness and efficiency.  However, too much of that and Plato’s model can lead to rigid dogmatism and the stifling of individual creativity and expression.  Aristotle’s system can avoid that, but too much of it and it can degenerate down to individual aggrandizement and materialism.

The modern fields of evolutionary psychology and sociology confirm that the human species is selected by nature to live as a being with individual identity and rights to friendship, love and mating within groups that rely on cooperation, as well as learning and teaching from each other in order to survive.  This set of traits is what Nicholas Christakis of Yale U. calls the social suite.**  Research on involuntary communities (like those resulting from shipwrecks) as well as voluntary and experimental communities shows that to restrict too much the individual’s rights or the cooperation among the members of a community most often leads to its collapse.

What makes the whole question of individual versus society so difficult is none else but the heavy emotionality and the fears, rational or not, that surround its polar outcomes.  Those who believe that societies ought to be the sum total of individual rights no matter what will not easily surrender to the calls for collective action at the expense of individual rights.  And those who believe society is more than the sum total of its members’ rights and that by protecting its interests enhances individual welfare will not stop calling for collective solutions.

Societies have oscillated between the two polar ends of individualism and social imperative.  America, for example, was founded on the rights of the individual for “life, liberty and the pursuit of happiness.”  When, however, economic disintegration threatened the state, Roosevelt did not hesitate to embark on the New Deal which introduced social welfare institutions, like Social Security, and extensive regulations.  Modern Communist China was founded by putting the interests of a collectivist state ahead of the interests of the individual.  But by 1980 China was facing dire economic crises.  So, Deng Xiaoping made the bold move to loosen individual rights to confront the crisis.  The American case was one of moving from unfettered individual rights to greater social solidarity.  That of China was a case of moving from rigid economic order toward individual economic freedom.  Both moves faced ferocious criticism and resistance.  Herbert Hoover decried Roosevelt’s New Deal as socialist and fascistic.  Deng’s economic liberalization also faced criticism by politicians of the old guard.

These and other examples suggest that to reach a better balance between the rights of individuals and society, some crisis is necessary to compel citizens to overcome their emotional and ideological attachment to one or the other polar end.  So, the question then is: what kind of crisis do we need to bear before we decide to do something about gun violence; or about climate and the environment; or about the impact of technology on our lives; or excessive private wealth and neglected public goods?

* Arthur Herman, The Cave and the Light: Plato versus Aristotle, and the Struggle for the Soul of Western Civilization, 2014.

** Nicholas Christakis, Blueprint: The Evolutionary Origins of A Good Society, 2019.