What’s In Fifteen Dollars

Some numbers have the power to capture the public attention and become symbols of fears, reality or aspirations.  So 13 terrifies us; 1% reminds us of economic inequality; and $15 is the battle cry for the minimum wage in America.  This post is about that last number.  In fact, it’s more than a number.  It’s about real lives. 

First, I got curious as to what the minimum wage is in other economically advanced countries.  I found that in 2020 the minimum wage in the UK, Germany, France, Holland, Belgium and Ireland was over 1500 euros compared to just over 1000 in the US.  That’s a considerable discrepancy, so I had to look at the other side of the equation, unemployment.  From 2017 to 2019 (prior to the pandemic) average annual unemployment was around 4% in Holland, 3.5% in Germany, and 4% in the UK, that is, about the same as the 4% rate of the US.  Spain and Italy with a lower minimum wage than those countries had unemployment rates over 10%.  Denmark, Finland and Sweden do not have a minimum wage and yet their unemployment rate of about 6% was higher than in countries with a minimum wage. 

Though simplistic, these data tell me that the often-heard argument of a direct relationship between minimum wage and unemployment rate is not a slam dunk.  Numerous economic studies also fail to come to a uniform conclusion regarding the link between minimum wage and employment.  Now that we have thrown cold water on this debate-killing argument, let’s proceed with the rest of the story. 

The term minimum wage usually means an administratively set minimum price for labor. The federally set minimum wage has stood at $7.25/hr since 2009.  Its purchasing power today is clearly below that level.  When we consider that tax rates are adjusted for inflation to avoid higher taxation for high incomes and Social Security benefits rise with inflation, it becomes harder to argue against a minimum wage adjustment to protect its purchasing power.  So, that’s one point to keep in mind.  And here is another one.  The Congressional Budget Office estimates that raising the minimum wage to $15/hr will cost workers 1.4 million jobs.  This translates to a $10.15 million* of hourly income loss for workers.  But there are 27 million workers that make less than $15/hr and, hence, even a one-dollar raise of their wage translates to a $27 million hourly gain, much greater than the loss.  In a society that fetishizes aggregate income growth regardless of its distribution, the contemplated rise of the minimum wage sounds like a big winner.     

To critics, an administratively set minimum wage violates the law of demand and supply for labor.  But the labor market is full of distortions.  How do we explain, for example, barriers to entry into some professions (doctors, dentists and lawyers come to mind) that boost their wages by suppressing supply?  Or administrative requirements for an expert’s opinion (engineers, architects) even for small and mundane projects that help increase demand for such services and, hence, wages.  Not only these arrangements violate competition; they aim at elevating lifestyles from middle-class to upper-middle class or even to upper-class status.  Compare that with policies that try to pull millions of workers out of poverty to bear subsistence.  Which policies win the moral argument?

And what about the formation of horizontal (same industry) conglomerates, which by the laws of oligopoly produce at a lower level than under perfect competition and, hence, have less demand for labor, which in turn suppresses wages.  Let’s also ask this question.  If the current minimum wage is above what firms can afford, why then the slice of profits in our national income pie keeps getting bigger than the slice of wages?  Somewhere in our economy labor must be losing ground and the most likely suspect is the low end of the labor market. 

There is another definition of minimum wage that can set us on a more promising road.  That’s the wage that allows a worker to meet basic needs in shelter, food, clothing, recreation.  It’s what a living wage is meant to be.  This definition is often undermined by attempts to associate the minimum wage with teenagers or college students who just try to supplement their parental allowance, or with those who need only some part-time work that is weakly consequential to their overall wellbeing.  The truth though is that in the US the minimum wage is the only income source to millions of people struggling to have a decent living.  The reality is at the current minimum wage of $7.25/hr an American family of two lives in poverty.

Viewed from this perspective, the minimum wage is very meaningful because it helps sustain the physical and mental health of the lowest paid workers as well as their participation in the labor force.   It also humanizes labor because it shifts the focus from jobs to employees.  As many economists argue jobs are a statistic but employees are the ones who bear the brunt of disruptions in the labor market.  The humane and socially responsible approach then is to decouple the living conditions of a worker from the lowest wage that equates demand and supply.  And this is all the more so in a country of extreme wealth for few.

So, we now come to the crux of the problem.  To have jobs we need to have demand from firms at a wage they can afford.  To fairly compensate labor the minimum wage ought to be a living wage.  The only force that can bridge the gap (when such a gap exists) is comprehensive public policy.  There are several alternatives.  To ensure a poverty-free minimum wage the government could set a living minimum wage and then reduce the total labor cost, at least for smaller firms, through lower charges for programs, like Social Security and Medicare in the US.  Alternatively, the government should provide direct supplemental payments to workers that support a poverty-free living.  We already have such supplemental assistance programs but poverty still persists for millions of Americans.

The main point is that the debate about the minimum wage ought not to be about jobs lost or gained but about working lives and what it takes to keep them out of poverty and with dignity.  This approach then suggests that tackling the minimum wage calls for more comprehensive policies that support the demand for labor but also recognize the value of labor and protect the lowest paid workers from the vagaries of the labor market. 

* If 1.4 million workers lose their minimum wage of $7.25/hr they suffer a total hourly loss of $10.15 million.   Even if they lose more than $7.15/hr (because they are paid more than the minimum wage) it is still highly unlikely for the lost income to overtake the total gain the 27 million workers will enjoy from a higher minimum wage.

In Search of The Common Good

Invoking the concept of the common good as an organizing principle of a society is one thing; trying to define it, though, is a major challenge.  Like the Odyssey, setting out for the common good is a journey full of temptations that can throw you off course, full of risks of making wrong choices, full of adversaries that want to stymie you from ever reaching Ithaca.  Since I raised the concept of the common good in my last blogpost, it’s now time to say a bit more about it.

From early on, the common good has been discussed through two different lenses.  One is that of the individual, the other is that of society.  The first approach defines the common good as the sum total of individual interests.  This is the way the common good is attained through the invisible hand of Adam Smith.  Self-interest and ambition checked and balanced in the marketplace produce the greatest good for society.  Adherence to unfettered markets, however, threatens attainment of the common good not because Adam Smith advocated that self-interest should come free of morality (actually the opposite) but because, as we know, markets can fail, and when they do, they serve neither the individual nor the common good.     

One hundred years after Adam Smith’s The Wealth of the Nations appeared, another brilliant Scot, Charles Darwin published The Origin of the Species.  Based on a false interpretation of Darwinian evolution, HerbertSpencer coined the unfortunate expression “survival of the fittest.”  This became the premise for a very charged individualistic approach to defining the common good.  A good society is one whose members are strong enough to meet the challenges of social survival.  Society should weed out weak and free-loading individuals.  Resistance against social safety nets and welfare programs are modern echoes of the Spencerian principle.   

On the other end of the spectrum, we have the top-down approach that prescribes a common good for all in the interest of achieving salvation or state supremacy.  These are the conceptualizations of the common good by religious zealots or authoritarian political movements. 

It is between extreme individualism and top-down authoritarianism where the search for the common good becomes most challenging because it requires that optimal balance that can be so elusive.  In this tradition, the common good is realized in societies and states where there is a mutual interdependence between the interests of the individual and those of society.  For Aristotle* (considered to be the father of the concept of common good) the good society is one that enables its members to realize their full potential.  The common good is attainable only through the society and yet it is individually shared by its members.  Each person should take ownership in the attainment of the common good and contribute to its enjoyment by fellow citizens since enabling everyone to realize his/her potential is the essence of the common good. 

This conceptualization of the common good makes it the shared responsibility of the citizens and the state.  Realizing one’s potential depends on the means and opportunities to which one has access, and, hence, how a society is organized.  It is here that a modern philosopher, John Rawls, has made an intriguing proposition.  Rawls invites each one of us to go behind a veil of ignorance and forget who we are, male or female, privileged or not, well-connected or not, physically or mentally gifted or not, and then choose the social organization within which we would like to live.  That would determine then how a good society ought to be organized so that even its least fortunate and weakest members have a fair shot at realizing their potential and share in the happiness of life.  It is the value of potential self-actualization and preservation of dignity even for the weakest of us that elevate education, health and avoidance of poverty to legitimate rights and part of the common good.    

Attainment of the common good comes with the surrender of some private benefit or freedom of choice from each one of us.   Therefore, it is important to show that attaining the common good is worth this loss.  It is easy, for example, to see how a common defense or public roads system provides private benefits.  It may not be as easy though to understand that public financing of education generates private gains for all.  Only when the desire to attain the common good becomes part of the cultural fabric of a society, individuals count it as a source of satisfaction besides their own private accomplishments.

Charity and morality have been used for millennia to motivate people to subscribe to the idea of common good.  But practical wisdom also needs science to draw the circle of common interests and how to manage them.  It is the science of evolution that has shown us how sociality has enabled humans to survive and become a more resilient species.  It is science that is alerting us to the risks of climate change.  It is science that exposes the harmful effects of poverty on the cognitive and psychological growth of children.

The unflattering fact in the search for the common good is that it takes a common threat or an unbearable indignity to make us coalesce and form a more socio-centric worldview.  In the last century, it took two devastating world wars and an economic catastrophe with their respective fears of death and hunger for people to become more aware of their common destiny.  It took the indignity of racial discrimination in America to enact laws to protect the civic and voting rights of Black Americans and other groups in the sixties.

But it took only twenty years to fall back to the individualistic conceptualization of the common good here in America.   Allowing the rise of stark inequalities in economic outcomes, health care, educational attainment, child care, as well as our divisions in handling the risks of the pandemic and understanding the climate challenge are witnesses of how far we have veered from the sense of the common good.  

The common good is more than individual freedom and civil rights.  Actually they are both in peril without a social compact that gives citizens the basic means and opportunities so that they  come to accept certain interests as common and worth striving for.

*Aristotle’s common good comes with the caveat that it was not all-inclusive.  It was only in reference to the interests of free male citizens at the exclusion of women and slaves.

Economics for The Common Good

I have borrowed this title from a book written by the French economist Jean Tirole, winner of the 2014 Nobel Prize in economics.  Tirole’s goal is to show how a society can use the discipline of economics to pursue its common good whatever that is.  It’s like saying, let’s show how we can use the laws of aeronautics so we can fly from here to there.  In other words, Tirole reminds us that economics is a means to reach an end, not the other way around. 

That’s important because many, whether by ignorance or calculation, identify economics very narrowly with institutions and practices that advance the interests of some people and ignore or hurt the interests of others.  It’s a social loss that most students leave their secondary education with little understanding of economics.  This limited knowledge is very responsible for the rise of populist economic ideas or the support of policies that worsen instead of improving the economic interests of the society. 

Although economics can provide more informed and efficient answers to many practical problems the road to employing it in the service of the common good is full of challenges and tough choices.  Knowing how markets work, being able to design economic contracts that optimize the interests of sellers and buyers, and having answers for the economics of climate change and the digital economy do not necessarily take us to the common good.

To grasp the potential and the limits of economics as a means to serving the common good, first, we need to understand the role of the market and the state.   Tirole reminds us that markets are mere mechanisms of exchange without an a priori purpose to serve this or that common good.  They have no inherent morality of their own; nor do they and by themselves produce the distribution of gains a society prefers.  Market failures and outcomes rather reflect the moral values of societies and the market rules they set. 

The economic roles of the market and the state are not mutually exclusive; they are instead complementary.  We rely on the state to guarantee contracts and property rights; to keep competition fair; and correct market failures.  If we were very honest and had all the information we needed, transactions would be fair and the state would have less of a role to play.  Adam Smith believed that self-interest would make good markets for both sellers and buyers.  But often, self-interest veers into exploitation of other market participants.  Thus, a bank may engage in reckless lending and fail to redeem the savings of its depositors.  Or a firm deliberately withholds vital information affecting the value of its stock and bonds.  In these and other cases where behavior and information are important, state regulation is the necessary remedy.

Just as the market is open to failures so is the state.  Political power can enable special interests to capture the state authorities that set and enforce regulation and hold individuals and firms accountable for the consequences of their economic actions.  The winners and losers of an economy are often determined by the political power special interests and groups can wield.

The main actors in markets are business organizations which operate under different organizational forms.  They may be non-profit entities, simple proprietary firms, cooperatives or corporations.  Each form serves the interests of a distinct set of stakeholders, the most dominant being the shareholders.  But do their interests serve the common good?  How do we align the interests of these organizations with the common good? 

Pursuing the common good is not cost-free.  We need to decide how the cost of negative externalities (like pollution, displacement of workers, community decline) are to be shared between private business and the state.  Society as a whole can also produce unwelcome externalities.  The more innovation-intensive and globalized a society prefers to be the more turmoil will prevail in its industrial and labor fabric.  The more individualistic a society is the more economic inequality will exist.  Again, the question is whether a society will ignore any negative effects of these choices or it will serve as a shock absorber and stabilizer.  The more of the burden falls on the state the more willing we ought to be to pay higher taxes. 

The group most affected by the structure and performance of an economy are the workers.  Tirole argues that a good economic policy should prioritize employees not jobs.  Since we have very little control over jobs it is the workers we need to protect as firms, industries, even the whole economy transition to a new phase.  In the US, we have learned the hard way the costs of lacking a sound transition policy as the pace of offshoring of jobs intensified through the 1990s and beyond.  The anxiety of workers (coal miners for example) in industries in decline has a lot to do with this lack of a transition policy.

Tirole stresses that “Economics is not in the service of private property and individual interest, nor does it serve those who would like to use the state to impose their own values or to ensure that their own interests prevail.  . . . Economics works toward the common good: its goal is to make the world a better place.”

But after the impartiality of economics toward the market and the state is established and its dedication to the pursuit of a better world has been declared, the challenge of what a better world is still bedevils us along with the question of how we get there. 

As argued above, pursuing a common good requires that we accept a tradeoff.  Scandinavians trade high taxes for state services in education, health and retirement benefits.  The French trade stubbornly higher unemployment for job protection.  Many countries have minimum wage laws even if this may mean some unemployment for low-skill workers.  In the US, belief in the primacy of markets and private enterprise foreclose initiatives for universal health insurance.

Tirole’s book makes a persuasive case for the analytical rigor of economics and its ability to guide us toward more optimal solutions.  But at the end of the book our quest for the common good is still elusive.  It’s like we have been given a perfect airplane but we now have to choose our destination.   For this we need more than economics.

Child Poverty Is Everybody’s Problem

It is very encouraging and promising that there is bipartisan movement to seriously address the child poverty scourge in the US.  When I check international data to find where the US ranks in indicators like child poverty, I feel compelled to check and recheck the numbers and consult different sources.  I do this because I find it difficult to believe that a country that rich ranks so low in taking care of its young people and future promise.

Let me say at the outset that there are different estimates of poverty, and child poverty in particular, so that one can come up with different numbers and international rankings.  For example, research out of the American Enterprise Institute disputes the US numbers used for international comparisons and contends that the US ranks close to other similar countries like the UK and Canada. 

Even so, in a country of extreme inequality, you can have mild national averages for a socioeconomic indicator that hide the very precarious state of considerable segments of the population.  Even after one adjusts the poverty levels by counting various government programs, the fact remains that there are pockets of significant child poverty in the US.  For example, the Children’s Defense Fund reports that one in six children live in poverty in this country.  The ratio is one in three for Black and one in four for Hispanic kids.  Across the US, child poverty rates are significantly higher in lower- income states and states with significant numbers of people of color.  Thus, even California and New York State have child poverty rates above the national average despite their overall prosperity.

The consequences of child poverty are grave in terms of economic impact, social mobility, health, cognitive and emotional development, and, of course, social adjustment and crime.  The Children’s Defense Fund estimates that the effects of child poverty amount to a loss of $700 billion of annual GDP.  Social mobility studies utilizing the Intergenerational Earnings (IGE) elasticity index have found that 50% of the earnings of American adults depend on the earnings of their parents.  It means that half of the adult earnings of a child born into a poor household are predetermined by the low earnings of the parents.  Unless we believe that foregoing part of a person’s potential for economic and social attainment is not a waste or does not matter for social harmony then we must have no difficulty acknowledging that investing in children can give a society the biggest payoff.

It is important to understand that the failure of fulfilling one’s potential in economic and social attainment is the result of what poverty does to a child’s cognitive and emotional development.  The effects are the product of interactions of genetic and environmental factors that affect the brain and health of the child.  Adverse environmental conditions include poor nutrition and health care as well as problematic family and social situations.  

Thus, child poverty is very relevant to one’s adult life.  Two kids born with very similar genetic predispositions can have dramatically different adult lives.  The kid born into a favorable economic, family and social environment is a lot more likely to be successful later in life than the kid born into poverty and adverse family and social conditions.  Ignoring the effects of childhood experience on adult life impacts how we perceive and, more importantly, attribute success and failure in adulthood.  A much higher percentage of Americans than Europeans attributes success in adulthood to personal effort and merit and more Americans than Europeans also agree with the notion “people are poor because they are lazy or lack determination.”  When we fail to understand the link between childhood poverty and adversity and adult life, we are more inclined to oppose public support programs for adults.

The developmental effects of poverty come from the fact that the frontal cortex is the part of the brain still developing during adolescence and the last part of the brain to reach full development.  The frontal cortex is important for executive functions and regulating our emotions.  Any impairment in its development impairs cognitive and emotion maturation.  It is known that poverty and adverse environment during adolescence can adversely affect the development of the frontal cortex.  Because of its late-stage development, the quality of the frontal cortex is less dependent on genes and more on the environment and nurturing.

Studies have shown that childhood adversity, including poverty, raises the odds for depression, anxiety, substance abuse, impaired cognitive capabilities, impaired impulse control and emotion regulation, antisocial behavior, and troubled relationships.  In other words, born in poverty means you have a lot more barriers and challenges to overcome in order to succeed.  Since child poverty implies household poverty, inferior prenatal care and maternity conditions also contribute to possible problems in the development of the brain.

The negative effects of poverty on the physical and mental health of poor people are also accentuated under conditions of inequality.  What has been found to really matter is not the condition of being poor but rather the condition of feeling poor. Children growing in poor households and neighborhoods become aware of their low socioeconomic status and this further contributes to their uneven development.

Beyond these negative effects afflicting American children, we need also to account for inequality in educational attainment due to more limited resources in school districts attended by poor children.  Therefore, the extent of the problem of child poverty is serious and complex.  Making progress in the war against child poverty requires investments that support robust educational opportunities and outcomes, good nutrition and good health care.

If we look beyond the US, the good news is that extreme poverty (which, of course, affects children) has declined from 50% of the global population in 1966 to 9% in 2017.  This is a tremendous improvement for mankind.  A lot of this achievement is due to the lower number of births per mother from 5 in 1965 to 2.5 in 2017.  With fewer children a household can take better care of its offsprings.

A good society is not one that neglects its most vulnerable members.  The challenge for America is to become a global role model in line with its status as the richest country.

The Crumbling Wall of Separation

The United States has no formal religion.  It has no religious test for public office holders or an oath to divine authority.  And its Constitution (the First Amendment in particular) prohibits the state from favoring any church establishment.  If, however, you came from abroad, unfamiliar with this country, you would have to be excused for mistaking America as a country engulfed in symbols and practices more commonly associated with states steeped in religion.

When you convert your foreign currency into dollars you will see the phrase “In God We Trust” emblazoned on its currency.  If you attend a public event, you will hear people pledging allegiance to a country “under God,” and if you witness the oath to public office you will hear it end with the words “So help me God.”  None of these religious manifestations existed at the creation of the United States consistent with the letter and intent of the constitution to keep matters of faith and state separate.  Had we lived back then, we would have noticed a lot of religious fervor and widespread practicing of religious duties among those early Americans, but no overt signs the state and its civil servants were out to promote any religious creed.   

The fact that in this and other ways the US has moved away from its founding agnosticism is the second paradox about religion in today’s America.  In this, as in other respects explored in the previous blogpost, America differs from the countries its first colonists sailed away from to escape religious persecution and wars centuries ago.  What we are observing is that the separation of church and state is more and more interpreted by religious zealots as a way to keep the state out of religion, conveniently ignoring that the reverse is also part of this constitutional arrangement.  The more intense forays of religion in the “public square” are all the more interesting when we consider that the fraction of Americans affiliated with religious establishments, including Christian churches, has been shrinking.

Contrary to the complaints of religious activists that religious liberties are under attack, religious freedoms are very well protected and well-coordinated litigation and political pressure have actually blurred the lines of separation between church and state.  No longer religious establishments are excluded from the allocation of public funds, even if they can be used to support direct religious activities (that’s exactly the case with the Paycheck Protection Program of the Covid-19 relief law).  Service to customers can be denied on the ground of freedom of expression and religion.  Health insurance coverage for contraceptives can also be denied to employees for religious reasons.  Government funds cannot be used for abortion, despite its legality.  At the behest of religious organizations, Republican administrations routinely deny aid to foreign agencies engaging in reproductive and abortion-related services to poor people.  The Trump administration went even further with its “Conscience Rule” that would have allowed medical professionals to deny care on the ground of religious or moral beliefs.

In general, we observe that religious activism has a two-progue objective.  One is to strengthen the influence, if not the grip, of religious interests on judicial and government authorities.  The other is to shape the moral landscape of Americans in ways that conform to certain Christian beliefs.  Actually, the first objective is motivated by the second, which is also the one that should have us all worry because of its political and constitutional consequences.  A little history here is instructive.

For its first three centuries, operating within the Roman Imperium, Christianity grew on the strength of its moral and spiritual message with no state support.  Once, however, it was declared the official religion by Emperor Theodosius the Great at the end of the fourth century, Christian leaders sought to erase any religious competition.  By wining over or waging war on pagan rulers, Christianity succeeded in becoming the official religion through-out Europe.  This method of expansion to new people under the aegis of the state continued in the centuries of explorations and colonialism.  Evangelizing to others has been a time-honored mission of Christian churches. 

But capturing the state and using it as a tool to force on others the morals of any faith can undermine the principle of religious tolerance and eventually even the principle of democratic life.  No other block of American Christians has done this with greater determination than Evangelical Christians.  Despite his serious moral flaws, denigration of women and people with disabilities, and his harsh policies in treating immigrants and Muslims, Evangelicals, and especially white evangelicals, embraced Donald Trump as their champion and even savior in an almost messianic way.  In their desire to continue with a political regime that promised to advance their moral and religious agenda they reached the point to forswear their allegiance to the democratic governance of the country by becoming perpetrators of the ‘Stolen Election’ lie. 

What is more worrisome, however, is the willingness of politicians and even of a whole party, i.e., the Republican Party, to reciprocate the embracing gestures of the Evangelicals.  Today, Evangelicals comprise the single largest religious block of the Republican Party.  A 2019 survey revealed that 78% of Evangelicals are registered Republicans compared to 56% in 2000.  This strong party loyalty of Evangelicals is explained by the entreaties they see coming from Republican politicians.  Besides Trump who assured them that “God is on our side,” former Secretary of State Mike Pompeo had declared himself as a “Christian Leader” on the homepage of the department’s website.  Other Republicans tooting their loyalty to Evangelical priorities are Mike Pence, Ted Cruz, and Josh Hawley, all of them with presidential aspirations.

This party symbiosis with a single religious block is entirely new, at least in its intensity, in the recent history of American politics.  It is more reminiscent of those past alliances of political, government and religious leaders that led to intolerance, strife and violation of the political and civil rights of opponents.   The politicization of religion, if it continues, it will gravely challenge the future of the American Republic as a multicultural, multi-faith, and open polity.  The end result will no longer resemble anything the Founding Fathers had in mind.

These trends, I believe, should have all democratic-minded Americans worried, irrespective of religious or secular beliefs.  White Christian nationalism taking root in American politics is not just a paradox in a country in which the things that are Caesar’s ought to be separate from the things that are God’s.  It is rather outright dangerous, and, yes, un-American.

The Religion Paradox in America

One of Thomas Jefferson’s most prescient arguments for the separation of church and state was that left alone to fend for themselves religious establishments would gather strength from the solidarity and dedication of their members instead of growing complacent under the aegis of the state.  By arguing for separation of church and state, Jefferson (and his fellow Virginian James Madison) also hoped to distance the state from religious rivalries. 

More than two centuries later, Jefferson’s argument appears to have been fully validated.  The US has strong and thriving religious establishments of all creeds and religion is more prevalent in American society than in almost any other advanced industrialized country.  On the other hand, the expectation that separation would keep the state out of the encroachment of religion has hardly survived the test of time.  (About this in my next post.)

Let’s start with religious adherence.  According to a 2018 survey, 41% of American Christians attended church services at least once a week, far ahead of their coreligionists in Western Europe.  A Pew Research Center survey also revealed that religion was more important in the lives of Americans than in the lives of Western Europeans.  When examined within the United States, these religious indicators are stronger in conservative than liberal states.  So, the question arises as to whether the more intensive religious commitment of Americans is matched with an equally strong performance in various social indicators that reflect the influence of moral and hence religious precepts.

To answer this question, I checked various international statistics of recent years.  UN data show the US with 20.8 abortions per 1000 women, higher than in the more secular countries of Western Europe.  Do more religious states have lower abortion ratios (abortions per pregnancies) in the US?  The answer is yes.  Is this though due to religious attitudes or stricter restrictions in these states?  The evidence I found suggests that abortions do not bear significant relationship to religious creed in the US, whereas an international study revealed that abortion rates are lower in countries with more liberal policies toward abortion.                     

What about divorces and out-of-wedlock births?  In both, the US ranks ahead of almost all Western European countries.  Within the US, divorces and out-of-wedlock births are in general higher in the South, South-West and the Mid-West than in the more liberal states of the North East and West coast.

Next, I looked at suicide and drug death rates.  U.N. statistics show the US is ahead of Western European countries in both causes of death.  With 314.5 drug deaths per 1 million, the US is far ahead of second-place Sweden with 81 drug deaths.  CDC (Center for Disease and Control) data show that both suicides and drug deaths are higher on average in the South, the Mid-West and the Rocky Mountain states.  New York ranks 23rd with fewer drug deaths than 21-place Florida.  West Virginia is number one in that sad statistic.

Poverty and incarceration are two social ills which are closely related.  The 2019 survey of OECD (Organization of Economic Cooperation and Development) places the US number 35 out of 37 developed countries in overall as well as child poverty rate (that is, 34 countries scored better).  The US is also the world leader in incarcerations with a rate of 665 per 100,000 persons.  Poverty rates are higher in Southern and South-Western states and incarceration rates are higher in Mid-Western and Southern states, regions ranking higher than the national average in religious adherence.

These results point to a paradox about religion in America.  Despite greater religiosity and closer affiliation with religious establishments, Americans do not seem to perform better than countries known for their secular culture and politics.  More tellingly, even within America, states known for their religiosity do not seem to perform better than more liberal states.

What do these findings tell us?  Do they mean that stronger religious attitudes lead to worse moral behavior?  Can we argue that Americans are more morally challenged than the more secular societies of Western Europe? 

First let’s put to rest one claim often heard from religious people.  Namely, that religious affiliation leads to a more moral life.  This has been an old canard against atheists, agnostics and secularists, in general, without though any factual basis.  For example, in a speech given at the University of Notre Dame, William Barr, the former Attorney General of the US denounced secularists for “moral chaos and immense sufferings, wreckage and misery” in the US.  The above findings instead show that many of the serious ills of American society originate in states with greater adherence to religion.  This association has been already established in the past.

But equally unfounded would be the claim that religious people are less morally inclined than others.  What if behind the association of moral outcomes and religiosity are other factors that explain this correlation.  Such, well-established, factors are less education, poorer economic and job conditions, and inadequate public services.  The statistics I looked at are better in Western Europe to no small degree due to wider and stronger safety nets that result in less poverty and social alienation.  These conditions then have a mitigating effect on poverty, crime, and suicides.  Better drug rehabilitation programs also result in lower incarceration rates and drug deaths.

In the US, some of the worst statistics are reported in more religious states which also happen to have significant pockets of lower educational attainment, weaker economic conditions, lower quality jobs and insufficient public services.  Many of these are the states where the “deaths of despair” have surged in the last 25 years (as explained in an earlier post).

What are then the really important conclusions we can draw from this analysis?  First, the virtue wars between religious and secular people are entirely futile and counterproductive.  Second, the road toward better societies is through public policies that produce better educated citizens with more opportunities for economic advancement and greater support from the state in coping with the vicissitudes of personal life.  

What Brexit Really Means

While America was gripped by the double anxiety of a raging pandemic and the desperate and unlawful attempts of an outvoted president and his die-hard and misinformed supporters to cling on to power, the world also witnessed another sobering event, the Brexit.  Great Britain at long last was leaving the European Union, making the British channel again more than a mere geographical divide.

I call Brexit a sobering event because to me it is one more reminder of how difficult it is for humanity to build inclusive and enduring bonds and stay together.  The tendency toward fragmentation reminds me of the biblical story of the tower of Babel.  Men and women worked together to built it.  But then as they were coming close to their goal, God decided to give them different languages.  Cooperation became impossible and the project collapsed.  Humankind would splinter into different factions, each going its own way.  The fact that God’s will was the culprit of this fragmentation does not make it any less unfortunate and over time destructive.

In today’s world, the role of a religious God is played by a host of humans playing god, equally determined not to let humankind come together.  These human gods take the form of ambitious politicians or selfish business people.  Fragmentation, that is, the “Us” versus “Them” divide becomes to some the road to power and treasure.  Such gods wrought Brexit by telling an anxious working class of Britons lies and half truths about Brussels bureaucrats, hostile immigrants and the promise of renewed glory for Old Albion.

The move to an untethered Great Britain harkens back to the idea of nation-state.  The idea that a country with greater national, religious, and cultural cohesion is a more effective administrative unit.  But the record is mixed.  The Greek city-states thrived as independent entities while external threats were effectively managed.  But disunited they eventually fell to the armies of Macedon.   The disparate Hellenistic kingdoms became renowned cultural centers, but they, in turn, succumbed to the power of Rome.  The Roman Empire, first based in Rome and then in Constantinople, the Holy Roman Empire, the Ottoman Empire, the British Empire and the Soviet Union all ruled over dozens of people with different ethnicities, languages and creeds. 

All these (and other empires) were militarily strong, kept land roads and sea lanes free, and protected their peoples from foreign enemies.  They did unify large parts of humanity but under autocratic rule that did not always respect the rights of different ethnic and religious groups.  When dogmatic religious or political ideologies prevailed, these empires would also squelch cultural, intellectual, and artistic creativity.  When years ago, I read Jacob Burckhardt’s history of Renaissance in Italy, I could not help but realize how Renaissance blossomed out of the independent city-states of Venice, Florence, Padua and Genoa which let the arts and letters thrive by fending off Rome’s Papal power.  Soon after that, the creative explosion of Renaissance emerged not in the cities of the Holy Roman Empire but in independent and more democratic Holland.  A century later the political, social and commercial preconditions that led to the rise of free markets and capitalism first took hold in England, not the rigid multi-ethnic monarchies of continental Europe.

So, the lesson of history is that large state conglomerations project power and stability but often stifle individual rights, creativity and innovation. Nonetheless, this is not an argument one can raise in defense of Brexit.  Great Britain is not escaping an autocratic empire.  It leaves a union of democratic states each with enough autonomy to foster creativity and innovation, and all dedicated to civil and individual rights.  The European Union is the first experiment in history where independent democratic countries decided to cede some of their sovereign power in the interest of pan-European peace and a common future.  If the concept of the nation-state after the Westphalia Treaty of 1648 was the right solution to bring an end to the religious wars of Europe, equally consequential was the Treaty of Rome in 1957 that established the European Economic Community as the solution to putting an end to disastrous intra-European national conflicts.  It is against this bigger purpose any cost and friction of a unified Europe must be stacked against.

It is, therefore, from this big-purpose project Great Britain is walking away.  And what an irony this is!  The same Great Britain that had no qualms about ruling over half the world in the name of the Crown, it is now the country that balks at a European order in which it had an equal voice, a voice it deprived of its imperial subjects.

Around the time Great Britain was embarking on its empire-building project, here in America, a newly independent country was embarking on a novel experiment of forming a multi-ethnic democratic state within its borders.  Unlike the British project of joining foreign people from all corners of the globe under British rule, the American experiment was to become the home of people from around the globe governed by a constitution of the people.

As it happens with all undemocratic empires, in time, the people that made up the British Empire split off in order to pursue their own national destinies and Great Britain itself retreated to its geographical and national borders.  That’s a devolution not open to America.  Here we are destined to live together – multi-racial, multi-ethnic, multi-cultural and multi-creed.  We have no internal borders behind which we can retreat and live in racial, ethnic and religious purity. 

That’s why the trends of racial friction and the rise of religious and white nationalism we have seen in recent years should be sobering to all Americans.  The American like the European Union project is to teach people the possibilities of “We” in contrast to the fear of “Others.” 

So, to me Brexit means walking away from building a “We” world just like the splintering of Americans by race, creed, or any other divisive idea is walking away from the original American project of building One out of Many.

Markets and Individual Rights

When Twitter permanently cancelled President Trump’s account and Facebook suspended it indefinitely, many Americans, most of them liberals, gave their resounding approval.  These drastic actions came, of course, after a long stream of lies and disinformation the incumbent president launched against the integrity of the presidential election outcome and his reprehensible and inflammatory speech exhorting his supporters to march on to the Capitol and take back “their country” with the shameful and insurrectional actions that followed.

Silencing someone, like Donald Trump, who has masterfully used social media to coral millions of his followers into a parallel universe of concocted facts ranging from the coronavirus pandemic to voter fraud surely has a salutary effect on all those who stand for science, facts, civil rights and democratic norms.  But before we celebrate Messrs. Mark Zuckerberg (of Facebook) and Jack Dorsey (of Twitter) as noble defenders of American democracy, we need to connect all the dots and see where their actions can take a democratic and pluralistic society.

Both Zuckerberg and Dorsey delivered a punch of extraordinary force because they have an immense control on the market for information exchange.  So, let’s first disassociate the market as a mechanism for exchange from its users.  Markets are often denounced by the left as tyrannical and extolled by the right as liberating.  Both sides are barking up the wrong tree.  Focusing on the market is most often a convenient way to take our eyes away from its users, the true suspects of any good or bad thing markets bring to human societies.  Those who are truly responsible for the beneficial or harmful consequences of markets are their users, individuals, enterprises, and state authorities. 

There is the misconception that, if left free, markets are innately capable of sorting out and facilitating the exchanges people (the consumers) like to have.   But the reality is that unregulated markets do not necessarily lead to free exercise of individual choices.  Even in the absence of a state monopoly, a market can be cornered by one or a few private entities.  Facebook, Twitter as well as Google, Apple and Amazon which have denied services to Parler (another social media firm) are practically the owners of the information market.  Donald Trump is a victim of their huge market power.  Other times, individuals are locked out of a market for political, religious or pure discrimination reasons (see below).

Because so much of one’s life is affected by the opportunities to use markets to satisfy material needs and wants as well as for self-expression, the right to participate in markets should be considered an individual right.  Of course, like other rights, market rights are subject to limitations when this is necessary for the common good.  The internet and social media, in particular, have made the navigation of market rights and the common good extremely challenging.  Disinformation and incitement can mislead people to take actions harmful to themselves and others or drive people to violent acts.  It is this concern that led Facebook and Twitter to exclude Donald Trump and thousands of conspiracy theorists from their platforms.  But this reaction raises legitimate questions about the enforcement of market rights.

Over a year ago, I wrote about the case of the Colorado baker who refused to bake a wedding cake for a gay couple.  Similarly, the firm Hobby Lobby refused coverage of contraceptives to its female employees as a matter of religious freedom.  In these cases, either a certain class of customers or employees were excluded from a particular market but the Supreme Court found the exclusions consonant with the First Amendment.  I then raised the question as to what would happen if a market was controlled by a single or a few players, all imposing the same exclusions.  How would the individual right to a market be protected in the absence of market alternatives or poor substitutes of these markets?

Well, this conjecture seems to have been realized in the case of Donald Trump.  And this poses a problem for conservatives and liberals.  Conservatives cannot celebrate the right of the baker and Hobby Lobby while they condemn Trump’s expulsion from the social media of his choice.  And liberals cannot rejoice at Trump’s rejection but deny the baker and Hobby Lobby the same right of refusal in their respective markets for cakes and workers.

And just because I like to provide historical perspectives, let’s not forget that prior to the repeal of segregation practices in the 1954 case of Brown v. Board of Education of Topeka, Black Americans were excluded from markets that were reserved for white clienteles.  Very recently, Newsday, a local newspaper in Long Island, had an extensive expose on how real estate brokers had systematically tried to keep Black home buyers away from predominantly white neighborhoods.  All these cases suggest that even a country with a professed adherence to free markets has and continues to engage in the practice of market exclusions for reasons rooted in political, religious, or racial ideologies.

There are a couple of lessons in the above examples.  First, we have to recognize that markets can violate individual rights whether they are run by private or state interests.  Markets can be dominated by private interests and be as illiberal as markets dominated by state authorities.  The Chinese are already drawing an equivalence between their state control over information and that exercised by private entities in the US.  Second, individual rights are not necessarily protected when we allow unfettered market freedom.  The Trump administration, like previous Republican administrations, was particularly friendly to mergers and acquisitions.  (Democratic administrations, though to a lesser extent, can be also faulted in this regard.)  Now Republicans find that one of their own in addition to thousands of their supporters have fallen victims of the market power of a few mega-players. 

The expulsions from social media with all the ramifications on the right of people to exchange information is bound to force a reckoning and hopefully a public debate that should lead to a new legal and social contract for the regulation of information markets.  It is hard to expect or accept that a democratic society would grant the power of information gatekeeper to a few private entities, some of them dominated by a single or few owners.

The Fragility of American Democracy

January 6, 2021 will be a day of infamy in the annals of American democracy.  After four years of telling Americans not to believe in facts, not to trust anyone but him, and never accept electoral defeat as fair, Donald Trump unleashed his supporters so that they would march on to the Capitol, invade it and desecrate it in the name of fantastical, and yes, fascistic beliefs.

In such moments, emotions run high and the desire for immediate relief finds an easy outlet in naming Trump the villain and arguing that these events were to be expected given his behavior all along.  That he is the principal instigator is undeniable.  But there are millions of Americans that have been stuck to his egregious, untrue and cruel utterances.  Trump found fertile ground on which he sowed his seeds of hate and resentment of others, and he also found a constitutional order and electoral process that were ripe for abuse.   It is timely, therefore, to reflect on the fragility of American democracy and what it means for its future.

The foremost challenge to the application of democracy (the rule of people or demos) is who is accepted and trusted as member of the demos.  From the emergence of democracy in the Greek city-state, admission to the demos was restricted and incomplete.  This democratic deficit continued in the Roman Republic and later in the modern reappearance of democratic institutions in England and continental Europe.  So, the first point I wish to make is that America, as a multiracial and multi-ethnic country, has experienced severe political convulsion each time its demos expanded due to immigration or voting rights laws. 

To understand the last four years and the insurrection episode of this week, we need to understand the history of American democracy in its true record and not its idealized narrative.  It is a history of tortuous and painful expansion of its demos.  A history with periods during which the dominant fraction of its demos has been unwilling to trust the other side with a role in the public affairs of the Republic.  This became very clear in this last presidential election.

The American democracy was launched with representational deficits of its own, since neither slaves or women were admitted to the American demos.  The first crisis of democratic governance came after a civil war that abolished slavery and freedmen were granted voting rights.  The transition to this, more inclusive, demos came, however, to an effective end in the South with the collapse of the Reconstruction effort in the 1880s.  What followed was a long period of vote suppression and compromised citizenship rights for Black Americans.  Fear of sharing political rights with less desirable migrants from Eastern and Southern Europe also triggered the immigration law of 1924.

The next political realignment and backlash to a renewed affirmation of voting rights came with the adoption of the Voting Rights and Civil Rights Acts in 1964 and 1965.  Southerners punished the Democratic party for sponsoring these Acts and turned to the Republicans.  It is not a secret the notorious Southern Strategy of Richard Nixon was based on racial fearmongering.  Its most outspoken early champion was George Wallace, Alabama Governor and presidential candidate.  The 1965 immigration law that opened up immigration to Latin American and Third World countries would further strain nativist attitudes regarding the composition of demos which would flare up many years later.  Systematic efforts to suppress the vote of undesirable voter blocks are the manifestation of the systemic resentment toward an expanded demos.

By the time Donald Trump appeared in the political stage, covert and overt resentment to the gains of minorities of color and immigrants was already at play.  Trump’s opening gesture to white conservatives was the denunciation of Latino immigrants as criminals and free riders of welfare benefits, soon followed by bans on the immigration of Muslims.  (However, whites from Norway were very welcome.)  Economic and social backsliding may be the visible catalysts of white alignment with Trump, but resistance to sharing political rights and control with people deemed inadmissible to the pro-white American demos are better suspects.  The specter of an impending minority status for whites versus all other demographic groups together does not provide relief to the existential anxiety of white nationalist voters either. 

The second weak point of American democracy is its constitutional order and the electoral process.  The Framers of the Constitution lived in an era that was steep in Greek and Roman classicism and the admiration of virtue in public life.*  Their idea of a politician was that of a person of honor who placed the public good ahead of his/her own.  Since however, the Framers were not hopeless idealists, they inserted a heavy battery of checks and balances, lest a branch of the government, most crucially the President, developed authoritarian urges.  This constitutional order presupposes, however, that public figures have the integrity and the courage to exercise the checks and balances granted to their office.  What if, though, these guardrails collapse under the weight of intimidation and abuse of authority as practiced by a President like Donald Trump. 

The presidential electoral process also involves multi-stage checks and balances.  Between the point the people vote and the winning candidate assumes the presidency, the process provides a role to many actors and confirmation rituals: election boards to validate the election results: state legislatures to certify the electoral votes; and Congress to certify each state’s certification under the watchful eye of the Vice-President.

Along this road to final conclusion of the electoral process, there are ample opportunities for law suits and court rulings, election board challenges, state legislature shenanigans, and finally congressional malfeasance.  Under the right combination of majorities (think of a Republican majority in the present House) Congress can void the electoral votes, and choose its own candidate even if he/she lost the election.  Whether or not most of these challenges and irregularities are subject to the review of superior courts, the fact remains that a determined and malevolent presidential candidate has the opportunity to mount one challenge after another until the final decision has been sullied enough to cast doubt on the electoral outcome.  And that’s exactly what Donald Trump and his enablers did.  

The Trump presidency has exposed the weaknesses of the American democracy in more and scarier ways than any other before.  Apart from reinforcing the integrity of the electoral process, not an easy project in itself, the most consequential need for all of us is to master the public virtue the Framers expected in order to restore commitment and trust in the idea of an inclusive demos, free of fear and resentment of the other.

In less than two weeks, America will celebrate Martin Luther King, Jr. Day.  What better then than to invoke his summoning words as to what an all-inclusive America should look?  

“And so even though we face the difficulties of today and tomorrow, I still have a dream. It is a dream deeply rooted in the American dream.  I have a dream that one day this nation will rise up and live out the true meaning of its creed: “We hold these truths to be self-evident, that all men are created equal.”  I have a dream that one day on the red hills of Georgia, the sons of former slaves and the sons of former slave owners will be able to sit down together at the table of brotherhood.  I have a dream today!  …

America should let no one, not above all someone like Trump, to take this dream away.

* In his book First Principles, Thomas Ricks reports that the word virtue can be found several thousand times in the writers of the founding fathers, more even than the word freedom.

A Year in Search of A Name

What do you call a year like 2020?  The year of the pandemic?  The year of grief and death? The year reason gave way to delusion?  The year American democracy almost died?  All we know for sure is this is the year humanity really wants to forget. 

My very first post of 2020 was about the first twenty years of the 21st century and how for the first time in three centuries these years were free of wide-spread wars and human carnage.  Little did I know that an invisible thing of nature we call virus would spread out of China and torment, by sickness and death, the rest of the world.  But that was not the only scourge Americans had to endure.  Besides the coronavirus that chipped away on our health we had to confront another virus that chipped away on our faith in truth and trust and eventually in democracy.  It is hard to tell which virus was the most damaging. 

The study of history is informative not so much for satisfying our curiosity about events of the past, but rather because it teaches us how little human behavior has changed over time despite our advances in science and technology.  In 1347, Europe was hit by a plague that came to be know as Black Death.  It was responsible for depopulating Europe by up to 50%.  Like the coronavirus it had come from Asia.  In many ways, its victims experienced familiar emotions that we saw amongst us: grief, anger, denial and despair.  Instead of blaming China, Black Death era Europeans blamed and scapegoated, demons, witches, and a frequent target, the Jews.*  Instead of recommending bleach, hydroxychloroquine and ultraviolet light, the Black Death pandemic was exorcised with maniacal public dances, rubbing the body with chopped onions or dead snakes if available, self-flagellation, or taking arsenic and mercury. 

The similarities in reaction are stunning.  But how can we justify them in our contemporary world, which is light-years ahead in scientific, medical and technological knowledge?  What do we make of the denial in the face of so much evidence?  I have no other explanation but to admit that in super-tense situations emotions prove to be stronger than reason for too many people.  In the year 2020, large swaths of the population either refused or were unable to inform reason with knowledge and information from science to control their emotional instincts.  For many Americans, risk aversion and precautions were subordinated to the political ends of one person, Donald Trump, President of the US.

While the pandemic was raging through America, spreading sickness and death, another virus, political in nature, started to take hold.  Its symptoms were mistrust in the integrity of the upcoming presidential election and outright paranoid conspiracy theories of voter fraud.  Practices, like voting by mail, not challenged heretofore, including the recent primary elections, were now suspected as instruments of electoral malfeasance by American citizens and foreign operators, including a dead one! 

Between election day, November 3, and December 14, when the electors cast their votes, Americans lived through weeks of one challenge of the presidential vote after another; challenges that engulfed lower and supreme state courts, the Supreme Court of the USA itself, appeals to state legislature to dismiss the duly elected electors, even asking election officials to undo what had come out of the ballot box.  At first glance, these challenges could charitably count as gestures of bad faith.  But the ongoing and strenuous insistence to overturn the people’s verdict in the face of no supporting evidence and even after all legitimate means had been exhausted, has to be called by its true name: a wishful call for a coup d’ etat.  

It is tempting to draw a parallel between natural and political viruses.  A natural virus can infect healthy and unhealthy people, but it is most dangerous to those with underlying health conditions.  The same way, a political virus can infect anyone, but it mostly thrives in people with underlying conditions, like financial decline, diminished political power and social status, identity crisis, and fear of the “other.”  There are millions of Americans with these underlying conditions and there is much blame to go around.  But then there are the super-spreaders of division, conspiracies, and delusions that, like parasites, thrive on these underlying conditions.  As a result, many Americans, have resorted to the easy comfort of conspiracy theories and fantastical promises of a coming restoration to their rightful place. 

So, there we are at the end of this horrible year facing these two viruses.  Fortunately, we have the science to fight the one that came out of nature.  By now, vaccines are available and sooner or later the Covid-19 will no longer threaten our physical existence.  What, however, about the political virus?  How will America inoculate itself and rid of it?  What will take for this to happen?  There are always voices of optimism that put their faith in the better angels of America, or the eventual return of common sense, or the yearning for reconciliation.  But do we have a catalyst for any of these beneficent forces to come to our rescue? 

If there is a chance to restore the political health of America, responsible political and thought leaders have to step forward and restore reason and trust in the public square.  Paranoid, delusional and fraudulent ideas should not be given the oxygen to poison the body politic.  Failure in stemming the corrosive influence of the political virus will, in the end, turn out to be much more devastating than the coronavirus.

As I am headed toward the exit door of this year, I am happy to see that science triumphed, relieved that American democracy survived, and very aware that resilience and reason are worth keeping alive.     

* In the middle of the Black Death epidemic hysteria, Pope Clement VI, to his credit, tried, but to no avail, to tame the anti-Semitic sentiments by appealing to reason.

* The information about the Black Death came from the book Apollo’s Arow of Nicholas Christakis, a doctor and professor of social and natural science at Yale University.