Longing for Rootedness in A Globalized World

Since their beginning, humans have moved about the globe.  “Out of Africa” is the phrase that epitomizes the origin and the outward movement of the final human species that survived, the homo sapiens.  Even after farming and agriculture were invented, thus bonding humans to the land, they kept moving to new places out of economic necessity and opportunity.  No matter, however, how strong our drive to move and relocate has been, just as strong is the desire to grow roots and thrive in a place we can call home.

Despite periodical waves of mass migration, for thousands of years most people spent their lives very close to home.  This was mostly due to the way most economies were organized.  Production was done in relatively small scale and thus it took small amounts of saving and, hence, capital to support it.  In other words, capital raising and capital use were mostly local. 

All this changed with the advent of capitalism.  Mass production required significant investments in technology and, thus, capital.  Not all capital could be found locally.  The financial system had to grow to facilitate the movement of surplus capital to places where the returns were higher.  The growth of colonialism made capital mobility more intense because it was also very profitable.  Colonial countries, like Holland, England and France, used their surplus capital to fund investments in their colonies.  Utilizing strong protectionist measures, colonial powers made sure these overseas investments were highly profitable.  They also made sure that critical production technologies and facilities remained in the motherland.  This latter tactic was crucial for keeping their working population content.  As long there was enough demand for labor and handsome profits were repatriated to the motherland, everybody was happy.

And then all this changed again.  And it changed in our lifetime (as long as we were born in the 1980s).  First the UK and then the European Union joined the US in the liberalization of their financial systems and the movement of capital across the globe.  The dissolution of the Soviet Union and most importantly the opening of China to foreign investment and business created new profitable opportunities for capital to pursue.  In the ensuing decades it became possible for financial firms to amass capital from all corners of the world and redeploy it anywhere based on the promise of hefty returns.  Not only savings (surplus capital) were competed away to the most profitable uses but so were factories and labor.  As the new multinational economic order gathered steam, old industrial towns in the American Midwest, in England, France and elsewhere started to lose their factories, their economic vitality, and the institutions that made up their social fabric and identity. 

We still live in a world in which rootedness is location- and people-specific.  It gives us shared values, common culture and shared identity.  These are not easily transferable from one place to another.  And they take years to take roots in new locales.  So bringing rootedness into the picture makes us realize that the terms of mobility for capital and labor are not equal.  Labor mobility comes at a high price.  Its price is the diminished communities it leaves behind and the struggle to grow roots to the new places.  That’s why our desire to move with economic opportunities runs up against our desire for rootedness.

The mobility of capital clashes with rootedness in other ways as well.  Capital is owned by firms and investors.  Their interest is to get the highest returns in any place this is possible.  That’s not, however, what serves the interests of rootedness.  In an economic sense (as told by Giridharadas in Winners Take All) capital owners belong everywhere whereas working people belong somewhere.  Corporations claim they consider their local communities as one of their stakeholders.  But when investment in a community is to last as long as it secures acceptable returns, is the loyalty of capital to a community serious?  If the interests of capital owners do not necessarily align with the interests of local communities, who is there to mediate the friction?

Next consider the divide between rural populations (mostly farmers and blue color workers) and urban and suburban populations (mostly educated professionals).  The former, by the nature of their occupation or the limitations of their skills, are homebound and, hence, very much interested in the stability of their communities.  In the cities and suburbs, educated professionals and cadres of workers have greater opportunities to share in the bounties of the global economy.  No wonder, those who live in rural areas tend to be more nationalistic whereas city folk tend to be more cosmopolitan.  Again, we have a divide between those whose interests are somewhere and those whose interests can be everywhere.

All these thoughts suggest that despite the tacit and widespread approval of capitalism, even in communist countries like China and Vietnam, there are aspects of it, most notably unfettered capital mobility, that are not easily compatible with our human longing for rootedness. 

The mistake we often make in our evaluation and appreciation of a political, social or economic system is to judge it on the basis of some distinct advantage ignoring all the while its full implications as its various workings come into full gear.  This is the case with economic globalization.  It sounds great in textbooks and economic models that treat resources like capital and labor as if they were pieces on a chess board while failing to consider the reality of human nature and therefore the collateral costs.

This piece cannot end though without considering the salutary effects of globalization.  Thanks to it, billions of people in previously undeveloped countries have been lifted out of poverty and misery.  Nonetheless, the folks that have been displaced in the industrialized countries will continue to remind us that this human progress cannot be had at their cost; all the more, when the globalists are the last to pay a price.   Reconciling the interests of the people of everywhere with the interests of the people of somewhere should be high in the global agenda.    

* This piece on rootedness was inspired from reading a book review of The Future of Capitalism: Facing the New Anxieties, by Paul Collier.

A World of Difference in Fighting the Pandemic

Earlier in the fall, I was asked to lead a discussion on the coronavirus pandemic among faculty associated with the Center of International Financial Services and Markets of the Zarb School of Business in my old academic home, Hofstra University.  Given the scope of the Center, I proposed to focus on questions with an international perspective.

Admittedly, the discussion was based more on information gleaned from news reports than on hard evidence.  But still the questions themselves were worth considering as a starting point.  The following were the questions we discussed.

Did the structure of health care systems (national vs. private) and health insurance (universal vs. optional) matter?  Broadly speaking, the answer appears to be no.  Countries with diverse systems in the delivery of health care and health insurance suffered devastating surges in coronavirus cases and deaths.  This was true for Europe with its emphasis on national health care and universal insurance as it was in the US where health care is mostly private and insurance, even after the Affordable Care Act (commonly known as Obama care), is not universal.  There was, however, one area where the structure of the health care system made a difference: in the production of more or less equitable outcomes.  The lack of universal health insurance and less emphasis on public health policies in the US have left significant numbers of Americans, especially minorities and poorer people, with severe co-morbidities that resulted in greater frequency of severe Covid-19 symptoms and deaths among these groups. 

Did a national vs. a regional response matter?  Most countries in Europe, Asia, as well as Canada adopted a national strategy in designing methods and tactics with respect to testing and tracing, wearing masks and practicing social distancing, and imposing restrictions, including lockdowns.  These countries managed to contain the first wave better than the case was in the US where president Trump left this task to the states.  When the second wave came, we again observed that countries following a national policy were more successful in reining the pandemic.  An area where the type of the policy, national vs. regional, mattered a lot was in regards to opening schools.  Although a difficult and challenging feat, countries with a national policy provided a more consistent and coordinated approach, most often favoring in-person classes, than countries with regional approaches, like the US.  Countries that opened their schools proved to have made a more judicious use of the evidence that the infection rate is extremely low among the young.  They also seemed to appreciate more the role of socialization in the cognitive and emotional growth of children.  A cacophony of special and political interest voices bungled this issue in the US.  As a result, the US may pay a significant deficit in this regard for many years to come. 

Did the individualistic vs. the socio-centric culture of a country mattered?  Asian countries with their greater emphasis on the interests of society vs. those of the individual were better able to mobilize their populations to abide by government guidelines.  Interestingly, if we look across countries with a common cultural background, in this case an Anglo-Saxon culture, we can see that the display of individualistic attitudes, which run strong in this culture, was not uniform.  In Canada, Australia, and New Zealand, the people displayed a greater compliance with state mandates than in the US and the UK.  Especially in the US, the zeal to exercise individual rights, often driven by political preferences, made the coordination of responses to the pandemic exceedingly challenging and in some states outright ineffectual with devastating results health-wise.

Did trust in the authorities matter?  Needless to say, that trust in the government and the medical and scientific community is extremely important for an effective response to any health crisis.  It was an unfortunate coincidence the pandemic hit the US during an election year.  The toxic political discourse coupled with the unusually low trust in government made the public’s cooperation with federal and state authorities extremely fraught.  On top of that, there was a proliferation of conspiracy theories that drove dangerous wedges in any attempt to coral public support for any policy response to the pandemic.  We don’t need to make any international comparison to conclude that mistrust played an enormous part in the health calamity experienced in the US.

An important point that emerged in the discussion was the role of policy and leadership.  Policy tools and effectiveness are, of course, impacted by culture, institutions, and availability of resources.  But coherence, steadiness, and execution can matter even more.  Countries, like China, South Korea, New Zealand and Germany to mention some, responded with well-thought out and well-executed policies.  Again, the US was a laggard in this respect.  Choice of policies also matters.  A case in point is the experience of the Nordic countries which share a lot in culture and institutions.  Norway, Denmark and Finland formulated and executed more aggressive policies.  Sweden opted for a weak policy stance and paid a heavier price.  Leadership has to do with the clarity and steadiness of message, resoluteness, and leading by example.  None of that happened in the US.  If anything, the US leadership provided fodder for counterproductive behavior that exacerbated the negative health outcomes.

How different countries responded to the pandemic had serious implications for the meaning of “failed state” and, hence the brand image of a country.  Small countries, like New Zealand, buttressed their image as competent and successful countries, whereas the US lost significant brand capital.  We can also say that soft power, like public trust and policy design and execution, proved to be superior to hard power measured by military and economic power.

Finally, if there is a common international theme in fighting the pandemic, it is, first, the compassion and sacrifice of the health care givers, and, second, the speed and ingenuity with which the scientific community across the globe responded to produce the vaccines needed to inoculate us against the coronavirus. 

Disclaimer: The opinions expressed in this post do not necessarily reflect those of the Center, the Zarb School of Business, or Hofstra University. 

“Make Liberals Cry Again”

It was less than a few weeks before the elections, and my wife and I were driving to a nearby town when a pickup truck came up next to our car.  We noticed it was emblazoned with pro-Trump banners.  But what we saw at the back of the truck when it pulled ahead of us was more telling.  In big letters the banner proclaimed “Make Liberals Cry Again.”  I turned to my wife and said “Perhaps, we are the reason Trump is so popular.”

In about two months, Donald Trump will no longer be president.  But those who voted for him, and more importantly, his loyal base will still be with us.  Although their candidate lost, I want to tell them that all Americans, conservatives and liberals, have reasons to cry, but not entirely for the reasons the banner intended. 

So, if Trump’s supporters want me to cry, although my candidate won, I will.  I will first cry for the moral bankruptcy of the Christian Right (mostly Evangelicals) that against all evidence chose to entrust its religious political agenda in the hands of a man whose personal behavior as a civilian and politician has little to do with Christian values.  If history books are any guide, it’s not the first time that religious zealots succumb to the rule “the purpose justifies the means” in order to advance their agenda.  Trump promised to deliver the Supreme Court and support the Christian Right’s agenda and to its members nothing else mattered.    

I will also cry, for the blue collar, working class, and rural Americans who have been abandoned by Democrats and Republicans.  The Democrats bought too much into the idea that opening markets and trade across the globe would be good for all workers.  They signed up to the global mobility of capital before they had a chance to build the safety net and skills upgrade needed to ease the new competition.  That was a policy failure.  The Republicans though deserve a lot more blame for deliberately distracting working class and rural Americans with the vision of an old America and moral and cultural wars while their world and social fabric around them were deteriorating into decaying towns, low paying jobs, work without fulfillment, and social isolation and degradation. 

The Republican establishment feigns surprise to Trump’s success, as if he is not the natural culmination of their Southern Strategy going back to their own Richard Nixon.  That was the strategy that played the race card to lure voters in Southern States away from their support of the Democratic party.  Instead of celebrating the Civil Rights and Voting Rights Acts of 1964 and 1965, Republicans used them against the Democrats.  Then they doubled down by turning the Supreme Court decision in Roe v. Wade, that legalized abortion, into another cudgel to wield against Democrats.  By the time Donald Trump appeared on the political stage, race, religion, and guns (let’s not forget them) had been already served in his political platter.  All he had to do was to add a few more scoops and then mix in anti-immigrant slurs and fearmongering.  

I will cry for the ideological blindness of the libertarian wing of the Republican Party that still preaches the virtues of unfettered capitalism, free of regulation and full of individualistic behavior unconcerned of any negative social spillover consequences.  Don’t they see that millions of working-class Americans (including their white base) are rejecting this strain of capitalism?  These economically disadvantage Americans do not care about deficits and debt, nor do they care about the virtues of international trade.  What they care about is keeping their old jobs, the economic and social vitality of their towns; and access to good education and health care.

I will cry for the American plutocracy that is more interested in amassing personal wealth and competing in grandiose donations than accepting to share more fairly the gains of economic growth with all others for the well-being of the American society.  Why would a society need generosity out of concentrated wealth if it had the resources to take care of its needs?  If, for example, there was more support for living wages, easing the transition of workers to new jobs, promoting healthy habits and environmental safety and accepting labor unions as partners instead of strangling them out of existence.  Should there be support for these socially-minded policies, we would have less poverty and financial insecurity to worry about, healthier populace, better-trained workers and a greater consensus in the pursuit of meeting the interests of all stakeholders.  If corporation and the wealthy did not resist to pay their fair share of taxes that their peers of past generations paid, the government would have the resources to step in and take care of the detritus left by the creative destruction of capitalism.

And I will cry for the educated and professional elites, conservative and liberal, that often seem to have less in common with their fellow Americans in the fly-over country than with their peers of the global elite.  The sea of red dots we see in the voting landscape of America outside the urban and suburban areas is where the other America lives.  It is the country of rodeos and agricultural fairs.  It is the land of country music and square dance, and the country of Friday night high-school football games.  It is the land that feeds us and gives its sons and daughters to defend us.  It is the country of rural and working-class people who do not think a college degree is needed for a better life as long as American capital keeps investing in people and not just profits.   

Finally, I will cry for the fact that white America has yet to come to grips with what the legacy of slavery and the ongoing under the radar discrimination has done and is doing to the lives and opportunities of our fellow black Americans.  Coming from another land, it took me years to educate myself about the racial past and contemporary reality impacting minorities of color in America.  If we can understand how the lack of opportunities, adequate services and of a safety net can bring personal plight and social disintegration to working-class whites in the American heartland, and how unfair it would be to make their condition the core of prejudice and stereotyping, then we should have no difficulty to understand how unfair it is to do the same to any other group of people, no matter what their race or background.

And now I ‘ll wait for you, Trump voter, to tell me what you would cry for.

Our Lost Appreciation of the Communitarian Ethos

In the last post, I gave a historical overview of America’s path from the self-centered sentiment of the Gilded Age to the egalitarian and socio-centric sentiment and activism of the Progressive Era all the way to the 1970s and then the slide to a “me-first” attitude under the wave of individualism that has prevailed in the last 40 years.

Striking the right balance between the interests of the individual and those of the community has always been one of the greatest challenges of human societies.  Individualism ensures basic personal freedoms that enable each person to pursue self-actualization and preserve self-dignity.  Dedication to a communitarian ideal ensures the preservation of the social norms and institutions that enable individuals to thrive within a system of shared values, reciprocity and pooled resources.  It is ultimately about the balance between rights and responsibilities.

When the French diplomat Alexis Tocqueville traveled through America in the first half of the 19th century, he noticed that Americans exercised rugged and resourceful individualism that, at the same time, was mindful of what was good for their communities.  He called that ethos “individualism rightly understood.”  After the excesses of the Gilded Age had shuttered this compact, the Progressive Teddy Roosevelt proclaimed “We grudge no man a fortune in civil life if it is honorably obtained and well used.”  But “We should permit it to be gained only so long as the gaining represents benefit to the community.”

Today, we are hardly entitled to argue that we live up to Tocqueville’s description of Americans or Teddy Roosevelt’s admonition.  The living example of our failure to understand individualism the right way is the reckless refusal of large segments of Americans to take precautions (like wearing masks and socially distancing themselves) in the midst of a pandemic that rages uncontrollably because, to a great extent, of this attitude.  And we fail to see that the inordinate accumulation of wealth persists while millions of Americans go by with high rates of morbidity, subpar health insurance and care, uneven educational attainment, unfulfilling jobs, and the scourges of homelessness, drugs, opioids, and suicides.  

So how did we arrive at this disconnect between the interests of the individual and the interests of the community.  I would point my finger to a two-faced suspect.  The exultation of individual achievement along with a gradually diminishing awareness about the importance of the community and society at large.

Common-knowledge examples will suffice to make the point of our society’s excessive attribution of success to individual achievement instead of to collective effort.  CEOs are upheld as singularly responsible for their corporation’s success as if all others do not matter.  And yet, we often observe that corporate success and CEO compensation (a metric of their value) do not coincide.  Famous actors and actresses are lavished super-generous contracts as if all that matters is their performance and not that of the ensemble.  Individual players in soccer, football, baseball, basketball are remunerated at stratospheric salaries without necessarily leading their teams to championships.  All these examples are indicative of a cultural attitude that awards a premium to individual effort and performance but ignores the collective body.  

Once this attitude has taken hold, we are not far from the point that each one of us starts to believe in our unique value supported by personal talents and meritorious effort, beholden to no one else’s help or to a stroke of luck.  As Ayn Rand’s Atlas, we “shrug off the feckless takers,” in this case, society’s weaker members, those undeserving “takers.”

What we ignore is that without successful clans, tribes, villages, societies we would not be here today.  The truth is that our distant ancestors, those, often disrespected, hunter-gatherers are the unsung heroes of the human species.  We exist because of their wisdom to develop group survival attitudes.  They came to understand that without pooling and sharing their limited resources, and especially food, they had very little chance of surviving.  They realized that one individual’s fitness to survive their very harsh and hostile environment depended on the fitness of everyone else in the group.  Better to have more good hunters than fewer.  Better, therefore, to share food with others and nurture all around strength than hog all of one’s kill.

Thankfully, our distant ancestors responded to our specie’s strong pro-sociality instinct that fosters cooperation.   Research has shown that our species’ success does not lie in our raw intellect or reasoning powers but in our capacity to learn from one another and then spread our knowledge to others as well as to future generations.  Therefore, affording every member of a society the opportunities and means to remain engaged in the common effort gives a society the advantage to improve holistically and advance to the benefit of everyone.  Societies with high degrees of inter-connectedness and cooperation based on trust and mutual reciprocity create a larger social capital and produce a greater social premium shared by all.

If social capital and social premium are recognized as beneficial to all, the individual as well as the community, then it is not difficult to make the leap from a culture of I to a culture of We.  But this is not going to be easy.  There are political, corporate and individual interests that see no benefit in this cultural transition.  Beyond being merely reluctant, these forces are outright hostile to the idea of sharing control, status and wealth in order to support the investments and communitarian effort needed to serve the common good.  To overcome their entrenched power, we need a grass roots political movement reminiscent of the one that gave us the Progressive Era 120 years ago.    

Despite all the mischaracterizations and disinformation, there is today a Progressive movement that offers hope.  It centers around the issues of fighting climate change and environmental decay, and reversing inequalities in economic rewards, health conditions and health care, educational attainment, and the enjoyment of full citizens’ rights independent of race, ethnic background, sex, and gender choices.  The challenge for this movement is to translate its concerns into a political message that convinces a decisive majority of citizens that we must again make American individualism to be rightly understood and become the driving force for the renewal of social trust and the recapture of the social premium we all deserve.

Is It Going to Be I or We?

When we try to understand the present outside any historical perspective it is easy to grow either too complacent or too pessimistic.  Either way we fall victims of inertia believing that life will continue as is because that “is” appears to be a self-explanatory possibility.  Several recent posts on this blog have described what is a general view, that is, we are living in an era of individualism and personal gratification, flailing social cohesion and party tribalism.

But there comes a historical account of the last 125 years of American history that shows how American society has oscillated between individualism and social solidarity and most importantly what has driven each of them.  This book is The Upswing of Robert Putnam, a Harvard public policy professor.   What follows is a partial outline of this historical record.

In a nutshell, Putnam finds that starting at the turn of the twentieth century the Progressive Era movement set in motion a series of economic, institutional and social changes that moved America from the politics of economic inequality and polarization of the Gilded Age (1870s – 1890s) toward a long period infused with egalitarian policies, civic responsibility and a stronger sense of “We.”  These trends were about to last until the 1960s – 1970s before taking a turn and gradually descending toward individualism with its resultant economic and social inequities.  The path of these developments is along an inverted U curve (Ո) that starts with a society favoring the I, ascends to a state favoring the We and then reverts back to our present state of I.

Start with the economic curves.  The notorious top 1 percent hogged 20% of national income in 1910, but only 6% in 1975, and up to 21% by 2013.  The same with wealth: 42% in 1910, down to 23% in 1980, and back up to 41% by 2013.  Corporate tax: 1% in 1909, 53% in 1968, and down to 21% today thanks to the 2017 tax law.  Progressivity of federal tax rates (i.e., the difference between highest and lowest marginal tax rate): 25% in 1913, up to 70% between 1940 and late 1950s, and down again to 27% in 2017.  Top estate tax: 5.5% in 1915, up to 75% in 1940-1975, and down to 40% today.  Also relevant here is the tax exclusion for estates which kept falling till 1977 and then gradually rose to its highest point after 2017.  (A higher exclusion rate means less of an estate is subject to tax.)  

I am throwing this bunch of numbers, annoying as that may be, to convey an irrefutable fact:  How out of step with recent past experience we are today when we decry higher taxes as confiscatory ignoring the fact that past generations of Americans had a wholly different sense as to what is a fair distribution of the tax burden.

How were these more egalitarian and socially-centric results accomplished in the first three quarters of the 20th century?  First, by a spade of reforms and initiatives, including anti-trust laws, financial regulation with the establishment of the Federal Reserve System, the introduction of personal income taxes, and the growth of labor unions.  Interestingly, the reforms and the spirit of the Progressive Era survived through Republican and Democratic administrations until Roosevelt’s New Deal added to that legacy with Social Security, national labor laws, further regulatory reforms, and housing policies.  The arch of social mindedness continued to rise through the Eisenhower, Kennedy and Nixon administrations.  Its high point was the Great Society programs enacted during the Johnson presidency.

The progression of America toward a sense of “we are in this together” was not just the product of government activism.  It coincided with a strong engagement of citizens in secular and religious projects of social solidarity that expanded voting rights to women, established mutual-support organizations and cooperatives (chief among them the credit unions), all of which aimed at providing solutions from the ground up.  Civic engagement thrived among long-time Americans and new immigrants; among white and black communities.  But as with the economic indicators, the civic movement, church membership and attendance, and the labor union movement reached their apogee in the 1960s and started to wither in the ensuing decades. 

As social attitudes changed from I to We, the political tribalism of the Gilded Era also started to surrender to greater political comity, cross-party collaboration and political speech that aimed to unify than divide.  But around 1970-1980 all that started to sink toward tribalism.  For example, splitting the ticket between candidates of different parties, approval ratings of a president from one’s opposite party, and inter-party feelings by citizens started to decline after 1970-1980 whereas party loyalty started to rise.  Along with these negative trends came a significant decline of the public trust in government, epitomized by President Reagan’s declaration the government is the problem not the solution. 

The period from the Progressive Era of early 19th century to Lyndon Johnson’s Great Society has come to be known as the Great Convergence.  This is the period during which broad segments of the American public and the political parties, notwithstanding disagreements about the means, came to accept that civic initiatives and government policies were indispensable for the purpose of generating wider opportunities, more fairly shared results, and the advancement of the overall well-being of the society.

Then from the 1970s and onward, America started to experience what they call the Great Divergence.  The split between the fortunes of rich and poor, educated and less educated, executives and the rank-and-file, Democrats and Republicans, blue and red states, and so on.  Generations that came of age after 1960 started to lose their sense of social trust (“most people can be trusted”) and became less emerged in the culture of engaged citizenship.  Furthermore, public policy became less sensitive to the needs of disadvantaged citizens and, thus, social ills and economic inequities were left to fester.

That’s where we find ourselves now.  So, the question is: Are we going to continue down the road of caring first and foremost for the I?  Or are we going to be inspired by the lessons and achievements of past generations and move toward caring for the We?

The New American Plutocracy

As American workers started to face a new grim reality in their job markets and social lives in the late 1990s, the rapid accumulation of wealth within a tiny fraction of the population was also starting to give rise to the second wave of American plutocracy.

Plutocracy, defined as the rule of the wealthy derived from extreme concentration of economic and political power, is not new to America.  Thanks to the second industrial revolution, America experienced extraordinary economic growth in the last thirty years of the 19th century, a period also known as the Gilded Age.  Along with rising living standards for workers, America was producing a class of tycoons who led companies with dominant positions in their markets and were amassing immense amounts of wealth.  

As the 20th century dawned, these magnates (the Rockefellers, Carnegie, Mellon, Frick and others) collectively named robber barons, started to funnel their wealth to foundations, museums, universities, libraries, and various charities.  Their giving did not come, however, without severe criticism levied on the provenance of their fortunes (trusts and cartels, poor working conditions, exploitation of new immigrants).  President Theodore Roosevelt captured the mood when he said “No amount of charities in spending such fortunes can compensate in any way for the misconduct in acquiring them.”

What is more important though to our own appreciation of the current state of the new plutocracy is the dogma that came to epitomize the original plutocrats’ view about their place in the American economy and society.  That dogma was articulated by Andrew Carnegie in his essay “Wealth.” Carnegie opined that his ilk should be given free rein by the government to produce as much wealth as possible (even at the expense of the working class); wealth that they, the robber barons, would then spend through philanthropy to redress deficits in the economic, social, health, and cultural fabric of the nation.  Interestingly, and in contrast to today’s plutocrats, Carnegie advocated the steep taxation of inherited wealth as an incentive to spend it instead of hording it.  Carnegie also espoused that wealth inequality was the inevitable price society had to accept in the interest of economic growth.  Does it sound familiar?

Why should we worry about plutocracy?  Let’s start with an aphorism from Darren Acemoglu and James Robinson who explored how nations fail.  They write: “Societies decline when narrow elites organize society for their own benefit at the expense of the vast masses of people.”  There are several ways plutocrats can organize society. 

One is through philanthropy.  But philanthropy can serve the vanity and priorities of the givers.  More money going to elite institutions, say Harvard, instead to underfunded community colleges.  More money given to elite museums and opera houses than to neighboring theater groups.  Thus, philanthropy may end up benefiting the privileged rather than the underserved.  This concern is reflected by Rob Reich in Just Giving: Why Philanthropy Is Failing Democracy and How It Can Be Better.  Accepting plutocratic giving as a substitute of public funding, Reich laments, limits the scope of public policy, especially when wealth is derived from tax favoritism that crowds out government revenues.   

Another way of impacting society is through political money.  Although big corporations and their leaders proclaim devotion to sustainability and social justice, the evidence says otherwise.  The record shows that they also fund political forces that resist increases of the minimum wage, environmental regulations, and the rights of minorities and LGTBQ people.  The well-organized efforts by corporations and wealthy donors to fund political campaigns would have no reason to exist if there were no payoffs to be reaped.

A third way is through the economy by controlling markets, competition, labor demands, and taxation.  We are currently witnessing a worrisome concentration of market power in the financial, pharmaceutical, and information technology sectors.  The goal of economic power is rent seeking, that is, the production of super profits over and above what is feasible in competitive markets.  The socially-damaging consequences are the retardation of innovation and fewer opportunities for new entrepreneurs.  

Based on his experience as a former McKinsey analyst, Anand Giridharadas gives a fascinating and illuminating description of the plutocratic world in his book Winners Take All.  This is the world of an international network of the wealthy and their surrogates, politicians and thought leaders, who come together in glamorous gatherings organized by such outlets as South by Southwest, TED talks, Aspen Ideas Festival, Davos World Economic Forum, the Clinton Foundation and others.  When Karl Marx summoned the workers of the world to unite, he would have never guessed it would be the plutocrats who would heed his call. 

According to Giridharadas, at the core of the plutocratic dogma is the belief that all problems faced by societies can be solved through the private initiative and the magic of markets.  He blames plutocrats for enriching themselves from corporate activities that cause health and environmental problems, social disintegration and poverty and then come to the rescue by proposing private market solutions.  Thus, Giridharadas argues, big corporations and wealthy individuals manage to keep governments at bay and reduce the role of public policy while at the same time they expand opportunities for profiteering from social problems.  

If we are serious about tackling the corrosive effect of extreme inequality and plutocracy, we need to distinguish between creating and hording wealth as Andrew Carnegie did 100 years ago.  Creating wealth by fair market practices and socially responsible behavior is beneficial to society.  Hording wealth, however, through rent-seeking and low taxes that deny governments the means to enact publicly desirable policies is counter to the interests of society.

Most crucially, we also need to understand that plutocracy is incompatible with democracy and free markets.  Sooner or later extreme inequality and concentration of power can lead to social and political unrest.  The only way for an economic oligarchy to survive is a political order akin to that in Putin’s Russia.  Is this what the current American plutocracy aims for?  Is this what we the citizens are willing to accept?

When Education Is A Matter of Life and Death

Textbook economics teaches that people are self-interested agents who respond rationally to economic changes.  They change locales as jobs move, retool themselves to match the skills of new jobs, and adjust their social lives and psychological moods to the new realities.  Other than a role in shaping overall fiscal and monetary policy, government has little or no role to play as the economy moves from one state to the next adjusting to changes in commerce, technology and consumer preferences.

In reality, this is a narrative that rarely ends well.  Consider the following data.  In 2019, there were 70,980 deaths due to drug overdose (of which 50,042 from opioids) in the US, 3.5 times the average of 17 high-income countries.  In 2018, there were 48,000 suicide deaths in the US which, adjusted for population differences, placed the US first among wealthy nations.  But that’s not the whole story.  Deaths attributed to suicide and drug overdose was a price paid mostly by middle-age non-Hispanic white Americans without a college degree. 

Less educated Americans also happen to be those whose incomes have not just stagnated they have actually fallen.  Between 1977 and 2017, real wages for non-college jobs were down by 13% while the GDP grew by 83%.  Between 2010 and 2019, the economy created 16 million new jobs but less than 3 million were suitable for non-college workers. 

A few years ago, the population statistics showed that the number of middle-age (45-54) white men was declining.  The economists Anne Case and Angus Deaton (Nobel, 2015) dug deeper into this new phenomenon.  What they discovered along with their thoughts about the possible causes are laid out in their book Deaths of Despair and the Future of Capitalism.  If you are not convinced that left alone the economy can produce sad and deadly outcomes, reading this book will.  And let me hasten to add that Case and Deaton are not capitalism deniers.  Far from that.  What they try to do is to foster public awareness about the inextricable connection of the economy to the social fabric and by extension to the human condition.

Throughout the twentieth century, mortality rates of middle age white Americans kept declining due to advances in medicine and health care.  Then starting in 2000, mortality rates for middle-age whites started to creep up while it continued to drop for other advanced Western countries.  However, the rise of the mortality rate in this cohort group came primarily from a sharp rise in deaths from alcohol, drugs and suicide of men (and less so from women) without a college degree.  Moreover, fair or poor health, mental illness, and pain were being reported at a higher rate by less educated middle-age white men.   Thus, this cohort group experienced both greater mortality and lower quality of file.

What about race?  Until the late 1990s white men, whether with or without a college degree, had lower mortality rates than black men.  Black men had gone through their own phase of “deaths of despair” in the 1980s when drug abuse among this group was rampant.  But starting around 2000 white men without a college degree, had overtaken black men.  Death rates for black men without college education started to climb again only after 2012.  As Case and Deaton note, the much higher death rates of less educated black and white middle-age men is not a race but a class problem arising from inadequate educational attainment and low wages.

Case and Deaton identify several causes for this bi-furcation of mortality rates.  One is lower wages for low-skill jobs.  Even worse though is the lower quality of the new jobs – mostly service as opposed to manufacturing.  Many of the new jobs are also less stable and involve less bonding with the firm.  The sense of belonging and prestige and hence the work pride is no longer there.  Additional causes are social isolation, family problems and break ups, and withdrawal from church attendance and activities that provided social support.  The demise of labor unions has also deprived low-skill workers an outlet for social engagement and solidarity. 

Interestingly, Case and Deaton find that differences in poverty levels and income inequality across states and demographic groups (for example, white vs. black Americans) do not always explain the higher mortality and morbidity rates of middle-age white men without a college degree.  It is rather the abrupt loss of the traditional way of life and the loss of meaningfulness that has brough this cohort of middle-age white men to pain killers, alcohol and drugs, and eventually to poor health and death.  From a political standpoint, the loss of social status and living wages has contributed to the resentment less educated white workers feel toward other groups, immigrants, Hispanics and Black Americans, whose living standards, though still lagging, have improved over time.  This resentment finds, of course, its way to the ballot box.

Case and Deaton ask: Who has failed the American blue-color workers?  First, they point the finger to the government and Congress.  Inadequate regulation of opioids, low funding of public healthcare programs to fight mental illnesses and alcohol and drug abuse, as well as anti-labor laws have failed to protect vulnerable groups.  Second, they are particularly critical of the US healthcare system.  Weak regulation of the pharma industry, lack of monitoring and evaluation of healthcare providers, and a hands-off policy toward consolidation of hospital systems has produced a socially unacceptable reality of very expensive healthcare with only mediocre results to show and tens of millions of uninsured and under-insured people.

Finally, they argue that American capitalism has devolve to a reverse Robin Hood paradigm where corporate governance, market regulation (or lack thereof) and taxation help concentrate incomes and wealth within a narrow slice of the population without regard to those left behind.

There is an instructive detail in this calamity.  The people that bore the heaviest brunt are Americans living in the middle of the country and conservative states.  People who by culture and upbringing are self-reliant and independent.  And yet when the jobs market collapsed around them these resilient Americans were not able to withstand the challenge of change.  This shows that an economic and business system left to its own dynamic without active and socially-minded government engagement does not deliver the textbook results its advocates -and not surprisingly greatest beneficiaries – tell us to believe in.

How Labor Was Left Behind: An (Incomplete) Story

The major trends in labor income in the US over the last 50 years are wage stagnation and a yawning gap between workers with and without a college degree.  Both these developments are not just topics of mere economic interest.  They have serious social and political repercussions.  Here I give a partial explanation of how it happened. 

First the data.  Until the early 1970s, the US economy (GDP) and wages grew pretty close together and labor got its fair share.  Things changed after the seventies.  Thus, from 1979 to 2018 productivity rose by 70% but real wages only by 12%.  In the period after 1970, the ratio of wages to profits in the formation of the GDP declined from 67% to 60%.  By the 2000s, automation and outsourcing had robbed non-college workers of well-paying jobs, especially in manufacturing.  The economy created lots of jobs but most were for skilled educated workers.  As a result, wages for low-skill jobs fell and these workers were left behind with respect to: income, social status and integration, and healthcare.  Very tellingly, these disparities were unique to the US among advanced market economies.

There are several factors that drove these negative developments.  For one, the demise of labor unions and unfavorable labor laws and rulings cut down the bargaining power of labor.  But another place to look is changes in the financial objective of the firm which were played out in the corporate boardrooms and the financial markets.

We can say that from the fifties to the seventies American business was ruled by managerial capitalism.  Shareholders held nominal power but they were mostly considered another class of capital suppliers, like bondholders and banks.  Executives governed with little oversight from shareholders whom they tried to appease with just adequate returns.  Executives were mainly compensated by salary and modest, by today’s standards, bonuses.  Whether they held views of social fairness toward labor or they aimed at maintaining labor peace with unions and an assertive working force, they were willing to split the gains in productivity (and profits) quite fairly between shareholders and workers.  Being compensated mostly by salaries, executives also had an incentive to seek higher wage increases which would justify commensurate salary raises for themselves.

But closer inspection of corporate performance by academics revealed that this style of corporate governance had left too much shareholder money on the table.  That is, excesses either in the form of generous labor contracts, managerial perks, or for the building of corporate empires were pursued at the expense of the shareholders.  (Just for the sake of illustration, think of the purchase of a jet to fly the CEO around; it adds to the CEO’s prestige and ego but its cost comes out of the shareholders’ profits.)  To rein in such managerial waste, a new theory, called “agency theory,” proposed that shareholders had to monitor managers more diligently and, even better, make managers behave like shareholders.  The theory argued that treating executives as agents of the shareholders and compensating them at fixed salaries did not incentivize them to create maximum value.  If, however, they were granted shares and/or options to buy shares in their firms, then they would behave as principals (i.e., shareholders) and think twice before they wasted shareholder value.   

Although the goal was to foster better management of resources, turning executives to shareholders created new unintended effects.   In order to increase the value of their stock, executives now had an incentive to squeeze expenses, including labor costs.  That gave us more part-time and temp jobs with little or no fringe benefits, all the way to gig jobs a la Uber.  No longer the interests of executives were aligned with those of labor as in the early era. 

The second harmful possibility came from the incentive to inflate the stock prices even by unethical means.  This reached scandalous and eventually criminal levels around 2000.  Mighty corporations, like Adelphia, Enron, Worldcom, and Arthur Andersen (the renegade auditing firm) were prosecuted for using misinformation and accounting irregularities for the purpose of inflating stock prices.  And adding insult to injury executive compensation packages blew through the roof to levels unseen in the US or other market economies.  To this day, even failing executives are rewarded stratospheric severance pay packages to relieve a firm of their incompetent presence.  Whereas thousands of workers are laid off with minimal notice and severance pay, top executives are sent to retirement with millions of dollars to presumably ease the pain of their aborted tenures.  

The most fertile ground to pursue shareholder value maximization, with significant consequences for labor nonetheless, is corporate restructurings, that is, the sale and purchase of corporate assets, divisions and whole firms.  This is exactly the arena where Schumpeter’s “creative destruction” takes place.  Very soon, the new term “takeover premium” was coined to mean the additional value a firm had as a potential takeover target.  New laws and regulations were passed with the objective to loosen up the grip of various stakeholders on firms and make them easier targets for corporate raiders. 

This newly minted market for corporate control, as it was named, got a big boost when private equity funds and hedge funds entered the market in earnest in the nineties.  The tax loophole of treating carried income as capital gains turbo-charged these funds.  Consistent with the relentless emphasis on shareholder value maximization, an academic study found that between 1952 and 1988 stock prices grew in sync with the GDP; from 1989 to 2017, GDP growth accounted for only 24%.  The rest came from “reallocated rents to shareholders and away from labor compensation.”

The thing is that corporate restructurings and takeovers help cleanse the economy of losing projects and incompetent managers and thus improve efficiency.  However, when academics and consultants measure the positive value creation of such activities, they do it from the perspective of shareholders under the financial objective of shareholder value maximization.  What they leave out is the costs from worker displacement, the unraveling of local economies, and negative social effects.  The neoclassical economic view is that workers and capital holders smoothly navigate the upheaval caused by “creative destruction” until everybody lands on a better place.  After all, it is the government’s job to pick up the cost of smoothing the transition.

But that’s not what happened.  The stories of blue color workers in the American Midwest tells a more dire story.  Their story had a lot to do with who entered the White House in 2016.

Rethinking American Capitalism

When we are invited to discuss possible reforms to the current economic system, many Americans suspect the talk is about overthrowing capitalism in order to introduce some socialist system.  Needless to say, this is not a good way to start a public dialogue about the state of the present system and the potential it holds for very unpleasant social and political outcomes down the road if it continues as is.

There are several different ways one can jump start the discussion.  I will choose one that appeals to me because it relates to how well we in the US care about social growth and cohesion.  Let’s start with the Social Progress Index that measures the standing of a country with respect to nutrition, health care, education, shelter, personal safety, personal rights, freedom and choice, access to information and other indicators of citizens’ wellbeing.  In 2020, the US slipped to the 28th place out of 163 countries.  Almost all of the Western European countries – capitalist countries themselves – were ahead of the US.  Next take poverty.  In 2017, the US ranked second worse among the 37 countries of the Organization of Economic Cooperation and Development.  Finally judged by social spending as a percent of GDP, in 2018 the US ranked 16 from the bottom spending less than the OECD average of 20.1%.

A healthy and well-functioning economic system should meet certain criteria.  It should support the production of output and wealth, it should support the fair distribution of incomes and wealth, and it should do no harm, that is, not contribute to social ills and disparities among the population it serves.  The US system does very well with respect to production.  Not though with respect to the other two criteria.  Income and wealth inequality are the highest in the world.  A recent study by the German insurer Allianz shows the US growing its wealth by an average of 13% in 2019 but, as in past years, with the most unequal distribution since most of the gains went to the highest 1% of the population.  Similarly, the Federal Reserve System recently reported that family income, wealth and stock holdings continued to grow faster for the top 10% of households than for any other percentile bracket.

Income and wealth inequality are not the only pathologies of American capitalism.  Wages after inflation have stagnated since the 1970s and this is most pronounced for workers without college education.  Black and non-white Hispanic Americans significantly lag in income and wealth due to long-standing racial inequities.  Unfortunately, economic disparities beget various negative outcomes.  Low-income Americans are less likely to have health insurance.  Lower incomes are associated with less education and the latter is, in turn, associated with more abuse of alcohol, drugs and opioids as well as higher suicide rates.  Less education and lower incomes also lead to more broken families and more children living in single-parent homes.  And, of course, social mobility, that is, moving to higher income brackets, is lower for today’s Americans than it was for their parents and grandparents.  No wonder then the majority of Americans do not think their country is on the right track.

If this is what the data tell us, then it behooves us to ask (a) what is the cause? and (b) what is a possible way out?  My short answer is that we have produced a system without a social purpose.  Advocates of pure capitalism might object to this proposition by reminding us of Milton Friedman’s famous pronouncement of sixty years ago that business has a social responsibility and that is to maximize profits. 

There is no doubt that profits are necessary for businesses to stay viable and, thus, be able to meet consumers’ needs and wants.  But we have also seen that despite a stellar record of profitability achieved by American business over time, this country is afflicted with many serious social problems emanating from the uneven distribution of economic opportunities, incomes and wealth.  The notion that the private interest is aligned with the social interest is just an assumption.  In 2008, it made sense for every trader to dump financial assets as they were losing value fast.  But it was not in the interest of the overall market and society.  The Federal Reserve System and the government (political and social institutions) had to step in to avert the collapse of the whole system.  Chemical companies and fast-food restaurants can be more profitable if they care less for the quality of environment and their customers’ health.  It is society that pays the spillover costs.

Capitalism thrives in markets.  But markets (being mechanisms of exchange) are agnostic when it comes to the social good or purpose.  When a person buys a gun, the market does not care (or know for that matter) whether the buyer will use the gun for a legitimate defense purpose or to commit a crime.  When the labor market facilitates the hiring of a worker at some wage by a firm it has no idea nor does it care whether the worker can live with that wage.

At its beginning capitalism was a new system primarily concerned with the production of goods.  Labor rights and issues of fair distribution of the output were dealt with much later under the pressure of progressive economic and social thinking and movements.  In other words, we cannot divorce the function of the economic system from the political order and the social compact of a society.  If the economic system could operate like a physical system independent of the influence of political order, we would not see businesses spending enormous amounts of money and effort to shape government policies.

Being honest as to what markets can and cannot do is essential for creating a good society.  Markets not only lack a social conscience they can also succumb to endemic imperfections and collapse.  Other times, markets are so concentrated in the hands of a few powerful corporations that can negate the benefits of market competition.  Markets can also fall prey to special interests uninterested in the social good.  Equally unrealistic is to rely on markets to meet the needs we associate with the social safety net. 

The reason the US socio-economic system leaves significant numbers of people behind is two-fold.  First, the economy has been allowed to develop distortions that benefit the few at the expense of the many; and second, public policy, under the mantra of small government, has failed to take responsibility for the needs of those who lack the financial means to access markets for essential services such as health care, education, healthy environment and retirement.

Faith, Reason, and The Vetting of Federal Judges

The nomination of Amy Coney Barrett, a devout Catholic, to the Supreme Court has renewed the debate about the role of one’s faith in discharging her or his public duties and the place of faith and reason-based secularism in the founding documents of America, the Declaration of Independence and the Constitution. 

To be a citizen, a politician, or a jurist in the multi-ethnic and multi-cultural US is truly a challenge as you are called to transcend your personal religious or secular beliefs and learn to live with others whose moral laws and philosophies of life spring from different theologies and worldviews.  When we read the Declaration of Independence and the Constitution, we sense that the Founders struggled with this very problem and eventually came to language that balances the interests of faith and reason. 

To justify the pursuit of freedom and independence from the oppression of the English King, the Declaration invokes the “Laws of Nature and Nature’s God.”  The notion of laws of nature or natural laws has been around since the times of ancient Greek and Roman philosophers.  These are laws of behavior endowed by nature to humans to guide their moral conduct.  The Declaration also invokes laws defined and granted by Nature’s God.  The Declaration is neutral though as to which faith’s God it has in mind.  Thus, no God of a particular religion is identified as the supreme grantor of human rights.  The language implicitly accepts two sources of discernment of laws:  rational thinking for the laws of nature and faith for laws emanating from the God of Nature.  This suggests the Declaration of Independence is open to secular and religious approaches to constructing and interpreting laws.

Keeping with this premise, the Constitution makes no mention of God or religion.  It starts with the words “We the people . . .”  The powers of the government come from the people and not from a Divine Authority as it was the case in the monarchies of Europe of that time.   And as the First Amendment stipulates, the government should not favor any particular religion.  We are free to profess any religious or non-religious beliefs but should not expect any helping hand or validation from the government in all its three branches.  Respect for this constitutional edict implies we should not seek, or worse, apply undue influence to achieve such assistance from the government.

What is of interest to us who live the now and here is the realization of two important points.  The first point is the historical evidence that both secular and religious moral laws have changed over time as humans have become more tolerant and more knowledgeable about our nature, human and physical.  The second and more important point is that secular and religious moral laws vary across different secular and religious systems as well as peoples.   Therefore, the peaceful coexistence under a constitution that invokes no particular religious dogma or secular beliefs and a Declaration that appeals to both, the Laws of Nature and Nature’s God, as the source of moral laws requires neutrality in the writing and interpretation of laws.

This understanding of the founding documents then shows the way as to how we should approach the nomination and vetting of candidates to the Supreme Court.  Knowledge of a candidate’s adherence and degree of devotion to a particular faith or secular principles should not necessarily preclude nomination.  Vetting, however, should ascertain the candidate’s adherence to the constitutional neutrality with respect to religious dogma and secular views.  It seems to me the candidate bears the burden of proof of his/her adherence to impartiality.   Consequently, rigorous vetting of a candidate’s impartiality, far from being mistaken as hostility or suspicion of the candidate’s beliefs, emerges as a basic responsibility of the Senators, who confirm Supreme Court judges.

A different approach is to argue that a candidate’s discharge of his/her duties as a jurist does not necessarily accord with one’s religious or secular beliefs.  Here the premise is that most of us, and, hence, Supreme Court candidates, live compartmentalized spiritual and professional lives.  This is what David Brooks has argued in a recent New York Times column.  Specifically, he makes the point that faith has no or little influence on one’s political ideas, including legal ideas.

I would like this to be true, but I have reasons to believe it is more a wishful conjecture.  The three Abrahamic religions, Judaism, Christianity and Islam, are strict moralistic faiths that prescribe a certain way of life.  Moreover, since their founding, Christianity and Islam have practiced an active evangelization and conversion policy for the purpose to bring an ever-expanding number of people to live by the tenets of these religions.  For centuries, Church law was also state law and to this day Sharia (the Islamic law) is still practiced in parts of the Muslim world.  Therefore, the notion that religious adherents just believe in the creation and other holy stories of their religion while they bring universal moral values and beliefs to their professions (the practice of law, included) is just an assumption that requires empirical confirmation via vetting.

Such vetting is important in the case of confirming jurists to federal courts and especially in filling seats on the Supreme Court for obvious reasons.  Members of Congress serve for finite terms; hence if the people are dissatisfied with a member’s performance in matters relating to the application of religious and secular principles, they can vote them out of office.  On the contrary, Supreme Court jurists serve life terms without recourse to replacement if a jurist demonstrates systematic bias in favor of the tenets of a particular faith or secular dogma.  Besides, Supreme Court jurists are the ultimate arbiters in interpreting the letter and intent of our basic governing charter.

Precluding anyone as a candidate for a seat on the federal bench on the basis of religious or secular beliefs would be prejudicial; vetting a candidate’s ability to be impartial and neutral in interpreting the law, on the other hand, is a necessity if we wish to do service to America’s founding documents.