Education and The Human Question

Imagine we are at the early stages of the agricultural era.  People live in communities close to the lands they cultivate and they have started using technologies that differ from those used in their Hunter-Gatherer days.  Let’s also imagine that there are some primitive schools that teach students how to survive in this new era of human condition.  We can surmise that the curriculum would put more emphasis on how to till the ground, how to sow and harvest, how to watch for the changes in the seasons, and how to follow patterns of rain, drought, and flooding.  It’s also likely that this curriculum would put less emphasis on how to hunt animals and gather edible plants and fruits.

If there were such a primitive school system, we can imagine the protests from those who were still fond of their hunting-gathering lifestyle.  They would claim that the new education put people at the peril of losing skills that had kept them alive for millennia of years.  And that the new curriculum did not respect the old culture and lifestyle. 

The tension between new and old educational content and purpose came to my mind as I was reading “The Elite College Students Who Can’t Read Books” by Rose Horowitch in The Atlantic.  The author laments the fact that even students who attend top colleges come unprepared to read whole books from the field of humanities.  It is not that the students are weak in reading comprehension.  What they rather lack is the will and patience to read.  This, according to the author, is the result of our new digital world where our attention is contested by a myriad of news outlets and social media. 

What this essay brings forth is the tension created when changes in technology and the socio-economic environment affect the content and purpose of education.  Currently, the study of humanities is in retreat as students and colleges place greater emphasis on turning education into a stepping stone to professional careers.  Fewer students major in the Liberal Arts and Sciences now and as The Atlantic essay argues the teaching of humanities is being diluted in order to match the students’ preferred ways to studying. 

Since ancient times, the study of humanities and to a lesser degree that of math and science had been at the center of education.  The purpose was to give students a well-rounded education that would prepare them for public life and service.  For the Medieval Church education served to prepare church leaders and to defend its dogmatic theology.  In the newly founded United States, John Adams believed that the purpose of education was to prepare students to function as citizens of a democratic nation.  But the ultimate goal of the study of humanities has been to cultivate an appreciation about human creativity, what it tries to tell us about ourselves and others, and to nurture further exploration of the human condition.

Starting toward the end of the last century, the purpose and content of education again entered a period of redefinition.  Professional fields in technology and business required more formal and specialized study.  Just getting a solid Liberal Arts and Science education was no longer sufficient to prepare students for such professional careers.  Thus, the emphasis shifted to the study of STEM fields (i.e., Science, Technology, Engineering, Math), business, and the health industry professions.  The main victim of this professionalization of education has been the study of Humanities.

Technology also has had a very profound effect on learning.  Students no longer have to be the depositories of information and knowledge nor do they have to develop the cognitive capabilities that process knowledge and information.   All that can now be outsourced to machines and increasingly to Artificial Intelligence algorithms.  We have been gradually surrendering the learning task to machines and we have already started to surrender the creative task as well.  In the much-discussed play McNeal, the playwright Ayad Akhtar has his main character lean on AI to produce a novel. 

So how should we approach the content and purpose of education?  I believe an answer can be found in our nature as creatures of biology and meaning.  That is, along the way of our evolutionary process, we found an advantage in instilling life with meaning.  It is in art, stories, and ideas we find answers to the question “What it means to be human.”  The answers to this question come in the works of literature, art, and thought produced by fellow humans from diverse places, times, and circumstances.  It is in our life’s chosen meaning that we find personal agency and a measure of autonomy.  In the pursuit of meaning both, freedom to produce different versions of meaning and freedom to study them are thus essential.

Therefore, the question I think we owe to ask is “Will we keep our humanity if we surrender these freedoms?”  Up to now the threat of extinguishing these freedoms came from authoritarian secular and theocratic regimes.  Now, however, the threat is also coming from intelligent machines to which we may outsource the production of meaning as well as its dissemination.  Should this happen, humans run the risk of becoming creatures without the capacity to express the human experience in all its diversity.  That’s how the diverse expressions of meaning we have today will start to collapse and our minds will eventually close.  The ultimate point would be the Singularity* of the human mind. 

By that I mean that just as before the cosmological Big Bang all had collapsed in one singular point, the same way our diverse emotions, sentiments, and thoughts may never escape from the Singularity of mind.  It is an apocalyptic scenario but worth keeping it in mind.     

*Singularity is a possibility often discussed in relation to AI.  Most often, it is defined as that point when artificial intelligence surpasses human intelligence and humans no longer have control and agency over their affairs.  Here I use singularity as the point when what it means to be human collapses into one meaning set by a dominant AI algorithm.

To Defend Democracy Look at FDR’s Progressive Presidency

It is no secret that over the last thirty years the Democratic party gradually alienated and eventually lost a big segment of working-class Americans, especially those without a college degree.  How this happened is quite well known.  In a nutshell, Democratic politicians adopted the Republican mantra “the market works for all” and let global free trade, labor-replacing technology, and unchecked corporate concentration rule without much thought about their consequences on the industrial and social fabric of the country.

By doing so, Democrats walked away from the most essential and progressive ideas of their own party which ninety years ago had successfully faced the two most existential threats of the nation: the Great Depression and the assault on freedom by Nazism and Fascism.  These ideas were: people matter more than economic principles and people will defend their democratic liberties when they know that democracy works for them.  In the words of Harvey Kaye (The Fight for The Four Freedoms) these ideas, promulgated by no one else but Franklin Delano Roosevelt, turned the “Greatest Generation” into the most progressive generation in America’s history.  Not only that generation embraced FDR’s social and economic message and policies, it also moved ahead of FDR in a quest for a more inclusive and fair social and economic order.  Through civic associations, marches, and unionization New Dealers promoted and defended FDR’s agenda in the face of strong and outright resistance from the conservatives, reactionaries, and corporate interests of the day.

From the standpoint of policy conception and execution, FDR’s great insight was that when a nation faces a dire crisis which private institutions and enterprise cannot fix, it is the duty of the government to pull the levers of the state and provide solutions.  It was that realization that led to the creation of numerous programs, agencies, and regulations that boosted incomes, provided security to retired people, and harnessed the anti-competitive and abusive impulses of the market.  Americans of all walks of life understood the importance of these initiatives and elected FDR president four consecutive times.  More remarkably, Kaye writes, FDR’s accomplishments were so much top of mind among the American troops that in letters to their families, soldiers would write they fought to defeat totalitarianism and defend the benefits and security the New Deal had given them.

What expressed the progressivism of the New Deal were two proclamations.  The first was the articulation of the Four Essential Freedoms FDR declared on the eve of the Japanese attack on Pearl Harbor.  The first is freedom of speech and expression.  The second is freedom of every person to worship God in his/her own way. The third is freedom from want for the necessities of a healthy peacetime life.  The fourth is freedom from fear of war.  Remarkably, FDR declared these freedoms in the name of all peoples of the world, not just the American people.

The second FDR declaration was his Second Bill of Rights.  This came toward the end of his presidency and was supposed to be an encapsulation of his vision and a guiding manifesto for future initiatives.  This social bill of rights aimed at securing for all Americans the right to a useful and remunerative job; the right of workers and farmers to earn a living wage; the right of doing business in competitive markets without undue domination by few firms; the right to decent housing; the right to medical care and good health; the right of a decent and secured retirement and protection from sickness, accident, and unemployment; and, finally, the right to a good education.  Unfortunately, a less cooperative Congress refused to act on this bill of rights and to this day it remains the unfinished project of the progressive movement.

Roosevelt was able to inspire Americans because he put the common man and woman at the front and center of his politics and policies.  He resisted and fought the strenuous objections of rival interests and took their hostile language and actions as a badge of honor.  That’s why he earned the respect and admiration of the American people. 

There are lessons to be learned from this extraordinary period of reform in American history.  First, the efficacy and popularity of the New Deal proved so enduring that even Republican presidents like Eisenhower and Nixon didn’t dare to dismantle it.  Instead, they even built on it with infrastructure projects, like Eisenhower’s interstate highway system, and new regulatory initiatives, like Nixon’s Environmental Protection Act.  This shows the broad political appeal of the progressive agenda. Second, the promotion of the state as a facilitator, mediator, and regulator in the interest of the common good not only did not lead to economic atrophy but instead laid the foundations for the extraordinary growth and the thirty years of shared prosperity after the World War II.

The most important lesson we can take from FDR’s progressivism is how well he grasped the invidious link between the lack of economic inclusivity, populism, and the demise of democracy.  This is particularly relevant to us today.  First came the erosion of the balance of power between labor and corporations that was part of the post-War social contract.  Then came the philosophy of laissez-faire capitalism that subordinated labor to capital.  Then came the unprecedented inequality of incomes and wealth along with the financial and social decline of the ordinary American worker.  And finally came the populist backlash that found in Donald Trump its, no matter how questionable, standard bearer.  And now we cry out about the threats to Democracy.  FDR would tell us “You ignored me at your own peril.”

Unfortunately, in today’s culture of individual comfort and aggrandizement, the classes on the left and the right that benefit the most from the present status quo decry progressivism as radical and impractical.  However, ninety years ago large majorities of the American people found in progressivism not only a powerful promise to uphold the dignity of every citizen through social and economic inclusivity, but also the inspiration to stick with democracy and defend it when their world around them was succumbing to the forces of authoritarianism.    

The Anti-Immigrant Frenzy: What the Past and the Future Can Teach Us

We know what they say about the past: those who forget it are condemned to repeat it.  Knowledge of the past, though, is useful for another purpose as well.  It gives us a perspective on matters that have a historical record.  This can both inform our current attitudes and acts and serve as a benchmark of our honesty as we become actors of history. 

This opening came to my mind as I try to grapple with current Western attitudes toward immigration, which has become one of the hottest political issues in Western countries.  Negative attitudes toward immigration are responsible for Brexit, and the rise of nationalist parties in France, Germany, Italy, and other European countries.  The anti-immigrant message has brought to power far right parties in Hungary and Austria.  The same message galvanized the successful campaign of Donald Trump in 2016 and keeps fueling his present comeback. 

The anti-immigration message of politicians and parties has been embraced by significant numbers of constituents.  This stand rests on claims that immigrants contribute to criminality and unemployment, no matter how badly these claims fail to meet the test of good-faith scrutiny.   What, however, sounds more visceral are the cries of fear that immigrants are a threat to Western lifestyles and culture and worse to the purity of its blood and ethnic make-up.   It is here that we need to summon the knowledge of the past and judge ourselves in the historical context.

In the fifteenth century, the Western European countries embarked on a sustained and expansive exploration and conquest of the world that changed it radically and irrevocably.  The native civilizations of Central and South America were vanquished, their populations were decimated by disease and their demographic composition was altered by the infusion of slaves from Africa and whites from Europe.  Eventually these vast regions emerged into the modern world as Christian, westernized, and White dominated countries.  The same conquests and transformations took place in North America.  They were soon followed by wholesale cultural, religious, and societal changes as well as colonial rule in Africa and Asia. 

These radical transformations were accomplished by the force of guns, coercion, and economic subordination.  Of course, at the time these acts were rationalized as being committed in the name of human progress.  The pervasive and underlying conviction held by the conquerors was that the Western culture, religion, statecraft, and economic model would help pull the conquered people out of primitive and degrading lifestyles and beliefs into the modern world. 

We now know that whatever the aims of those transformations were, they left behind half of the planet, the so-called Global South, in a state of under-development and political dysfunction.  Many of these countries struggle with poor economies, political instability and corruption, crime, and a bleak future.  For all these reasons, they are the sources of immigration toward Europe and America.  These immigrants are so determined to avoid destitute and violence in their home countries they are willing to risk their lives as they cross the Mediterranean Sea on unsafe and overcrowded boats operated by cruel human smugglers.  They are so desperate for a better life that they choose to cross the perilous and deadly jungle of the Darien Gap from Columbia to Panama on their way to America. 

These foreign people come to our borders and shores not on gunboats, not aiming to conquer us, or dominate us in any way.  Instead, they throw themselves at our mercy and that of our laws and make themselves available for any type of work no matter how onerous or demeaning it might be.  It is ironic that whereas neither their intentions or situation pose a real threat to us, we present them as potential transgressors of crimes that our own ancestors did commit against their own.   

I do understand that many of us feel untouched by the events of history and feel unwilling or unable to make them relatable to us.  Immigration though is not a problem that will easily go away no matter how restrictive our laws become or how high our borders rise.  The population projections are not in favor of the developed countries.  All the population growth will be in Africa and countries like Pakistan, whereas that of Europe and China will be negative.  The U.S. will escape a population shrinkage only because of new immigrants and the higher birth rate of its Latino population.  This imbalance of population growth rates will create severe shortages of labor and young people in the developed world and a big population surplus in poor countries.

For the hundreds of millions added to the population of poor regions fast economic growth is the only way to keep them at home.  But for economic growth to materialize considerable progress must be made in law and order as well as in political and social institutions.   And this is not all.  Climate change already forces people to migrate.  Absent significant progress in containing climatic disasters and lifting people out of poverty these regions will continue to generate waves of immigrants.  Therefore, we need to ask: “Do we have any strategies to better their conditions so they stay home?  Are we willing to spend treasure and effort to this end?  Can we humanize and rationalize our immigration attitudes, laws, and policies so we can develop win-win solutions?  Shouldn’t our troubling past record against the rest of the world compel us to be compassionate toward the immigrants of the world?  That’s the real debate we ought to have if we are honest about the immigration problem. 

I will close this post with a story of how Westerners have affected other peoples’ lives.  In the mid-sixties Great Brittain was preparing to end its colonial rule of Mauritius, an archipelago in the east coast of Africa.  However, that being the period of the Cold War, the U.S. needed a naval base in the Indian Ocean.  Then Great Britain exploited an international law which allows a country to lay claim on a territory empty of people.  The British authorities chose to apply this law to a cluster of islands that formed the Chagos archipelago which was part of Mauritius.  However, there was a problem.  The islands were inhabited.  This did not stop Great Britain to establish the British Indian Ocean Territory after expelling the local population.  Thus, Great Britain could rule this dot of land as it pleased.  That’s how the U.S. base of Diego Garcia was built.  A recent article in the NYT (October 4, 2024) announced that following a decision by the World Court and a UN vote, the British government has finally decided to cede control of Chagos to Mauritius conditional on agreeing to a 99-year lease to the U.S. to operate Diego Garcia.  (A detailed article on the plight of Chagossians and the British machinations was published in The Atlantic in its 2022 July/August issue.)   

Shared Prosperity: A Better Economic Paradigm

Les Trente Glorieuses or The Glorious Thirty is how the French named the thirty years from 1945 to 1975 when, in the aftermath of the WW II, the economy grew fast and millions of families entered the middle class.  This pattern of economic performance was experienced in practically all free economies, not the least in the United States.  The most common characteristic was that productivity growth and real (after inflation) wage growth moved in almost perfect unison. 

And then in the 1970s this harmonious relationship came to an end.  Capital investment and corporations started to claim the lion’s share of productivity gains leaving wages behind.  This had a significant hollowing out of the middle class.  In 1970 the fraction of the population considered middle class by income was 61%.  By 2021 it had dropped to 50%, a fall significant by the numbers but even more consequential for its deleterious socio-political effects. *

The divergence in the paths of the rewards of capital and labor was not the product of the invisible hand of the market.  It came about because both political power and intellectual attitudes shifted in favor of capital investors and business owners and away from labor and the working class.   

First, a series of laws and regulations started to limit the rights and powers of labor unions and their ability to represent labor’s interests.  (Here, President Reagan’s firing of 10,000 striking air traffic controllers was a pivotal moment in the demise of labor unions.)  Next, the development of the shareholder wealth maximization scholarship promoted the creation of corporate value in the name of shareholders as the ultimate standard of success for corporate executives.  Quickly this scholarship turned into a practicing philosophy that failed to see through its full consequences for the other stakeholders of business and the economy in general.

The creed of shareholder value maximization promoted the importance of executives and devalued that of labor.  The old contract between management and labor had allowed American workers to improve working conditions and claim higher wages.  In the new model of shareholder value maximization, labor income became just another cost component that could be controlled through consolidation of the working force at the corporate level or lower wages.  Even solid academic work in favor of raising the minimum wage came under attack by business interests and their ideological allies in academia.  Angus Deaton gives a good account of the efforts to discredit the work of David Card (Nobel Prize in Economics) and Alan Kruger on the minimum wage (Economics In America).

Two more factors contributed to the stagnation of labor income and the demise of middle-class households.  One was automation of the production process.  The other was the offshoring of production thanks to international agreements, like NAFTA and WTO.  Both allowed businesses to reduce their demand for American workers without having to share the gains with labor in an environment of diminished labor union power. 

Neither automation nor offshoring had to produce these negative effects for labor income.  They did, nonetheless, because business, and especially big business, had the power and the acquiescence of the political class to refuse a fairer distribution of the corporate gains earned from technological progress and global expansion.  And most importantly, there was no countervailing power from labor to bargain for a more equitable distribution. 

After four decades of stagnated wages and immense economic inequality, it is time we looked for a better economic paradigm.  Thus far, we have prioritized economic policies which we assume will produce the desired economic results.  Emblematic of this approach was the trickle-down model of the neoliberal order.  It promised that fattening corporate profitability through deregulation and tax incentives would produce better real wages.  In the absence of a potent labor movement, it turned out to be an empty promise for the bulk of the working force.  Even worse has been the tendency of U.S. governments (Administrations and Congress) to socialize corporate losses by bailing out financial and industrial firms but not the rewards by also spreading out the gains businesses amass thanks to government policies.  In other words, the faith in the magic of markets stops at the door of corporate failure at which point the state picks up the losses.  However, no equivalent state role is envisioned to ensure labor gets its fair share of the economy’s gains. 

Now we hear that imposing tariffs and import restrictions will boost the production and profitability of American corporations and consequently raise American wages.  It’s not clear at all why this should happen.  For one, the American economy has become more monopolistic than before.  Economics suggests that monopolies do not expand production to the maximum point.  The concentration of jobs within a smaller number of huge firms has also turned them into monopsonists with more power than workers to negotiate lower rather than higher wages.

Another idea is to create an opportunity economy, where all have an equal chance to participate in the gains of the economy.  As American and laudable as this sounds, it still focuses on means rather than ends.  Equal opportunities do not necessarily produce fair outcomes.  The underlying pathologies of the current American economy – concentration of corporate and market power, weak labor unions, and lagging productivity – will also prevent opportunities to translate into fair outcomes.  Leveling the playing field of opportunities is also challenging in the presence of a vast economic inequality.  Studying a group of countries, the academics T. Piketty and E. Saez have found that where inequality is high opportunities are fewer. 

Therefore, if we wish to make real progress toward a more equitable and thriving society, we need to set our eyes on goals not tactics.  Shared prosperity is such a goal.  If it happened before, in the post WW II decades, it can happen again.  But to bring shared prosperity back we need to follow policies that set the foundations for its attainment.  Thus, we need to create more competitive markets and curtail the dominance of few corporations.  We need to restore the power and voice of labor.  We need to boost educational opportunities for the type of skills (not all requiring a college degree) the economy needs.  We need to incentivize investments that boost worker productivity instead of only incentivizing the replacement of workers by machines.  (In their book Power and Progress, Acemoglu and Johnson report that despite the enormous investments of American corporations in technology Total Factor Productivity growth is lower now than in previous periods of less mechanized production.) And we need to have policies that boost and protect wages. 

The goal of shared prosperity will put a stop to the present mindset that the economy is a zero-sum game of winners and losers.  It can also restore the sense of solidarity a sustainable society needs.  Finally, goals are not only aspirational; more importantly, they can be inspirational.

*Pew Research Center

Note: These ideas have been developed from what I have read in several books in recent years.  So, I need to give them credit by citing them below.

Power and Progress, Acemoglu and Johnson

Economics In America, Deaton

Deaths of Despair, Case and Deaton

The Rise and Fall of the Neoliberal Order, Gerstle

The Middle Out, Tomasky

And a book I know in summary but haven’t read yet People, Power, and Profits, Stiglitz

Blaming Educated Elites Misses the Point

Trying to explain the political divide between rural and working-class Americans on the one hand and urban Americans on the other a lot of blame has been placed on the responsibility of elites for the alienation of rural and non-college educated Americans.

This past year, though, the critique of the elites has targeted a particular segment, namely, the educated elites.  One part of this elite comprises college administrators, faculty, and students.  The reasons behind this critique vary.  Some take issue with the conduct of pro-Palestinian students and faculty.   Others resent the push of uncompromising political correctness that comes out of campuses, especially those of elite universities.  Another reason is the application of double standards that favor liberal causes and speakers.  To some extent the criticism is justified and at any rate it is useful as part of checks and balances.  In a democratic society, we all have a great deal of stake in the function of universities as institutions that support free speech and inquiry and safeguard everybody’s right to partake in these privileges without exclusion or harassment.

There is also criticism directed in general against educated Americans, who seem to do a lot better than those living in rural parts of the country and those employed in blue-collar jobs.  This overall educated elite allegedly treats rural and less educated Americans with derision and scorn and fails to recognize their values and cultural identity.  That the paths of these two groups have diverged in the past several decades is an undeniable reality.  However, making sweeping generalizations about the responsibility of educated elites for our current political maladies grossly misses the point. 

The declining economic and social status of rural and less educated Americans has much less to do with the squabbles taking place on campuses or the attitudes of educated people and a lot more with economic and social developments for which both conservatives and liberals are responsible.  These developments not only affected the relative power and status of rural and blue-collar workers, they also reallocated a greater slice of the costs toward rural and less educated Americans.

Globalization and automation have been the primary factors for the elimination of well-paying manufacturing jobs which in turn contributed to the decay of social and family life in rural areas and past industrial hubs.  Most severe has been, however, the transformation of the economy into one relying on knowledge and information with its attendant requirement for more sophisticated technical skills and education.  The failure of the decision makers who steered the country toward the new order was their inability or unwillingness to foresee the economic and social costs of these transformations on those who were ill-equipped to survive under the new conditions. 

More importantly, the new economic order of neoliberalism accentuated the trend toward increasing income and wealth inequality.  By one important measure of income distribution, labor compensation as percent of GDP declined from 49.8% in 1973 to 43.6% in 2021.*  Left with a smaller piece of the pie, it is not surprising that the frictions among working-class people intensified.  Given their previous privileged position, white workers felt particularly diminished in economic and social status after wages stagnated in the early 1970s and then in the 1990s as manufacturing jobs were off-shored.  No wonder their feelings towards minorities of color and immigrants, whom they consider competitors for the smaller income slice, have hardened. 

What was then the role of educated elites in these adverse developments?  Well, some of the strongest voices in favor of open borders and unregulated markets came out of elite institutions, like the University of Chicago.  The neoliberal order was also supported by overoptimistic expectations that drawing adversaries, like China, into the web of capitalist markets, would promote national security and world peace.  The experience of the last two decades has dashed these hopes.  That’s why we see now a remarkable bipartisan retreat from neoliberalism in favor of industrial policies and tariffs.

The intellectual and pragmatic underpinnings of the neoliberal order also favored education, innovation, and entrepreneurship that in time would place less educated workers at a disadvantage.  Let’s not forget that the modern Western world was built on these three pillars.  That’s how the 16th century scientific revolution put Europe on a divergent path from other contemporary powers, chiefly China, and led to the West’s world dominance.  In addition, conservative and liberal thought has also emphasized individual responsibility, reason, and free agency.  That implies that people respond to changing circumstances to maintain their well-being.  Therefore, economic rewards and losses are distributed according to one’s effort and merit.  This thinking assumes that safety nets are redundant and people accept their diminished fortunes as the result of their choices.

The reality though is that historically what we call progress has been driven by the interests of a minority of innovators, entrepreneurs, and learned people.  The expectation that the rest of society will respond to the changes ushered only by the few reflects their own aspirations and assumptions.  The fact that even in our times close to two thirds of working age people in the Western world lack college education should tell us something.  We should recognize that for a variety of economic, social, and political reasons a lot of people are not given the opportunities to move over to the winning side.  For others, it is a lifestyle choice that should not condemn them to substandard living.

That’s why Acemoglu and Johnson, whose book I highlighted in my last blogpost, argue that innovations and changes in general do not represent true progress unless they ensure that we all participate in their benefits.  In other words, only when new directions into the future are made with the goal of shared prosperity we make real progress.  Our political malaise these days has a lot to do with our failure to chart our future path with this goal in mind.  For this failure, we need to look much beyond the behavior of educated elites.

*Labor income participation in the GDP or the Gross Domestic Income can vary by source.  However, all sources show a decline over the past fifty years.

The Direction of Technology and The People’s Voice

In past posts, I have pointed to the consequences of technological change and how some of its outcomes leave a lot of people behind or pose a danger to our human nature.  Now a book has arrived that makes a very comprehensive case for the interplay of power and technology and why it is important to harness technology for the benefit of all not just of the few.  This book is Power and Progress by Daron Acemoglu and Simon Johnson. 

This and similar books are all the more important right now because, first, we have a rich historical record to draw from (so no excuse on the base of ignorance) and, second, we stand close to crossing a threshold unlike any other as we move toward the end stages of AI development. 

Humankind has followed a relentless march of technological innovation because of favorable biological and cultural factors.  We have developed highly intelligent brains and have developed cultures that can established the institutional scaffolding for technological advancements.  These primary factors do not, however, preordain which direction technological developments will take.  This, the book argues, depends, as history shows, on two other forces.   

The first is power – mostly state power.  For example, the first industrial revolution in late 18th century would not have taken hold, at least as fast, without favorable government laws that pushed farm workers to the cities to provide abundant and cheap labor to factories.  Nor would the later success of the industrial capitalist model would have been as successful without the exercise of colonial power and the slave-holding plantations of the New World.

The second force is persuasion, that is the ability to steer a society down a technology path by the power of vision and tenacity of gifted or powerful individuals.  An example is the French engineer Ferdinand de Lesseps who against all odds pushed for the building of the Suez Canal and later the Panama Canal.  He succeeded but at the expense of many fellow human beings.  Thousands of workers were employed at starving wages and under atrocious working conditions.  The first attempt to cut a canal through Panama resulted in more than 20,000 fatalities.  Acemoglu and Johnson use this example to argue that the vision of a few individuals may produce spectacular results but, in these and other cases, there is a huge human cost that we tend to ignore in the name of so-called progress.

Power and persuasion are at work right now and if unchecked they will soon determine the next direction of human history.  In authoritarian countries it is the dominance of a strong man (like Putin) or of the undisputed party (as in China) that sets the direction.  But we, in democratic countries, should not placate ourselves with the illusion that we the people have full control of our future direction.  Although we do rely on elected representatives to influence our direction, as in the case of climate friendly laws, a lot more is decided by the huge market power of few mega corporations who have practically cornered the market where it matters the most for the development and use of influential technologies.  Primary examples of that are the market for the acquisition and aggregation of personal data and the funding of AI research and development.  Right now, it is the vision of few executives and corporate owners that is poised to set the direction of our future.  As Acemoglu and Johnson write “Vision is power and power is vision.”  But then what about the rest of us?  How do we ensure our vision is put on the grand table on which our human destiny is negotiated?

The authors warn that technological changes should not be judged solely on their intellectual and scientific merits and on how much they thrill us as manifestations of human achievement.  Instead, they should be judged by whether they promote shared prosperity and enhance the human condition.  Technologies that merely replace human labor and push workers down to low-paying jobs are not friendly to shared prosperity.  That was the case during the first part of the First Industrial Revolution.  Human-friendly technologies are those that improve human productivity and share the productivity gains with labor.  This is what the authors call the productivity bandwagon that carries us to shared prosperity.  However, Acemoglu and Johnson show, this happens only when there are countervailing forces that compel those who control the new technologies to boost workers’ skills and accept a more equitable distribution of the gains.  This happened when labor unions and the Progressive movement emerged in the latter part of the 19th century in the U.S.  And it happened again in the period between World War II and the 1970s when labor unions were still strong.  On the contrary, the tremendous pace of technological breakthroughs over the last forty years has yielded meager results for working class people and exacerbated income and wealth inequality, which are the exact opposites of shared prosperity.

How technological change impacts human work is one part of the equation.  The other is how it affects human relations, our emotions, our sense of facts and falsehoods, our relations to authorities, in general, our control over information and privacy.  In riveting detail Acemoglu and Johnson describe how social media and internet search firms manipulate emotions and information to increase clicks and attention in order to maximise ad revenues with disregard to the privacy or well-being of their customers.  Despite warnings from government and politicians as well as from groups of experts, research and development in AI also appears to succumb to the profit motive.  Just days ago, we became aware of that when concerned scientists from OpenAI accused its executives for directing the firm toward profit opportunities at the expense of human safety even survival. 

The situation is no better in countries where the government controls private initiatives.  The book offers a chilling account of how China has prioritized surveillance and monitoring of its citizens as the primary goal of AI development.  The use by several Western countries of algorithms, like Pegasus, which can spy on civilians, shows that democracies are not immune from the malicious use of technology.

The lessons we can draw are critical for how we manage our future direction.  Technological change is not equivalent to progress unless it promotes shared prosperity and human well-being.  Machine intelligence is not equivalent to machine usefulness.  The direction of technological change is not destiny but a matter of choice.  And choices are made mostly by those who control the agenda and the vision.  Unless we are willing to surrender control and vision to an oligarchy, we the people must claim our seat at the table.  This means that we the people have the right to share in the control and vision of technological change.  The bottom line is not to block technological change but to choose what benefits humankind and earns our consent.

The Legacy of the Neoliberal Order

In the previous post I referred to the neoliberal order from the perspective of finance.  Here I will offer a wider review of this era because it helps explain where we are and it also raises inevitable questions about our future direction.  The source of my thoughts is again Gary Gerstle’s The Rise and Fall of the Neoliberal Order.

The first point of interest is that a political and economic movement becomes a prevailing order when it gathers strong enough force and persuasive power that it is adopted across the political spectrum.  Thus, the New Deal was conceived and put into action by Franklin Roosevelt and the Democrats but it found support and further enhancement in the policies of the Republican presidents, Ike Eisenhower and Richard Nixon.  Similarly, the neoliberal order was ushered in by Ronald Reagan but then it was further facilitated by Bill Clinton and remained effectively unchallenged by Barak Obama.

The reason the neoliberal order survived under both parties was because it appealed to both conservatives and liberals.  Conservatives had always harbored a deep apprehension against the New Deal with its heavy reliance on the state to repair market failures, the worst of all, of course, being the Great Depression.  The fiscal strains caused by the simultaneous pursuit of guns (Vietnam war) and butter (the Great Society and War on Poverty) by the Johnson administration and the ensuing loss of economic momentum were the forces that offered conservatives the opening they needed.  Their remedy was a return to free markets and as little government interference as possible.  Neoliberalism was then the restoration of economic liberalism free of the statist approach of the New Deal.

For their part, liberals had started to lose confidence in the managerialism model that favored corporate stability through entrenched corporate leadership and stable labor relations.  The potential and promise of computers and later of the internet appealed to liberals as the means to foster innovation and individual freedom.  By embracing the Silicon Valley and Wall Street, liberals believed a new age of tech-based creativity and prosperity was possible on a global scale. Both conservatives and liberals came to believe that restoring shareholders at the center of corporate power and relying on the market would open up new vistas of economic and individual growth.

But unleashing market forces has its dark side.  Creative destruction means displacement and uprooting as jobs move that test the stability of cultural traditions which is central to conservative ideology.  To justify these costs conservative neoliberals, Gerstle argues, had to invent a Neo-Victorian ethic that called for self-restraint, self-responsibility and self-reliance to overcome the excesses of the market and its materialistic temptations.  On the other side, the New Left liberals would continue to emphasize the need for a safety net to catch the human and social fall out from disruptive new technologies and globalization.

The neoliberal economy produced economic agility and innovations but failed to sustain the well-paying jobs of the disappearing manufacturing sector.  Factory automation reduced demand for labor and offshoring expanded labor supply, both putting downward pressure on wages.  At the same time labor unions came under a withering attack by politicians and corporations that weakened their bargaining power.  Absent any countervailing force, cutting labor costs became a business orthodoxy, and wages inevitably stagnated.  Without a coordinated policy to upgrade workers’ skills the rate of deindustrialization was too fast for a smooth transition to the new economy.  Workers without college education were those hit the hardest by the new direction of the economy.

Against the background of industry deregulation and financial liberalization, the conservative expectation that self-restraint and responsibility would be effective antidotes to market excesses proved to be another idealized construct of economic behavior just like that of the rational individual.  The 2007-08 housing crisis proved that the lure of easy credit and material gratification were too powerful for individuals to resist.  Liberals also failed to put in place an effective safety net.  Without control of the Congress, Bill Clinton, was forced to adopt welfare policies that worsened the terms of assistance to the poor.  Open border and trade policies exemplified in the North American Free Trade Agreement, the World Trade Organization, and the economic opening to China, without the simultaneous adoption of pro-labor policies, proved to be detrimental to working class Americans.  

By the turn of the century the unravelling of the pre-1980 social contract had started to erode important dimensions of human development in the U.S.  Marriage and birth rates declined, out of wedlock births rose, and deaths from drug and alcohol abuse along with suicides reached unprecedented levels.  The views on important policy-related issues also underwent a shift to more self-centered choices under the ideological weight of neoliberalism.  Tax cuts acquired legitimacy at the expense of welfare programs which became synonymous to individual failure.  Inordinate income and wealth inequality were accepted as signs of meritocracy.  Nonetheless, fiscal probity and the “less government is good” doctrine did not stop Republicans and Democrats from saving financial behemoths and the auto industry from the self-inflicted ruins of the Great Recession.  At the same time millions of households were left unprotected from home foreclosures and financial ruin.

Forty years of neoliberalism left us with some useful lessons.  A series of market and corporate crises and especially the Great Recession proved once more that markets cannot self-correct without state intervention and regulation.  Self-restraint and self-reliance are not resilient enough in the presence of tectonic shifts in the economic order.  Unstable and unrewarding jobs undermine social cohesion and cultural traditions.  A sharp rise of inequalities and loss of faith in the prospect of upward class mobility erode the commitment to democratic principles of governance.  At its end the neoliberal order finally met its nemesis, populism.

The signs that the neoliberal order is in retreat are around us.  Globalization is under attack.  International trade treaties are no longer popular.  Industrial policies, anathema to liberal economics, are adopted by Republicans and Democrats.  Cosmopolitanism and elitism are vilified by working class Americans.  Recent surveys show that majorities of Americans across parties are dissatisfied and anxious about the country’s direction.

 So, what is next?  The transition is complicated by two new factors.  One is the concentration of enormous economic (and political) power in the hands of a few mega-firms, some of which answer to a single person.  The other is the rapid advances in Artificial Intelligence.  Neither of these developments has given us signs that they believe in open governance and shared prosperity.  The challenge for democratic states, as I see it, is the democratization of the process by which societies choose the direction of technology and the allocation of power and responsibility in ways that look after the interests of all citizens, not just of those privileged by class, wealth, or corporate power.

Finance In The Neoliberal Order

If I had to summarize the essence of the neoliberal order, the prevailing economic thought for the past forty years, I would argue that it allowed private, especially shareholder, profits to become its central organizing principle.  In this mission it did not have a more loyal handmaiden than finance, both as an academic discipline and a practiced profession.

These thoughts came back to my mind after learning of the passing of Michael Jensen, one of the most influential finance academics and a driving force behind placing finance at the center of the neoliberal order.  The other driving force was Eugene Fama, the 2013 Nobel Prize winner in economics.  Fama has been the high priest of capital market efficiency, that is, the theory according to which the prices of financial securities, like stocks and bonds, reflect the aggregate information of all buyers and sellers in these markets.  As such, prices give reliable signals to firms and investors about the values of firms and other assets.  The Great Recession of 2007-2008 put this theory to its toughest test and many economists found it wanting. 

For Jensen, the villains were entrenched executives who having to choose between an option that aligns with their interests and one that enhances shareholder value they will opt for the former.  In 1976, Jensen came up with the so-called agency theory that exposed the conflicting interests of managers and shareholders and suggested that the most effective way to align the interests of the two parties was to turn managers into shareholders.  This could be accomplished by granting executives stocks and options to buy stocks.  This way value maximization would be the common goal for shareholders and executives.   By the early 2000s, in the wake of the Enron debacle and other managerial abuses, Jensen had come to regret the corruptive effect of stock options which he called managerial heroin. 

As with many “great” ideas the law of unintended consequences had spoiled expectations.  Although the agency theory was sound, it became the spring board for executive compensation packages that blasted away any measure of moderation.  The result was exorbitant compensation packages which are still central in our current discontent with inequality.  In the 1960s, executive compensation averaged around 20 times the average worker salary.  By 2000 it had reached its highest level at 366 times.  Was there any credible evidence the new industry captains were that superior compared to their predecessors? Hardly.  In the earlier decades the American economy had thrived and the productivity gains were shared fairly between profits and wages.  More corrosive, though, was the moral hazard problem the new compensation schemes created for executives.  If their wealth depended on the value of the firm’s shares, why not attempt to boost their value by unethical, even unlawful, means.  Misinformation, lack of transparency, pursuing irrational risks, could be employed to boost stock prices and executive wealth.  The Enron and similar cases around 2000 and the housing crisis of 2007-2008 had the finger prints of such moral failings. 

Creating corporate efficiency by aligning the interests of executives and shareholders was not sufficient for value maximization.  What about firms that continued to operate under inefficient managers?  Wouldn’t it be better if these firms were put under the control of more successful executives?  Michael Jensen was again influential in shaping and promoting the corporate control theory and its business applications.  Corporate takeovers facilitated by investment bankers and funded by private equity firms, hedge funds, and the infamous junk bond markets became the tool that held the promise for new value creation.  Hundreds if not thousands of finance research papers provided the empirical justification as they amassed lots of evidence that this type of creative destruction indeed created new value.

Finance practitioners and academics started to view firm balance sheets as if they were made of Legos.  Move these pieces here and there, spin off or sell some businesses, go offshore, take on more debt, pay hefty dividends, this was the new language of finance.  The problem with these restructurings was not that they did not produce profits, at least in the short run.  The problem was that they were evaluated with one beneficiary in mind, that is, the shareholders.  No wonder, therefore, the empirical studies found supportive evidence.  

What was left out though was any negative consequences for other stakeholders.  The laid off workers, the hollowed towns, the decayed social fabric of previously bustling industrial hubs, the deaths of despair caused by alcohol, opioids, and suicides.  From the 1950s to the 1970s, under the bargaining power of labor unions, corporate executives had accepted a social contract of reasonable shared prosperity.  The neoliberal order had nothing to do with that.  Markets were supposed to operate with as few rules as possible and everyone was supposed to fend for him/herself.   Entrepreneurs, the so-called producers, were responsible to organize businesses for maximum profits and workers had to retool themselves and follow businesses to their new more profitable locations.  Of course, that was impossible if the new locations were in Mexico, India, China, and Southeast Asia. 

The question is why finance academics followed this narrow perspective.  By the early 1980s, finance professors had soured on managerial power and they had concluded those post-War II executives were sacrificing value to build corporate empires and accommodate generous union contracts.  For conservative academics the choice was easy.  Economic efficiency had to be restored and unfettered markets were the best instrument to that end.  Liberal finance academics just rode the wave of the New Left (a term found in Gary Gerstle’s The Rise and Fall of the Neoliberal Order) which also counted on markets and entrepreneurship to set creativity free for the new digital age.  In addition, creating a global economy and lifting all boats around the world was a commendable economic project.

With hindsight, we can contemplate the flawed consequences of putting so much faith in capital markets and the ability of managers to strike a fair balance between the private and the common good.  The reality, though, is we all followed in the intellectual footsteps of Fama, Jensen and the mantra of shareholder value maximization.  I am not sure how present finance academics think about the social responsibility of their craft, but I hope they have learned what we missed.  

Anthropocene Epoch: Are We Ready for It?

The anxiously anticipated decision from the International Commission on Stratigraphy is finally in.  We are not there yet.  The Anthropocene epoch is not yet upon us.  It was supposed to be adopted as a new epoch marked by the explosion of the atomic bomb in 1945, although some set its birthday at the explosion of the hydrogen bomb in 1952.  The Anthropocene epoch would set us apart from the Holocene epoch that came at the end of the ice age and the wake of the first Agricultural Revolution some 11,000 years ago. 

This whole controversy around the validity of the Anthropocene epoch made me curious; so I set out to educate myself about our geological timeline.  First, some definitions.  Anthropocene means a new (Greek kainos) epoch in Earth’s history impacted by humans (Gr. anthropos).  The preceding epoch Holocene means a whole (holos) new epoch marked by the first serious impact of humans on nature through agriculture and the domestication of animals.

Right now, we are in the Meghalayan age (marked by an extended drought 4200 years ago) within the Holocene epoch, of the Quaternary period (beginning 2 mya*), of the Cenozoic era (beginning 65 mya – with the end of dinosaurs), of the Phanerozoic eon (beginning 540 mya – with the appearance of complex animals and plants) on planet Earth born about 4.6 billion years ago.

No matter how interesting the geological line is, I find the lineage of humans even more so.  Our genus Homo starts with our ancestor H. Habilis about 2.8 mya.  It is followed by H. Erectus 1.7 mya, H. Heidelbergensis 0.7 mya, H. Sapiens 0.3 mya, and here we are as the Modern Homo Sapiens since 50 thou. ya.  Since the human line split from the other primates, a good many other human-like species came and went, including the Neanderthals and Denisovans whose DNA we still carry in a tiny amount.  But homo sapiens is the one human species that managed to survive all adversity to this time.  No wonder we are full of ourselves.

Now this succession and extinction of human species along the way reminds me of the myth about the creation of humans in Hesiod’s (9th century BCE) Theogony (Creation of Gods).  First, the Gods created the Golden race (the best of all), followed by the Silver, the Heroic, the Copper, and finally the Iron race.   Just like the offspring of Adam and Eve in the Book of Genesis, the Iron race is destined to live with worries and suffering.  I find it interesting that in both, the Greek mythology and the Hebrew bible, the better days of humans are in their earlier not the later stages.

That’s not how we modern humans think.  No matter what cards we are dealt, we believe our present world is the best humanity ever had and even better days await us in the future.  I suppose this confidence comes from our triumphs in science and technology.  But not everybody agrees with this assessment.  For example, the transition to agriculture has been held responsible for eliminating the egalitarian life of hunter/gatherers, and for ushering in class hierarchies, administrative bureaucracies, and the regimentation of work.  

So, is it possible that our own human intelligence takes us down to a path toward a post-human species?  As a matter of fact, brilliant minds have been actively speculating about this possibility.  On one side, we have the anti-humanism futurists who believe that we will degrade the environment to the point it can no longer sustain us and we will go the way of the Copper race.  Then, once we are gone, the Earth will no longer suffer in our hands.    

On the other side, we have the Transhumanists who believe that by harnessing the power of AI humans will find ways to solve their environmental problems by becoming all mind and data and no physical matter.  By uploading our minds on platforms our environmental impact will come to an arrest and we ‘ll live peaceful and prosperous lives like the Golden race or Adam and Eve before their fall from grace.  Sadly, neither school of speculation envisions a human future where a physical body and an ethereal mind interact to produce the human experience of our present state.

So, I asked myself what name we could give the next species if the Transhumanism prediction came to pass.  I came up with the name Homo Artificialensis, before a Google search revealed the next human (?) species had been already named Homo Artificialis. 

It is really worth asking what species contemplates its own extinction or its transformation into an artificial entity as the only solution to preserve what is left of nature.  Both views reveal our anxiety about our impact on nature and our resignation that we cannot stop moving “from living with what nature give us to living with what we want to have from nature” in the words of a thinker.

The more, however, we want to bend nature so that it gives us what we want, the more we set ourselves outside nature.  But this is not a way of living or thinking shared by all humankind. Most of humanity has lived and continues to live within limits set by nature and thus it bears very little responsibility for the environmental damage we experience.  That’s what the climate researcher Stephen Lezak correctly points out (NYT 3/26/24).  And he adds that as noted in a statement published in Natural Ecology and Evolution by a group of scientists “our impacts have less to do with being human and more to do with ways of being human.”

That’s an important point to heed.  It implies we have choices as to how we wish to live as humans and we are not locked into a path entirely determined from outside ourselves.  Which brings us to the reason for naming our epoch as Anthropocene, that is, the explosion of the atomic bomb.  Did it come up by happenstance?  What if the terms put on the Germans at the Treaty of Versailles were not so onerous as to breed humiliation and misery?   What if economic policy makers had listened to voices like that of John Maynard Keynes as to how to address the threats of inflation and unemployment?  What if in the absence of these adverse consequences of peace, there was no World War II, and there was no race to develop nuclear weapons? 

Resorting to historical “ifs” may sound naive.  But historical retrospection can also inspire cautious reflection as we choose our steps into the future.  That is after all the value of history.

*mya means million years ago.

Poverty As An Economic Policy Failure

Imagine if we looked at poverty not just as a personal failure or the result of bad luck, but as a sign of malfunction of the economic system, that is, the same way we take fever as a sign of illness.  What if we scrutinized the economic system for its failure to reduce poverty the way we scrutinize it for not yielding higher growth or higher employment. 

These are the questions you take away from reading Matthew Desmond’s book Povery, By America, which I introduced in my last post on this blog.  As the title implies, Desmond argues that poverty in America has some special features because of the way our economy and public policies work which make poverty endemic to our socioeconomic system.

Various statistics on poverty in America describe a very inconvenient picture.  Before covid in 2019, OECD ranked the U.S. worst in poverty among 26 developed nations.  So, Desmond asks why this is happening in the richest nation on earth.  Desmond builds a strong case that poverty in America is not accidental, not due to laziness, and not borne out of reliance on welfare, but rather due to an economy that disadvantages the poor, an ill-designed safety net, and limited opportunities to avoid or exit poverty.

There are many reasons a person can fall into poverty.  Growing in a poor and/or dysfunctional family; dropping out of school; having encounters with the law that escalate to greater economic and social displacement; inability to meet medical bills.  Worse though is what keeps poor people trapped in poverty.  Low wages, no access to affordable housing, unaffordable child care.  The question, therefore, is how we manage so that individuals can either be caught before they fall into poverty or be empowered to jettison themselves out of its orbit. 

Desmond identifies three market sectors that work against the interests of people with limited means and pushes them further into poverty.  One is housing with its insufficient supply that imposes higher than fair rental charges on the poor.  Poor people have to accept high rents for poorly maintained houses in predominantly depressed areas.  Besides spending an inordinate amount of income on rent, poor people are also trapped in communities with limited opportunities for socioeconomic development and advancement. 

The second economic sector that can contribute to poverty is the labor market.  The persistent stagnation of wages since the 1970s is a well-known cause of this reality. A recent Brookings Institution study found that 44% of working age Americans earn low wages.  The frozen federal minimum wage is one reason for this.  The dramatic shrinkage of the number of unionized workers has also deprived them of bargaining power in setting living wages.  In addition, the nature of employed work has changed to the disadvantage of workers.  The fraction of full-time employment with full benefits has been shrinking in favor of part-time and temporary gig jobs that shift the burden of health insurance costs and saving for retirement from employers to workers.  Ironically, Desmond notes, the welfare programs the government offers low-wage people work as complement to the low wages paid by many corporations.  No wonder large corporations are in support of welfare programs. 

Finally, poor people are severely underbanked and left to manage their finances with payday check-cashing and lending outlets.  Even when poor people transact with banks their perennial inability to cover checks and pay debts on time exposes them to overdraft and late-payment fees.  Desmond estimates that poor Americans spend $61 million a day on check-cashing and credit related fees.  With little access to credit their buying power is limited and so are their opportunities to better their lives.

The overall result is that those among us who earn the lowest wages are compelled to pay the highest prices.  Very aptly, in this connection, Desmond quotes James Baldwin who remarked “how expensive it is to be poor.”

Our lack of success on the war against poverty persists despite the fact that funding of welfare and safety net programs has significantly expanded over time.  The reason is, that unlike other Western countries, a great deal of funds nominally earmarked to fight poverty are diverted to other causes.  Most responsible for this are states with a tradition of aversion toward supporting poor people.  A good example is the refusal to expand Medicaid to provide health insurance to low-income people.

Further contributing to pushing poverty to the fringes of economic and public policy are prejudices built on myths divorced from factual reality.  Here are some of these myths that Desmond debunks.  Poverty is deserved because poor people are lazy or have low work ethic.  Both are wrong.  You can work full-time and still be poor in America.  Poor people use welfare funds on non-essential goods like tobacco and alcohol.  Wrong.  Better off people spend twice as much on alcohol than poor people.  Poor people are trapped in welfare programs and stop seeking work.  The evidence shows otherwise.  Poor people desire better lives like the rest of us and grab opportunities to become independent of government assistance.  The poor like to live on the dole.  Also wrong.  Desmond shows that the poor are less successful in utilizing government support programs than middle- and upper-class families.  In other words, poor Americans have lower take-up rates of public benefits than well off Americans.

There is also a wide-spread belief that the poor are a burden to society.  When we account for government funds lost to mortgage interest tax deductions, the untaxed health insurance benefits employees get from employers, various tax breaks and loopholes, and a host of other government support programs, the better off classes cost the government more than the poor.  Thus, in 2018, a middle-class family received $7,100 more in overall government aid than it paid in federal taxes.  More recently, the bottom 20% of the income distribution received $25,733 in government benefits but the top 20% received even more, $35,363!  If anything, we are all on the dole.  We can bemoan the widening budget deficits but we cannot blame the poor. 

If we are serious about reducing or, even better, abolish poverty, we need to treat it as part of a comprehensive economic policy.  Poverty is not inconsequential to the health of a society.  Poverty breeds less educated citizens, less skilled workers, broken households, underdeveloped children, unhealthy social habits, and crime.  Above all, it deprives people of their dignity and pride.  Poverty is a negative externality of the economic system that touches all of us.

It also steals the potential for a better society.  Desmond asks: “How many artists and poets has poverty denied us?  How many nurses and engineers and scientists?  Think of how much more vibrant and forward-moving our country would be?”