Friday, December 20, 2013


by Charles Snow

A reader responded to my post on Great Business Books by asking, Are there any great books on business ethics? Or is that term an oxymoron?

Yes, I've heard business ethics referred to as an oxymoron, along with jumbo shrimp, military intelligence, and British cuisine. But, oxymorons aside, behaving ethically in business is hugely important. Just look at what happened in the U.S. before the financial crisis in 2008 and the Great Recession that followed when some businesspeople behaved unethically, illegally, and immorally -- often at the same time!

Your comment caused me to chat with a colleague who does research on business ethics as well as conduct some armchair research of my own. I learned that "Ethics books have never been well received in the marketplace," according to HarperBusiness Executive Editor Dave Conti. That probably explains why you couldn't think of a "great" business ethics book. Therefore, given your particular objectives, you might try reading one or more of the following books:

1. Dan Ariely, The (Honest) Truth About Dishonesty (Harper Collins 2012). Ariely is a behavioral economist and is considered to be one of the best experimentalists in the social sciences. This is an evidence-based book that describes how we all cheat and lie -- but only up to a point where we still feel good about ourselves.

2. Linda Trevino and Katherine Nelson, Managing Business Ethics: Straight Talk About How to Do It Right (John Wiley & Sons, 2011). These authors, both business school professors, have done a lot of research on codes of ethics. You can learn about the experiences of various organizations that have formulated and used ethical codes.

3. Joseph L. Badaracco, Jr., Leading Quietly: An Unorthodox Guide to Doing the Right Thing (Harvard Business School Press, 2002). Another business school professor whose book is targeted towards middle managers.


By Ronald Fox
On December 4th, American news outlets reported the results of the Organisation for Economic Co-operation and Development (OECD)-administered 2012 Program for International Student Assessment (PISA) test. Once again it showed American teens to be lagging in global education rankings. Our students scored below average in math and roughly average in reading and science compared with teens in the 65 nations that participated in the test. Among the 34 OECD countries, which include most western industrial countries, plus Japan and Korea, the U.S. ranked 26th in math, 21st in science (from 17th in 2009) and slipped to 17th in reading (from 14th in 2009). Already the finger-pointing has begun.

The PISA test is far different from the standardized tests students take in the U.S. Administered every three years to 15-year olds, the PISA test is designed to measure a student’s ability to think critically and solve problems in math, reading and science (it was initiated in 2001). PISA demands fluency in problem solving and the ability to communicate, skills that its framers believe are necessary for working, thinking and adapting to the world beyond school, a world choked with information and subject to rapid economic change. I find the PISA test a valuable instrument for cross-national comparisons of what teens know.

I don’t claim to know why our students test so poorly internationally, nor do I want to get into a debate on the subject; I’ll leave that to the finger-pointers, who will undoubtedly have their say in assigning blame for our latest educational black mark. What I want to do in this post is talk about what one high-scoring country does differently from the U.S. in educating their kids. This country is Finland. Up until a few decades ago, Finland was a largely illiterate farming and logging nation with failing schools. A concerted effort to turn its education system around, however, has resulted in a remarkable success story. Finland now regularly ranks near the top in PISA scores in math, science and reading. What changes did Finland make? How is that country’s education philosophy different from ours?

Wednesday, December 18, 2013

Response to a Subscriber Regarding my Goodbye to the A-10 Posting

Subscriber, Larry Slayen, wrote: “Why has nothing changed?  You could have written this article 10, 20, 30 and 40 years ago.  The military continues to overspend at the expense of the taxpayers and the troops.”  
You’re exactly right, Larry. The Pentagon’s addiction to gold plating, which often makes weapons too expensive to purchase in planned numbers, goes back a long time. The weapons acquisition system is structured to virtually guarantee cost overruns, biased testing, poor combat readiness, and a great deal of waste (see my 8/22 post on why our weapons cost so much). Seymour Melman wrote about structural problems in our procurement system back in 1970 in his book, Pentagon Capitalism. He followed this work up with a broader treatment of the problem in, The Permanent War Economy (1985). The addiction to excessively high technology was covered in a 1982 book by Mary Kaldor called, The Baroque Arsenal. Over the years, many works on procurement problems have been published.  So, yes, there is a long history of concern about the kind of practices I wrote about.  
What is most remarkable is that nothing seems to significantly change. The military-industrial-Congressional complex, which Gordon Adams referred to as “The Iron Triangle” (The Iron Triangle: the Politics of Defense Contracting), is a powerful symbiotic alignment that is self-promoting and impervious to reform. Politicians in key military appropriations committees, and other Members who just love to take military contracts back to their home districts (the bigger the better) are all in on the procurement charade. Few politicians have had the courage to take the system on.  On this there is bipartisan consensus.



By Ronald Fox

In a previous post, I focused on the case of the air force’s new super-airplane, the F-35, as yet one more example of our military’s persistent tendency to understate cost and overstate performance of major weapons systems. Specifically I questioned the veracity of F-35 Program Manager, Lt. General Christopher Bognan’s boast that the unit cost of the F-35 will settle in at about $85 million per plane. Well, with the cost per plane now over $160 million and climbing, that prediction appears pure fantasy, if not outright duplicity. But that’s not the worst of it. It appears that to free up funds to purchase the overpriced F-35’s, the Air Force will phase out the A-10 Warthog, the best airplane ever made for the task of close air support (CAS) for combat troops on the ground. In the end, the costly F-35 will degrade our CAS capability, making our soldiers more vulnerable on the battlefield. This is yet another example of the military addiction to high technology trumping good old common sense.

Monday, December 16, 2013


by Charles Snow

John Jordan, a colleague of mine at Penn State, writes a periodic newsletter called Early Indicators. It’s about the latest techie stuff and trends, and I find it quite interesting. John’s latest newsletter, however, was entitled What Makes a Great Business Book?

In this post, I’d like to summarize John’s ideas on great business books and then conclude with my own. John’s thought process was kicked off when he finished reading Brad Stone’s book on Jeff Bezos and Amazon entitled The Everything Store. Although he enjoyed the book, he concluded that it lacked “greatness.” After making his own list of great business books (shown below), he began to eliminate the categories of business books that, in his opinion, were not worthy of the label “great.” These are:
  • “Self-help” books such as Stephen Covey’s The Seven Habits of Highly Effective People.
  • “Patterns of success” books such as Tom Peters and Robert Waterman’s In Search of Excellence or Jim Collins’ Good to Great. (John points out that successful companies today are often unsuccessful companies tomorrow.)
  • “Strategy” books such as Michael Porter’s Competitive Strategy, Hamel and Prahalad’s Competing for the Future, or Kim and Mauborgne’s Blue Ocean Strategy. (These seem like “exercises in hindsight” rather than “scientific discovery.”)
  • “First-person tales” (too numerous to mention). 

Tuesday, December 10, 2013


By Ronald Fox

I’ve had countless arguments with conservatives opponents of the Affordable Care Act (i.e., Obamacare). I’ve done so even though I’m not a big fan of the Act. It is extremely complex, key provisions are vague, which will probably lead to inconsistent and arbitrary enforcement, and, worst of all, it contains no public option, making it unlikely escalating healthcare costs can be controlled. These legitimate concerns, however, are rarely topics of discussion among the conservative ACA critics I’ve encountered.

Almost without exception, I’ve found these people to be uninformed, misinformed, and downright duplicitous debaters. It’s the same old ideologically dogmatic, hyper-partisan story: ignore facts and accept as truth only what conforms to one’s ideological beliefs and prejudices. I don’t think, however, that disagreement over facts lies at the heart of the current Obamacare divide; rather, the bitter divide seems to me to follow from differing philosophies about whether health is a basic human right and whether public authorities have an obligation to ensure realization of this right.

Friday, December 6, 2013


By Ronald Fox 

Numerous polls show that President Obama’s job approval rating has dropped to a new low. Over 50% of Americans now disapprove of his job performance. The recent decline in public approval appears largely a result of criticism over the ACA rollout debacle, but also reflects bipartisan revulsion over revelations about the mind-boggling extensiveness of NSA spying. Dissatisfaction with Obama encompasses the full political spectrum, from right to left. I’ve been thinking about what Obama’s declining popularity portends for Democrats in 2014 and his legacy as president.

The disastrous rollout reinforced dislike of the ACA among Republicans, but also antagonized many independents and pulled some Democrats away from supporting the plan. More importantly, perhaps, it diverted attention from the government shutdown gambit that alienated many Americans and gave a bump to the President and the Democrats. NSA spying revelations have angered liberals and conservatives alike. Those on the left are particularly vexed in light of Obama’s 2008 campaign rhetoric to conduct the war on terror with transparency and due respect for civil liberties. He promised to do things differently from his predecessor. Little did we know that this difference would mean more spying, less transparency, and more drone attacks. This comes on top of Obama’s weak response to conservative economic challenges, and his wholesale embracing of the tax cutting/austerity framework, which has also bothered many on the left.

Sunday, December 1, 2013


by Charles Snow 

Normally, we think of firms competing against one another; however, a growing number of successful companies are working together to find solutions to common challenges. For instance, Innocentive uses a global network of millions of problem solvers to help companies overcome R&D business challenges that they can’t overcome themselves. Several member firms of have together developed an innovative computer server application for the Asian Art Museum in San Francisco, California. Along with major research universities and the national governments of six countries, IBM is using its “collaboratories” to develop solutions to such complex problems as electricity distribution on the island of Malta and traffic congestion in Moscow, Russia. Lego helps entrepreneurs start their own businesses by providing a toolkit featuring its famous building blocks. My M&M’s web site enables chocolate lovers to design their own candies just as NIKEiD allows sports enthusiasts to design their own shoes.

All of these firms – and many more around the world – are using some form of collaborative innovation to expand and improve their business. In knowledge-intensive industries like biotechnology or computers, the ability to collaborate is a must because the knowledge base required to innovate is complex, growing, and widely diffused. But even in less dynamic industries, collaboration can be useful to firms for a variety of reasons, including lowering the costs of product development, starting new businesses, retaining customers, and building brand equity.

Saturday, November 16, 2013


By Charles Snow

In his book The (Mis)management of America, Inc., Lawrence G. Hrebiniak uses the metaphor of a large corporation and applies it to the U.S. economy. He examines how top management -- the White House and Congress -- runs the country and artfully describes a "perfect storm" of irresponsible top executives managing a corporation whose customers (U.S. citizens) are  largely disengaged and misinformed. The consequences are potentially catastrophic. Although the book is five years old, its thesis is still valid -- perhaps even more so given the events that have unfolded over the past few years.

From a management perspective, consider how closely the U.S. resembles a large corporation. America, Inc. is a $3.5 trillion business. It is diversified, operating businesses in defense, health care, banking, science, education, and many other industries. It hires people, fires people, has retirement programs, makes money, loses money, and so on. The company called America, Inc. is larger, more diversified, and more powerful than Wal-Mart, IBM, Exxon Mobil or any other company in the world.

Wednesday, November 13, 2013


By Ronald Fox

A reader, responding to my posting on economic inequality and institutional corruption, sent me a note asking whether I thought expanding civil and criminal probes into the financial dealings of Wall Street firms represents a turning point in holding large corporations accountable for criminal misconduct. These investigations, which have touched on such misdeeds as mortgage and derivative fraud, inner-bank interest rate manipulation, energy trading practices, insider trading, rigging global interest and currency rates, and passing out bribes to secure business, have already resulted in some record fines. The federal government promises there’s more to come. Globally the cost of banks cleaning up misdeeds is expected to soar to over $125 billion. I’m far from an expert on human behavior, business practices, and complex legal maneuverings, so in responding to the reader all I can offer is an educated opinion. I will draw on practical wisdom to argue that I don’t believe fines alone will be enough to deter Wall Street malfeasance.

The current legal troubles of JP Morgan Chase (henceforth, JPM), long considered one of our strongest and best managed financial institutions, is illustrative of recent government efforts to investigate and punish financial institution malfeasance. JPM is currently facing dozens of federal and state legal probes. As reported in the New York Times, a couple of foreign governments have also launched investigations. The company has already paid over $5 billion to resolve Federal claims that it misled Fannie Mae and Freddie Mac about risky home loans and securities they bought before the housing market collapsed, nearly $1 billion for failure to oversee trading that led to a $6 billion loss in its “London Whale” caper, and $410 million in a settlement with the Federal Energy Regulation Commission (FERC) for its alleged manipulation of California’s electricity market from 2010 to 2012. Among other legal probes, Morgan is presently being investigated for its connection to the Barnard Madoff Ponzi scheme and is in the final stages of negotiations with the U.S. Justice Department on a multi-billion dollar settlement for activities related to the financial crisis. Our “best managed financial institution” appears now to be possibly our biggest corporate crook.

Thursday, November 7, 2013


By Charles Snow

When I was a third grader in San Diego in the 1950s, my teacher was Leta Lipp. She was a wonderful teacher who, among other things, taught us how to think about and interact with the natural environment. Ms. Lipp loved the American Indian way of life, and every summer she would spend some of her time visiting the Hopi reservation in Arizona. When she returned to teach in the fall, she would introduce American Indian philosophies and practices to her students in the form of stories and projects (e.g., carving kachina dolls out of balsa wood). Ms. Lipp taught us that the relationship between humans and Mother Nature was sacred. The earth’s God-given resources are precious and should be respected. When someone uses a resource, such as removing a bucket of river water for drinking or killing a buffalo for food, he should not be wasteful. The environment should not be despoiled in any way and should be preserved for future generations. Essentially, Ms. Lipp taught us to be in awe of Mother Nature’s wonders and to do everything we could to preserve them.

If she were alive today, I believe Ms. Lipp would be pleased with the attention the environment has been receiving lately. I’m certain she would be appalled at how we’ve let the environment degrade, and probably she would be angry at the U.S. for being one of the world’s biggest polluters. A few years ago, 65 scientists received a Nobel Prize for summarizing the research done to date and concluding that climate change is real and that it is largely due to the actions of mankind (using fossil fuels for energy, emitting carbon dioxide into the air from automobiles, etc.). A recently released U.N. report, which was held up as scientists agonized over its wording, stated these same conclusions even more forcefully. Just last month, the head of the Organization for Economic Cooperation and Development sought to make climate change a higher priority on the global agenda, urging the world to eliminate all emissions from burning fossil fuels sometime in the second half of this century.

Wednesday, October 30, 2013


By Ronald Fox

The U.S.-Russian diplomatic initiative to induce Bashar al Assad to abandon chemical weapons and join the Chemical Weapons Convention (CWC), which voided Washington’s threat to use military force, inspired me to reflect on the legacy of the Iraq and Afghanistan wars. Does the U.S. opting for diplomacy rather than force in Syria mean that Washington learned valuable lessons from its policy debacles in Iraq and Afghanistan? Could it be that the Syrian crisis represents a turning point in the U.S. proclivity to project global power through the use of military force? Unfortunately there is no reason to believe, Syria notwithstanding, that our recent experiences in the Middle East will produce any significant moderation in American militarism. As the Vietnam War failed to produce any lasting dovish tendencies in U.S. security policy, so won’t the wars in Iraq and Afghanistan.

There are, to be sure, a number of positives to draw from the apparently successful diplomatic effort in Syria. We worked closely with the Russians and through the U.N, rather than in our usual unilateral way, President Obama went to Congress for the authority to order military strikes, rather than acting imperially, as all recent Presidents have done, and the American public sent a clear message that it was in no mood for yet another military adventure in the Middle East. A deeper reflection, however, leads me to believe these positive developments will be fleeting. There is little evidence to conclude that President Obama and the national security elite have lost any appetite for using force first, rather than as a last resort, as the operational code in our approach to peace and security. Expect Washington continue to pursue global dominance in the name of peace, framed in the lofty ideals of promoting democracy, civil society, economic development, and rebuilding failed states.

Thursday, October 17, 2013


By Ronald Fox

In my essay on economic inequality and the cheating culture, I admonished our governmental regulatory system for aiding and abetting the growing prevalence of institutional corruption and individual cheating in America. I would be remiss if I did not also include the changing face of political journalism in my indictment. Political journalism in the United States has suffered a serious decline in the last three decades. This is primarily the result of greater concentrated media ownership in the hands of large corporations, who have adopted market-driven, business models mandating, among other things, the downsizing of news divisions, which they no longer consider profitable. A number of once great newspapers and news magazines have even ended daily publication. The result has been less coverage of political issues and elections and a narrower defining of newsworthiness to the priorities of the owning corporation‘s bottom line. These priorities no longer include watching out for, and going after, institutional corruption and wrong-doing.

Corporate media has become full-fledged members of the power structure. Rather than speaking to power, as was the press tradition for most American history, today it largely echoes the priorities and policies of the economic elite. Free-market theories and prescriptions are the corporate media’s guiding light, which in practice leads them to take a generally conservative, pro-business slant on economic and fiscal issues. They use their financial and communication power to consolidate the power of the economy in the service of the economic elite. This makes off-limits and unquestioned those areas that people in power agree should be left alone. As business conglomerates, with vast arrays of diverse holdings, the media giants tend to avoid presenting any bad news about topics near and dear to their own financial hearts (they certainly wouldn’t want to bite the hand that feeds them).

Friday, October 11, 2013


By Ronald Fox

The notion that each American has the right to pursue happiness and the freedom to strive for a better life through hard work and fair aspiration lies at the heart of the “American Dream.” This idea, which drove the hopes and aspirations of Americans for most of the country’s history, begin to be transformed in the late 1970s when wealth began to steadily ascend to the top of the economic hierarchy. According to David Callahan, in his book, The Cheating Culture: Why More Americans are Doing Wrong to Get Ahead, the soaring income gap has given rise to a fundamental value shift in America: the time-honored commitment to community, self-reliance, fair play, truthfulness, compassion for the less fortunate, and rule-following, has morphed into selfishness, hedonism, greed, jealousy, and an excessive preoccupation with materialism. It has also inspired a growing number of American institutions as well as individual citizens to cut corners to get ahead.

The growing income gap has divided Americans and weakened our social fabric—undermining the notion that we’re all in this together and no person is above the law. The lavish and ostentatiously displayed lifestyles of the super rich, relentlessly displayed everyday on TV and in movies and magazines, has transformed perceptions of what it means to live the good life. Instead of aspiring to a standard of living relative to one’s peer group, the proverbial Joneses, a growing number of Americans now aspire to emulate the lifestyles of the rich and famous. Everyone wants not just a better life, but one filled with jet-setter luxuries. In an earlier time when there was less wealth at the top, Americans aspired to get a “fair share;” now they seem to want it all, whatever it takes. America is now engulfed in a greed-driven, money culture that has reshaped the moral climate of corporate America as well as the personal ethics of American citizens.

Thursday, September 26, 2013


By Ronald Fox

Part II: America’s Diminished Credibility and the Decline of America’s Global Power

Foremost in the current debate over how the U.S. should respond, if at all, to the probable chemical weapons attack by the Bashir al-Assad regime, has been the question of U.S. credibility. It is said that America’s credibility is on the line, a matter made more acute by President Obama’s poorly thought-out drawing of a game-changing, red line on Syria’s use of chemical weapons. This idea suggests a connection between our national credibility and the use of force to back up a threat; only thusly will bad guys be deterred. For those disposed to this line of thinking, it is the use of overwhelming force rather than negotiation aiming at a peaceful settlement that best serves American national interests. America does indeed have a credibility problem, but it isn’t because we haven’t been sufficiently tough in deploying force; rather, just the opposite: Washington’s credibility problem stems precisely from its heavy military response to the war on terror.

Wednesday, September 25, 2013


The Nazca lines are a series of ancient geoglyphs in the Nazca Desert in southern Peru. The hundreds of individual figures range in complexity from simple lines to stylized hummingbirds, spiders, fish, and lizards. Archeologists, ethnologists, and anthropologists have studied the ancient Nazca culture to try to determine the purpose of the lines and figures, but no single theory or explanation has come to be widely accepted. One interesting hypothesis is that the construction of the lines kept the Nazca people busy. In addition to farming and hunting, ongoing construction of the lines gave the community a sense of purpose and structure so that the people could move forward in a productive way and not get bogged down in competition and conflict.

The creation of the Nazca lines may serve a similar societal purpose to that of the American Dream. Numerous opinion polls conducted since the 1980s indicate that the majority of Americans believe that working hard is the most important element of getting ahead. These same polls also indicate that achieving the dream through fair means is becoming increasingly difficult for future generations. The main cause of the disappearing American Dream is economic inequality. The product of a variety of decisions and actions, income inequality has grown at an alarming rate over the last several decades. According to the Congressional Budget Office October 2011 report “Trends in the Distribution of Household Income Between 1979 and 2007,” overall real average (after-tax) household income grew 62% over this 28-year period. However, for the top 1% of earners, household income grew 275%, and for the bottom 20% of earners, it grew only 18%. Today, the 400 wealthiest Americans have more wealth than the bottom 150 million Americans combined.



In the coming days, Phronesis will present a series of pieces on economic inequality in America. We are specifically interested in addressing the question: Does inequality really matter, and if so, in what ways? We will draw on practical wisdom derived from our respective academic training and research as well as years of first-hand experience as American citizens living in an increasingly unequal world. Accordingly we will strive to offer evidence-based, commentaries on the issue of inequality. We hope to stimulate awareness and discussion of this important topic.

As an introduction to the inequality posts to come, let me say something about the main battle line in the inequality debate. Americans have widely varying beliefs and opinions about inequality. Those who embrace a free-market philosophy, usually people of conservative persuasion, tend to see inequality as a natural and healthy feature of an efficient running economy. Inequality is natural because in a free society with a free market system, individual differences in intelligence, education, talent, ambition, and work ethic will produce varying levels of achievement. Simply put, the best and brightest will earn and accomplish more than those who lack attributes for success. The recent spike in income inequality was an inevitable result of the globalization of finance and advances in technology in an increasingly complex world.

Free-market advocates cite two reasons why inequality is a natural thing. First, they believe it’s the people at the top, the accumulators of capital, who possess the necessary smarts and wherewithal to invest wisely in productive enterprises. This will stimulate growth, create jobs, and generally enhance national prosperity. Not surprisingly, they tend to oppose government meddling and other obstructions to capital accumulation. The second reason is the so-called goal gradient phenomena: the existence of a social hierarchy serves as an incentive for self improvement. People will strive to climb the ladder of success in order to, as the saying goes, “keep up with the Joneses.”

Other Americans have less faith in the capacity of the free market to promote the general welfare. Liberal in their worldviews, these people see a necessary role for government to intervene in the economy to protect consumers, stimulate growth, smooth out the boom and bust cycles inherent in a capitalist society, and generally ensure fairer economic outcomes for all citizens. This school of thought views inequality, when it reaches extreme proportions, as harmful to society. Extreme inequality confers disproportionate economic and political power on the wealthy, who without effective constraints will prioritize their own interests to the detriment of the public good. According to this school of thought, the “trickle-down theory” of economics is just that-- a theory. Without government intervention, the trickle is more likely to be up than down. Those at the top of the economic pyramid will be prone to conspicuous consumption, which will instigate a value shift in American culture toward an excessive preoccupation with money and materialism. Rather than striving to live like one’s peer-group Joneses, money-driven consumers will endeavor to emulate the lifestyles of the rich and famous, a preoccupation that will lead to all sorts of destructive behavior.

It won’t be surprising to those who follow Phronesis that we are sympathetic to the liberal perspective on inequality, propagated most passionately by British economist John Maynard Keynes. Our posts on inequality will reflect this worldview. Nevertheless, the overall objective is to encourage readers to think about the meaning of inequality and its effects on the American economy, culture and political system. Has it shaped your life for better or worse?


By Ronald Fox

America’s most powerful and wealthy elites have always shaped our politics, economy and culture, sometimes for the good, and sometimes, not so good. The worst behavior of American elites has occurred historically when income has been highly concentrated at the top of the economic hierarchy: around the turn of the 20th Century when corrupt, robber barons dominated the American economy, in the last few years leading up to the Great Depression, and, from the late 1970s to the present. Elites were at their best when income inequality was only modestly skewed. The period from the end of World War II up to the late 1970s was a golden era in the United States. With the effective tax rate for the highest income earners over 50%, and government spending as a percentage of GNP high, the economy grew, wages multiplied, economic opportunities flourished, and most Americans shared in the steadily growing, national prosperity. Economic inequality was the lowest since the Progressive Era.

This period also brought out the very best in elite behavior. The upper class took seriously the responsibility that comes with their power and visibility. They were engaged in community life and committed to public service; they tried to be good citizens. They enjoyed the good life, but often railed against conspicuous consumption. In their money-making ventures, they relied on an innate wisdom and savvy in their investment choices and in choosing the people with whom they worked. They tended to choose people like themselves, with breakaway independence of mind, wisdom, good judgment, empathy, imagination, and, to be sure, intelligence, though this was not the main thing they looked for. Upward mobility in America guaranteed the elite strata would be cross-fertilized with young minds bringing new ideas. Yes, they accumulated great wealth, but their share of the nation’s wealth was only modestly greater than the majority of working class wage earners. To be sure, the elites I’m describing were not all angels, but in comparison to those leading America today, their behavior was exemplary.

Tuesday, September 17, 2013


By Ronald Fox 

The possibility that the United States might intervene militarily yet again in a Middle East country got me to thinking about the legacy of Osama bin Laden. The war of terror he unleashed over a decade ago has caused a sea of change in the Middle East and throughout the world. It has also radically changed the America I grew up to care deeply about. I am profoundly concerned about the path my country has taken in pursuing its global war on terror. In our response to the terrorist challenge, the United States has lessened domestic freedoms, compromised democratic values, squandered economic resources, violated the rule of law, helped sow chaos in the Middle East, and diminished the good will and respect we once enjoyed as the undisputed leader of the free world. In this two-part essay, I will discuss the heavy price we Americans are paying for Washington’s global war on terror.

Tuesday, September 3, 2013


By Ronald Fox

It’s that time of year; public school academic test scores are in and are being reported in the news. The Sacramento Bee reported on August 30 that 70% of schools in the Sacramento region had their Academic Performance Index (API), a composite of student test scores, decline from 2012 to 2013. Curses! Already fingers of blame are being pointed. Teachers blame larger classrooms, reduced school funding, and the inappropriateness of standardized tests for measuring academic performance; principals cite disruptions associated with implementing new California state curriculum standards (called Common Core); district chiefs emphasize that scores have been rising over the last dozen years or so and perhaps the schools have “topped out,” and a slight slip was inevitable; many school “reformers” and politicians insist poor teaching is at fault; and, free-market fundamentalists would have us believe, it's those damned teachers' unions.  What's with all the ado over test scores? 

I don’t know who or what's at fault for the API decline, or even if the falling scores, in fact, really mean anything.  You see, standardized school tests, as administered in the U.S., are notoriously flawed instruments for measuring student knowledge and learning; they're even worse as an indicator of teaching effectiveness.  What I do know is that standardized test scores can have monumental consequences. Falling scores predict trouble for students, teachers, principals, schools, and even states.

Friday, August 30, 2013


By Ronald Fox

Best Years of Our Lives

Occasionally Phronesis will offer film commentaries. Though we are not film critics, sometimes a film with a strong political message may inspire a commentary. Such is the case with the 1947 film, The Best Years of Our Lives. I’ve seen this film many times before, but only after watching it again recently did I come to fully appreciate what a great and enduring film it is. This isn’t just my opinion, as the film won seven Academy Awards, including best picture, and earned as astonishing, for the time, $11 million.

Best Years tells the story of three soldiers returning to the same town from World War II: Captain Fred Derry (played by Dana Andrews), an Army Air Force bombardier, Army sergeant Al Stephenson (played by Fredric March), and Homer Parish (played by Harold Russell), who was in the navy. Each is gripped with fear and uncertainties about returning to “normalcy.” Fred returns to a beautiful wife (Virginia Mayo) who has been working in clubs while he was away, Al to a loving wife and family and position in a bank, and Homer to his family and fiancé. Each, however, has been scarred by the war. Fred has recurring nightmares about his bombing missions. Al has taken to drinking, and Homer has lost his hands in an explosion on ship and now has two hooks for hands. Their chance meeting after the war turns into a close friendship and a sharing of their respective post-war difficulties.
The film centers on the struggles of Fred, Al and Homer trying to readjust to civilian life. Fred has difficulty finding work because his skills, “killing Japs,” are not relevant to a new workplace environment that now places a premium on education, training, specialized skills, and experience. He can’t please his party-animal wife, whom he finds has developed a taste for the good life, and, other men.

Al returns to his family and job at the bank, but he is uneasy re-connecting emotionally with his faithful wife (Myrna Loy), nor his son, who doesn’t seem to appreciate the war relics he brought home and pesters him with questions about Hiroshima and atomic energy, which his high school teacher had told him needs to be controlled, “or else.” At the bank, Al discovers it has adopted a policy of requiring collateral before granting loans, something most of his fellow returning soldiers lack. He is pressured to deny loans to veterans whom he believes are of strong character and can be trusted to repay.

Homer’s fiancé and parents are uneasy with his disfigurement, though they try hard not to show it. All three are returning to an America they find cold, unwelcoming, and less optimistic about the future. They all feel like misfits, unneeded relics of the past, a point that was powerfully illustrated in a scene near the end of the movie when Fred is strolling through a junkyard of dismantled B-17s, the planes he flew during the war. Like Fred, they had served their purpose, but were now unfit for the new era.

Monday, August 26, 2013

Response To "Anonymous" Regarding Reagan and Contemporary Republicans

By Ronald Fox

I received the following comment from "anonymous" on my posting on Reagan and contemporary Republicans:

“I have heard in the past that it was actually Gorbachev who proposed dismantling all nukes (?) Jeb Bush offered some similar thoughts; stating that neither Reagan nor his father could earn the GOP presidential nomination in today's world. Sad. On the other hand, could a moderate Democrat like Bill Clinton get his party's nod in today's world? I have my doubts.”

Below is my response to what he/she had heard about who proposed dismantling all nukes:

Thanks for your response. I’m not surprised you’ve heard that it was actually Gorbachev who proposed the total elimination of all nuclear weapons. Many people continue to believe, including those who like and dislike Reagan, that he was manipulated by Gorbachev when they met in Reykjavik, Iceland, in October of 1986 into agreeing to abolish all nuclear weapons. The naïve, idealist Reagan was simply no match for the knowledgeable and sophisticated Gorbachev, who led him along like a dog on a leash. It has been said that Reagan really didn’t fully understand the magnitude of what he almost agreed to do. The historical record, however, does not support the view that Reagan was an unwitting bystander in the abolition drama, or that Gorbachev was the driving force behind the proposal to eliminate all nuclear weapons. Below is the true story.  (For an in depth account of what occurred in Reykjavik, I strongly recommend Jonathan Shell’s, The Seventh Decade.)

Saturday, August 24, 2013


By Ronald Fox

Among Republican faithful, Ronald Reagan is, “The Man,” wise, omniscient, inspiring, infallible, an ideologically pure, true believer in conservative principles; the founder of modern conservatism. What Republican candidate for high office wouldn’t want to invoke the Reagan name as his guiding light? Referring to oneself as a Reagan Republican is often all that is needed to establish conservative credentials. But, what does it mean to be a Reagan Republican? Would Ronald Reagan, himself, fit in with today's Republican mainstream?

Thursday, August 22, 2013


By Ronald Fox

My skeptical juices rose when Lt. General Christopher Bognan, Program Manager for the Air Force and Navy’s F-35 fighter aircraft, proudly announced recently that unit cost for the F-35 “continues to come down” and will likely settle in at about $85 million per plane when in full production. His optimistic prediction of reduced unit cost was echoed by Defense Secretary, Chuck Hagel in Congressional testimony. The Navy’s commander of the Naval Air Systems Command, Vice Admiral David Dunaway, chimed in that the F-35 was a “fairly mature air vehicle,” suggesting that most of its bugs had been worked out.  The GAO also gave the F-35 (there are differing Air Force and Navy versions) a thumb up for progress, as did the DOD in its most recent Selected Acquisition Report (SAR). The cost savings and good performance news are supposed to result from economies of scale in larger production runs as well as the "learning curve" that comes from experience.  Isn’t it good to hear that our tax dollars are being well and carefully spent? Or are they?

In a recent report, the Center for Defense Information (CDI) exposes such optimistic assessments as pure fiction. Unit costs doubled from $81 million in 2001 to $161 million in 2012, and are estimated by the CDI to be, on average, $219.3 million in 2014 (higher for the Navy version). The boast by General Bogdan that the cost per aircraft will decline to $85 million in 2018 stretches credulity to the limit. The real question, according to the CDI, is how much more they will cost than the projected $219+ million for 2014. Nor is there any evidence to conclude that the F-35 is a “mature airplane,” implying  that it has been thoroughly tested and is on course to be put into operation in the near future. In fact, the aircraft is less than half way through its developmental flight testing, which when completed will have assessed only 17% of its capabilities. The more important and rigorous battlefield testing will not start until 2018, meaning that no meaningful appraisal of its performance can be made until after that testing is completed and reported.


By Ronald Fox

I grew up politically, meaning I developed my political consciousness, in the turbulent 1960s. This experience, along with values inculcated in my early family life, left an expansive progressive mark on my political ideology. In those formative years, I frequently stood on principle, or so I positively labeled it; more correctly, I was an impatient idealist who saw compromise with progressive principles an ultimate sin. Neither Republicans nor Democrats appealed to me: the former being hopelessly out of touch with, and callously insensitive to, the struggles of poor and minority peoples, and the latter timid and overly willing to compromise progressive values. But compromise they did, enough to form bipartisan coalitions that moved many pieces of landmark laws on such important issues as civil, voter and labor rights, consumer and environmental protection, and medical care for the poor and aged. Nevertheless, entrenched in my dogmatism I derided the unholy alliances that produced legislation that fell short of my principles and expectations, and took special pleasure in lambasting the Democrats for selling out.

As I grew older, and more cognizant of political realities and possibilities, I became more comfortable with half-loaf compromises. Rather than seeing compromise as evil, a sellout to principle, I began to see them as a necessary condition for effective governing. Such is political maturity, I suppose. Like many others, I took compromise for granted. This is simply what legislators necessarily do, or should do: fight it out on principle and policy, but in the end, find common ground. Oh how things have changed. The gulf between Republicans and Democrats today is deeper and more rigid than I can remember. Both parties seem to believe that the opposing party is always wrong; facts are irrelevant, or just something to be manipulated for partisan gain. Compromise, is a dirty word to today’s extreme partisans. Policy positions aren’t seriously debated; partisan advantage, not problem-solving, is the driving ethos. A discourse that was once relatively civil has become as vehemently adversarial as the European parliamentary parties I once found humorous. Governing in America has become hopelessly gridlocked in a sea of partisan vitriol, a gridlock that has strangled the capacity of our government to address urgent matters and undermined public faith and trust in our political institutions. This is a recipe for disaster.


By Ronald Fox

In this second part of my essay on the partisan divide, I will discuss several factors I believe have contributed to America’s extreme polarization, beginning with those most frequently mentioned and ending with what evidence points to is the most important causal factor.

It is conventional wisdom, shared amongst politicians, media sorts, and a large segment of the American public that the sharp polarization that plagues American politics is a result of gerrymandering. This is when state legislatures, which in most states are responsible for drawing district boundaries after the decennial census, craft districts that guarantee victory for one party or the other. These so-called “safe districts” dominate our electoral landscape. By making districts more homogeneous and less competitive, it is argued that candidates are consequently freer to take more extreme positions, pandering to their bases while ignoring moderate and independent voters.

Pandering to one’s ideological base has become mandatory; to do otherwise runs the risk of inspiring a spirited and ideologically “purer” primary challenger. So, it is argued, with competition reduced gerrymandered districts reward extreme wings of both parties; moderates are squeezed out and the two parties become more deeply partisan and polarized. In this environment partisanship rigidifies and compromise becomes a taboo. This theory sounds so plausible it is no surprise so many  embrace it. Unfortunately, it is not supported by evidence. Research by Nolan McCarty and other political scientists has shown that gerrymandering has, at best, had only a small effect on polarization.

Wednesday, August 21, 2013


Most Americans would agree that the U.S. has a problem with guns. I don’t believe I need to cite any statistics to support this statement – the recent horrific shootings in Colorado and Connecticut should be enough for anyone to support a reasoned debate on the issue of gun violence and gun control.
What is the nature of this problem, and how should we as Americans go about solving it? [Full disclosure: I am a gun owner (shotguns). I was in the military where I received rifle training, and I currently belong to a gun club where I enjoy recreational shooting. I used to go bird and rabbit hunting with my father and brother both as a child and an adult. I do not belong to the National Rifle Association.] As described in one of my earlier postings (Wicked Problems), gun violence is a “wicked” problem, meaning that it is connected to other complex problems, so any proposed solution to gun violence can only help us make progress on solving this problem, not actually eliminate it. Efforts to reduce gun violence, such as those currently underway in the Congress and some state legislatures, are unlikely to result in the systemic solution that is needed.


While listening to President Obama’s State of the Union Address last February, I was struck by the long list of problems he wants the country to tackle during his second term. It is clear that the United States faces many daunting and even dangerous challenges, and it was heartening to see Obama speaking optimistically about both our opportunities and chances of success. Sadly, however, I don’t believe that we will make much progress in solving our biggest problems, no matter how hard the President, Congress, and others try. This is because our country faces a large number of “wicked problems.”

A wicked problem is a social problem that is difficult if not impossible to solve because of incomplete or contradictory knowledge, the number of different interests and perspectives involved, and the problem’s interconnectedness to other problems. Horst Rittel, one of the first to formalize a theory of wicked problems, cites ten characteristics of these complicated social issues. When you read them, you get the sense that no governing body can solve the kinds of problems mentioned in the State of the Union Address: gun violence, immigration, unemployment, war, international cooperation, and so on. But yet wicked problems are the very problems we need to solve, or at least mitigate, if we want to enjoy our future rather than merely survive it.


Once more we are witnessing displeasure with the minimum wage. Low-wage fast-food and retail workers from eight cities who staged walkouts earlier this year are calling for a national day of strikes on August 29. The workers are calling for a wage of $15 an hour and the right to form a union.

Is a minimum wage good or bad for the U.S. economy? Free-market adherents believe that there should be no minimum wage at all. According to this view, employers should offer wages that cover their marginal costs, and they should only raise wages when it is necessary to attract or retain high-quality workers. Employers should not have to pay their workers an arbitrary minimum wage set by the government. On the other hand, those who believe in the value of a minimum wage argue that employers are obligated to pay wages that afford workers a "decent" living.