Sunday, December 13, 2009

Strictly business.

Just imagine how angry the American public would be if they knew the whole story.

For months, we have listened to the whining from Wall Street. U.S. banks are having a record year, and they want to be paid a lot of money. Billions and billions of dollars.

Public indignation is deep. After all, over the past year, we have watched as hundreds of billions of dollars of public money has been poured into bank balance sheets. We have—we are assured—taken steps that were necessary to bring our financial system back from the brink. We may not have liked it, but we had no choice.

But now that we have stemmed the tide, now that the Great Panic of 2008 has abated, we have been forced to watch these same institutions moan about how bad they have it. Citigroup—the one that received $45 billion in taxpayer funds, plus a couple hundred billion extra in public underwriting of bad assets—wants to wipe the slate clean by paying the money back and calling it even. So they can pay themselves billions of dollars in bonuses.

Wells Fargo, the arriviste among the financial elite, is complaining about the competitive disadvantages that they face as a consequence of federal compensation constraints. Constraints that prevent them from paying themselves billions of dollars in bonuses.

Goldman Sachs—caught in a lie by a federal Inspector General who refuted Goldman’s sanctimonious claim that even if the world had collapsed, they would have been fine—is trying to fend off accusations of unwarranted hubris and greed—which reached a pinnacle when they announced plans to pay themselves $21 billion in bonuses—by announcing that their senior partners will take their share of the billions in stock.

But what if the public understood the whole story? How is it that the banks are now having one of their most profitable years ever? Given that there is not much lending going on, and that the newly increased credit card fees have only just begun to flow into bank coffers, where is all that money coming from?

It is coming from proprietary trading. “Prop trading” is the kind of betting with the bank balance sheet that was made illegal for commercial banks back during the Great Depression, when the FDIC and deposit insurance was created. The price of having the federal government guarantee bank deposits was separating the lending and depositary functions of commercial banking from trading and risk activities of investment banking. Thus, in 1935, the commercial bank J.P. Morgan & Company was separated from the investment firm Morgan Stanley.

But this separation was undone in 1999 to facilitate the creation of the megabanks that we have today. However, while the Financial Services Modernization Act of 1999 ended the separation of activities, FDIC deposit insurance remained in place. And this year, the elite of the financial world—JP, Citi, Wells, BofA, Goldman and Morgan Stanley—have finally emerged for what they are: Gigantic hedge funds backed up by the full faith and credit of the United States of America. Wall Street bankers making big bets with our money, content in the knowledge that if they win their bets, they will pocket the cash. And if they lose, we will all pick up the mess.

But it really does get better. So exactly how did they make all that money this year?

Well, the trade of the moment has been the U.S. dollar carry trade. A foreign currency carry trade is simple in concept. Borrow money where interest rates are low, and invest where interest rates are high. Or simply stated: Short the U.S. dollar. Buy the currency of a country where interest rates are higher. The beauty part is that by continually assuring the world that U.S. interest rates will remain near zero for the foreseeable future, the Federal Reserve has assured traders that they can keep the trade in place for some time.

So the Wall Street elite, just months removed from their near-death experience, are now making a fortune shorting the U.S. dollar. One year ago, faced with the greatest financial panic in generations, the American people swallowed hard and bailed out the banks. Today, the banks have moved on, and are tearing down the currency of the nation that saved them.

But it is nothing personal. It is strictly business.

And the carry trade will work out fine. Until it doesn’t. Then the trade will unwind quickly, and those who do not get out in time will get hurt badly.

But the banks are not worried. If the unwinding of what NYU economist Nouriel Roubini has labeled “the mother of all carry trades” takes a bank or two down with it, everything will be all right. Because the bank deposits are still insured, and we now know to an absolute certainty that if one of the elite institutions fails, we will bail it out. Again.

It is time that we come to grips with the depravity of the current situation, and potential damage that continuing down this path may yet do to the financial system and to our economy.

Our commercial banks are not, and should not be, hedge funds. U.S. dollar carry trades and writing credit default swaps are not core commercial banking functions. They are not necessary to the efficient functioning of our financial system.

The U.S. dollar carry trade is destructive to our currency, and is creating asset bubbles across the world, as leverage is transferred from our markets into others. For their part, credit default swaps serve no useful purpose in proportion to the systemic risks they create.

It is time to go back to basics. Commercial banks provide essential services in our economy. They enable the Fed to control the distribution and pricing of capital to the productive sectors of the economy. They provide secure depositary and asset management services.

Unfortunately, pending Congressional legislation has done nothing to address the central risks that the new financial landscape presents to our economy. Rather than reinstitute restrictions on bank activities or restrain institution size, Congress is looking to regulatory solutions that hold little promise of success when the next crisis emerges. And rather than recognizing the problem of moral hazard, this week Congress took the first step of embracing it in statute.

This year, Wall Street has shown its true colors, but the public has yet to understand the depth of the betrayal. It is not the continuing absence of lending, or jacking up credit card fees, or hiking consumer interest rates, or even the constant refrain of complaints about limitations on executive compensation. No, the greatest betrayal is that with the American economy as weak as it has been in years, with the dollar weakness threatening to unravel the international commitment to the role of the dollar as the reserve currency, Wall Street has shown no shame about attacking the currency of the nation that came to its aid.

If this is the path that the elite commercial banks have chosen, if they have been fully seduced by the lucre of trading, Congress needs to revisit the fundamental rules of the game, and revisit the central rationale for deposit insurance and the structure of the commercial banking system.

Sunday, December 06, 2009

Bankers leaving town?

So what was this image from the New York Times this week?

Healthcare lobbyists coming crossing the Potomac for a meeting on the Hill?

Goldman Sachs bankers heading to the Hamptons to spend their bonus checks?

No, it turns out it was a group of uninvited guests crossing the reflecting pool on their way to a reception on the White House lawn. You can see Tareq Salahi, standing fourth from the left, in his formal hoodie.

Thursday, November 19, 2009

Hell hath no fury.

Just when I had convinced myself that behind the curtain, hidden from public view, Barack Obama had a plan—for Afghanistan, for the Middle East, for Iran, for Russia, for education, for energy, for financial regulation, for health care… or at least for some of them—I saw Obama Campaign Manager David Plouffe pitching his book on The Daily Show.

Plouffe was in full campaign mode, selling the successes of the first year of the Obama presidency, as well as his new book—The Audacity of Winning. Grinning and determined, he spoke with an evangelical fervor.

The Audacity of Winning. Coming from the Obama campaign manager, the title itself is at best an ironic commentary on hope as a political strategy, but at worst the title bluntly mocks the electorate that invested their hopes and dreams in the Obama campaign.

Electoral losses this month in New Jersey and Virginia provided a grim reminder to Plouffe and the Democrats of the fragility of their electoral victories of just one year ago. If 2008 was an election year when young and independent voters set their cynicism aside and embraced the hope that a different tenor might come to national politics, 2009 saw young voters abandon politics and independent voters abandon the hope briefly flickered a year earlier.

The voters in New Jersey and Virginia did not get it wrong. They were not impatient. They were not premature in their assessment. By all accounts, the hope Obama offered—the belief that Washington can shift to a new trajectory and engage the real and deep challenges that threaten our nation’s future—is, if not dead, on life support. The past year has been one of deep and unremitting partisan rancor. A year has been lost with nothing to show for it but growing evidence that national politics is indeed a rigged game.

The easy response—and we have heard it for months—is that the Republicans are responsible for the intransigence in Washington. After all, it takes two to tango. But at a defining moment in the healthcare debate, John Boehner threw down the gauntlet. Healthcare reform, he stated, would be Obama’s Waterloo. Defeat healthcare legislation and you defeat Obama. Game on.

But at that moment, President Obama failed to engage the overarching issue of politics and partisanship, and instead the healthcare debate devolved into little more than another Washington food fight. Obama abdicated his commitment to reframe political debate and ceded the field to Congressional leaders with no interest or inclination to keep hope alive.

Harry Reid and Nancy Pelosi had no interest in Obama’s pledge to young and independent voters to change the tenor of politics in Washington, because it is their politics. For Reid and Pelosi, the political price of setting aside the interests of the SEIU and the Trial Lawyers and others in favor of a bi-partisan deal was too high to pay. Instead of reaching across the aisle, Democratic leaders preferred instead to mute industry opposition to healthcare legislation by bringing the industry heavyweights—big pharma, the hospitals association and ultimately the insurance companies—inside the tent. After all, that would only cost money.

The result is legislation that makes a mockery of sensible healthcare reform. It is expensive. It continues deeply entrenched incentives to overspending. Its financing is dishonest. And it protects those industry interests—promising expanding markets and limited cost controls—along with the interests of unions and lawyers heavily invested in the status quo, proving once again the power of lobbyists and contributors to take any major piece of legislation and manipulate it to their benefit.

So was it all just words? One year in, where is the evidence that the campaign that was designed to win by building on the hopes and dreams of the electorate was something more than just tactics? Where is the courage to take the long view? Where is the courage to take on your friends and occasionally accommodate your adversaries? And where is the courage to take on the contributors and lobbyists that neuter and manipulate one legislative initiative after another.

Have we seen any of that?

This year, we have watched events unmatched perhaps since the Gilded Age a century ago, as bankers have dipped their hands deeply into our pockets and those of our children to protect and enrich themselves, and nothing but whimpers from our elected representatives who tell us that this is the way it has to be—even as they take some of that very same money for their own campaigns.

But it is not just the bankers. The pharmaceuticals and insurance industries will dig deeper into the federal trough as subsidized drugs and insurance mandates are enacted. Energy companies and traders are eagerly ogling the new carbon trading bonanza that looms under the cover of cap and trade legislation. And out in the heartland, Monsanto is rewriting the rules of the farm economy under the protection of intellectual property laws that give it greater and greater control over the agricultural economy and farm incomes.

November’s results were not haphazard. Voters have not forgotten or forgiven the Republican sins and profligacy of the Bush years. But that was then and this is now, and Dick Cheney is not on the ballot. But 2010 looms large, and 2012 not long after that, and for all the mocking of the Teaparties, Democrats are skating on thin ice, and there is real anger out there, and disgust and disappointment.

In all likelihood there will be no healthcare bill this year, and that may well be for the better. The Democrat strategy of relying on a narrow, partisan margin was undone in the House when an anti-abortion Democrats upended the political calculus and may leave Democrats to deal with their own internal battles. Then, perhaps, David Plouffe and his associates will look in the mirror. And perhaps they will not like what they see. Barack Obama is not doing well, despite what Plouffe insisted to an incredulous Jon Stewart.

If Barack Obama wants to get reelected, and win votes one more time from the young and independent voters who put him over the top, perhaps it is time he stop playing politics and govern like a president who is willing to lose. Nancy Pelosi and Harry Reid are not doing Obama any favors, and will not win a single young or independent voter to his side next time. Unless he redeems his campaign slogans about changing our politics and demonstrates the courage to do what he promised to do the first time around, Obama will lose anyway, and have nothing to show for it but words.

Sunday, November 01, 2009

What are they thinking?

With the announcements of record Wall Street bonus pools, and rising credit card fees, it is time to sit back and see where we go from here.

In the wake of the near collapse of the US financial sector one year ago, Hank Paulson and Ben Bernanke took extraordinary measures to avert collapse. Turning caution to the wind, they arranged shotgun mergers, decided who would live and who would die, and brought the word trillions into our every day vocabulary. By the time they were done, the landscape of American banking has been transformed. Today, the six banking organizations that received $160 billion between— JP Morgan, Bank of America, Citigroup and Wells Fargo, and the former investment banks Goldman Sachs and Morgan Stanley—are now looking to a future in which they can dominate the financial services landscape.

But perhaps the term financial services is misleading in this context. After all, as bank earnings reports were rolled out for the most recent quarter, and news headlines announced the record bonus pools that the banks were preparing to pay, it became clear that these earnings derived from trading activities, rather than traditional commercial bank lending activities. Before our eyes—and with the full support of the Federal Reserve and the US Treasury—the transformation that we have witnessed is not of the conversion of major investment banks such as Goldman Sachs and Morgan Stanley into commercial banks, but rather of each of these firms into government guaranteed hedge funds.

I readily concede that I am using the term hedge fund loosely. After all, hedge fund is a generic term for a relatively unregulated investment vehicle, that is permitted to invest in a wide range of unregulated derivatives and other investments, and whose returns are dedicated to a limited universe of investors. And certainly, the practices of JP Morgan or Goldman Sachs, who undertake massive proprietary trading activities, run huge derivatives books, and dedicate the preponderance of their earnings to senior employees, should not be lumped into the same category.

But on the other hand, if it walks like a duck…

Today, the commercial banking world is sharply divided. With over eight thousand commercial banks and savings institutions, these six firms hold less than 50% market share. Therefore, by traditional measures of market concentration, they are far from monopolistic. But as individual firms, their size dwarfs their cohorts, even considering that two of them, Goldman and Morgan Stanley are not traditional depository institutions. Together, the six boast total deposits of $2.7 trillion, or an average of $444.8 billion per firm. This compares with an average of $107.2 billion for the next six largest banks, and $76.0 billion for the following six. The fiftieth largest—well within the top 1% among all banks—Associated Bank of Wisconsin, has deposits of $16.4 billion.

At the same time, as JP Morgan and its brethren have increasingly concentrated on derivatives trading, loan securitization, securities underwriting and proprietary trading—and as these activities have contributed disproportionately to profitability—the share of bank assets dedicated to traditional commercial bank lending—the type that is most directly linked to the local economy in towns across the nation—has similarly decreased. Therefore, it is not a stretch to suggest that even as the Federal Reserve and Treasury have concentrated for the past year on addressing the risks to the financial system that largely emanated from the largest firms, these firms have at the same time migrated the farthest from the tradition public mission of the commercial banking industry.

It may be hard in the face of the drumbeat of stories about the banks and their problems and their bonuses to remember that commercial banking is an industry with a specific public mission: To take deposits and make loans. It was in the wake of the Great Depression, that the Glass-Steagall Act was passed to restore confidence in the commercial banking industry. Glass-Steagall forced the separation of commercial banking (lending) and investment banking (trading, underwriting), and created the FDIC to insure the deposits of commercial banks.

Beginning around 1980, the banking industry began a steady assault on Glass-Steagall, as investment banking firms sought access to the large pools of commercial bank deposits and commercial banks sought to expand into trading activities that would allow each type of firm larger profit and bonus opportunities. These efforts finally culminated in the Financial Services Modernization Act of 1999, which finally ended the Glass-Steagall restrictions and allowed the complete merger of investment and commercial banking organizations. However, the 1999 Act left FDIC insurance in place, resulting in the hybrid creature that emerged, able to attract government-insured deposits, and utilize those deposits across a range of lending, securities trading, and newly emerging derivatives trading activities.

Today, the financial policy brain trust of Ben Bernanke, Larry Summers and Tim Geithner have rejected calls for structural reforms to the banking system and to reinstate the Glass-Steagall restrictions. Despite the experience of the past several years, culminating in the financial crisis one year ago, they are suggesting that the concentration of power represented by these six firms is acceptable and desirable, and reform efforts should focus instead on the creation of a single systemic risk regulator to oversee those institutions deemed too big to fail.

Standing alone against Bernanke, Summers and Geithner within the Obama administration is former Fed chairman Paul Volcker. Volcker continues to call for the reinstatement of the Glass-Steagall restrictions, and recognizes the imperative of maintaining the link between deposit insurance and commercial bank lending.

It is hard to imagine what Bernanke, Summers and Geithner are thinking, and how they can look at the devastating experience of the past two years, and not conclude that something is fundamentally wrong. Financial modernization did little to help those thousands of commercial banks who have stuck to their knitting, and who now have been sorely disadvantaged by the federal bailouts of their large competitors. Proponents of financial deregulation argue that Volcker and other advocates for turning back the clock are recalcitrant Luddites, yet they have been hard pressed to demonstrate how the creation of the new class of hybrid commercial-investment banks and unregulated derivatives trading have added value to the economy.

Can Bernanke, Summers and Geithner seriously believe that a systemic risk regulator can control the risks that are embodied in these massive firms? Recent history suggests that the risks entailed in the trading strategies, quantitative models, complex derivatives and contract risks were never fully understood by the risk managers, bank CEOs and directors of their own organizations. Bank regulators were captive of the banks themselves as they sought to understand the information that was provided to them. Capital requirements other traditional tools for containing risk proved to be of only marginal value in the face of derivatives with nearly unlimited leverage, and collateralization requirements buried deep in unregistered and unregulated contracts.

Furthermore, political influence over regulators is a fact of life in Washington, and over time will undermine whatever independent structure these three wise men might have in mind. One need only point to Summers’ own success in 1998 in silencing Brooklsey Born—the head of the independent Commodity Futures Trading Commission—when she argued the inconvenient truth of the growing systemic risks presented by the unregulated derivatives market, at a time when Summers and the Clinton administration were arguing the merits of financial deregulation.

It is mind boggling that we can continue down this road. Paul Volcker must be applauded and supported for his unflagging efforts to bring attention to this issue. He is a wise man standing against smart men. And he is right.

Saturday, October 03, 2009

Heads I win. Tails you lose.

Thirty years ago, Salomon Brothers and Goldman Sachs were two of the “bulge bracket” underwriting firms that dominated Wall Street. Both firms with partnerships with trading cultures that characterized their organizations. It was a time when Wall Street firms were looking far and wide for ways to increase their access to capital. Trading firms make money by making bets. More capital meant bigger bets. Bigger bets meant more money.

In 1980, in pursuit of a bigger balance sheet, Salomon CEO John Gutfreund negotiated the sale of his firm to Philipp Brothers, then the largest commodity trading firm in the world. The sale was not without controversy. Within Salomon, bond traders—led by Salomon family member William Salomon—opposed the sale. How, they asked, would traders be paid what was their due in the event the new firm lost money in other far-flung commodity businesses? As partners, they had a reason to be concerned by over-expansion into business lines that they neither understood nor controlled. They did not yet appreciate the benefits of trading with Other People’s Money.

But the sale of Salomon went through—John Gutfreund pocketed his $30 million bonus—and over the next few years, the new firm, Phibro-Salomon was acquired by Travelers Insurance. Travelers, in turn, was acquired by Citibank, to create the financial supermarket that was supposed to give American banking a global dominance to match the well-capitalized Asian and European counterparts.

The Salomon story was part of the evolution of Wall Street over the past thirty years, as the storied Wall Street firms succumbed to the lure of capital to give up their partnership status and merge into commercial banks and to become publicly traded corporations. And while the Wall Street investment banks did achieve their goals of increasing their access to capital—and ultimately won back their access to the massive pools of depositor money that they lost with the passage of the Glass-Steagall Act in 1933—the cost to the rest of us has been significant.

Where, after all, was William Salomon when Lehman Brothers decided to bet the ranch on collateralized mortgage securities that would ultimately bankrupt the firm. Where was William Salomon when Bear Stearns increased its leverage to thirty times, based on financial models that few in the firm really understood. And where was William Salomon when Joseph Cassano, the head of AIG Financial Products took the insurance giant headlong into the credit default swap business.

There was a moment when Cassano made his case to the AIG Board of Directors. The credit default swap contracts that AIGFP was providing to financial giants such as Goldman Sachs had no risk to AIGFP, argued Cassano, and therefore all of the annual receipts paid to AIGFP under those credit default swap contracts could be taken as current income—and used to pay very large bonuses—rather than held as reserves against future risk. CDS contracts are essentially insurance contracts provided to guarantee against defaults on corporate bonds, but Cassano argued that the bonds were so strong that there was no credit risk, and therefore the money paid to AIGFP was essentially free money.

But there was no William Salomon on the AIG Board of Directors. Unlike the old Wall Street partnerships, directors of corporations are largely insulated from the financial consequences of their decisions. Had AIG been a partnership like the old Salomon Brothers, a William Salomon would likely have asked the logical question of Joseph Cassano:

Goldman Sachs is paying us tens of millions of dollars a year, but you are telling us there is zero risk. One of us is wrong. This is a game of poker, and there is an idiot at the table. And you are telling me that Goldman Sachs is the idiot? I don’t think so. I think we are the idiots at this table. If Goldman Sachs is paying us tens of millions of dollars a year, we are taking risk, and we sure better know what that risk is, because we are betting our future on it.

But, of course, AIG was not a partnership, and the rest is history.

But the Phibro-Salomon story had one chapter left. This summer, Citibank—the failed financial supermarket that is now a ward of the State—sought approval from the US Treasury to pay bonuses in order to keep a group of highly profitable traders from leaving the bank. The bonuses—the most famous being the $100 million for Andrew Hall—were to be for traders in its Phibro commodity trading subsidiary.

William Salomon saw the writing on the wall. The partnership trading culture that was critical to Salomon Brothers success—a culture that combined incentives and accountability—would not survive an evolution into a corporate model. What we have learned is that the incentives to make big bets and take big risks has survived, but without the accountability. Andrew Hall made $2 billion for Citigroup placing energy bets, and was due to be paid $100 million. But what of those whose bets lost Citigroup $2 billion? They have not even lost their jobs.

The trading firms gained the access to the capital that they sought in the 1980s, and they found the joy of playing with Other People’s Money. And for twenty years, the game has gone on.

Heads I win, tails you lose. Or in David Einhorn's more elegant formulation, Private Profits, Socialized Risk.

Today, the US Treasury and the Fed are trying to hold the pieces together. AIG. Citi. Bank of America. GMAC. Fannie Mae. CIT Financial. But why? Where is the evidence that large financial corporations are more efficient at allocating capital than smaller banks? Surely, they have not been sound custodians of depositor funds or of the public trust. Neither have they proven they can deliver more predictable returns on shareholder equity than smaller, more nimble financial institutions, who themselves are increasingly disadvantaged by each bailout. Whose interest has conglomeration served but that of insiders seeking greater compensation with less risk?

One central question to all of this is whether the fundamental corporate model is not central to the problem. Today, absent prosecution for fraud, the CEOs and directors of all of these failed firms will walk away with much of their wealth intact, insulated from the consequences of the decisions they made. For years now, they have been playing with our money.

New regulatory regimes will not be adequate to control this systemic risk. Controlling banker compensation might have a populist appeal, but no one should imagine it constitutes systemic reform. Regulatory bureaucracies cannot control systemic risk in massive financial corporations, because the systemic risk is the massive financial corporation.

Thirty years ago. William Salomon was suggesting a simple truth: Sound decision-making, incentives and accountability require that those who are making decisions and placing bets have their own capital at risk.

Sunday, September 27, 2009

They're back.

It is not clear how foreign policy strategy is being set in the Obama administration. But the execution has the appearances of a well-considered and orchestrated dance. And when the music stopped this week, standing together on the stage, united in common purpose, were the Big Four of wars gone by—the U.S., Great Britain, France and Russia.

The surprise this week was not the disclosure of a second, secret Iranian uranium enrichment site. Nor the ensuing condemnation and threats of collective action. What was surprising was the distinct voices that were heard. It was French President Sarkozy and British Prime Minister Brown whose declarations were strongest, with Russian President Medvedev joining shortly thereafter. Finally, an American President was able to speak a bit more softly—and by the demonstration of common purpose suggest a bit more stick on behalf of the international community.

For the first time in a while, Iranian President Ahmadinejad seemed caught off guard. His normal swagger was muted, perhaps with the realization that his days of manipulating Russia against the West have ended. More perhaps with cold fear that it was he that was manipulated by Russia, and that his miscalculations may weaken him considerably in his battles to retain power at home.

Perhaps American foreign policy is coalescing around some basic realities of the world. There are real threats out there, and we do not have the capacity to fight them alone. The unilateralism of the past decade was defined less by our determination to go it alone into war than by the belief that we could fight all battles and recast all nations in our own image. Almost without exception—perhaps China, as our lead banker, was the exception—we demanded fealty to our image of democratic progress from all of our antagonists.

But when you are fighting on all fronts, your ability to build enduring coalitions on any one of them is diminished. Russian Foreign Minister Sergey Lavrov has long articulated this view. Yes, as he has suggested for the better part of a decade, Russia and the United States have more issues that unite them than divide them. And yes, when presented with the top five issues of concern facing the U.S. in the international arena—perhaps including among them Afghanistan, Iran, Iraq, Islamic fundamentalism, drug trafficking, nuclear proliferation—Russia was a potentially valuable ally in all of them.

But the problem was that Russia had their own top five list, and Lavrov has long complained that if there was to be a partnership, it could not be one-sided. Russia’s concerns had to matter as well. Yes, Russia was prepared to be an ally in the Global War on Terror, but the Russian list had to be on the table. And they had a different list. Chechnya. Georgia. NATO. Missile defense. Encirclement. Status.

Russia’s list was fundamental to the continued integrity of the Russian nation. Russians may be paranoid, but the simple fact is that people are out to get them. U.S. official policy has been and continues to be one of encirclement, while many prominent voices go well beyond that—most notably Carter-era National Security Advisor Zbigniew Brzezinski—and argue that U.S. policy should be the dismemberment of the Russian state.

The dismemberment of the Russian state is not so far fetched. Before the fall of the Soviet empire, the Soviet Union claimed a population of nearly 300 million people. Today, Russia is a nation of just over 140 million, and it is shrinking rapidly. With low birth rates, high infant mortality, short life expectancy, and minimal immigration, by mid-century Russia’s population is projected to decline by more than 20%, to approximately 110 million.

The prospect of Chechen independence—and the demands for independence that would likely ensue from other minority groups should Chechnya succeed—further threatened the future of Russia. This fear explained in large measure Russia’s vociferous objection to NATO’s declaration of independence for Kosovo, and Russia’s steadfast claim that the international community can only grant nationhood through the legal powers granted to the United Nations.

Russia’s intransigence in dealings with the United States is rooted in its defense of national self-interest. For several years, Putin and Medvedev have been intent in their actions in international affairs—from supporting Iran to instigating the Ukrainian natural gas crisis—to force the United States to deal with them and their issues.

U.S. actions over the past nine months indicate that U.S. policy has evolved, and that we may finally be paying attention. The nuance is the distinction between what we say and what we do. The Bush administration talked about partnership and an alignment of interests, but took every opportunity to dismiss Russian concerns on the ground.

Now, the process seems to have been inverted. Vice President Biden—an early and vociferous backer of the Kosovo action that was so objectionable to Russia—has emerged as the voice of American support for the process of democratization and continued support for Ukraine and Georgia.

But Putin and Medvedev are realists, less moved by words than action. At the same time as Biden was talking the talk, the administration was walking a different path. During the early months of the administration, Russia threatened U.S. resupply routes into Afghanistan, and U.S. access to a key air base in Kyrgyzstan. One can imagine at that moment that the administration looked down the road at the real threats that loomed, and took a hard look at the facts on the ground. One can imagine that at that moment, they weighed the real impact on the ability of the U.S. to pursue its strategic goals and determined that Russia was—as Lavrov long suggested—better to have as an ally than face as an obstacle and an adversary.

It really was never a question. After all, for all the rhetoric—whether from Biden, Bush or Cheney—about U.S. support for Georgia or a common defense of Ukraine—neither we nor our European allies have had or likely would ever have the willingness to go to war with Russia in their Near Abroad. Our actions may have been designed to tweak them and continue the great game wherever possible—but never with the intention of real escalation.

One question this week has been how long ago did the U.S. learn of Iran’s second enrichment facility. Was it many months ago, and were the strategic moves to bring ourselves closer to an effective alliance with Russia—such as shifting our policy on strategic missile defense in Poland—in preparation for this next phase of the confrontation with the Iranian regime? Or was it simply fortuitous that the steps had been taken, and the groundwork had been laid that would allow Russia and the U.S. to stand together against a common threat?

Perhaps it doesn’t matter. But it does matter that our foreign policy may be built less on rhetoric, and more on our capacity to build effective alliances against real, and common, threats.

Sunday, September 13, 2009

The grand illusion.

Most of us have lived through, or will live through, the painful years of watching our parents’ health decline. Behind the ugly partisan rancor of the town halls and the healthcare debates is the simple truth of that common experience.

Whether our parents have cancer or Alzheimer’s or dementia, or are simply dying of old age, we watch as their bodies become frail and their minds fade. These are our parents, once our providers and protectors, sapped of the energy and vitality that we for so long took for granted.

The medical bills. The residential communities. The in-home care. The drugs. They drain our parents’ savings and ultimately strain our family resources. We may have thought that Medicare would suffice, until one day a bill arrives from a rehab facility or a hospital, or a new drug is prescribed. From that day, the emotional pain of end of life care is compounded by the financial strains that bleed outward, undermining sibling comity, and threatening the resources set aside for kids' education, for family vacations, or for retirement.

There is no easy solution to this. We are all living longer, and the advances of technology and science now offer us the ability to fend off diseases that years ago barely existed—largely because we used to die younger and never contracted them. As a close friend put it—a Jesuit priest with a way with words—the longer any machine works, the more the maintenance costs go up.

People like to compare Medicare with Social Security, but the challenges facing Social Security are manageable by comparison. When Franklin Roosevelt created Social Security in 1935, it was a stroke of political—if not financial—genius. Social Security offered retirement security at age 65 to American workers whose average mortality at the time was 59. Therefore, it offered an entitlement to people who—on average—would be dead before they were eligible.

But even with longer lifespans, Social Security is a controllable and predictable program. The mortality curve shifts slowly and we can—at the end of the day—choose to change the parameters of the program that affect cost: the retirement age, the cost of living adjustments, the basis of pay, and the basis of taxation.

Healthcare has no such certainties. Unlike information technology, which offers greater and greater power at less and less cost, investments in healthcare technology and pharmacology that increase longevity and cure rare diseases may be moral victories for humanity but only exacerbate the financial strain on society and families. This is the dilemma of healthcare: The better we get at it, the faster the costs will escalate.

Dr. Andrew Weil, and many others, have pointed out that the solution to our healthcare crisis lies in how we choose to live our lives, and ultimately how we choose to die. In a similar vein, Atul Gawande suggests that the solutions to the cost and quality of healthcare lie in large part in the choices and conduct of the providers themselves. If physicians turn the practice of healthcare into an exercise in profit maximization, they will do better as individuals, but their patients and the system itself will suffer.

But as in most areas of life, good choices and ethical practices cannot be compelled or overseen by government. Regulatory regimes can enforce measurable practices—such as the concentration of melamine in dog food, or rat hairs in cola. But the federal government has no capacity to regulate the quality of collaboration among and conduct of individual medical practitioners.

The person who cried out at one town hall meeting to not let the government get its hands on Medicare has been duly castigated for the irony and ignorance of the remark. But at a deeper level, the remark encapsulates the problem we face.

Medicare is a government program. While many proponents of single payer healthcare point to Medicare as a model, it is a program that pays providers far less than the cost of services, and therefore results in substantial cost-shifting that exacerbates the medical insurance costs paid across the rest of society.

But Medicare is the lifeline of the elderly, and of the families of the elderly. That person may want to believe that Medicare exists above and apart from government, but of course it does not. Each Medicare patient—and their families—are relying on Other People's Money for their care, but they feel entitled to have it with few strings attached nonetheless. But the truth is that it is one more tax-funded program. Just like the stimulus money. Just like the wars. Just like everything else.

Medicare is our cushion. It insulates us from painful decisions that otherwise would be ours. But it is an illusion.

The fear and rage evinced by the person at the town hall presages the pain to come as that illusion is laid bare. If Medicare is Other People’s Money, then those other people are surely entitled to set the rules. But even worse is the realization of what would happen if it were not there. Without Medicare, we would have to be paying those costs for our parents’ end of life care ourselves.

Based on Dartmouth research data, per patient Medicare costs for the last two years of life range from approximately $50,000 to $100,000 across the country. And this is just the part paid by Medicare, which as we all learn is only part of the puzzle. These costs stand in stark contrast to the Federal Reserve data on household finances, that indicate that the median family net worth has fallen from $120,000 in 2007 to $99,000 as of October of last year. It is not a stretch, therefore, to suggest that we are spending with other people's money far more than we could spend if it was our own.

The person who cried out at the town meeting may have been voicing a fear we all hold deep inside. What if it is all an illusion? What if Medicare is not an impenetrable wall that protects us from those decisions that are most painful?

For many years we have accepted the illusion and comfort that Medicare offers. Spending Other People's Money has changed the decisions that we make, and our assumptions and expectations about the care our loved ones receive. We now clamor to rest assured that they will receive all of the care a physician might recommend—with little consideration of the cost to the system of which we are a part.

But federal government resources are no more than the pooling of our collective family resources. In the end, we will return to the questions that for many years we have been able to avoid. How would we choose what steps to take—and what procedures to forego—if it was our limited family resources that would be drained away by each of our decisions? And what choices would our parents make if they understood the magnitude of the impact of each decision on their children and grandchildren?

The question of how we are going to spend scarce resources is with us. We confront it around the kitchen table, and it is time that we accept that it is the central question of healthcare challenge. It is the question that will consume our politics in the years ahead, because the years of free money and free choices have come to an end.

Thursday, September 10, 2009

The specter.

A specter is haunting America—the specter of debt.


Birthed in the dying years of the Cold War, the American polity lost its way. Public policy, as encapsulated in the Federal budget, was always about making hard choices among competing priorities and constituencies. The notion that resources were limited was a critical discipline, and the ability to navigate the process of allocating resources was the stuff of which Congressional leaders were made. Throwing arrows is easy. Building a budget in a democracy is hard stuff.

Traditionally, Democrats were the party that believed in spending more—and taxing more, while Republicans once were the grownups of the American political system, sternly cautioning against the political urges toward deficit spending and international adventurism.

But the world changed over the last quarter century. Faced with the realities of survival in a competitive world economy—and the exigencies of political fundraising—Democrats brought corporate America inside their tent and muted their hostility to the private sector. For their part, since George H.W. Bush uttered the words Voodoo Economics in his failed efforts to derail the Reagan Revolution, the unholy alliance of tax cutting Republicans and big spending Republicans marked the death knell of that party’s claim to the moral high ground in matters of fiscal propriety, while Neo-conservatives brought to the GOP an evangelical fervor to change the world that was once a Democratic credo.

The numbers are stark. Over the past twenty-five years, Democrats and Republicans alike forswore their allegiance to the central responsibility of elected legislators to make choices, balance priorities and pass budgets with integrity. Perhaps they were not to blame, after all East Asian countries led by China continued to fund our deficits by buying our bonds and offered cheap money as an alternative to the more painful options of cutting spending or raising revenues. These foreign purchases of our debt were not an act of faith in the almighty dollar as much as a simple expedient of the export-driven model of economic development that has become the norm across the world.

Over the past quarter century, China, the Asian Tigers of South Korea, Hong Kong, Taiwan and Singapore, and more recent converts such as Vietnam, pursued a successful economic development strategy built on selling manufactured goods into the U.S. consumer market. As these countries took in massive amounts of dollars, they faced two options: They could recycle those dollars back into the U.S. or watch the value of the dollar decline and their own currencies rise. There really was no choice, as the export-driven development model that was lifting the Asian nations out of poverty required that their currencies not rise in value relative to the dollar, so that their low-cost goods remained attractive in the U.S. market. Accordingly, U.S. Treasury securities became the preferred investment for Asian trade-surplus dollars, and our financial markets become flush with that kept long-term interest rates low.


What was lost in the orgy of low cost debt that ultimately engendered the securitization boom in credit card and home equity lending—and enabled growing deficit spending at the federal level—was that the American economy, like the American household, was living on a chimera of growth that belied the underlying damage that was being done to our economy.


As the table here illustrates, over the past twenty-five years, our economic growth has increasingly been driven by imported capital. In the same way that the average American household saw no real income growth during the past decade, but increased their spending through borrowing, so too the national GDP was flat, but for the growth realized through externally borrowed dollars.



Today, we are faced with stark choices. But if the healthcare debate is any measure, it is evident that our political establishment has lost much of its capacity for honest debate and real decision-making. Twenty-five years of free money and no discipline has made a mockery of the federal budget process, as we now are accustomed to avoiding choices and accepting the false notion that there are obligations that are non-negotiable.


For two decades now, we have become accustomed to justifying any manner of spending, from education to tax cuts, as an investment in our future. This rationale is a direct outgrowth of the availability of low-cost capital that has itself undermined the ability to weigh and make choices. This has undermined as well the notion of a national consensus on foreign policy, as we now go to war with little regard for the financial cost. With no fiscal consequences and no universal service, war has become a sideshow of American political life.


Today, the generation-old paradigm may well be shifting. It is with no small amount of irony that even as our Republican and Democratic representatives have lost anything but a rhetorical commitment to the traditions of responsible budget policy, it is the Chinese Communist Party—the largest holder of our debt and the most at risk for the consequences of a devalued dollar—that is becoming insistent that we pay attention to our cascading fiscal mess.


Surely, as the source of the free capital to which we have become addicted, the Chinese have little more standing to scold us than the crack dealer who declares to the destitute customer that it is time to stop. The true dividend to the Chinese is not the return on their investments, but rather the economic growth that has lifted the livelihood of hundreds of millions of Chinese out of poverty over the past two decades—paid for graciously by the American factory workers whose livelihoods were lost.


But ironies aside, we have to listen. A new economic model could be beneficial to us—over the longer-term. Increased domestic savings and a declining dollar in the short-term could make overseas manufacturing less competitive, and allow America to begin building things again. But the near-term pain will continue for some time, as the process of paying down the debts that we have accumulated will take years. And we have become a very impatient nation.


The problem, however, is not our ability to listen. The problem is that after twenty-five years, the very skills required to build a federal budget that faces up to real facts, weighs priorities and makes real choices, may be gone from our political DNA. The Death Panel debate, while fraudulent on its face, offered the first inkling of the challenges to come when capital becomes scarce once again, and we are confronted with competing priorities for limited budget dollars.


The truth is that there is a Death Panel, that is charged to sit and decide how limited resources should be allocated. To weigh the needs of the elderly against the needs of the young, the costs of healthcare against the costs of war. But it is not the faceless panel of bureaucrats of Sarah Palin's imagination. It is Congress. It is time they get used to it.

Wednesday, July 15, 2009

The greening of Goldman Sachs.

The US economic turnaround may not be complete. The AIG turnaround may not be complete. The GM turnaround may not be complete. But Goldman Sachs is back.

“A Swift Return to Lofty Profits” proclaimed in the New York Times, as Goldman Sachs reported that it earned $3.44 billion in the second quarter, and is preparing its largest bonus payout in history. And without doubt, those lofty bonuses are well earned.

Consider how effectively Goldman has navigated the roiling waters of the global financial crisis. First, Goldman received a $10 billion injection of TARP funds to help it weather the market turmoil. Next, it swiftly converted itself into a commercial bank and member of the Federal Reserve system, gaining access to low or zero cost capital at the Fed Discount window and access to federally guaranteed borrowing through the FDIC Temporary Liquidity Guaranty Program. Finally, it garnered a $13 billion payout at one hundred cents on the dollar for its outstanding credit default swap contracts with AIG.

Now, we are told, Goldman’s profitability stems from its trading prowess in global markets. Really? A $3.44 billion profit in the second quarter could be accounted for simply by a 25% run-up in the value of the CDS portfolio from its value when AIG stood as a bankrupt counterparty.

No, Goldman may have trading prowess, but that pales against its political prowess.

Thirty years ago, most of the major Wall Street investment banks were partnerships, and those with the greatest prestige and market power—Salomon Brothers, Goldman Sachs, Lehman Brothers and Morgan Stanley—eschewed retail brokerage in favor of institutional relationships and proprietary trading. Only Merrill Lynch prided itself on retail brokerage and being a member of the New York Stock Exchange.

Then, the world changed, as investment banking firms looked far and wide for new ways to strengthen their balance sheets and access new pools of capital. One by one, the old-line partnerships fell by the wayside, casting aside their culture and independence for the lure of other people’s money. Salomon merged first with Phibro, and then was subsumed into the emerging Citibank colossus. Lehman was acquired by American Express. Morgan Stanley suffered the ignominy of merging into the Sears Roebuck/Dean Witter/Discover financial services company.

Only Goldman Sachs retained its culture and identity, even though it too tossed aside its partnership heritage in exchange for the lucre and capital offered through a public stock offering.

As one watches the evolution of Goldman, it is hard not to become a conspiracy theorist. After all, Goldman’s rise from merely the top of the heap into the stratosphere has come after years of growing influence in Washington as one Goldman partner after another were appointed to senior positions in the Cabinet or White House—John Whitehead, Robert Rubin, Josh Bolten, Hank Paulson, to name a few—and tens of millions of dollars of political contributions found their way from Goldman Sachs into the campaign war chests of members of Congress, of Senators and Presidents, Democrats and Republicans alike.

Perhaps the public interest and the private interest just happened to coincide with the passage of the Financial Services Modernization Act in 1999 and the Commodity Futures Modernization Act of 2000. Perhaps the conversion of Goldman Sachs—a non-depositary institution—into a commercial bank, with access to Fed Funds and the Discount window, and eligible for FDIC guarantees on its debt offerings was in the public interest. And perhaps the public interest was somehow served when Goldman and others jumped to the front of the line of AIG creditors and were made whole on their credit default swap contracts with a bankrupt counterparty.

Perhaps. But we must conclude—because we believe in truth, justice and the American way—that Robert Rubin, Josh Bolten and Hank Paulson influenced and guided public policy in ways that was truly in the public interest, and that there was no nefarious connection between all of those campaign dollars and the direction of our national policy in any manner that unduly benefitted Goldman Sachs over the years.

Perhaps. But this year, appearances matter. And this is the year that has seen $10 billion of TARP money and $13 billion of AIG money and who knows what amount of additional Federal Reserve funds or federal guarantee benefits flow into the coffers of Goldman Sachs.

So perhaps, this year, Goldman Sachs employees should be content with the tripling in value of their stock—surely a direct result of all of the financial largesse that has flowed Goldman’s way—and perhaps this is a year when $3.44 billion of Goldman Sachs profits should not turn into bonuses, without due consideration for how all of that was possible, and where that money came from.

From the rest of us.

Saturday, July 11, 2009

After the fall.

We have yet to see what the Iranian regime will be prepared to do in the face of real opposition. After all, the leaders of the opposition questioning the election results—Mir Hussein Mousavi, Mehdi Karroubi and Hashemi Rafsanjani, and others who have emerged as fellow travelers, including Ali Larijani and Mohammad Khatami—are each deeply routed in the Islamic revolution, and each served as either the leader of the parliament or the President of the Islamic republic.

More to the point, each rose to the top of Iran’s tightly controlled political apparatus, gaining personal power through a political system that excludes ex ante any candidate deemed to be a threat to the ruling regime. Therefore, one can fairly wonder why the Supreme Leader Ali Khamenei jumped the gun in declaring a winner, since the system was rigged before the vote. But apparently that was not enough.

Here in the realm of the Great Satan, we tend to view things through our own eyes. So before Michael Jackson, Mark Sanford and Sarah Palin drove Iran from our TV screens, we were fixated on the images of street protests in the wake of the Iranian election. For us, in Iran––as in Florida––the question was, “Who really won the vote.”

But as the images of Tehran have faded, debates over who won have given way to a clear understanding that the integrity of the election in Iran is not the measure of democracy there. At the same time, the Iranian regime is coming to realize that the integrity of the election, or lack thereof—whether perceived or real—may be its undoing.

From the moment the polls closed, when Supreme Leader Ayatollah Ali Khamenei declared the victory of President Ahmadinejad a “divine assessment,” Khamenei undercut his own credibility as a dispassionate ruler committed to the integrity of the electoral process. While Iranians may have come to accept limitations on what candidates are allowed on the ballot, fundamental Shia principles of fairness and justice demand that the integrity of the process be respected.

Instead of showing patience and respecting the process, Khamenei undermined his own credibility. But more important, he opened the door for the narrative that soon emerged: Those who questioned the results were guilty of apostasy. And in Islam, apostasy is a mortal sin, and such accusations have justified the most extreme incidents of Islamist violence.

Today, even though the demonstrations in the streets have disappeared from cable news, the debate in Iran has been elevated from vote counting and ballots to treason and apostasy. It doesn’t get much clearer than that.

The issue is no longer about the election results. The issue now is about the core principal of the Islamic Revolution—velayat-e faqih—that Islamic law requires that power over civil society must lie with the clerical order of Islamic jurists.

This debate is deeply rooted in the Islamic Revolution of 1979. At the time of the Revolution, Ayatollah Khomeini was the most vocal proponent among the senior Shia clerics of velayat-e faqih, while he was opposed at the time by his peer and rival Ayatollah Abul-Qassim Khoi, who disagreed with that interpretation of Islamic law, and dissented from the urge to assert clerical dominion over civil society. While Khomeini won the day and dominated the revolution against the Shah of Iran, velayat-e faqih has never been accepted across the senior Shia clerical order as settled law.

The debate over velayat-e faqih has reemerged as the central issue in Iran. Today, even as the Revolutionary Guard—the Praetorian Guard founded by Ayatollah Khomeini in 1979 to defend the clerical regime—is asserting its control over the streets of Tehran, Supreme Leader Ayatollah Ali Khamanei’s impatience in handling the election may ultimately cost the regime its legitimacy.

A central figure in the debate over velayat-e faqih will be the leading protégé of Ayatollah Khoi, Ayatollah Ali Sistani, the Iranian cleric who is demonstrating the principals of his mentor in his patient oversight of civil society and the emerging democracy in Iraq. For Iranians in the streets, as well as clerics in the holy city of Qom, Sistani is among the most revered religious figures, and a cleric of greater authority and stature than Ali Khamenei himself.

The irony is that none of the leading actors the Iranian drama, Mousavi, Karroubi, Rafsanjani, Larijani or Khatami have identified themselves with Sistani or opposition to the existing order of clerical dominion over civil society. They are each products of the existing system. And yet the principle of velayat-e faqih is what is at stake and will emerge as the issue at hand.

The prospect of change—counterrevolution by any reasonable definition—in Iran poses real dangers, as any evolution to a more open democratic process and easing of clerical dominance will yet face many hurdles, and may take many years. As Ali Khamenei loses stature due to his mishandling of the post-election period, the winner over the near term may well be President Mahmoud Ahmadinejad, whose ties to the Revolutionary Guard may allow him to assert greater power in Tehran, even as religious and legal arguments are debated in Qom. How the those debates play out in Qom may determine the long-term direction of the Iranian revolution, but how control over the Revolutionary Guard and the military evolves will likely determine whether the opposing camps in the post-election era reach a near-term accommodation, or Iran devolves instead toward a traditional dictatorship.

Sunday, March 01, 2009

All in.

Even as the Obama administration may be consumed by efforts to stem the depth and duration of the recession, we appear to be at a tipping point in foreign affairs that can lead to positive new directions or a new downward spiral in regional conflicts.

The opportunities at hand are complex and interconnected. And unlike the high stakes, three-hand game among Russia, China and the United States of the Nixon era, which subsumed all of the smaller countries into bit roles, the diplomatic world today involves a wide range of actors, each of whom has real interests, has signaled their readiness to play, and can each affect the potential outcomes for the others.

The historical background is important in several regards. First, the global economic collapse has illustrated the interdependence of national economies, while at the same time demonstrated the risks to individual states that flow from that interdependence. Accordingly, many national leaders find themselves at a point where they have to choose—both as a matter of policy and politics—whether they are in or out, whether they accept the rules of globalization, free trade, and interdependence, or whether they will opt for a return to economic protectionism and political self-preservation.

Second, the election of Barack Obama signaled the end of the Neoconservative era in US policy, and portends a renaissance of realism in foreign affairs and diplomacy based on national self-interest. As much as war might be the proven solution to depressions past, Americans have grown weary and cynical over calls to arms and regime change over every looming international confrontation, and the rest of the world seems ready to embrace new directions as well.

Russia, for one, has been pushing for an alignment of interests since Vladimir Putin first called George Bush to pledge Russia’s support after the 9/11 attacks. Putin sought—but ultimately failed—to build a new strategic relationship with the US around a number of specific areas of common interest—stemming the Jihadist threat emerging in Chechnya and Muslim former Soviet republics, defeating the Taliban, controlling Iran’s nuclear ambitions, and controlling drug trafficking—for which Russia could leverage the reinstatement of the Bush ‘41 and Clinton-era US commitments to curtail NATO expansion toward Russia.

Now, after several years of declining relations, and with the ruble in free-fall, Putin is signaling a desire to try again. After years of trying to swing a big stick to get our attention—cutting off natural gas supplies to Ukraine and Europe, sending arms to Iran and Venezuela, and sending tanks into Georgia—Russia is trying a bit of carrot—opening its territory for the US to resupply its troops in Afghanistan and delaying the deployment of missiles in Kaliningrad.

Iran, meanwhile, is looking to get into the carrot and stick game. Like Russia, Iran grates at being disrespected and has sought out strategies that might force the US to bargain on equal terms. Certainly, as President Obama announced his plans for withdrawing from Iraq, it was not lost on many that Iran—almost single-handedly—can determine whether those plans can succeed.

Iran has much to offer—enabling an exit from Iraq, moderating the role and conduct of Hezbollah, and, of course addressing the nuclear issue—and has much to gain—recognition of its role as a regional power, reducing the threat of American troops on both its eastern and western boarders, fears of being frozen out in a US-Russian rapproachment, and an end to American threats of regime change.

Like Russia, Iran’s economy is in shambles, and the June presidential election looms to be a critical moment. The entrance of former president Mohammad Khatami into the presidential race in early February may signal that Iranian Supreme Leader, Ayatollah Khamenei is willing to move toward a moderation of Iran’s hard line direction and rhetoric, embodied in current president Mahmoud Ahmadinejad, and substantively address the concerns of the international community.

Syria, another long-time target of US regime change, also needs to demonstrate its bona fides at this moment of political change. Just as Russia reached out to Syria over the past several years to demonstrate its continuing ability to stir the always-simmering Middle East pot, Syria can on its own significantly influence the next trajectory in the politics of the Middle East.

Like Iran, Syria controls one long border with Iraq, and can influence the outcome of President Obama’s exit strategy. Similarly, as the home of the Hamas political leadership, and as the long-time suzerain of Lebanon, the Syrian intelligence apparatus can directly control the direction and temperature of the Palestinian-Israeli conflict. Like others, Syria has its interests—in territory and regime survival—for which it will play its cards.

Achieving our foreign policy goals requires that each of these key nations change their approach to us and to others. We have tried threats of regime change and war, and we are broke and tired. Ironically, however, this has led to a moment of opportunity where each country may be motivated to move in a new and positive direction.

For Russia, Iran and Syria, this is a moment of opportunity. Their leaders, Putin, Khamenei and Assad, are rational and cunning adversaries. Each has demonstrated the ability to work with us when it served they and their country’s interests, or to resist our threats and recriminations when it did not. Each of them has a hand to play, and yet each knows that they risk being left behind if they fail to seize the moment.

For Barack Obama, as well, this is a moment of opportunity. But as he has suggested when speaking of the economic challenges we face, in the world of foreign policy, our major challenges are interconnected. He cannot put any aside for another day.

Iraq. Afghanistan. Iran. Pakistan. Al Qaeda. Israel-Palestine. Lebanon. Energy. Venezuela. For each of these, Russia, Iran and Syria—in one combination or another—can be the fulcrum for success or failure.

This is the President’s moment.

Reality bites.

This Sunday, the New York Times asked a panel of economists, “When Will the Recession Be Over?” A few panelists offered hopeful words, ‘Perhaps later this year… if there are no more surprises.’ The eternally pessimistic Nuriel Roubini suggested three years… or more. One sage observer offered the wisdom of bubbles past: You don’t reach the bottom until people stop asking.

We are having a hard time accepting that recovery will take time. Leveraging, or getting into debt, is a lot of fun. For twenty years or so, as interest rates declined and lending standards loosened, America went on a debt-funded spending spree. Across the country, as housing prices rose and the home-equity lending came into vogue, Americans used their access to money to live beyond their current incomes, creating an illusion of prosperity and growth.

Deleveraging, on the other hand, is not fun. It ultimately requires reducing debt. Actually getting rid of it. For American households—whose real incomes have been flat for a decade or more—it means returning to the standard of living that they could afford before the borrowing spree started, adjusted further downward to allow them to pay off the debts they accumulated during the boom years.

So far, our public policy responses to the housing collapse and banking crisis have largely amounted to various strategies for shifting the debt burden around. In the name of stability, the TARP program socializes the losses from our financial sector. Now, in a similar vein, we are proposing to tackle the problem of home foreclosures. But unlike the TARP program that puts the bank losses on the broad shoulders of the Federal government, the strategies to boost the housing market will shift the losses experienced by current homeowner onto the next generation of homebuyers.

Consider this. In 1981, the median home price was $62,000, and the annual cost of funding the purchase of that home at the then-current 16.6% mortgage rates, and with a 20% down payment, was $8,900 per year. $8,900 was 47% of the median family income at the time of $19,000, indicating that the median priced home was not affordable for most families.

As interest rates declined through the 1980s and 90s, home prices escalated as affordability increased. By 1998, the cost of carrying an 80% mortgage on a $128,400, median priced home dipped to $8,228, or just 21% of the 1998 median family income.

By 2007, median home prices increased a further 70% to $217,800. 30-year mortgage rates only dip another 1% or so, but home priced increases were aided by the advent of all sorts of “creative” mortgages, that continued to reduce buyer monthly payments.

For more than two decades, the growth in home prices was made possible by the long-term decline in mortgage interest rates, and at the late stage of the bubble by interest only, variable rate, and teaser-rate mortgages. Despite all hopes for a revival of the real estate market, and particularly a new period of growth in home prices, this is not likely to happen.

Current Federal strategies to re-stimulate the housing market to address the foreclosure problem are ill-advised. Over the past several months, the Federal Reserve has initiated efforts to push long-term mortgage rates down toward 4.5% by purchasing mortgage-backed securities. In addition, the newly enacted stimulus package included an $8,000 first-time homebuyers tax credit.

The problem with these efforts is that they will not fix the fundamental problem, but instead will simply push the problem—the loss of home equity—onto the next generation of homebuyers.

Consider this example. Take the median US home that was worth $220,000 during the years 2005 to 2007, but which might be worth $180,000 today, reflecting a loss in value of nearly 20%. This reduced home price, with a market-rate, 6% mortgage and 80% down, would cost the new owner around $10,500 annually. However, with a 4.5% mortgage rate and the $8,000 tax credit, this new owner can afford to pay $215,000, and still owe only $10,500 annually.

This is the same game that we have watched for the better part of two decades. The buyer—who has been taught to focus on the monthly payment as the measure of “affordability”—is willing to pay the higher price for a home because of the availability of low-cost financing. The seller is happy, because they receive close to the 2005-2007 price of their home. For two decades, this logic worked, because interest rates were continuing to drop and home prices were continuing to rise.

But the situation today is different, creating two very real problems. First, these policies constitute deliberate inducements to entice homebuyers to pay over-market prices for homes, as a matter of public policy. It is reasonable to expect that once the Federal actions that induced the purchase are ceased––the artificially low mortgage rates and the tax credit—the market price of the home the buyer purchased for $215,000 in the example above will fall back to its current value of $180,000.

Therefore, the impact of these policies will be to benefit—or “bail out”—the current homeowners who are facing substantial losses, by passing those losses on to the new homebuyers.

Second, and equally important, new homebuyers should be on notice that the “great deals” that they might see in the real estate market today are only great in comparison to prices at the high point of the real estate bubble. The implied suggestion is that once the current mess is behind us, home prices will continue to rise once again. But that is not likely to be the case.

There are two simple reasons for this. First, tightened rules governing mortgage banking will end the lending practices that artificially lowered the carrying costs of purchasing a home and supported the run-up in home prices. Traditional conforming mortgages with real down payments and more conservative underwriting standards will once again tie home affordability to household incomes and long-term mortgage costs.

Second, long-term mortgage rates are more likely to rise than fall, once the Federal Reserve Bank curtails its market intervention to suppress mortgage rates, and particularly if Congressional action allows judicial rewriting of mortgage contracts, which will undermine the security of—and therefore increase the cost of—mortgage loans.

Many will argue that since we have chosen to bail out the banks, it is only fair that we bail out homeowners. That is a fair argument, and one that Hank Paulson and Ben Bernanke and Congress should have considered before we began our long walk down this path.

But Federal actions to artificially boost home values will not socialize the losses in home values, but instead will literally pass one family’s loss on to the next. Like the TARP program, the fundamental problem is that the losses are real, and try as we might to shift them around to avoid the pain, they will not go away.

Thursday, February 26, 2009

Learn from Google. Make the federal budget free.

Why tax people?

Really. We now know that no one likes paying taxes. The presumption was that Democrats liked taxes and that Republicans were opposed to them. But clearly that hypothesis was proven wrong. First, when the man who now sits atop the IRS, Tim Geithner, was nominated to be Treasury Secretary, it turned out that he preferred to not pay taxes.

But that failed to prove the point, as for many it was unclear whether Geithner was actually a Democrat. But Tom Daschle turned the trick. It finally was clear that Democrats, like Republicans, do not like to pay taxes.

Back in the day, taxes were not the issue. Spending was the issue. Back then, when everyone presumed that balancing budgets were among the sole tasks that our members of Congress were charged to perform––that and trashing the UN––Republicans liked to spend less and tax less. Democrats liked to spend more… and were somewhat agnostic on taxes. It was not that they liked taxes per se, but they were a necessary step to get to spend more.

Then Ronald Reagan changed everything, and all assumptions were cast to the wind. Since the Reagan Presidency, Republicans learned that spending really was not so bad, as long as taxes didn't have to pay for it. At first they voiced horror at the fiscal consequences of tax cuts, but, in time, they got over it.

And in time it really annoyed the hell out of Democrats. Ronald Reagan had led the Republican Party to the Promised Land. Cut taxes, spend money and let the chips fall where they may.

The premise was simple. It was unarguable. At any level of taxation, there is a lower level that will put more money back in the hands of taxpayers, and that will provide more resources for businesses to hire people and spur the economy onward.

It has become axiomatic. At every level of taxation, there is a lower level that if achieved will spur on the economy.

Therefore, following the logic to its natural conclusion, the optimal tax rate is zero.

Unless you need money. For stuff. Guns. Butter. You know. Stuff.

Or so we thought.

Today, we are approaching political Nirvana. In the final great leap of bipartisanship, the new administration is reaching for a new middle ground. Cut taxes in a nod to Republicans (and, it turns out, to everyone else, who also prefer to not pay taxes.) And increase spending. Because… Well. Because we can. Because we must.

And forget all those arguments about the expiring 2001 and 2003 Bush tax cuts. That is not a tax increase. That is just reality once again coming back to bite us.

Pardon the digression, but those tax cuts marked the beginning of the end of any integrity in tax policy. The scoring rules at the time required that tax legislation be budget neutral over a ten-year horizon. Congress was unable to pay for the tax cuts with other increases or spending cuts, so they paid for them by having them expire in year eight or so. So they complied with the ten-year scoring rules. Kind of. Lots of cuts for eight years. Lots of revenue to pay for them in years nine and ten.

Back in 2001, as they contemplated years of tax cuts that would suddenly expire, people jokingly referred to 2010 as “the year we push momma from the train,” because in 2011 the estate tax would rise dramatically back to its 2001 level. Well, here we are in year eight, and there is no need to worry about the estate tax. The estates were invested with Bernie Madoff.

Everyone gives lip service to debt being the problem that got us into our current mess. Not passing on to our children “a debt they cannot pay” was the great bi-partisan applause line of the President's speech the other night. They applaud fiscal responsibility. They just don’t believe in it. Or know what it is.


Look at the record over the past two decades. Our economic performance has been flat, other than the growth that we have literally purchased with debt. As a nation, we are like households whose real income has been flat for a decade, but who fund an increasing standard of living—new electronics, cruises, home improvements—through more and more borrowing. For years now, as a nation, our GDP growth has increasingly been purchased with imported capital.

Really.

Take a look at the new federal budget. $3.55 trillion of spending. A $1.75 trillion deficit. Maybe we have reached the tipping point. Finally, our revenues may become less than half our budget, and we can begin to migrate our tax rates to their optimal level.

Zero.

That does not mean we will have a 100% deficit. Far from it. We will still have cattle grazing fees.

And we will have the loan guarantee fees that the Federal Reserve charges for guaranteeing private debt. Those should be growing.

Oh. Sorry. Those aren’t in the budget.

My bad.

The return of the business cycle

Twenty years or so ago, when my I was planning a move to California for a new job position, I listened to an interview regarding a study of the psychological affects of recessions on individuals and communities. Specifically, the study compared the incidence of mental illness, depression and suicide in Los Angeles during the recession there in the early 1980s with New Hampshire during the 1970s.

Apparently, mental illness, depression and suicide were more prevalent in southern California at the time than had been the case in New Hampshire during the previous decade. The study attributed much of the different experiences of the affected communities to differences in family and community structure. In New Hampshire, people continued to live in extended families and extended communities. During the economic downturn, older members of the community would tell stories about earlier recessions, and pass on the wisdom of the elders:

Economic cycles are part of life, like the seasons. Families need to cut back and save as the downturn approaches. During the downturn, workers need to be patient and improve their job skills. And, like the seasons, this is normal, and like a hard winter, this too will pass.

Los Angeles was a very different place, where people had moved to the new world of optimism and opportunity. But in the face of an economic downturn, the perspective of life and the economic seasons was lost. Instead, lacking the wisdom of the grandparents and elders in the community, the fear and pessimism that comes with job losses and economic decline was exacerbated and reinforced.

Over the past months, our country has responded to the recession with fear and pessimism that has been largely unchecked by an historical perspective on economic cycles. We have responded as Los Angeles responded, and the politicians and the pundits are exacerbating the fears expressed in conversations around kitchen tables, in beauty salons and at Starbucks.

Looking back, it is apparent that the depth and severity of our current economic downturn is due in large measure to the success of the Federal Reserve in forestalling significant periods of economic downturn for much of the past twenty-five years. We forgot—as individuals, as families and as businesses—that the economy is cyclical.

But even worse, we lost the value of periodic recessions as a cleansing and humbling time. For individuals and families, recessions are a time to take stock, to cut back on our materialist tendencies, to save, to pay down debts, to retool our skills. For businesses, recessions are a time to rethink strategy, to close marginal operations, and improve attention to costs and excess. For bankers, it is time to learn to write off bad debts and tighten up lending standards, and perhaps teach young associates the basic principle that when there are no profits, there are no bonuses.

In 1997, Foreign Affairs magazine published “The End of the Business Cycle,” trumpeting the success of the west in conquering the business cycle.

“Business cycles -- expansions and contractions across most sectors of an economy -- have come to be taken as a fact of life. But modern economies operate differently than nineteenth-century and early twentieth-century industrial economies. Changes in technology, ideology, employment, and finance, along with the globalization of production and consumption, have reduced the volatility of economic activity in the industrialized world. For both empirical and theoretical reasons, in advanced industrial economies the waves of the business cycle may be becoming more like ripples.”

Lost in the triumphalism of the article was recognition of the important role that periodic economic downturns play in stemming the exuberance, the hubris and the bad habits that build up during the expansionary phase of the economic cycle. For the past twenty-five years, the Federal Reserve has managed to forestall the regular economic downturns that previously characterized the post-war years. The dramatic increases in labor productivity that came about through computerization and changes in information technology, and the suppression of wage inflation that resulted from globalization and outsourcing, combined to suppress inflationary pressures.

As a result, the economy ploughed forward through crisis after crisis, through failure and fraud. The collapse of Continental Illinois. The savings and loan crisis. The bankruptcy of Drexel Burnham. The Asian financial crisis. The Russian financial crisis. The Internet bubble. The collapse of Long-Term Capital Management. September 11th. Enron. WorldCom. At each point of crisis or threat to the financial markets and investor confidence, the Fed was able to forestall downturns and spur continued economic growth by flooding liquidity into the system or pushing down interest rates, with little concern to the normal inflationary consequence.

Ten years later, in 2007, Business Week Chief Economist Michael Mandel articulated the new economic paradigm on his Economics Unbound blog.

“We now may be in a world of mini-recessions—sharp falls in one or two sectors which do not pull down the whole economy… A sharp drop in one sector—say, housing—may pull down a couple of adjacent sectors, such as furniture. But the rest of the economy steams on, and maybe even accelerates, as resources are transferred from the weak sectors to the strong sectors.

“This picture of the world actually fits very well with neoclassical economics. We may get a couple of quarters of negative GDP growth, but deep economy-wide recessions may be an anomaly rather than the norm.”

Now, we have learned that there is no new paradigm, and the business cycle is still with us. But this time, without periodic downturns to temper our exuberance and stem our excesses, we are paying a heavy price.

Consider this. Since Paul Volker stepped down as the Chairman of the Federal Reserve twenty years ago, median family income has increased ten percent in real terms. During that same period, household mortgage debt increased almost six-fold and consumer credit more than tripled, and financial institution debt grew more than eight-fold. Together, household and financial institution debt increased by over $24 trillion.

Even now, over a year into this financial crisis, we have yet to fully accept the depth of pain and dislocation that deleveraging may require. As we face the consequences of the boom years, we are going to need old wisdom as much as we need new policies. We are going to need to remain calm in the face of 24-hour cable shows playing on our fears and trumpeting every moment of our economic travails. Last night, President Obama tried to move beyond the position of policy wonk-in-chief, toward the role of the grandfather in New Hampshire. He scolded us for our excesses, while reminding us that the economy is cyclical and will rebound in time.

But what will we have learned when this moment is past? Will students of the Dismal Science no longer see the “end of the business cycle” as the Holy Grail of economic polity? Will we accept that the cycles of economic life are a necessary—and ultimately productive—check on human tendencies toward excess and exuberance? Or will we quickly fall prey to the hubris of policymakers and pundits who will as we ride the next wave, assure us that this time, once again, things will be different.

Saturday, February 21, 2009

Take a deep breath, and let go.

This week, the Federal Deposit Insurance Corporation took over Silver Falls Bank in Silverton, Oregon. Silver Falls Bank was the 14th bank taken over by the FDIC this year. This compares with 25 banks that failed in 2008 and 3 in 2007. Bank failure is not unheard of. And up until a week or so ago, the term “nationalization” was not invoked.

Since its creation in 1933, the role of the FDIC has been to prevent runs on banks by insuring bank deposits and to oversee the orderly disposition of failed banks. For the better part of a century, it has done its job quietly and effectively. And today, we would all be well served to let the FDIC and its capable leader, Sheila Bair, do their job.

From the beginning of the current financial crisis, one of the problems has been the failure of the leading agents of the government, embodied by Hank Paulson and Ben Bernanke, to establish clear rules and follow them. Instead, we have plodded along, from crisis point to crisis point. From Bear Stearns to Fannie Mae to Merrill Lynch to Lehman Brothers to AIG to Washington Mutual, each collapse engendered a unique response by the Treasury and the Federal Reserve.

In a similar manner, the focus of the $700 billion Toxic Asset Relief Program—the federal bailout—to address the insolvency of the nation’s largest banks has veered from the purchase of toxic assets to injections of capital to guaranteeing of assets. Now once again, toxic asset purchases are back in vogue as the strategy of choice, this time under the “good bank, bad bank rubric.”

This week, the stock market broke through its technical support levels, and now appears headed toward 6,000 as its next support level. Some observers have suggested that the decline reflected the market response to looming plans for the “nationalization” of the banking system and one more step down the road to socialism, as trumpeted on the cover of Newsweek.

But market decline was not a result of the fear of nationalization, and nationalization would not mark the next milestone on the road to socialism. Quite the contrary. Investors are running away from banks—good banks and bad banks alike—precisely because the federal efforts to date have obscured the true financial condition of the banks. Faced with uncertainty and poor information, investors will always pull back and wait for the fog to clear.

The takeover of insolvent banks by the FDIC is the way the process is supposed to work, and the way it has always been allowed to work—up until now. For all of the debates over the “Swedish Model”—where banks were taken over, balance sheets reconfigured, and then spun back out to private ownership—the way they did it in Sweden is not actually all that different from the way they do it at the FDIC, when the FDIC is allowed to do its job. Insolvent banks are seized. Assets are sold off and the depositors are paid or, if possible, the balance sheet is cleaned up and the bank is sold off to a new owner.

The problem today is that a small number of our insolvent banks, notably Citi, are very big and very visible. But the problems they face are the problems that the FDIC was created to fix. This is not nationalization, it is essentially a debtor-in-possession bankruptcy process whereby the FDIC serves as the receiver.

If left to do its job, the FDIC would do what the banks resolutely refuse to do: sell their bad assets, accept the price of their business decisions, and move on. The banks refuse to do it because it would force them to face up to what the markets, and increasingly outraged taxpayers, have known for a while: They are insolvent.

For years, America has told other countries how to deal with financial crises: Cut your losses. Clean up your balance sheets. Get on with it.

This week, the stock market said the same thing.

On a side note, the Ford Motor Company—the one that is not taking federal money—has seen its market share rise steadily for the past four months. This is the way markets are supposed to work. Saving one company or another—or one bank or another—is not an inherent public good, however politically compelling. And pumping public money into one company serves to dramatically disadvantage their competitors.

One of the biggest mistakes that Hank Paulson made was demanding that banks take TARP money, even if they didn’t want to—or need to—in order to remove the stigma from those who did. Somehow, this was supposed to be a way of maintaining confidence in the system. But instead of protecting the bad banks, by letting them hide among the good, it has achieved the opposite. That is why investors have turned their back and are walking away.

The banking industry and the markets would be better served if the politicians and the pundits quieted their politically loaded hubris about nationalization, and if the Treasury and the Fed let the process work, as it has been designed to work. Forget about creating good banks and bad banks. Let insolvent banks take their medicine. Management, shareholders and bondholders will pay a steep price for business failure. And let the banks that are healthy take market share from those that are not.

It is time to let the process work. Punish failure. Reward success. This is not nationalization, it is the way the system is designed to work. Time to take the banks off the dole, and let Sheila Bair do her job.