Tuesday, October 28, 2008

A 21st Century Bretton Woods

Source: WSJ

There wasn't much to see in Bretton Woods in July 1944, when delegates from 44 countries checked into the sprawling Mount Washington Hotel for the United Nations Monetary and Financial Conference. Almost a million acres of New Hampshire forest surrounded the site; there were free Coca-Cola dispensers, but few other distractions.

In this scene of rustic isolation, 168 statesmen (and one lone stateswoman, Mabel Newcomer of Vassar College) joined in history's most celebrated episode of economic statecraft, remaking the world's monetary order to fend off another Great Depression and creating an unprecedented multinational bank, to be focused on postwar reconstruction and development.

At the Final Plenary, a sea of black-tied delegates gave a standing ovation to British economist John Maynard Keynes, whose intellect had permeated the three weeks of talks. Lord Keynes paid tribute to his far-seeing colleagues, who had performed a task appropriate "to the prophet and to the soothsayer."

The Bretton Woods conference has acquired mythical status. To economic-history buffs, it's akin to the gathering of the founding fathers at the constitutional convention. To politicians anxious to make their marks upon the world, it's a moment to be richly envied. The recent calls from British Prime Minister Gordon Brown and French President Nicolas Sarkozy for a new Bretton Woods conference, to which the Bush administration has acceded, have caused TV crews to descend upon the old hotel, which has undergone a $50 million facelift. But Bretton Woods revivalism is nothing new. Indeed, it's a long tradition.

After the onset of the Latin debt crisis in 1982, U.S. Treasury Secretary Donald Regan floated the idea of a new Bretton Woods to steady the hemisphere's currencies. The following year, reeling from three devaluations of the franc, French President Francois Mitterrand declared, "The time has really come to think in terms of a new Bretton Woods. Outside this proposition, there will be no salvation." Mitterrand persisted in this grandiloquence over the next two years. He finally quieted down in 1985, when Margaret Thatcher dismissed his proposal as "generalized jabberwocky."

In the wake of the emerging-market crises of 1997-98, Bretton Woods nostalgia broke out again -- this time in post-Thatcher Britain. "We should not be afraid to think radically and fundamentally," Tony Blair opined. "We need to commit ourselves today to build a new Bretton Woods for the next millennium." The precise content of Mr. Blair's millennial ambition was, shall we say, vague. But no fellow leader was rude enough to say so.

Among acts of international economic statesmanship, perhaps only the Marshall Plan has been invoked more frequently. There have been calls for a Marshall Plan for postcommunist eastern Europe, a Marshall Plan for Africa, a Marshall Plan for the inner cities. Indeed, anybody wanting Washington to splurge finds Marshall exceedingly convenient.

But Bretton Woods has a richer and more rarefied cachet. It was about reordering the international system, not just mobilizing money for an enlightened cause. And whereas the Marshall Plan was an example of the unilateralism for which the U.S. is known, the Bretton Woods conference was a triumph of multilateral coordination. It featured countries as diverse as Honduras, Liberia and the Philippines (Keynes spoke disdainfully of a "most monstrous monkey-house"), though it did not include South Korea or Japan, important voices in today's economic summitry.

Both sides of the Bretton Woods achievement seem alluring today, yet both may be chimerical. The conference rebuilt the economic order by creating a system of fixed exchange rates. The aim was to prevent a return to the competitive devaluations best illustrated by the "butter wars." In 1930 New Zealand secured a cost advantage for its butter exports by devaluing its money; Denmark, its main butter rival, responded with its own devaluation in 1931; the two nations proceeded to chase each other down with progressively more drastic devaluations.

This beggar-thy-neighbor behavior added to the protectionism that brought the world to ruin, and the Bretton Woods answer was simple. In the postwar era, the dollar would be anchored to gold, and other currencies would be anchored to the dollar: No more fluctuating money, ergo no competitive devaluation. To undergird this system, the Bretton Woods architects created the International Monetary Fund, which was far more central to their ambitions than their other legacy, the World Bank. If a country's fixed exchange rate led it into a balance of payments crisis, the IMF would bail it out and so avert devaluation.

Today the idea of another monetary rebirth has much to recommend it. The credit bubble that has wreaked havoc on the world's financial markets has its origins in a two-headed monetary order: Some countries allow their currencies to float, while others peg loosely to the dollar. Over the past five years or so, this mixture created a variation on the 1930s: China, the largest dollar pegger, kept its currency cheap, driving rival exporters in Asia to hold their exchange rates down also. Thanks to this new version of competitive currency manipulation, the dollar-peggers racked up gargantuan trade surpluses. Their earnings were pumped back into the international financial system, inflating a credit bubble that now has popped disastrously.

Persuading China to change its currency policy would be a worthy goal for a new Bretton Woods conference. But currency reform is low on the agenda of the summit that the Bush administration plans to host on Nov. 15. (The administration styles this gathering a "G-20 meeting," ignoring the European talk of a Bretton Woods II.) The British and French leaders who pushed for the meeting want instead to talk about financial regulation -- how to fix rating agencies, how to boost transparency at banks and so on. But many of these tasks require minimal multilateral coordination.

If the Europeans shrink from demanding that China cease pegging to the dollar, it's perhaps because they anticipate the concession that would be asked of them. China isn't going to give up its export-led growth strategy for the sake of the international system unless it gets a bigger stake in that system -- meaning a much bigger voice within the International Monetary Fund and a corresponding reduction in Europe's exaggerated influence. When you strip out the blather about bank transparency and such, this is the core bargain that needs to be struck. Naturally, the Europeans aren't proposing it.

It will be up to the two great powers -- the U.S. and China -- to fashion the deal that brings China into the heart of the multilateral system. Here, too, is an echo of the first Bretton Woods, for underneath the camouflage of a multilateral process there was a bargain between two nations. Britain, the proud but indebted imperial power, needed American savings to underpin monetary stability in the postwar era; the quid pro quo was that the U.S. had the final say on the IMF's design and structure. Today the U.S. must play Britain's role, and China must play the American one.

There's a final twist, however. In the 1940s the declining power practiced imperial trade preferences; the rising power championed an open world economy. When Franklin Roosevelt told Winston Churchill that free trade would be the price of postwar assistance, he was demanding an end to the colonial order and the creation of a level playing field for commerce. "Mr. President, I think you want to abolish the British empire," Churchill protested. "But in spite of that, we know you are our only hope."

Today it is the rising power that pursues mercantilist policies via its exchange rate. China's leadership, which sits atop an astonishing $2 trillion in foreign-currency savings, could trade a promise to help recapitalize Western finance for an expanded role within the IMF. But China may simply not be interested. The future of the global monetary system depends on whether China aspires to play the role of Roosevelt -- or whether it prefers to be a modern Churchill.

Monday, October 27, 2008

What History Tells Us About the Market

Source: WSJ

July 9, 1932 was a day Wall Street would never wish to relive. The Dow Jones Industrial Average closed at 41.63, down 91% from its level exactly three years earlier. Total trading volume that day was a meager 235,000 shares. "Brother, Can You Spare a Dime," was one of the top songs of the year. Investors everywhere winced with the pain of recognition at the patter of comedian Eddie Cantor, who sneered that his broker had told him "to buy this stock for my old age. It worked wonderfully. Within a week I was an old man!"

The nation was in the grip of what U.S. Treasury Secretary Ogden Mills called "the psychology of fear." Industrial production was down 52% in three years; corporate profits had fallen 49%. "Many businesses are better off than ever," Mr. Cantor wisecracked. "Take red ink, for instance: Who doesn't use it?"

Banks had become so illiquid, and depositors so terrified of losing their money, that check-writing ground to a halt. Most transactions that did occur were carried out in cash. Alexander Dana Noyes, financial columnist at the New York Times, had invested in a pool of residential mortgages. He was repeatedly accosted by the ringing of his doorbell; those homeowners who could still keep their mortgages current came to Mr. Noyes to service their debts with payments of cold hard cash.

Just eight days before the Dow hit rock-bottom, the brilliant investor Benjamin Graham -- who many years later would become Warren Buffett's personal mentor -- published "Should Rich but Losing Corporations Be Liquidated?" It was the last of a series of three incendiary articles in Forbes magazine in which Graham documented in stark detail the fact that many of America's great corporations were now worth more dead than alive.

More than one out of every 12 companies on the New York Stock Exchange, Graham calculated, were selling for less than the value of the cash and marketable securities on their balance sheets. "Banks no longer lend directly to big corporations," he reported, but operating companies were still flush with cash -- many of them so flush that a wealthy investor could theoretically take over, empty out the cash registers and the bank accounts, and own the remaining business for free.

Graham summarized it this way: "...stocks always sell at unduly low prices after a boom collapses. As the president of the New York Stock Exchange testified, 'in times like these frightened people give the United States of ours away.' Or stated differently, it happens because those with enterprise haven't the money, and those with money haven't the enterprise, to buy stocks when they are cheap."

After the epic bashing that stocks have taken in the past few weeks, investors can be forgiven for wondering whether they fell asleep only to emerge in the waking nightmare of July 1932 all over again. The only question worth asking seems to be: How low can it go?

Make no mistake about it; the worst-case scenario could indeed take us back to 1932 territory. But the likelihood of that scenario is very much in doubt.

Robert Shiller, professor of finance at Yale University and chief economist for MacroMarkets LLC, tracks what he calls the "Graham P/E," a measure of market valuation he adapted from an observation Graham made many years ago. The Graham P/E divides the price of major U.S. stocks by their net earnings averaged over the past 10 years, adjusted for inflation. After this week's bloodbath, the Standard & Poor's 500-stock index is priced at 15 times earnings by the Graham-Shiller measure. That is a 25% decline since Sept. 30 alone.

The Graham P/E has not been this low since January 1989; the long-term average in Prof. Shiller's database, which goes back to 1881, is 16.3 times earnings.

But when the stock market moves away from historical norms, it tends to overshoot. The modern low on the Graham P/E was 6.6 in July and August of 1982, and it has sunk below 10 for several long stretches since World War II -- most recently, from 1977 through 1984. It would take a bottom of about 600 on the S&P 500 to take the current Graham P/E down to 10. That's roughly a 30% drop from last week's levels; an equivalent drop would take the Dow below 6000.

Could the market really overshoot that far on the downside? "That's a serious possibility, because it's done it before," says Prof. Shiller. "It strikes me that it might go down a lot more" from current levels.

In order to trade at a Graham P/E as bad as the 1982 low, the S&P 500 would have to fall to roughly 400, more than a 50% slide from where it is today. A similar drop in the Dow would hit bottom somewhere around 4000.

Prof. Shiller is not actually predicting any such thing, of course. "We're dealing with fundamental and profound uncertainties," he says. "We can't quantify anything. I really don't want to make predictions, so this is nothing but an intuition." But Prof. Shiller is hardly a crank. In his book "Irrational Exuberance," published at the very crest of the Internet bubble in early 2000, he forecast the crash of Nasdaq. The second edition of the book, in 2005, insisted (at a time when few other pundits took such a view) that residential real estate was wildly overvalued.

Financial History

The professor's reluctance to make a formal forecast should steer us all away from what we cannot possibly know for certain -- the future -- and toward the few things investors can be confident about at this very moment.

Strikingly, today's conditions bear quite a close resemblance to what Graham described in the abyss of the Great Depression. Regardless of how much further it might (or might not) drop, the stock market now abounds with so many bargains it's hard to avoid stepping on them. Out of 9,194 stocks tracked by Standard & Poor's Compustat research service, 3,518 are now trading at less than eight times their earnings over the past year -- or at levels less than half the long-term average valuation of the stock market as a whole. Nearly one in 10, or 876 stocks, trade below the value of their per-share holdings of cash -- an even greater proportion than Graham found in 1932.

Those numbers testify to the wholesale destruction of the stock market's faith in the future. And, as Graham wrote in 1932, "In all probability [the stock market] is wrong, as it always has been wrong in its major judgments of the future."

In fact, the market is probably wrong again in its obsession over whether this decline will turn into a cataclysmic collapse. Eugene White, an economics professor at Rutgers University who is an expert on the crash of 1929 and its aftermath, thinks that the only real similarity between today's climate and the Great Depression is that, once again, "the market is moving on fear, not facts." As bumbling as its response so far may seem, the government's actions in 2008 are "way different" from the hands-off mentality of the Hoover administration and the rigid detachment of the Federal Reserve in 1929 through 1932. "Policymakers are making much wiser decisions," says Prof. White, "and we are moving in the right direction."

Investors seem, above all, to be in a state of shock, bludgeoned into paralysis by the market's astonishing volatility. How is Theodore Aronson, partner at Aronson + Johnson + Ortiz LP, a Philadelphia money manager overseeing some $15 billion, holding up in the bear market? "We have 101 clients and almost as many consultants representing them," he says, "and we've had virtually no calls, only a handful." Most of the financial planners I have spoken with around the country have told me much the same thing: Their phones are not ringing, and very few of their clients have even asked for reassurance. The entire nation, it seems, is in the grip of what psychologists call "the disposition effect," or an inability to confront financial losses. The natural way to palliate the pain of losing money is by refusing to recognize exactly how badly your portfolio has been damaged. A few weeks ago, investors were gasping; now, en masse, they seem to have gone numb.

The market's latest frame of mind seems reminiscent of a passage from Emily Dickinson's poem "After Great Pain a Formal Feeling Comes":

This is the Hour of Lead --
Remembered, if outlived,
As Freezing persons recollect the Snow --
First -- Chill -- then Stupor -- then the letting go.

This collective stupor may very likely be the last stage before many investors finally let go -- the phase of market psychology that veteran traders call "capitulation." Stupor prevents rash action, keeping many long-term investors from bailing out near the bottom. When, however, it breaks and many investors finally do let go, the market will finally be ready to rise again. No one can spot capitulation before it sets in. But it may not be far off now. Investors who have, as Graham put it, either the enterprise or the money to invest now, somewhere near the bottom, are likely to prevail over those who wait for the bottom and miss it.

Corrections & Amplifications:

Charles Schwab Corp. had $600 million of cash freely available at the parent-company level as of June 30. An earlier version of this story incorrectly said the total was $27.8 billion.

Friday, October 17, 2008

Fed Rethinks Stance on Popping Bubbles

Source: WSJ

The Federal Reserve and academics who give it advice are rethinking the proposition that the Fed cannot and should not try to prick financial bubbles.

"[O]bviously, the last decade has shown that bursting bubbles can be an extraordinarily dangerous and costly phenomenon for the economy, and there is no doubt that as we emerge from the financial crisis, we will all be looking at that issue and what can be done about it," Fed Chairman Ben Bernanke said this week.

The bursting of this decade's housing bubble, which was accompanied by a bubble of cheap credit, has wrought inestimable economic damage. The U.S. economy was faltering before the crisis in credit markets recently intensified, rattling financial markets and sending home prices down further. Even if the government's decision to take stakes in major banks works, it could take weeks for money to flow freely again.

"A recession at least of the magnitude of 1982 is quite likely," said ITG economist Robert Barbera. The recession that ended in 1982 lasted 16 months -- twice as long as the 1991 and 2001 recessions -- and saw the unemployment rate rise to 10.8% from 7.2%.

While it is too soon to pronounce an about-face in Fed thinking, policy makers' views clearly are evolving. The Federal Reserve's longtime line on financial bubbles has been that they were impossible to identify. Even if the central bank could identify a bubble, policy makers said, trying to lance it would be far worse for the economy than letting the bubble run its course and dealing with the consequences.

Economists' view that central banks shouldn't meddle with financial bubbles was informed by the Fed's disastrous efforts to pop the stock-market bubble in the late 1920s, which led to the 1929 stock-market crash and contributed to the Great Depression. That view was reinforced when the Bank of Japan's pricking of the late 1980s' stock-market bubble shepherded in a decade of economic stagnation.

"[T]he degree of monetary tightening that would be required to contain or offset a bubble of any substantial dimension appears to be so great as to risk an unacceptable amount of collateral damage to the wider economy," former Fed Chairman Alan Greenspan said in 2002.

The Fed's view on bubbles helped fuel what became known as "the Greenspan put" -- the conviction among investors that the Fed would let them take excessive risks and step in as custodian if the bets they made went awry. By giving market participants an incentive to assume greater risk than they would have otherwise, the Fed's laissez-faire position on bubbles may have contributed to the surge in credit that helped push housing prices skyward in the first half of this decade.

Part of the problem was that the Fed applied the lessons of the dot-com bubble to housing and credit, says Harvard University economist Jeremy Stein. When Internet stock prices collapsed in 2000, the economic fallout was contained, because the use of leverage -- borrowing money to magnify bets -- was limited. The housing market is far more dependent on credit, and therefore leverage. As the issuance of mortgages expanded, and investors plunged money into complex securities based on those loans, matters got dangerously out of hand.

Identifying bubbles is tricky, with some seemingly irrational price spikes turning out to be justified. Policy makers need to be careful of valuing their judgment over the collective judgment of the market, because efforts to quash prices could interfere with the crucial role markets play in relaying information and allocating capital.

In recent years, economists have made headway in identifying incipient bubbles. Princeton University's Jose Scheinkman and Wei Xiong have shown how bubbles lead to overtrading -- whether day trading dot-com stocks or flipping condos -- and this might be a useful alert. Researchers at the Bank for International Settlements have flagged excessive credit growth signaling a bubble.

Once authorities identify a bubble, the next step is figuring out how to deal with it. Fed officials appear uncomfortable with the idea of raising interest rates to prick a bubble, because rates affect a wide swath of economic activity, and a bubble may be confined to just one area.

"Monetary policy, for which we in the Federal Reserve are responsible, is a blunt instrument with economy-wide effects," said Federal Reserve Bank of Minneapolis President Gary Stern. "We should not pretend that actions taken to rein in those asset-price increases, which seemingly outstrip economic fundamentals, won't in the short run curtail to some extent economic growth and employment."

Fed officials are leaning toward regulating financial firms with more of a focus on how they are contributing to risk throughout the financial system. This approach could also have drawbacks, said Princeton economist Hyun Song Shin.

"These Wall Street people are very intelligent, and their incentives are so vast that they're going to find a way to go around the rules you set down," he said. "Leaning against the wind by raising interest rates in the face of what seems like a credit boom is one way of at least damping down on potential excesses."

Wednesday, October 15, 2008

A Short Banking History of the United States

Source: WSJ, by John Steele Gordon

We are now in the midst of a major financial panic. This is not a unique occurrence in American history. Indeed, we've had one roughly every 20 years: in 1819, 1836, 1857, 1873, 1893, 1907, 1929, 1987 and now 2008. Many of these marked the beginning of an extended period of economic depression.

[Opinion] The Granger Collection

President Andrew Jackson destroying the Bank of the United States. Lithograph, 1828.

How could the richest and most productive economy the world has ever known have a financial system so prone to periodic and catastrophic break down? One answer is the baleful influence of Thomas Jefferson.

Jefferson, to be sure, was a genius and fully deserves his place on Mt. Rushmore. But he was also a quintessential intellectual who was often insulated from the real world. He hated commerce, he hated speculators, he hated the grubby business of getting and spending (except his own spending, of course, which eventually bankrupted him). Most of all, he hated banks, the symbol for him of concentrated economic power. Because he was the founder of an enduring political movement, his influence has been strongly felt to the present day.

Consider central banking. A central bank's most important jobs are to guard the money supply -- regulating the economy thereby -- and to act as a lender of last resort to regular banks in times of financial distress. Central banks are, by their nature, very large and powerful institutions. They need to be to be effective.

Jefferson's chief political rival, Alexander Hamilton, had grown up almost literally in a counting house, in the West Indian island of St. Croix, managing the place by the time he was in his middle teens. He had a profound and practical understanding of markets and how they work, an understanding that Jefferson, born a landed aristocrat who lived off the labor of slaves, utterly lacked.

Hamilton wanted to establish a central bank modeled on the Bank of England. The government would own 20% of the stock, have two seats on the board, and the right to inspect the books at any time. But, like the Bank of England then, it would otherwise be owned by its stockholders.

To Jefferson, who may not have understood the concept of central banking, Hamilton's idea was what today might be called "a giveaway to the rich." He fought it tooth and nail, but Hamilton won the battle and the Bank of the United States was established in 1792. It was a big success and its stockholders did very well. It also provided the country with a regular money supply with its own banknotes, and a coherent, disciplined banking system.

But as the Federalists lost power and the Jeffersonians became the dominant party, the bank's charter was not renewed in 1811. The near-disaster of the War of 1812 caused President James Madison to realize the virtues of a central bank and a second bank was established in 1816. But President Andrew Jackson, a Jeffersonian to his core, killed it and the country had no central bank for the next 73 years.

We paid a heavy price for the Jeffersonian aversion to central banking. Without a central bank there was no way to inject liquidity into the banking system to stem a panic. As a result, the panics of the 19th century were far worse here than in Europe and precipitated longer and deeper depressions. In 1907, J.P. Morgan, probably the most powerful private banker who ever lived, acted as the central bank to end the panic that year.

Even Jefferson's political heirs realized after 1907 that what was now the largest economy in the world could not do without a central bank. The Federal Reserve was created in 1913. But, again, they fought to make it weaker rather than stronger. Instead of one central bank, they created 12 separate banks located across the country and only weakly coordinated.

No small part of the reason that an ordinary recession that began in the spring of 1929 turned into the calamity of the Great Depression was the inability of the Federal Reserve to do its job. It was completely reorganized in 1934 and the U.S. finally had a central bank with the powers it needed to function. That is a principal reason there was no panic for nearly 60 years after 1929 and the crash of 1987 had no lasting effect on the American economy.

While the Constitution gives the federal government control of the money supply, it is silent on the control of banks, which create money. In the early days they created money both through making loans and by issuing banknotes and today do so by extending credit. Had Hamilton's Bank of the United States been allowed to survive, it might well have evolved the uniform regulatory regime a banking system needs to flourish.

Without it, banking regulation was left to the states. Some states provided firm regulation, others hardly any. Many states, influenced by Jeffersonian notions of the evils of powerful banks, made sure they remained small by forbidding branching. In banking, small means weak. There were about a thousand banks in the country by 1840, but that does not convey the whole story. Half the banks that opened between 1810 and 1820 had failed by 1825, as did half those founded in the 1830s by 1845.

Many "wildcat banks," so called because they were headquartered "out among the wildcats," were simple frauds, issuing as many banknotes as they could before disappearing. By the 1840s there were thousands of issues of banknotes in circulation and publishers did a brisk business in "banknote detectors" to help catch frauds.

The Civil War ended this monetary chaos when Congress passed the National Bank Act, offering federal charters to banks that had enough capital and would submit to strict regulation. Banknotes issued by national banks had to be uniform in design and backed by substantial reserves invested in federal bonds. Meanwhile Congress got the state banks out of the banknote business by putting a 10% tax on their issuance. But National banks could not branch if their state did not allow it and could not branch across state lines.

Unfortunately state banks did not disappear, but proliferated as never before. By 1920, there were almost 30,000 banks in the U.S., more than the rest of the world put together. Overwhelmingly they were small, "unitary" banks with capital under $1 million. As each of these unitary banks was tied to a local economy, if that economy went south, the bank often failed. As depression began to spread through American agriculture in the 1920s, bank failures averaged over 550 a year. With the Great Depression, a tsunami of bank failures threatened the collapse of the system.

The reorganization of the Federal Reserve and the creation of the Federal Deposit Insurance Corporation hugely reduced the number of bank failures and mostly ended bank runs. But there remained thousands of banks, along with thousands of savings and loan associations, mutual savings banks, and trust companies. While these were all banks, taking deposits and making loans, they were regulated, often at cross purposes, by different authorities. The Comptroller of the Currency, the Federal Reserve, the FDIC, the FSLIC, the SEC, the banking regulators of the states, and numerous other agencies all had jurisdiction over aspects of the American banking system.

The system was stable in the prosperous postwar years, but when inflation took off in the late 1960s, it began to break down. S&Ls, small and local but with disproportionate political influence, should have been forced to merge or liquidate when they could not compete in the new financial environment. Instead Congress made a series of quick fixes that made disaster inevitable.

In the 1990s interstate banking was finally allowed, creating nationwide banks of unprecedented size. But Congress's attempt to force banks to make home loans to people who had limited creditworthiness, while encouraging Fannie Mae and Freddie Mac to take these dubious loans off their hands so that the banks could make still more of them, created another crisis in the banking system that is now playing out.

While it will be painful, the present crisis will at least provide another opportunity to give this country, finally, a unified banking system of large, diversified, well-capitalized banking institutions that are under the control of a unified and coherent regulatory system free of undue political influence.