The Statesman’s Yearbook Online

edited by Dr Barry Turner



FOCUS Archive

The Eurozone Crisis – A Tale of North and South

The North–South divide on the future of the euro is damaging the EU, says Ansgar Belke. In this article, he suggests a compromise by which the two sides might settle their differences to mutual advantage. For more information about The New Palgrave Dictionary of Economics visit the site.

After the European summit of June 2012 decided to break the vicious circle between banks and sovereign states, it seemed that political leaders were at last ready to deal with the threat to the euro. But optimism was soon lost in the cacophony of rival interpretations about what had been agreed. Still, the leaders had identified the critical issue: weak banks and weak sovereign states are like two bad swimmers that are pulling each other under water.

But which one should be saved first? Advocates of the Southern view say we should start with the sovereign states, by throwing them the lifejacket of joint-issued debt. In effect, richer countries would guarantee at least part of the debt of weaker ones.

Representatives of the Northern opinion, especially Germany, reckon instead that it is better to start by saving the banks. This would be done through stronger central supervision and the mutualisation of some liabilities in the banking sector, for instance through a joint fund to wind up failing banks and provide a Europe-wide guarantee of bank deposits. In effect, depositors in solid banks would be guaranteeing the savings of those in more fragile ones.

The Southern view is held by countries including Greece, Italy, Portugal and Spain and, since François Hollande took office, France. The Northern approach is taken by Germany, Austria, Finland and the Netherlands and was taken, while Nicolas Sarkozy was president, by France. Both sides recognize the danger that debt mutualisation could bring moral hazard (when protective measures remove the incentive to curb risky behaviour) and higher costs for creditor countries. For the North there is no getting around these problems. For the South these risks can be removed, or at least mitigated, by a careful design of the system. For instance, the Eurozone could impose conditions on countries seeking the benefit of jointly issued debt.

The South considers the panic that can increase borrowing costs and push countries into insolvency as the main threat to the Eurozone. The North reckons that the principal menace stems from removing this market pressure too quickly, dampening the need to reform.

Both speak of the political backlash. For the South it is excessive austerity in debtor nations that should be resisted; for the North it is excessive liabilities in creditor states that can cause resentment.

In some ways, though, the two sides are not so far apart. The North concedes that it is necessary to have some mutualisation of debt, if only to recapitalize banks. The South accepts that debt mutualisation must be limited to avoid moral hazard.

The Southern view: some basics
The main argument of the South runs as follows: since the 1970s economists have warned that a budgetary union would be a necessity for a sustainable monetary union. But the founders of the Eurozone ignored this warning. It is now clear that they were mistaken and that the governments of the euro area member countries face a hard choice. Either they move to a budgetary union or they abandon the euro. A disintegration of the Eurozone would produce huge economic, social and political upheaval. If euro area governments want to avoid this they have to look for strategies that move us closer towards a budgetary union.

A budgetary union, such as that of the US states, appears to be far off. But perhaps there is a strategy of taking small steps that lead us in the right direction. The Southern argument starts with the basic insight that Eurozone governments issue debt in euros, which is a currency they cannot control. In contrast, standalone countries like the UK endow bondholders with a guarantee that the cash to pay them at maturity will always be available. The fact that governments of the Eurozone are unable to deliver such a guarantee makes them vulnerable to upsurges of distrust and fear in the bond markets. This can trigger liquidity crises that drive countries towards default, forcing them to apply austerity programmes that lead to recession and a collapse of weaker banks. This is not to say that countries that have overspent in the past do not have to apply austerity. It is rather that financial markets, when driven by panic, force austerity on these countries with an intensity that can trigger major social and political backlashes. The effects are there to see in Greece, Italy, Spain and Portugal.

Proponents of the Southern view argue that some form of pooling of government debt is necessary to overcome this failure. Thereby, the weakest in the union are shielded from the destructive upsurges of panic in the financial markets of a monetary union.

They acknowledge that those that profit from the creditworthiness of the strong countries may exploit this by failing to reduce debts and deficits. The second obstacle is that the strongest countries will pay a higher interest rate on their debts as they become jointly liable for the debts of governments with lower creditworthiness. Thus debt pooling must be designed in such a way as to overcome these obstacles.

Moderate proponents of the Southern view agree, apparently in line with the Merkel government in Germany, that three principles should be followed. First, debt pooling should be partial – that is, a significant part of the debt must remain the responsibility of the national governments, so as to give them an on-going incentive to reduce debts and deficits. Second, an internal transfer mechanism between the members of the pool must ensure that the less creditworthy countries compensate (at least partially) the more creditworthy ones. Third, a tight control mechanism on the progress of national governments in achieving sustainable debt levels must be an essential part of debt pooling.

The Northern view holds that the mutualisation of the Eurozone’s debt to bring about the convergence of interest rates will not in the long-run tackle the root of the problem. Instead it has the potential to sow the seeds of an even larger crisis. This is what happened in the early years of the euro. A lack of discipline in countries such as Greece and Portugal was matched by the build-up of asset bubbles in other member countries, such as Spain and Ireland. Structural reforms were delayed, while wages outstripped productivity growth. The consequence was a huge loss of competitiveness at the periphery which cannot be resolved by the mutualisation of debt.

Debt mutualisation can take different forms. One is to mutualise new sovereign debt through Eurobonds. Another is to absorb part of the old debt, as advocated by the German Council of Economic Advisors, into a partly gold-backed European Redemption Fund. A third means is to activate the Eurozone's 'firewall' by using rescue funds (either the temporary European Financial Stability Facility or the permanent European Stability Mechanism) to buy sovereign bonds or to inject capital directly into distressed banks. Indeed, the ECB is already engaged in a hidden form of mutualisation—of risk if not (yet) of actual debt—through its programmes of sovereign bond purchases and its long-term refinancing operations for banks.

The view of the North is that almost all these are bound to fail, either for economic or political reasons, or both. Even financially strong countries cannot agree to open-ended commitments that could endanger their own financial stability or, given that they are the main guarantors, the stability of the bailout funds. Also the danger of moral hazard is ever-present.

Then again, any form of debt mutualisation involves an element of subsidy, which severely weakens fiscal discipline: the interest rate premium on bonds of fiscally weaker countries declines and the premium for stronger countries increases. Fiscally solid countries are punished and less solid ones, in turn, are rewarded for their lack of fiscal discipline and excess private and public consumption.

If yields are too low there is no incentive for private investors to buy sovereign bonds. The countries risk becoming decoupled from the capital markets permanently and the debt problems become increasingly structural.

This is true also for the ECB's bond-buying activities. The credit risk is rolled over from the bonds of the weaker countries to those of the stronger ones, and the ECB is made responsible for its liability. Over time, the ECB's measures might even be inflationary. Having the rescue funds buy bonds is little different, except that they lack the lending capacity to be credible. If they are given a banking licence, as demanded by France’s President Hollande, it would be no different from having the ECB buy bonds directly.

What about the European Redemption Fund (ERP) from the Northern perspective? This type of fund could be of particular help to Italy, which could unload half of its debt. But its partners could not force Italy to tax its citizens to ensure that it pays back the dormant debt. And with the assumption of debt, the credit rating of Germany might drop, owing to the increase in the German interest burden. The pressure on Italy and Spain to consolidate their budgets sustainably would be reduced. Meanwhile, the problems of Greece, Ireland and Portugal would not be solved, since these countries are unlikely to qualify for the ERP.

In addition to moral hazard, there are political obstacles, which would be most acute in the case of Eurobonds. Germany demands political union before Eurobonds can be considered. But it is sometimes said that this is putting the cart before the horse: a political union cannot be created simply to justify Eurobonds. Advocates from the Merkel government, like Finance Minister Wolfgang Schäuble, say treaty changes and high-level political agreements would be sufficient to make sure that euro area member countries comply with all decisions taken at the euro area level. This became clear when Schäuble came up with a plan to bolster the power of the EU’s economic and monetary affairs commissioner. Even Mario Draghi, President of the European Central Bank, has supported this German scheme to allow the EU to intervene in countries’ budgets and propose changes before they are agreed in parliaments. But the experience with Greece’s adjustment casts severe doubt on the practicality of such a proposal.

The differences between Eurozone members—on everything from respect for the rule of law to administrative capacity—are so great that political union is unlikely to work, at least in the next couple of years. It follows from the perspective of the North that the basis for Eurobonds is extremely thin.

According to the Northern or German view, the introduction of Eurobonds would in principle have to be backed by tight oversight of national fiscal and economic policies. But there is no true enforcement as long as the individual Eurozone members remain sovereign.

Intervening directly in the fiscal sovereignty of member states would require a functioning pan-European democratic legitimacy, but we are far from that. Voters in Southern countries can reject the strong conditionality demanded by Brussels at any time, while those of Northern countries can refuse to keep paying for the South. And either can choose to exit the Eurozone.

The emphasis on pushing through a fiscal union as a precondition for debt mutualisation means the debate, at least in Germany, has become a question of 'all or nothing': either deeper political union or deep chaos. This narrows the strategic options for the players and reinforces the North–South divide.

However, there is an alternative to cooperative fiscal federalism involving bailouts and debt mutualisation. This is competition-based fiscal federalism, of the sort successfully operating in the USA, Canada and Switzerland, among others. These countries have largely avoided serious and sustained public debt in their component states. Sub-federal entities faced with insolvency have the incentive to take early corrective action–without having to engage in centralized fiscal policy coordination. This seems to be a compromise between the Southern and Northern views.

To achieve this sort of federalism, it is necessary to separate the fate of the banks from that of the sovereign states. What is needed is not a fiscal union in first instance, but a banking union. It should be based on four elements: a European bank with far-reaching powers to intervene; reformed banking regulations with significantly higher equity capital standards; a banking resolution fund; and a European deposit insurance scheme.

A less comprehensive, more clearly delineated banking union should be more acceptable for the North than the Europeanization of fiscal policy as a whole. This is because it touches upon only a small fraction of the fiscal policy areas which have to be subordinated to central control in a fiscal union.

Obviously, a central resolution authority has to be endowed with the resources to wind up large cross-border banks. Where does the money for this come from? In the long run, the existence of a resolution authority goes along with a deposit insurance scheme for cross-border banks. This should—according to the German view—be funded partly by the banking industry.

With the banking system and the debt crisis thus disentangled, banking sector losses will no longer threaten to destroy the solvency of solid sovereign states such as Ireland and Spain. Eurobonds will then not be needed, and neither will the bailout of sovereign states. The debt of over-indebted states could be restructured, which means that the capital market could exert stronger discipline on borrowers.

There are two questions yet to be resolved. If the banking sector is really to be stabilized, a solution will surely have to deal with the devalued sovereign debt that some are holding. Would the banks not be better off holding at least some Eurobonds instead of, say, Greek or Spanish bonds? That said, Southern economists who advocate Eurobonds need to find a way of making them politically acceptable. And how much political union is feasible, or even desirable, just for the sake of a single currency that many never loved? Critically, where does the burden end up?

Extracted from The New Palgrave Dictionary of Economics, Online Edition, , edited by Steven N. Durlauf and Lawrence E. Blume; 2013; Palgrave Macmillan

From China with Love

For the Western democracies, the Chinese economic miracle is a mixed blessing. In his new book, Myths, Politicians and Money, Bryan Gould puts up some warning signals.

In the West, of course, dominated as we are by the Anglo-American model of capitalism, a close relationship between government and the private sector is regarded as anathema. The Western view has been that the best thing government can do for industry is to 'get off our backs'. Government intervention is almost invariably seen as unhelpful; second-guessing an infallible market, it is said, will always produce worse results than if it had been left to itself.

There can hardly be a starker contrast than with the approach followed by the Chinese government. To explore that contrast, and to ask the obvious questions, is not to endorse or commend all the Chinese have done and are doing. But it is surely prudent to recognize that the Chinese have achieved an economic performance that is already world-beating and is likely to overwhelm us and that they have done so while pursuing a very different political and ideological approach from our own. No dispassionate observer, comparing the West's recent history and immediate prospects with those of China, could possibly say that we have nothing to learn from the Chinese.

So how have the Chinese done it? In many respects, there is no mystery. A government that has virtually guaranteed stability and continuity is able to take a long strategic view. A government that sees little need to curry favour with voters or with particular interest groups has been free to pursue a single-minded objective—the economic development of the country. A government that can take decisions irrespective of the civil or property rights of individual citizens has been able to plan solely in accordance with those economic goals.

They have used that freedom of decision and action to be quite ruthless, and have accordingly attracted severe criticism from trade partners. A case in point has been their policy on the foreign exchange value of their currency. The Chinese Renminbi is still not fully convertible and its value is accordingly established by the direction of the Chinese government. By pegging its value to the US dollar for a long period, they have been able to take advantage, in terms of the competitive pricing of their exports, of the fall in the dollar's value.

There can be little doubt that the renminbi is substantially undervalued and that this is a deliberate element in Chinese trade policy. The size and persistence of the Chinese trade surplus is incontrovertible evidence of that undervaluation. The situation is reminiscent of the German and Japanese trade surpluses before the Second World War, which Keynes and others correctly characterized as a powerful and aggressive assault on the economic power of the USA and the UK. Keynes was clear that the creditor countries were as much to blame as debtor countries for the trade imbalances that threatened world peace. The Chinese government has a well-developed strategic view regarding where the national economy can and should develop, and literally every economic actor in China is required to comply with that strategy.

The Chinese government takes full responsibility for macroeconomic policy. It determines monetary policy (principally interest rates) and controls exchange rates and capital flows in and out of the country. It relies greatly on fiscal policy, principally public spending levels and taxation, to control inflation and to target sustainable growth rates. It exercises close control over the banking system and directs it to create large volumes of credit which are then channelled into investment in new productive capacity.

This is all quite different from the attitude of Western governments. In line with the general antipathy to allowing or recognizing either the actuality or possibility that governments might be able to help strategically in identifying what is needed for economic success, macroeconomic policy is almost totally ignored in Western countries. What passes for macroeconomic policy is limited to delegating to unelected and therefore unaccountable bankers the responsibility for fixing interest rates as part of a narrowly focused emphasis on controlling inflation. Everything else is left to the market. But if the Chinese are to achieve the living standards they want, comparable with those in the West, they are going to need access to a much larger share of the world's resources than they currently command. The resources they need now and will need even more in the future if they are to achieve their goals are in most cases finite—minerals and agricultural land, to name but two examples.

The best time to achieve that access and guarantee it into the future is to buy it now, when assets in most Western economies are relatively cheap and when China itself is cash-rich. But Chinese leaders will calculate that it is not enough to sign trade deals or conclude contracts to buy the products they need. If their future development is to be guaranteed, they need control over and ownership of the means of production. In other words, China's goals can only be achieved at the expense of others. A greater share of the world's finite resources for China means a smaller share for others. Chinese outward foreign direct investment is rising fast, and will go on rising. Nearly 20 per cent of the US$227bn. total Chinese outward FDI was made in 2009, all the more remarkable in view of the overall fall in global FDI in that year. It is targeted at industrial capacity—particularly high-tech capacity—in the USA.

It is even more obviously focused on mineral resources in Australia, where Chinese investment has increased dramatically. The Australians have become increasingly wary of such investment. For example, a Chinese bid to gain control of the world's largest deposit of rare earths was blocked in 2009 by the Australian Foreign Investment Review Board. Rare earths are an essential element in much modern electronic communications technology and China already controls 95 per cent of the world's rare earth reserves.

And it extends beyond Australia (where Chinese purchases of Australian farms have risen tenfold) even to a small economy such as New Zealand, where Chinese interest in food production has risen significantly. Some Chinese efforts to buy up not just dairy products but also dairy farms and production processes in New Zealand have aroused public anxiety.

The purchase of the farms was just one element in a total process which would take dairy production off Chinese-owned farms to be processed in Chinese-owned factories in New Zealand and then transported directly to be marketed to Chinese consumers. The farms would remain physically in New Zealand, and some local labour would be employed; but, to all intents and purposes, that element of New Zealand's dairy production would have been integrated into the Chinese economy. The farms might as well have been re-located in Zhejiang province.

Some firms are set up specifically to obtain overseas contracts as a means of extending Chinese influence, particularly in developing countries. The telecommunications and IT giant Huawei, which has set itself the goal of becoming a world leader in the field, has made huge strides in that direction. It is headed by a former senior officer in the People's Liberation Army, and it is reasonable to assume that it has close links to the senior echelons of the Chinese government. Concerns in the USA and Australia are such that Huawei has been excluded from participation in sensitive contracts for fear that national security might be compromised.

Chinese firms are often able to offer favourable prices and financing arrangements because their commercial operations are in effect guaranteed by cheap funding, which, in turn, is guaranteed by a virtually inexhaustible government purse. Chinese firms are already by far the biggest international infrastructure contractors, with strongly entrenched dominant positions throughout Africa and in Eastern Europe in particular. The Chinese government is able to combine these contractual arrangements with claims to be a significant aid donor to poor countries, particularly in regions such as the South Pacific.

The West should understand that a bid for a strategic asset by a Chinese firm may not be just a matter of a private firm taking advantage of a commercial opportunity. It may be part of a much wider picture in which the firm and its bid are to be seen as elements in a government-directed strategy to secure national goals. The economies of other, smaller countries could in effect by absorbed into the greater Chinese economy and be directed from Beijing.

None of this means that we should regard Chinese development as unalloyed bad news. Our concern should be to make sensible and prudent responses in our own interests while encouraging the Chinese to develop in a direction of mutual benefit rather than conflict.

The signs in this regard are not entirely discouraging. By contrast with the experience of the Great Depression, when American protectionism helped to drive the world economy into reverse, the Chinese economy has remained —through the current recession—dynamic, open and increasingly market-driven.

Moreover, the Chinese have their own reasons for changing course, at least to some extent. The Chinese economy is at present seriously unbalanced. Odd though it may seem to Westerners accustomed to concerns about a damaging emphasis on consumption rather than exports, the Chinese have the opposite problem.

A Chinese refusal to allow an appreciation of their currency so that their extreme and unfair competitiveness is reduced is likely to present them with an inflationary problem, which will do the job for them. One way or another, we are likely to see a rebalancing of the Chinese economy, towards domestic consumption and away from exporting, over the coming years.

At the same time, the Chinese face a problem that is familiar in the West but takes a more virulent form in China. Western countries recognize that care for the elderly will become increasingly expensive in the future, as the elderly constitute a higher and higher proportion of the population; in China, that problem is greatly exacerbated by the impact of the 'one child per family' policy of recent decades. A smaller workforce in future years will have to shoulder the burden of looking after an increasingly long-lived older generation.

The future may not, in other words, be as clear cut as we assume, and we should not forget the example of Japan. I recall spending time in Japan in 1980, at a time when the Japanese economy looked very much like today's Chinese economy, albeit on a smaller scale. The air was thick with predictions that Japan would overtake the USA as the world's largest economy by the turn of the century. We now know that those predictions came to nought—and it may be that China, as it emerges from the rapid growth of its initial development phase, will also find the going increasingly tough. With Chinese wages and raw material costs rising fast, and major firms such as General Electric reviewing the comparative advantages of manufacturing in the USA rather than China, it may be that the era of easy growth for China will come to an end.

This suggests that we should concentrate on trying to ensure that China is drawn into global efforts to regulate the world economy, by reforming the international monetary system and dealing with trade imbalances, and to achieve environmentally sustainable development. We need to strike a balance between accepting China's legitimate claim to a fair share of the world's scarce natural resources and our right to manage our own affairs.

While China is the most successful and significant of the new Asian powerhouses, India, Korea, Taiwan, Singapore, and increasingly Malaysia, Indonesia and Thailand, have all begun a rapid transformation. In most cases, economic success has been achieved by building a powerful partnership between government and private enterprise—something that runs counter to most current Western opinion as to the best way to produce economic efficiency.

Brazil, Russia and South Africa are also making impressive progress, notwithstanding both actual and threatened global recession—and they, too, have depended less on finance from Western financial institutions, following Western prescriptions less slavishly than might have been expected twenty years ago. We can now see a number of regimes enjoying wide popular support for policies which give priority to sharing resources more equally, raising living standards for the poor, strengthening public services and defying the power of multinational corporations.

The successful pursuit of policies that are affronts to the basic tenets of neo-liberalism has done little to shift opinion among Western business and political leaders, who prefer to avert their gaze from the evidence before them. But the rise of successful economies and healthier societies which pay little attention to the supposed triumph of Western ideology is at the very least a profound shock to the Fukuyama prediction that history is now behind us.

Extracted from Myths, Politicians and Money by Bryan Gould; published by Palgrave Macmillan

How China Became Capitalist

Ronald Coase and Ning Wang examine China's embrace of capitalism. For more information about How China Became Capitalist visit the site.

When Mao Zedong, founder of the People's Republic of China, died on 9 Sept. 1976, China was in the midst of a Cultural Revolution that was meant to rejuvenate socialism, ridding it of capitalist corruption and bureaucratic rigidity. Mao believed that China could shrug off poverty and jump on to the 'golden highway' to socialism if the Chinese people, united in thought and action, threw all their talents and energy behind the collective cause. Instead, Mao's deeply flawed ideology reduced enterprising people to lifeless cogs in the socialist machines.

So it was that China started its post-Mao journey with no roadmap and no destination in mind.

The need for reform was urgent but, unable to contemplate eradicating communism and starting afresh, the policy was to adjust the existing system while learning from different models of capitalism.

Once the Chinese people were freed from the shackle of ideology, they were able to catch up quickly. The setting up of Special Economic Zones and the inflow of foreign direct investment quickly pulled millions out of poverty and raised the living standards for a quarter of humanity. These remarkable outcomes have convinced other countries, including India and Vietnam, of the benevolence of the market and the folly of state planning.

Moreover, the Chinese market transformation has opened up new horizons for global capitalism. As a rising economic power, China is now contributing to the development of many countries in Central and Southeast Asia, Latin America and Africa, whose economies have been increasingly integrated with the Chinese market. The operation of a vibrant and distinctive market economy in China makes a compelling case that capitalism can take root and flourish in an ostensibly non-Western society. By breaking the West's monopoly on capitalism, China helps to globalize capitalism and fortifies the global market order by adding cultural diversity.

Drawing upon its rich and long traditions in commerce and private entrepreneurship, capitalism with Chinese characteristics will continue to strike out on its own way. But what exactly is capitalism with Chinese characteristics? Most commentators have focused on the visible hand of the Chinese government and the remaining monopoly power of the Chinese Communist Party as the defining features. While these are undeniably important, they do not hold the key to understanding capitalism in China.

First, the role of the Chinese state in the economy has become progressively less significant. Before the economic reforms, the Chinese people had little economic freedom and the state controlled every aspect of the economy, from production, to retail and even consumption. Today, private entrepreneurship is the primary driving force of the Chinese economy.

Second, the Chinese Communist Party today no longer identifies itself as a revolutionary vanguard. The 'mandate of heaven' has replaced communism; the party-state rests its legitimacy on effective governance and the improvement of living standards for the people.

There is a downside. While China's manufacturing sector now produces almost all types of consumer goods, Western consumers would be hard pressed to name any Chinese brands, even though their houses are full of products made in China. Short on innovation and lacking their own distinctive products, many Chinese firms depend on taking orders from overseas markets and selling them under foreign brand names. This does not bode well for an economy aiming to top the world. As late as 2009, the United States manufactured more goods (US$1.7trn. in manufacturing value added) than China (US$1.3trn.). After a decades-long decline in employment (fewer than 12m. workers at the second quarter of 2010), the US manufacturing sector still enjoys a significant lead over China, where manufacturing employs over 100m. Chinese workers. Moreover, given the large presence of foreign firms and joint ventures in China, the growth of domestic capacity in manufacturing in China is far less impressive than the name, the 'workshop of the world', might suggest.

The education system makes it all too clear that growth in quantity will not compensate for a lack of progress in quality. The fatal organizational flaw of Chinese universities is their lack of autonomy. The majority of Chinese universities remain under the strict control of the Ministry of Education. As a result, they have become more skilled in currying favour with the Ministry of Education than in offering innovative research and educational programmes.

Chinese law and politics have also suffered severely from the lack of an active market for ideas. Although Chinese economic performance has surpassed the wildest expectations, progress in political reform has been disappointing. China is in no way near the point where the constitutional rights of citizens are resolutely protected. The Chinese legal system is still far away from where it can 'guarantee the equality of all people before the people's laws and deny anyone the privilege of being above the law'.

Without a forum to express their views, people of critical thinking and independent thought the most valuable human assets in any society find themselves labelled political dissidents. In turn, political dissidents are often deemed to be 'anti-Party', or 'anti-socialism, a charge that can end careers, if not lives.

The lack of a market for ideas is responsible for the lack of innovation in science and technology, the Achilles' heel in China's growing manufacturing sector. The dearth of innovation and remaining state monopolies gravely reduce the range of investment opportunities that Chinese entrepreneurs find profitable.

The post-Mao Chinese economic reform in the past few decades has transformed the country's economy and society. At Mao's death in 1976, China was one of the poorest countries in the world, with a GDP per capita below US$200. By 2010, China was the world's second largest economy, with a GDP per capita at more than US$4,000. Over the same time span, China's share of the global economy rose from below two per cent to about nine per cent. Private entrepreneurship now thrives throughout the country and forms the backbone of the Chinese economy. With the world's largest population of internet and cell phone users and the largest car market, Chinese society is open, energetic, mobile and well informed, full of dynamism and aspiration.

But a vibrant market for ideas is an indispensable foundation for an open society and free economy. During the past decades of reform and opening up, the introduction of the market for goods has brought prosperity back to China and fortuitously led the country back to its own cultural roots. With the development of a market for ideas China will stand not only as a world manufacturing centre but as a lively source of creativity and innovation.

Extracted from How China Became Capitalist by Ronald Coase and Ning Wang; published by Palgrave Macmillan

Branding the City

Gjoko Muratovski examines the role of architecture and integrated design in branding a city to conclude that originality is the key to establishing a distinctive identity. For more information about the journal Place Branding and Public Diplomacy and other Palgrave Macmillan journals visit Palgrave Macmillan.

For thousands of years, architecture has been used to promote the power of the state. One has only to think of ancient Greece and imperial Rome to bring examples to mind. Closer to our own time, Napoleon embarked on a reconstruction of medieval Paris, a cause taken up by Napoleon III and Baron Haussmann to build the 'capital of capitals' to glorify the French empire. To flaunt the power of the Third Reich, Hitler visualized—though failed to realize—a mighty Berlin to dwarf Paris, while in Italy Mussolini resolved to make Rome greater than the Rome of Augustus. Red Square in Moscow was adapted to accommodate the grand ceremonial of the May Day military parades. Likewise, in the United States, George Washington commissioned the French architect, Pierre Charles L'Enfant, to design Washington, D. C. as a model for American city planning and as a symbol of world power. Other more recent American attempts at architectural propaganda include the Lincoln Memorial, the Jefferson Monument and the Vietnam Veterans Memorial.

Every world-renowned city has benefited from the construction of landmark monuments including London (the Tower of London), Paris (the Eiffel Tower), New York (the Statue of Liberty) and Rio de Janeiro (Christ the Redeemer). Canberra and Brasilia, both planned twentieth-century cities, were artificially created around governing bodies and national institutions.

More recently, architectural propaganda has evolved into branding, acting as a billboard to convey its message by the choice of style, material, technology or historical reference. For example, the iconic works of Charles Rennie Mackintosh in Glasgow and Antoni Gaudi in Barcelona play a central role in marketing those cities. Urban centres that are defined by their rich architectural and cultural heritage are seen as unique, attractive and lively. They need good architecture and design to develop their aesthetic and innovative values and to meet business and public needs. Architecture can contribute to overall well-being and can function as a source of civic pride.

Cities continue to use architecture to promote their image. Frank Gehry’s Guggenheim Museum in Bilbao is a prime example. While it does not function particularly well as an exhibition space it is an exceptionally effective marketing tool for Bilbao.

The 'Bilbao effect' inspired Dubai, Abu Dhabi and Qatar, all of which have ventured into hugely ambitious architectural projects to raise their profile from desert communities to urban oases. As oil revenues decline, they have turned towards real estate and tourism. Dubai has set a target of attracting 15 million visitors to some of the world’s most ambitious architecture including the tallest skyscraper, the first luxury underwater hotel and an artificial archipelago of residential islands. With its aggressive pursuit of bigger and better, Dubai now rivals Las Vegas as the leading desert city.

The economically stronger Abu Dhabi has been quick to establish its own vision of the future. Over the next decade, it aims to become one of the greatest cultural centres in the Middle East. The 'latter-day Xanadu', as the New York Times dubs it, will boast four museums, a performing arts centre and 19 arts pavilions designed by the likes of Frank Gehry, Zaha Hadid, Tadao Ando and Jean Nouvel. The plans include franchises of the Guggenheim and Louvre as well as an arts institute created by Yale University.

With its bid for the 2022 World Cup, Qatar has unveiled its own extravagant urban aspirations. With the help of Foster & Partners (led by British architect Lord Norman Foster) and Albert Speer & Partners (the son of Hitler’s chief architect), Qatar has vowed to build 12 carbon-neutral stadiums that can be disassembled and shipped to other locations. There is also a plan to build seven solar-powered satellite cities.

This is all hugely impressive but there are drawbacks. One strong possibility is that these copycat 'reimagining' strategies could defeat the whole purpose of branding. This has already been recognised in the use of overly similar logos, visuals and slogans. Now we see the same thing with architecture. Cities hire the same 'brand-name' architects who produce the same signature buildings. This, in turn, results in a uniform, amorphous city image with no distinctive sense of place. The renowned Dutch architect and theorist, Rem Koolhaas, who is himself part of the Middle East renaissance with his master plan for a Waterfront City in Dubai, fears that the growing use of high-end architecture as a promotional tool will reduce cities to homogeneous architectural theme parks.

If architecture is to promote cultural values that respect heterogeneity, it must align city marketing with social and economic realities. Architects and developers need to adopt innovative models that acknowledge and build on a firm cultural and social foundation.

Extracted from Place Branding and Public Diplomacy, edited by Gjoko Muratovski.Volume 8, Number 3, August 2012; Palgrave Macmillan



Opening Windows of Opportunity

Nicholas J. Cull assesses the Hillary Clinton effect on US Public Diplomacy. For more information about the journal Place Branding and Public Diplomacy and other Palgrave Macmillan journals visit Palgrave Macmillan.

American diplomacy is highly susceptible to changes in its leadership. Other Western states are not nearly so volatile. Whatever the quality of diplomats in the field, in the United States the scramble for resources from the legislature, the battle to be heard at the policy-making table and the ability to corral one's own bureaucracy or manage an ever-tricky inter-agency process rests on the personality at the top.

In the old days of the United States Information Agency (USIA)— the one–stop shop for American public diplomacy from 1953–1999—fortunes rose and fell with the choice of leader. The last golden age of that agency rested on Charles Z. Wick and more particularly, on his friendship with President Reagan. Equally, USIA's decline can be tracked to the leadership problems of his successors. In the years since USIA was taken into the Department of State, the leadership of American diplomacy has flowed from the Secretary of State and from whoever is Under Secretary for Public Diplomacy. Hence, the departure of any Secretary of State is a good moment to take stock. When that Secretary has the global public profile of Hillary Clinton the argument for so doing is overwhelming.

It is hard to underestimate the public diplomacy problems inherited by Hillary Clinton when she took office in 2009. The Bush administration's approach to foreign policy and most especially its war in Iraq had alienated much of the world. While the White House soon learned to pay attention to public diplomacy, US diplomats faced an uphill battle rebuilding an effective apparatus out of the mess left by the merger of the USIA with the State Department. A rapid turnover and long gaps between the appointment of Under Secretaries raised as many problems as the relative merits of those who served. The best of Bush's Under Secretaries—James K. Glassman—was in office for only six months. The Secretaries of State left their own stamp. Colin Powell led a charge to digitization, finally pushing old-school technophobes to embrace websites, email and personal data devices. Powell’s successor, Condoleezza Rice, touted what she termed transformational diplomacy, which amounted to a deployment of resources away from Europe to the Middle East. Judging by the scant attention given to diplomacy in her memoirs the subject was hardly a preoccupation. Hillary Clinton could not afford any such luxury.

The international agenda for the Obama administration was set in the President's inaugural address in which he pledged to 'extend a hand' to enemy governments 'if you are willing to unclench your fist' and promised the people of the Muslim world to 'seek a new way forward, based on mutual interest and respect'. The departure of the Bush administration gave the image of the United States an immediate boost: the country bounced to the head of Anholt-GfK Nation Brand Index in one of the very few dramatic movements yet seen in that surprisingly stable run of data. Hillary Clinton's job was to maintain the momentum and to deliver on the President's pledge of outreach. She embarked on a frenetic round of overseas visits, including meetings with the ordinary people of America's target countries. Her celebrity profile gave her a head start but it was not always easy. She became the focus for popular indignation at aspects of US policy. In Pakistan in 2011 she bore the brunt of public anger over civilian deaths from drone strikes. But under her guidance the US image changed to that of a country prepared to, in the preferred terminology of the moment, 'engage' international opinion. Clinton became the public face of 'Smart Power', a concept that blended hard power (military and economic) with the soft power of American values and culture.

Clinton's key lieutenants were her Under Secretaries of State for Public Diplomacy: first the TV executive and campaign contributor, Judith McHale, and then the former journalist and veteran of the Bill Clinton National Security Council staff, Tara Sonenshine1 . McHale's improved administrative structure created in 1999 built a solid foundation for the future practice of US public diplomacy. Her effectiveness, however, was disputed within the Department and few mourned her early departure. Sonenshine, who took office in early 2012, has worked to re-energize diplomacy in the field and has maintained an impressive speaking schedule to explain the important role that public diplomacy plays in US foreign policy.

While the Under Secretar's direct province is limited to the International Information Programs and Educational and Cultural Affairs, public diplomacy has to involve everyone in the Department and in the wider bureaucracy. Hillary Clinton's allies to this end included the Secretary of Defense for most of her tenure, Robert Gates. Since the end of the Cold War and more especially the attacks of 11 September 2001, the Department of Defense had become a central player in the projection of America's image. Gates saw the danger of this and went so far as to argue that resources be diverted away from his department to strengthen the State Department’s diplomatic capacity, including its approach to public diplomacy. Within the wider bureaucracy Hillary Clinton built on the precedent established during the tenure of Undersecretary Karen Hughes (2005–07) to maintain the Department's civilian leadership in a number of important cross-government initiatives, including inter-agency counter-radicalization. This effort has evolved into a Center for Strategic Counterterrorism Communications (CSCC), located at State, which pools resources from State, Defense and CIA to push back against Islamic extremism over new media. Under the leadership of Ambassador Richard LeBaron, the SCSS wisely de-coupled its work from any attempt to sell the United States in order to focus on connecting the potential targets of radicalization with online materials that undermine extremist claims and keep open other political approaches.

Clinton's public diplomacy initiatives have highlighted gender and women's issues, with the creation of the post of Ambassador-at-Large for Global Women's Issues. In the days immediately before Secretary Clinton's departure from office this post and its associated supporting office became permanent. No less significant, Clinton gave close attention to partnerships with non-governmental and corporate sectors. The threads of this work were drawn together in the Global Partnership Initiative, led by a Special Representative for Global Partnerships, Kris Balderston. This unit built on Bush-era partnership activity, such as the President’s Emergency Plan for AIDS Relief (PEPFAR). A global campaign for clean cooking stoves was among projects that made a real difference to millions of lives around the world. By 2012 the department had 800 partners including the Chinese and Indian governments. But while Clinton placed great emphasis on issues of gender and partnership it was a third area which would become most closely associated with her term in office: new technology.

From the outset Clinton emphasized what her innovation advisor Alec Ross touted as 'Twenty-First Century Statecraft'. This had two prongs. The first was a diplomatic emphasis on the need for open internet connectivity and free international exchange online. The second prong was an attempt to integrate new technology into the practice of diplomacy. Within the State Department initiatives which had been small-scale and experimental during the final months of the Bush era now became large-scale. Most embassies acquired Twitter feeds and the bureau of International Information Programs launched Facebook platforms dedicated to 'getting the message out'.

New technology can create communities of interest that transcend geographical space and challenge the primacy of states. Perhaps future generations will look back and see online public diplomacy not as 'Twenty-First Century Statecraft' but as post-state craft. But even now the State Department’s digital outreach often misses the opportunities for a two-way flow of information. If the United States is to flourish in the digital realm it must allow its diplomats in the field to build on their main strength, which is to create open-ended relationships.

The final months of Hillary Clinton's tenure were overshadowed by the deaths of Ambassador Christopher Stephens and his team in Benghazi, Libya, in September 2012. In the overheated atmosphere of a Presidential election campaign the Benghazi killings became a political football. Secretary Clinton's ability to stay calm prevented the affair from taking off in quite the way the opposition hoped. But the Benghazi tragedy may raise demands to move America's diplomats back from forward positions like the Benghazi consulate to the relative safety of embassy compounds. This would undermine the public diplomacy success of the Hillary Clinton era.

Whatever the strengths of new technology and the value of a charismatic leader at the helm, public diplomacy will always rest on the work of Foreign Service Officers in the field working to build relationships across what Edward R. Murrow termed 'the last three feet' separating one individual from another. Hillary Clinton embodied this lesson and led the way by example.

Hillary Clinton's departure provoked the inevitable flurry of media speculation on whether she intends to run for president in 2016. Her contribution to the field of public diplomacy as Secretary of State augers well. Plainly, were she to attain that office it would be no bad thing for America's conversation with the world on which so much of our shared wellbeing depends.

1Subsequent to the time of writing the White House announced that Under Secretary Tara Sonenshine would leave office in June 2013, apparently as part of an effort to open vacancies to reward contributors to the 2012 election campaign. The decision was not applauded by the public diplomacy community and is a sad testament to the degree to which the cost of US elections corrodes wider aspects of US statecraft.



The Iron Brand

Nicholas J. Cull assesses the impact of Margaret Thatcher as an international communicator. For more information about the journal Place Branding and Public Diplomacy and other Palgrave Macmillan journals visit Palgrave Macmillan.

Margaret Thatcher's tenure as Prime Minister was a milestone in the evolution of political communication in Britain. Unlike her predecessors, she was closely tutored to make the best of television. She worked closely with leading lights in the British advertising industry to develop a string of potent messages for the 1979 general election. Her press secretary was an essential member of her team. Her advisers believed in the necessity to control messages and this was never as clear as during the 1982 Falklands/Malvinas war when the British government was able to limit reporting and thereby establish a model of media control. The United States adopted the model for its operations in Grenada, Panama and the first Gulf War. Thatcher's government worked equally hard to control the news from Northern Ireland. Thatcher understood terrorism as a form of communication and (borrowing a metaphor from the Chief Rabbi of the day) spoke of the need to deny the terrorist the 'oxygen of publicity'. The policy involved heavy-handed treatment of the British media including a broadcasting ban on the transmission of the voices of terrorists on British airwaves. News organizations were obliged to hire actors to quote paramilitary spokesmen.

Thatcher was a powerful voice for Britain. Her famous tag of 'Iron Lady' was in itself a testament to her international impact, being generated by a Soviet military newspaper in January 1976 in response to a confrontational speech. She immediately embraced the soubriquet as a badge of honour. Her abrasive style was not always welcome, particularly in the European Union where she was inclined to lecture her opposite numbers. Yet she raised the profile of her country in world affairs. As Simon Anholt has put it she, like Churchill before her and Blair after her, 'paid the rent' on the UK's international profile. Margaret Thatcher was impossible to ignore. She certainly made an impact on the newly democratic countries of Eastern Europe. For them she was the voice of freedom, an essential element in the final phase of the West's ideological assault on the Communist Bloc. Her rhetorical legacy in the other great democratic battles of the era is less clear. South Americans recall her support for the Pinochet regime in Chile. South Africans remember her opposition to the application of sanctions against the Apartheid regime and jibes against the 'terrorists' of the African National Congress.

Her wider contribution to British public diplomacy is mixed. While Thatcher's American analogue, Ronald Reagan, ushered in something of a golden age in American public diplomacy—expanding the budget for international engagement, boosting exchanges, modernizing Voice of America and launching multiple initiatives to promote democracy—the same cannot be said of Thatcher. The Foreign Office launched its Chevening Scholarship programme on her watch (1983) but there were few other major initiatives. The Westminster Foundation for Democracy, which built on Thatcher's commitment to democratization around the world, did not appear until 1993. It was an era of soldiering on.

The British Council spent a decade scrambling for resources. The BBC World Service was sucked into the Thatcher government's cuts. The White House archives from the time of President Jimmy Carter include an appeal from the BBC via the US embassy in London for the US to beg the new Prime Minister Thatcher not to cut the World Service budget by 30%. There were some backward steps in the fields of dialogue and cultural exchange. The Thatcher period saw Britain’s withdrawal from UNESCO in protest against alleged mismanagement. It was a low point in the UK’s international cultural relations.

Thatcher's public diplomacy was strongest in her defiance of Soviet hegemony in Eastern Europe. Star turns included the BBC Russian service's DJ, Seva Novgorodtsev. Another notable campaign was directed at the United States to contest the negative publicity around the Northern Ireland conflict. Tactics included upgrading the press relations apparatus originally established in the Second World War to ensure that the major consulates across the USA had resident experts on Northern Ireland. These spokesmen were regularly rotated through visits to the province and were hence equipped with the authority of recent first-hand experience. In concert, the Thatcher government channeled a flood of material unrelated to Northern Ireland across the Atlantic—making excellent use of the new British Council office in Washington DC, royal visits and the blockbuster show ‘Treasure Houses of Britain’ which displayed masterpieces from British country houses. The effort succeeded in preventing Britain being branded by the troubles. In retrospect this was probably a mixed blessing. Britain might have done better to have listened to the international concern over the troubles rather than drowning it out. The day was to come when the UK had to negotiate with terrorists. Meanwhile, a further round of violence kept the issues alive.

While the Thatcher period did not revolutionize representation of Britain it is possible to see the Blair-era interest in Britain's image as an extension of the Thatcher era. The 'New' Labour Party with its fixation on the power and priority of image had learned much from Thatcher. Yet just as Thatcher played a part in shaping public diplomacy, public diplomacy was critical to the image of Margaret Thatcher.

At the beginning of her career Margaret Thatcher was, by her own account, profoundly shaped by public diplomacy: that of the United States. In 1967 she was a rising star of the Conservative opposition backbenches. Although getting noticed for her spirited forays into financial affairs she had no experience in international matters and displayed no urgency to acquire any. Her approach to conservatism was patriotic and parochial. At this point the public diplomacy team at the US embassy selected her for a six-week 'international visitor program' tour to meet US opinion leaders. Her visit included a meeting with White House adviser Walt Rostow and a visit to NASA's Mission Control in Houston. Looking back on the experience in 1995 she commented: ‘the excitement which I felt has never really subsided. At each stopover I was met and accommodated by friendly, open, generous people who took me into their homes and lives and showed me their cities and townships with evident pride'. It was a personal revelation and began a personal journey of ideological convergence between her take on British conservatism and the reinvigorated American conservatism. Her view of America certainly stood in contrast to that of her predecessor as leader of the Conservative Party, the Europhile Edward Heath, who had been an Atlantic sceptic since his own first visit to the US as a student debater in 1938. The result was that when the American swing to the right reached the White House in the form of Ronald Reagan, its British equivalent was already in place and open to collaboration. It was a handsome payoff for a tiny investment back in 1967 and one which US diplomats have not been slow to trumpet in the years since.

What then is the bottom line? That Thatcher shaped and was shaped by public diplomacy is a testament to the significance of both. She made the image of Britain overseas as surely as last year's Diamond Jubilee or the London Olympics. Not everybody approves of Margaret Thatcher and there has been a fierce debate over her legacy. But this has served to remind the world that Britain remains a country in which people expect to have a say, however controversial. Britain's first woman prime minister would have approved.


Us Before Me

Poverty is a global problem of enormous magnitude. In her latest book Us Before Me, Patricia Illingworth explores the scope of the problem and the role of the individual in overcoming it.

For more information about Us Before Me and other titles in the Philosophy list visit Palgrave Macmillan.

Poverty is a global problem of enormous magnitude. According to the World Bank, 1.4 billion people live on $1.25 or less a day. Over 1 billion people in developing countries have inadequate access to water and 2.6 billion lack basic sanitation. Not only is there extreme poverty worldwide, there is also vast inequality. The wealthiest 20 per cent of the world's population control 76 per cent of the world's goods while 80 per cent have what remains.

There is global homelessness as well. There are over 26 million internally displaced persons living in about 52 countries across the globe. In addition, there are over 10 million refugees. In a world of plenty, we do not care enough about other people to provide for their basic needs, despite the existence of human rights to food, shelter and medicines.

Sometimes we don't help others because we think it is morally wrong to do so. Some people believe they shouldn't help because it is not their responsibility to help; it is the responsibility of another person. Or they may believe that people should take personal responsibility for their own misery. Some people, such as those with 'donor fatigue', are weary of helping. But very often people don't help others because they are indifferent to the suffering and misery of other people or are simply preoccupied with their own affairs. They don't care.

It would be wrong to suggest that people never help others. People are especially attentive to their families, friends and—sometimes—communities. In some jurisdictions, people are required by law to help those with whom they have a close relationship, such as a child or spouse. People come to the aid of others when there is a highly publicized disaster. But given the persistence of severe poverty, there is much more that people could and should do. People need to care more about others. Unfortunately, care and concern for others is not the kind of emotion one can produce at will, or upon command.

Caring behaviour seems to arise among people who interact with one another. To build concern for others, the norm of self-interest needs to be counteracted with a new norm that shifts the focus from 'me' to 'us'. As the normative world now stands—in the West, at least—liberty, autonomy, self-determination, privacy, self-interest and sovereignty come together to shield people from the harm they cause others, directly and through their institutions, making them morally complacent and indifferent.

Pro-social norms are needed to transform the current culture of self-interest and indifference to one committed to helping others. Helping others, whether it is shovelling an elderly neighbour’s snow, donating money to Oxfam, inventing drugs for neglected diseases or working with AIDS patients in Africa, needs additional support from ethics. With a moral commitment to build social capital people may come to be more fully engaged in local and global communities. They may be less tempted to analyze their generosity in terms of self-interest, but might instead celebrate it as an example of their desire to help the less fortunate and build strong communities. Additionally, if the data of social psychology is accurate, they will probably also be happier.

People are drawn to those whom they perceive to be similar to themselves. People may feel safe with like people. Trust is strengthened when there are frequent face-to-face interactions. It is also not surprising that social capital flourishes in homogeneous communities. People gravitate to communities where there are like people. Proximity helps here. But surely with rapid transit and communication, it is now possible to encourage face-to-face interactions among people from many places.

The potential to forge a new 'global' identity may be aided by changing demographics. There is growing evidence that racial, national and cultural classifications are changing, and that a mixed race identity is far more common than in the past. In the United States, for example, the number of Americans who count themselves as mixed race is changing dramatically due to increased immigration and intermarriage over the last two decades. At present, one in seven marriages in the United States is either interracial or interethnic. Those with mixed backgrounds have a 'fluid sense of identity', and reject the idea that they must choose one racial identity over another.

Both international personal relationships and international business relationships create and depend on global social capital. Individuals have international friendships and family and may belong to international organizations. International business networks are widespread and growing. They include not only transnational enterprises but also contractual and commercial networks. Global governance organizations such as the United Nations, World Bank, International Monetary Fund, General Agreement on Tariffs and Trade, and World Trade Organization all create and rely on extensive global social capital, as do the countless number of NGOs that exist worldwide. Most of these networks involve face-to-face interactions but internet networks and email are often used to sustain social capital in transnational interactions.

Social capital's global dimension is enormously important. Today, more than at any other time in history, international relations have the potential for great benefit, especially for the most vulnerable people in the world.

With a Little Help from the Law

For the most part, laws that foster generalized reciprocity will be helpful in sustaining social capital. When law encourages people to act for the sake of others, it nurtures social capital. When it encourages them to act for people who may not reciprocate, it nurtures generalized reciprocity. When the law encourages inclusion rather than exclusion, integration rather than segregation, it creates the social structures necessary for social capital.

Because international law is based on negotiation, cooperation and consent, it relies on transnational social capital and constitutes an opportunity to create social capital. Nations must interact with one another in order to come to agreements. To this end, they build countless networks. The need for agreements, which will support treaties, creates networks and trust as different countries collaborate to find mutually agreeable arrangements. Negotiation involves the kind of give and take that creates trust.

This process of cooperation is strongest in multilateral agreements. Prior to the General Agreement on Tariffs and Trade, bilateral agreements, which invite power exchanges to win privileges, were dominant. They often lead to conflict among trading partners. In contrast, a multilateral approach, combined with Unconditional Most Favored Nation status, as adopted by the World Trade Organization, treats all trading partners alike and invites cooperation. In an ideal world, if we want to use global giving as a way to cultivate global social capital we would encourage people to build networks with people from other countries, perhaps cultivate a more expansive identity so that they see themselves as members of a wider world.

The more people travel, study, work and marry abroad, the more they are likely to want to contribute directly to distant countries and to control where their money goes. As they learn more about the world, experience some of the deprivations of the world as deprivations of their global community, their desire to give globally is likely to grow, and with that, social trust and global social capital.

It is difficult for human beings to watch people suffer. Shifting our moral focus to us brings people who were previously invisible, whether the distant poor or the local homeless, within our range of vision, making it easier to treat them morally. It is much harder to be indifferent to the suffering of those with whom we are connected than those with whom we have no connection. A moral duty to promote social capital underscores the moral value of connection to others. Being mindful of the moral obligation to create global social capital will be an important step toward realizing our global obligations and creating global social justice.

Extracted from Us Before Me. Ethics and Social Capital for Global Well-Being by Patricia Illingworth; published by Palgrave Macmillan


Winning the Global War Against Corruption

Corruption in government and business is widespread. But the tide is turning. Nick Kochan and Robin Goodyear explain what is being done to enforce minimum standards in their book Corruption

For more information about Corruption and other titles in the Business and Management list visit Palgrave Macmillan.

As a proportion of world trade, the value of bribes is phenomenal. The World Bank estimates that more than US$1trn. in bribes are paid each year, amounting to three per cent of the world's economy.

High-profile cases involving major companies have turned the spotlight on the extent to which bribery distorts global markets and destroys communities. With campaigning groups voicing demands for greater transparency, the corporate world is under growing pressure to adopt anti-corruption practices.

Transparency International (TI) is the foremost activist group in the field. This non-governmental organization (NGO) has chapters in over 90 countries. Founded in Berlin in 1993 by Peter Eigen, a former regional director of the World Bank, TI campaigns to promote transparency in elections, public administration, procurement and business, while lobbying governments to bring about change. The organization also develops and distributes practical guidance to help businesses operate ethically.

Its biggest achievement so far has been to raise awareness in governments, business and the general population – in its own words, to 'challenge the inevitability of corruption'. TI is best known for its annual Corruption Perceptions Index, a worldwide survey that ranks countries according to the level of corruption in public life. TI also publishes a Bribe Payers Index and, since September 2010, a Global Corruption Barometer, the only worldwide public opinion survey on corruption. TI has been directly involved in international anti-corruption agreements; it played a major part in establishing the Organization for Economic Cooperation and Development (OECD)'s Anti-Bribery Convention and in drafting the United Nations Convention against Corruption and the African Union Convention on Preventing and Combating Corruption.

Also established in 1993, Global Witness (GW) is an NGO based in London and Washington, D.C. It is an environmental group concerned with detecting and exposing the corruption and social problems associated with the exploitation of natural resources. The group campaigns for financial transparency as a way to expose corrupt relationships.

According to GW, in 2008 exports of oil and minerals from Africa were worth roughly £242bn. (US$393bn.)—over ten times the value of exported farm products (£23bn./US$38bn.) and nearly nine times the value of international aid (£27bn./US$44bn.). If used constructively this wealth could lift millions of people out of poverty. However, the main benefits of resource extraction are diverted by political, military and business elites in producer countries, and oil, mining, timber and other companies overseas.

The growing importance of transparency in government and business is illustrated by Publish What You Pay, a global network of over 600 civil organizations that lobby for disclosure in the extractive industries. Transparency is also a principal concern for the Center for Public Integrity, an independent investigative organization whose mission is concerned with the interaction between private interests and government officials and its effect on public policy.

The internet has provided a new way for people to tell their stories of corrupt practices. Websites such as www.ipaidabribe.com expose corruption at the grassroots level while at the same time providing a forum for discussion on how best to counter demands for bribes. Public resentment of corruption is expressed through social initiatives such as Bribebusters in India, currently piloted by lawyer and entrepreneur Shaffi Mather. Individuals and companies can hire Bribebusters, for a fee, to act on their behalf. Mather is particularly keen to combat individual demands for bribes, such as street vendors paying officials or drivers stopped for traffic offences paying police officers: It might be a small amount that each individual has to pay, but ... there are multiple studies that estimate the bribes paid in 18 common services such as electricity, water and civic services ... are around $4·5 bn. And we're not talking about the big scams or scandals, these are just bribes paid by the common man in his daily life.

Local schemes are complemented by organizations representing global business such as the International Chamber of Commerce, which has adopted a programme to combat bribery.

DOMESTIC RESPONSES WITH INTERNATIONAL IMPLICATIONS

Two pieces of legislation with international implications are the 1977 US Foreign Corrupt Practices Act (FCPA) and the 2010 UK Bribery Act.

These laws are feared by businesses because of their extraterritorial jurisdiction and potential for aggressive interpretation. The US FCPA, for example, though restricted to US citizens and nationals or those whose principal place of business is within the United States, is expanding to bring in more corporates. The UK Bribery Act has, potentially, an even wider reach; a Serious Fraud Office official explained that, 'In practice, a company registered anywhere in the world and having a part of its business in the UK could be prosecuted for failing to prevent bribery on its behalf wherever in the world that bribe is paid.' Though recent guidance has excluded foreign companies whose only business in the United Kingdom is a stock exchange listing from the reach of the Bribery Act, businesses are taking the Act seriously because it marks a trend in corruption legislation by introducing a new corporate responsibility. Previously, 'knowledge' (however narrowly or widely interpreted) had been a key component in determining corporate liability for bribery. Robert Amaee, former head of the Serious Fraud Office (SFO)'s anti-corruption team, explains how the UK Bribery Act changes the rules of the game:

The new Act sweeps away this requirement and introduces a new corporate offence of failing to prevent bribery. This is a novel concept under English law and one which we are likely to see more of in the years to come. It makes a commercial organization criminally liable if one of its employees, agents or subsidiaries bribes another person, intending to obtain or retain business or an advantage in the conduct of business for the company.

Legal firm McDermott Will & Emery suggests that national legislation such as the UK Bribery Act will drive up standards in company policy. The Bribery Act's broad scope, harsh penalties and narrow defences may be seen as a harbinger of other minimum standards for businesses.

TRANSNATIONAL INITIATIVES

Fighting corruption is often on the agenda at summits, for example at the G8 and G20. The thrust of the argument is that corruption is a global evil and that countries need to coordinate their efforts. Links between terrorist financing and money laundering have accelerated transnational attempts to address a whole range of financial misconduct.

In the EU Procurement Directive of 31 March 2004, Article 45 regulates the supply of goods and services to government bodies by companies and individuals. The Directive requires EU member states, for the first time, to exclude companies and individuals convicted of corruption from being awarded public procurement contracts.

Governments in other regions have also pledged to crack down on bribery. Twenty-five countries have now adopted the 1996 Inter-American Convention Against Corruption, which criminalizes the paying or accepting of bribes by public officials. The Association of Southeast Asian Nations (ASEAN) is also taking increased interest in tackling corruption, particularly in relation to 'gifts' to public officials.

In Africa, the New Partnership for Africa's Development (NEPAD) has started to focus more closely on corruption and how it adversely affects poverty relief. The 29 countries that signed its memorandum of understanding are committed to 'just, honest, transparent, accountable and participatory government and probity in public life'. Recognition in African governments of the impact of corrupt politicians is illustrated by the transnational African Parliamentarians' Network Against Corruption (APNAC), which works to 'strengthen parliamentary capacity to fight corruption and promote good governance'. In Dec. 2001, the 15 members of the Economic Community of West African States (ECOWAS) signed the Protocol on the Fight against Corruption, which requires all signatories to criminalize the paying and receiving of bribes, and provides an international cooperation framework to improve mutual law enforcement and facilitate asset confiscation.

The OECD has been increasingly active on corruption and bribery, particularly since 1997 when an OECD Convention established 'legally binding standards to criminalize bribery of foreign public officials in international business transactions', along with monitoring and enforcement measures to make it effective. It is the first and only international anti-corruption agreement to focus on the 'supply side' of the bribery transaction. While it has no direct powers of enforcement, the 38 signatories to the Convention commit themselves to enforcing this expression of intent through domestic legislation.

The OECD Working Group on Bribery in International Business Transactions is responsible for monitoring the implementation and enforcement of the 1997 Convention, in addition to the more recent 'Recommendation for Further Combating Bribery of Foreign Public Officials in International Business Transactions' and other related instruments. The OECD's country reports detailing the current state of anti-bribery measures, and data collected from participating countries, are available on its website (www.oecd.org).

The OECD has also played a role in attempting to improve the anti-corruption regimes of specific geographic regions. To date, 28 countries have endorsed a joint plan by the Asian Development Bank/OECD Anti-Corruption Initiative for Asia and the Pacific, which aims to set minimum standards and safeguards to prevent corruption and bribery.

The growing consensus that corruption is a threat to development is evident in other policy initiatives at the highest level. After Resolution 55/61 in 2000, the United Nations Convention Against Corruption entered into force in Dec. 2005 with 145 signatories, stressing the importance of the principles of prevention, criminalization, international cooperation and asset recovery in countering the threat of corruption.

THE UN GLOBAL COMPACT

Corporate social responsibility (CSR) initiatives reveal commitment from major companies to combat corruption. However, not all the principles lauded by the United Nations have received equal attention. This is changing rapidly. The UN Global Compact, a strategic policy initiative comprising 5,300 businesses and six UN agencies, describes itself as the 'world's largest corporate citizenship and sustainability initiative'. The network aims to place ten universal principles into business philosophy and everyday operation including ‘action against corruption in all its forms’.

BUSINESS ATTITUDES TO BRIBERY

Some Western companies dismiss corrupt practices as intrinsic to the culture of certain countries, especially in the developing world. Bribery is seen as a necessary evil or as an informal tax on operating in certain regions. Faced with deliberate stalling by corrupt officials or simply with inefficient public administrations, corporations might turn a blind eye to the activities of their agents or subsidiaries in an effort to speed up transactions.

Despite an increase in the number of companies striving to eliminate corruption, the necessary cultural shift has not yet occurred. Thus the consequences of the UK Bribery Act might come as a shock to many boards. For example, when the Financial Services Authority (FSA) investigated 17 wholesale insurance intermediaries operating in the London market it found that they have:

approached this area of their business far too informally (especially higher-risk business) and that, at present, many firms would not be able to demonstrate that they have in place adequate procedures to prevent bribery.

FSA concerns are echoed by other research. Over a third of respondents to a TI survey in 2010 thought that City of London businesses perceive bribery as standard procedure in some environments, and a survey by a City law firm showed that 20 per cent of City businesses had no policy in place to address corrupt practices. These findings are not unique to British companies. All businesses that operate in a competitive environment may have an interest in tacitly condoning bribery in pursuit of profit.

CULTURAL ATTITUDES TO BRIBERY: IT'S JUST HOW THINGS ARE DONE

Some forms of the Chinese concept of 'guanxi', defined as ongoing 'relationships between or among individuals creating obligations for the continued exchange of favours', are closely associated with bribery and corrupt public officials; in other countries, this may be known as 'blat', 'bakshish' or 'relationship marketing'. In light of the extraterritorial jurisdiction of the UK Bribery Act, much effort has gone into determining the extent to which 'one culture's favour is another's bribe'.

Despite Lord Woolf's dismissal of the claim that competitors benefit from robust anti-corruption procedures as a thin excuse 'made for not doing things that you know you should', there is evidence to suggest that (in the short term at least) it is not unfounded. It is estimated that from 1977 to 2004 American companies lost 400 major contracts because of bribes given by competitors to foreign government officials. From a superficial perspective, bribery can be perceived as a good thing by maintaining market dominance and increasing efficiency for a particular company. But there are strong reasons against turning a blind eye to corruption, however it is dressed up.

Research tends to focus on the economic consequences of bribery in the countries in which it is received, rather than on the consequences of being caught paying bribes. This is a serious omission, as it is clear that this impact can be grievous. The news that a firm or subsidiary is being prosecuted, or has been convicted of bribery, can damage the share price. In late 2010, shares in Panalpina dropped 4·1 per cent on the news of its admission of guilt in a deferred prosecution agreement concerning the bribery of officials in at least seven countries. According to Sam Eastwood, a partner with lawyers Norton Rose:

As international anti-corruption policies impact increasingly on global companies and their dealings with other companies, a dynamic of 'corporates policing corporates' is beginning to emerge. In order to protect themselves from liability, commercial organizations are increasingly requesting details of the anti-corruption policies and procedures of the companies with which they enter into business. This means that commercial organizations are becoming increasingly concerned with compliance with the Bribery Act and other jurisdictions' anti-corruption legislation, even if those laws do not directly apply to them.

The 2010 UK Foreign Bribery Strategy states that the government will actively support 'transparent companies with robust anti-corruption procedures' in order to 'ensure that ethical business will not be undercut by unscrupulous competitors or disadvantaged in access to [government] support'. Demonstrating a robust bribery risk management strategy will surely become an essential component of all corporate bids for government contracts.

Disbarment for a period of years from tendering for future lucrative contracts can have serious consequences. The German construction company Lahmeyer International suffered severe losses after it was banned from tendering for seven years for World Bank-funded projects after a conviction for bribery in South Africa. The World Bank's Sanctions Committee found that Lahmeyer engaged in corrupt activities by bribing the Lesotho Highlands Development Authority's Chief Executive.

While in some cases debarring is discretionary and might be avoided if promises to rectify the situation are made (for example, with World Bank contracts), in others (for example, under the European Union Procurement Directives), a purchasing body must exclude from tendering any company that has been convicted of corruption. The risk of mandatory disqualification from tenders is something boards need to take seriously.

Countries, as well as companies, that acquire a reputation for receiving bribes are also damaged. The developing country that is so stigmatized will jeopardize its eligibility to receive future aid payments. The Millennium Challenge Corporation only gives aid to those states that are judged to be implementing principles of good governance, which includes an evaluation of their control over corruption. The UK investigation into BAE's activities in Saudi Arabia and elsewhere damaged the country's reputation for transparency.

The reputation for corruption serves as a bar on any form of business activity. Individuals with a reputation for corruption will be excluded from the job market. They will also burn their bridges with influential public officials. Companies perceived as corrupt may be precluded from tendering for lucrative contracts. Developing nations' governments have particular reason to fear being deemed corrupt; a reputation as a country where bribery of public officials is encouraged or tolerated does nothing to help attract foreign direct investment and international aid.

Extracted from Corruption: The New Corporate Challenge by Nick Kochan and Robin Goodyear (Palgrave Macmillan).


China and America: The Perils of Dependency

In his book The Reckoning Michael Moran explores the intricate nature of arguably the most important economic relationship in the world today.

For more information about The Reckoning and other titles in the Economics list visit Palgrave Macmillan.

The most important economic relationship in the world—that between the United States and China—is widely regarded as a one-way street, benefiting China and undermining the United States. But this is a simplistic view. In fact, China's dependency on the American market and commercial innovations represents a huge vulnerability for Beijing as it makes its way toward the centre of the global stage. Along with its increasing dependence on imported energy, persistent internal unrest and a susceptibility to bouts of inflation, China's pushmipullyu relationship with America presents terrible dilemmas for Beijing's communist leadership.

In spite of the alarmist headlines, China's rise as an economic giant, and eventually a military and diplomatic competitor to American power, is taking place under terms that the United States can influence and even harness to its own advantage. But the United States, in the three decades since China opened its economy to capitalism and began its breakneck sprint toward world-power status, has failed to develop a coherent strategy to leverage these advantages—particularly in innovation, technology and intellectual creativity. Such a strategy would require Washington to invest heavily in its own economic and creative strengths, as well as adjust its military posture in the Pacific to accommodate Chinese interests without sparking conflict. It would also require Washington to insist on greater Chinese participation in international diplomacy and peacekeeping, and prepare American allies in Asia for the realignment of power that looms ahead.

A deep freeze enveloped US–China ties after the 1989 Tiananmen Square massacre. Gradually, isolation gave way to a paternalistic approach as the United States offered 'rewards' like most-favoured-nation trade status or membership in the World Trade Organization (WTO) as incentives for China to continue opening its economy to competition. All of this made sense early on, when China's home market was largely closed to foreign products and foreign direct investment (FDI) was limited to joint ventures with state firms. However, the approach lost its effectiveness after China absorbed Hong Kong's dynamic banking sector and after increasing numbers of Chinese trained in US and other foreign universities returned to their homeland. Powered by the mix of long-term economic planning, first–class financial acumen and cheap credit, China's growth accelerated, and it became clear that its emergence as an economic giant would happen no matter what the West thought.

The exact shape of the 'new geopolitical order', however, remains unclear. Some view the United States and China as rival standard-bearers of competing ideologies—democratic market capitalism and repressive state capitalism—bound to clash in the twenty-first century just as fascism, communism and democracy did in the previous one. But clinging too fervently to this belief risks creating a self-fulfilling prophesy, particularly in a world where the balance of power in Asia and elsewhere is in transition, and where none of the old talking shops, from the United Nations to the Association of Southeast Asian Nations (ASEAN) to the Asia-Pacific Economic Cooperation (APEC), are ready to offer a credible forum for mediating disputes.

The issues that receive obsessive attention from each side—the under-valued Chinese currency, disputed territorial claims in the South China Sea and China's gradual modernization of its military—raise the risk of a sudden miscalculation that could have tragic global consequences. What's more, the black-and-white view of this relationship obscures the fact that the United States retains the upper hand militarily and economically.

Washington should be plotting a future based on the awesome strengths of American free society and economic productivity. Instead, it scapegoats China, its chief competitor, for striving to lift its population out of destitution. Since the 2008 financial crisis, the United States has acted like a football team that can only play defence. Case in point is the running dispute over China's currency, the renminibi (RMB), also known as the yuan. The People's Bank of China—the country's central bank—has kept the RMB low in order to maximize the competitiveness of Chinese products. The dispute pits those who think a strong RMB would make US manufactured goods more competitive globally against those who see it as a relatively small issue in a very complicated relationship.

The larger problem is not about exchange rates but 'global imbalances'—the fact that China's consumers save at enormous rates and consume little compared with the spend-crazy, credit-addicted West. Getting Chinese consumers engaged in the global economy would do far more to employ American workers than a readjustment of its currency rates. Rather than building a mountainous surplus based on export earnings that it ploughs into US Treasuries and other investments, China could be recirculating that money domestically and stimulating a consumer boom that would produce a positive effect for every major manufacturing power on earth.

Of course, there are enormous structural impediments to this readjustment—for instance, Chinese banks effectively operate as funnels, sucking up much of the savings in Chinese households and transferring them (via bank balance sheets) to the corporate-state sector, underwriting its investments in massive infrastructure and other projects, and keeping consumer spending weak. Reforms to this system, which barely figure in US-China joint communiqués, would mean that more Chinese-made products would stay in China, with fewer dumped cheaply into foreign markets.

But such subtleties are lost on American legislators. Rather than focusing on ways to maintain and extend the US lead in high-end manufacturing, US politicians have waged a doomed battle to protect manufacturing industries like textiles and furniture that stand no hope of supporting what the American workforce considers a decent life. As a result, Congress has pressured for an appreciation of the RMB to be the featured 'task' of recent US financial talks with the Chinese. Granted, the United States has also established an annual bilateral economic summit that attempts to broaden the conversation to such areas as energy and climate change, but with US jobless rates hovering near 9 per cent, the pressure to scapegoat isn't letting up anytime soon. The 'all eggs in one basket' foolishness of this is clear, and the idea that China will relent given the potential cost to its fragile domestic stability seems unlikely.

Like all powers emerging from long periods of backwardness, China wants to make the most of its labour cost advantages while they last—something that giants of nineteenth-century American capitalism understood very well. The idea of fair play, which figures high in the rhetoric of contemporary American politicians, did not feature at all in the development of American capitalism's march to global dominance.

Imagine, now, a China run by a democratically elected government accountable to Chinese voters. Given the likelihood that an appreciation of the RMB would cause mass layoffs as foreign factories switch to other, even cheaper Asian producers like Bangladesh or Vietnam, any government claiming to act in the name of its citizens would resist outside pressure to allow the RMB to appreciate.

Meanwhile, neither the Chinese nor other foreign holders of US Treasuries are under any illusion about the ultimate aim of US policy: a covert devaluing of the US dollar, which would drastically reduce the value of the enormous investment China and other US creditors hold as their shares of the US national debt.

This lack of a 'grand strategy' in America's approach to China shows up in minor trade and commercial disputes. Recently, a bid by the Chinese telecommunications company Huawei Technologies to buy the American firm 3Leaf was dropped under pressure from the US Committee on Foreign Investment in the United States, a Cold War-era body founded to prevent sensitive technology from falling into Soviet bloc hands. In 2007, the same panel refused to allow Huawei to buy an Internet routing company, 3Com, which was ultimately purchased by US giant Hewlett-Packard instead.

There will be times when such actions make sense. Some of China's largest firms remain deeply entangled with the state and China's military. And not all of China's proposed investments rose to the level of political fights—China's Lenovo bought IBM's PC business in 2006, and the huge construction firm China State Construction Engineering Corporation is a major contractor on the reconstruction of the San Francisco Bay Bridge and has won contracts for work on New York City's subway and other large infrastructure projects.

But China's FDI in US corporations remains tiny—a paltry $791 million in 2009, compared with the over $43 billion invested in China by American firms that same year. With China's appetite for such investments likely to increase by up to eight times in the next decade, the United States would be foolish to continue warding off such money particularly given that these investments would help, over time, support US exporters and offset the enormous trade deficit the United States has been running with China for decades (amounting to $273 billion in China's favour in 2010).

China's great manufacturing complexes now dominate global markets across vast product lines, including appliances, consumer electronics and consumer durables like sporting goods, clothing, toys, furniture and textiles. Yet China lacks something that Trenton and its nineteenth-century peers had in spades: innovators. Even in 2010, the year China officially overtook Japan as the world's second largest economy, no Chinese brand could viably be called a household name in any Asian market, let alone in the wider world. Something is retarding China's transition from copycat manufacturer to innovative top dog. The kind of manufacturing that accounts for nearly all of China's export earnings relies on low-cost inputs, including labour, as opposed to the quality and technology that underpin an advanced economy's manufacturing sectors, notably in Japan, Germany and the United States. The combination of a nineteenth-century business model and a twenty-first-century pseudo-communist political repression does not foster a climate of original innovation.

The problem might be solved in the long term by investment in R&D and reforms to China’s economic incentives and education system—indeed, Japan suffered from precisely these problems early on in its emergence from the depths of destruction after World War II into a postwar economic powerhouse. But economists also suspect that the centralized nature of China's government will prove a lasting hindrance, allowing the United States and other advanced economies to maintain their lead in high-tech goods for far longer than might otherwise be the case.

This underscores a deeper dilemma. China's brute strength in manufacturing is based on the simple, and possibly unsustainable, deal that the Communist Party made with its urban elites—it will keep incomes rising and leave the urban elites alone to make money as long as they keep their political aspirations to themselves. But as more and more Chinese in the vast, poor interior clamour for their own piece of the pie, wages will rise and demands for safety and environmental codes will erode competitiveness. When this happens and jobless workers get angry, the urban elites may renege on the deal for a greater say in their own government. Even then, the rural millions may not have much patience left.

China is by no means doomed to remain a smokestack power. Increasing investment in science, technology and other innovation sectors, currently about 1·5 per cent of the GDP, puts China at the top of the table among emerging economies in terms of R&D spending, and fourth overall behind only the United States, Japan and Germany. But the country's mediocre higher educational system, demographic and political challenges and corruption suggest that this will be more of a Long March than a Great Leap Forward.

Take the problem of demography. China certainly cannot be described as suffering from a shortage of people, but it is suffering from an acute shortage of a certain generation of people, thanks to its enforcement of one-child population control policies since 1979. China has enjoyed a larger ‘demographic dividend’ (extra growth as a result of the high ratio of workers to dependents) than its neighbours. But the dividend is near to being cashed out. Between 2000 and 2010, the share of the population under 14 —future providers for their parents — slumped from 23 per cent to 17 per cent. China now has too few young people, not too many. China has around eight people of working age for every person over 65. By 2050 it will have only 2·2. Japan, the oldest country in the world, now has 2·6. China is getting old before it has got rich.

The US figure, according to Census Bureau projections, is 3·7 workers per retiree—and the trend will go gently upward. Indeed, of the countries that will rank as the world’s largest economies later in this century, only the United States, Brazil and Turkey have managed to avoid acute ‘dependency ratio’ problems. For the United States, a relatively high birth rate and a much more open approach to immigration have saved it from the worst of the crisis.

In 2011, the Boston Consulting Group (BCG) reported that, owing to a number of changing economic realities—including rising salaries and economic expectations among Chinese workers, new labour, environmental and safety regulations abroad, the higher price of energy affecting transportation costs, and the US market and the uncertainties of political risk in these places—the cost benefits of producing in Asia no longer automatically outweigh the risks. Indeed, the BCG report predicts a 'renaissance for U.S. manufacturing' as labour costs in the United States and China converge around 2015. Anecdotally, the effects can already be seen in new plants created in the United States and in some instances in which plants set up abroad a generation ago to leverage lower labour costs have relocated back to America.

If the main factors in these decisions were labour costs and the weak dollar, the victories would be Pyrrhic. 'Inputs'—energy, transportation, raw materials and other production costs—will fluctuate. But the decisive advantages include innovative management and production techniques, savings on transportation costs, lower political risk and corruption, and the productivity and relatively high skill levels of the American workforce.

The US economy, with its potent, creative corporate sector; transparent bureaucracy; world-beating universities; and highly skilled labour force, does not have to bow down in surrender to China. It can compete and adjust to the arrival of China and to the billion-odd other middle-income workers being added to the global economy in other emerging-market countries by stretching its lead in high-end, knowledge-based commerce. The true danger for the United States does not lie in predatory Asian sweatshops. Instead, danger lies in bumbling, for political reasons, into an overtly hostile stance that puts the world's two most powerful nations on course for a war so terrible that no nation, regardless of its choice of economic or political models, would survive it.

Extracted from The Reckoning. Debt, Democracy, and the Future of American Power by Michael Moran; published by Palgrave Macmillan



The Financial Crisis and its Lessons, Sir Howard Davies - an extract

In his article The financial crisis and the regulatory response: An interim assessment Howard Davies asks if we have done enough to prevent history repeating itself.

There is nothing quite like a financial crisis to focus political minds on how we regulate our affairs. In times of economic calm, politicians are not much concerned with supervisory agencies. The subject bores them. There is far more interest in pork barrel spending Bills, or in going to war with a country without too many voters at home. Only when markets go into spasm, and the public authorities have to step in with their cheque books, do legislators bend their minds to the issues. At that point it becomes clear that 'something' must be done, and that 'something' is usually either a raft of legislation giving regulators new powers to secure the doors of all the empty stables, or structural reform, or a combination of the two.

How, then, should we judge the latest raft of reforms? We cannot answer this question without offering a view on how far regulation was at fault, and how significant regulatory failings were, in the run-up to the crisis. If 'human error' was at issue, then there is in principle no need to change the law or the institutional structures. Instead, we should change the people, and hope the new crew do better next time around.

REGULATORY FAILINGS PRE-1997

My view is that regulatory failings did play a part, but that those failings were by no means the only or the most significant factors. The crisis came about because global imbalances, combined with relatively loose monetary policy, created the conditions in which leverage expanded rapidly. The monetary authorities on both sides of the Atlantic focused attention on retail price inflation, and assumed that control of inflation was a sufficient condition to maintain financial stability. In that environment, the incentive structures within financial firms pushed them to take on greater risks. In some cases, senior management had a poor understanding of the risks they were taking on, blinded by the complexity of new and dangerous products. As a result, when asset prices began to fall, and a liquidity squeeze developed, a number of markets collapsed.

At that point it became clear that financial regulators had not been tough enough, particularly in their approach to capital reserving, to constrain risk-taking or to ensure that institutions were sufficiently robust to cope with a period of severe stress.

It is wholly unrealistic to expect regulation to be the front line defence against booms and busts. Monetary policy is a far more effective, though still imperfect, weapon in that fight. However, it is reasonable to expect regulators to act as speed bumps when the traffic is accelerating too rapidly. They did not perform that function.

Most regulators now accept that there was too little capital in the banking system, and especially that capital requirements in the trading books of the investment banks were far too light. The regime assumed the effectiveness of hedging strategies, which proved of little value as previous price relationships broke down. It also assumed continuous liquidity, an assumption that proved dramatically false in 2007. Regulators, as much as the banks, failed to identify the damage that could be done by a collapse of confidence in highly complex over-the-counter deals, which were extremely difficult to price, even in normal market conditions.

The failure of regulators to identify dangerous trends and to warn against them was shared with the boards of the international financial institutions. The IMF was particularly weak in that respect, proclaiming, until just before the crisis hit, its belief that risk transfer innovations had made the financial system more robust, and bank failures less likely. Although individual institutions warned against specific trends and imbalances—the Bank for International Settlements (BIS) can probably claim the best record in the pre-crisis years—no entity pulled the pieces of the jigsaw together.

STRUCTURAL WEAKNESSES

These weaknesses point to some structural issues, two of which stand out.
If we look at the pre-crisis global regulatory architecture, we see a spider's web of interlocking relationships—with the Financial Stability Forum, set up in the wake of the Asian financial crisis of the late 1990s, sitting awkwardly in the centre. But while the FSF included the heads of the international standard setters—the Basel Committee, International Organisation of Securities Commissions (IOSCO) and so on—it had no authority to tell them what to do or when to do it. Each of them operated to their own leisurely timetable, dictated by their reluctance to devote energy to international issues, rather than to domestic fire-fighting.

Thus, the Basel Committee spent over a decade producing the new capital rules known as Basel 2, even though serious flaws in the original Capital Accord had been identified. Draft after draft was produced, of ever greater complexity, but no one asked the big question of whether there was enough capital in the banking system overall. By the time the crisis hit, there was broad agreement on Basel 2, but the United States had not resolved to implement it, and various versions were in existence, many of them relying excessively on banks' own internal models to determine risk.

The second obvious flaw in the global architecture was the lack of representation of the developing world. The membership of the financial bodies was mainly G7-based, at a time when the centre of the world's economic gravity was shifting rapidly to the East. The Basel Committee provided perhaps the most egregious example. In 2006, 10 of its 13 members were from Europe, and the European Commission and the European Central Bank (ECB) also attended. The most recent addition to the Committee, five years earlier, had been Spain.

Important countries, notably China, were becoming reluctant to be 'price-takers', simply accepting standards set by others, on which they had not been consulted. That created the risk of uneven application of global standards.

There were structural flaws elsewhere, too. It was already clear that the EU was living uncomfortably in a halfway house. Since member states were reluctant to adopt common standards, most formal authority still rested with national regulators.

The weaknesses were soon revealed all too starkly. The Icelandic bank case was the first severe test. Iceland is not a full member of the European Union, but it is part of the European Economic Area (EEA). According to EU law, a bank authorised in any country of the EEA is entitled to take deposits in all other countries, without needing authorization from the host regulator. When they began to run short of funds to fuel their aggressive expansion, Icelandic banks chose to seek retail deposits in the United Kingdom and the Netherlands, by the simple expedient of offering deposit rates slightly higher than those of the competition. When the crisis hit, and the three big Icelandic banks were revealed to be seriously overextended, they were unable to refund those deposits, and the Icelandic central bank was too small to be able to help. Thus, British and Dutch taxpayers were the only effective source of compensation for depositors in a bank over which their own regulators had had no authority. They paid up to the tune of several billion pounds.

Europeans drew two opposing conclusions. Those inclined to favour greater European integration used the experience to argue that the system of mutual recognition, on which the single financial market was originally constructed, was no longer viable, and that a system of pan-European regulation was clearly needed. Sceptics took the opposite view, maintaining that the real lesson was the need for host regulators to have the power to reject incomers from elsewhere in the EEA. That would begin to dismantle the Single Financial Market. Hedging his bets somewhat, the Chairman of the FSA noted that the episode clearly showed that we needed either 'more Europe or less Europe', and that the status quo was not tenable.

The crisis also revealed structural problems or regulatory gaps in individual countries. The United States was an obvious case in point. Critics have pointed to the lack of regulation of the mortgage market, to the existence of a multiplicity of banking regulators creating scope for regulatory arbitrage, to dysfunctional disputes between the two securities regulators the SEC and Commodities and Futures Trading Commission (CFTC), and to the lack of a body charged with oversight of systemic risk. The Dodd-Frank Act has made headway in some of these areas, but it is too early to say how effective those changes will prove to be.

In the United Kingdom, an early challenge to the regulators was the failure of Northern Rock, an almost exclusively domestic mortgage bank. The authorities’ initial response was hesitant and for the first time in 150 years there was a fully-fledged run on a bank, with queues of depositors outside branches trying to withdraw their funds. It was widely argued that the fault lay in the reforms carried out by the Labour government in the late 1990s, and especially in the removal of banking supervision from the Bank of England. There was a political dimension to this argument, of course, but it certainly did seem that the so-called Tripartite system, involving the Treasury, the Bank of England and the FSA, had worked poorly.

This catalogue of regulatory failure is depressingly long. In Germany tough questions have been asked about the oversight of regional banks. The Dutch Central Bank has been widely criticized for presiding over the almost total collapse of its banking system. However, I will limit myself to asking whether the reforms agreed so far are a sufficient response to the crisis.

POST-CRISIS REFORMS

Global

If we begin with the structural changes, the first and most rapidly agreed change was the switch from the G7 to the G20 as the basis for membership of the key financial oversight bodies. It was so obvious that an adequate response to the crisis needed cooperation from the large surplus countries that the convening of a G20 summit by President Obama in December 2008 was accepted by all countries without demur. Changes in the membership of the FSF and the Basel Committee followed quickly, after the April 2009 summit.

There are those who argue that even this broader membership is inadequate. Joe Stigliz and others have advocated a system built on more comprehensively global lines. However, it seems unlikely that further expansion will be agreed in the near future.

Will this broader membership contribute to making the financial system safer? It is hard to say. We do not know what the new countries want to achieve. So far, the signs are that China sees advantage in implementing tougher capital standards, and is committed to their enforcement. However, the Chinese are determined to exclude from the agenda discussion of currency misalignments and global imbalance. Thus, for now, I would view the expansion of membership as an overdue change, reflecting the new economic realities, but not one that will necessarily promote the coordination of macroeconomic policies which would help avoid a recurrence of the catastrophic events of 2007–2009.

Also at the London summit, the G20 agreed to strengthen the centre of the system, by renaming the FSF the Financial Stability Board (FSB). What's in a name? Not necessarily a great deal, but the G20 finance ministers look to the FSB to present progress reports on the reform agenda. That gives the Board some purchase on the standard-setters and others, and it is reasonable to believe that it has had an effect on the working practices of the Basel Committee, which produced a new capital regime, Basel 3, in little more than 10 per cent of the time it took to gestate Basel 2.

However, the FSB remains an informal body. There is no treaty basis for its existence. Its chair is a part-timer. For four years, Mario Draghi was simultaneously Governor of the Bank of Italy. Its capacity for independent action is strictly limited. The commitment of some of its members, notably the United States, is doubtful.

The Council on Global Financial Regulation (CGFR), a group of former regulators, central bankers and academics (of which I am a member), has advocated reforms to strengthen the position of the FSB. Although the output of the FSB has been disappointing so far the weakness, in the opinion of the CGFR, is more the consequence of its uncertain status than of its structure. It remains the only body that includes representatives of all the agencies needed to coordinate effective action at global level. But the institutional backing is still lacking. One cannot therefore give this area of reform more than a modest grade so far.

We can be more optimistic about the changes under way as a result of the Basel Committee's supercharged work on Basel 3. They have produced a new framework, with far tougher requirements. Banks will in future be required to hold significantly larger capital reserves, and a larger proportion of those reserves must be in the form of tangible common equity.

The Committee has also proposed a new resolution regime, which aims to allow banks to be wound up without causing severe disruption to the wider economy. Systemic institutions must prepare 'living wills', or 'funeral plans'. However, the details remain sketchy and, as the Lehman case demonstrated, there remain many obstacles to a rational cross-border insolvency regime.

The framework also allows for a 'countercyclical buffer', an additional reserve, which might be varied depending on regulators' view of the state of the business cycle, or of potential misalignments of asset prices. In addition, for the largest banks, there is a kind of 'too big to fail' supplement.

These reforms will undoubtedly make the banking system safer. However, the behaviour of bank shares seems also to be telling us that they will markedly reduce its return on equity. That may be appropriate, as banks will in future look more like regulated utilities, with tight controls on capital and indeed on dividends.

But what of the impact on the cost of bank borrowing, and thereby on investment in the economy more generally, and on economic growth and job creation? On that crucial question, there is no consensus whatsoever. The Basel Committee has argued that the impact would be very modest indeed, and that growth would be less than half a per cent lower over five years. The OECD has estimated the impact at about twice that size. However, economists at the Institute of International Finance, the trade association for the biggest international banks, argue that growth will be fully 3 per cent lower over five years. If they are right, this would prove to be a very costly reform indeed.

On the countercyclical question, while there is agreement on an additional capital buffer, we do not know how decisions on its implementation will be made. How do we assess when markets are out of line, or when credit growth is too rapid? It was the failure to react pre-emptively to credit expansion that contributed as much as anything to the bubble which burst so dramatically in 2007. And who is to assess the appropriate response? In principle, one can respond to excess credit growth by raising interest rates, or by lifting capital requirements by expanding the countercyclical buffer. However, the first response is the province of monetary policymakers, whereas the second is a matter for regulators. These may seem arcane arguments at a time when the Federal Reserve has promised to maintain short-term interest rates at close to zero for the foreseeable future, but one day the problem will arise again.

In principle, the FSB could take a view, but so far members have been reluctant to stray into that territory. In Europe, the European Systemic Risk Board, chaired by the President of the ECB, could do so, but interest rates remain the jealously guarded province of the ECB's Governing Council. In the United States, the new Financial Stability Oversight Council might opine, but once again control over interest rates lies elsewhere, with the Federal Open Market Committee. In the United Kingdom, there is a new Financial Policy Committee (FPC), sitting alongside the Bank of England's Monetary Policy Committee (MPC), but with very different membership and procedures. It is hard to escape the conclusion that these structural reforms have not resolved the problem. We will, as before, depend on the judgement of the individuals in positions of influence—some of them the same people as before.

European

In Europe it is difficult to be optimistic about the response to the 'more or less Europe' question. We are a long way short of a single European regulator, or even an optional federal regime for pan-European institutions, on the American model. We now have a European Banking Authority, a European Securities and Markets Authority (ESMA) and a European Insurance and Occupational Pensions Authority. But they are located in three different cities (London, Paris and Frankfurt), reflecting a purely political deal. That does not facilitate cross-authority coordination.

In addition, they are barely 'authorities' in the normal sense of the term. Their powers are quite limited. ESMA has direct authority over credit rating agencies, but with that exception these bodies operate through national authorities. They are charged with preparing, over time, a single European rule book, and they have the ability to arbitrate in the implementation of directives. However, we remain a long way from a federal system of regulation, and it is not even clear that the new arrangements would prevent a recurrence of the Icelandic bank problem.

This aspect of European integration has taken a back seat during 2011, as bigger issues relating to the future of the eurozone have come to the fore. Will the eurozone move towards a fiscal union, as many believed would be the inevitable consequence of the single currency? Will there be a eurozone finance ministry? Would that ministry issue eurozone bonds, guaranteed collectively by all governments? If these changes come to pass, and commentators increasingly see them as necessary to maintain the integrity of the euro itself, then the structure of financial regulation may come back onto the political agenda. My forecast would be that a genuinely pan-European system of regulation will eventually be set up, at least for major cross-border firms, and at least in the eurozone. That will pose an interesting challenge for the United Kingdom and other non-eurozone members. In the meantime, Europe has decided to lodge in yet another halfway house, although one slightly closer to the federal model.

Already we are seeing the drawbacks of yet another interim solution. In the summer of 2011, as the markets reacted badly to continued uncertainty about the recovery, and the weak fiscal positions of southern European countries, some countries wished to introduce short-selling bans and appealed to ESMA to organize a pan-European solution. ESMA was unable to do so, lacking the power to oblige any state to act against its own domestic preferences. Four countries wanted to go ahead with a ban: the rest did not. Thus, that is what happened, and the four bans were themselves slightly but significantly different in form. That was not a positive omen for the future.

United Kingdom

The United Kingdom's reforms are still work in progress. In what is described as a 'new approach to financial regulation', the prudential functions of the FSA have been carved out into a new Prudential Regulatory Authority (PRA), which will be a wholly owned subsidiary of the Bank of England.

There is also a new Financial Conduct Authority, located in the Bank of England, and chaired by the Governor, whose role is to 'contribute to the Bank's financial stability objective by identifying, monitoring and taking action to remove or reduce systemic risks'. So there are now four entities likely to be involved in crisis management: the Treasury, the Bank of England, the PRA and the FCA, with the FPC sitting between them. The effectiveness of the arrangements will depend crucially on the skills and wisdom of the participants, rather than on the particular structure within which they work.

CONCLUSIONS

In the four years since the crisis erupted, much has been done to correct the regulatory flaws it revealed. For a time, it seemed that the political obstacles, which had bedevilled earlier attempts at reform, would be blown away. Thus, there was talk of a global body with genuine power to enforce regulations—a World Financial Authority.

Now, although the crisis is far from over, the grander ideas have disappeared from the agenda. Tentative moves to strengthen the central nervous system of global finance have been made, but they fall well short of a revolution. In the United States, there have been only modest structural changes, but a barely digestible wave of new legislation. In the EU, we have the 'form' of European-wide regulation in the three new authorities, but not the substance. In the UK, we have once again shuffled the regulatory pack, and put the Bank at the top of the pile, from which it had been dislodged a decade or so earlier. What goes around comes around.

So, has a good crisis been wasted? The wise commentator would say that it is too soon to tell. However, overall, it is hard to escape the conclusion that there has so far been less in the way of significant reform than meets the eye.

Extracted from International Journal of Disclosure and Governance, Volume 9, Issue 3, 2012; Macmillan Publishers Ltd.


Greek Crisis in Perspective: Origins, Effects and Ways-Out, Nicos Christodoulakis - an extract

Breaking the cycle: Nicos Christodoulakis explains the Greek economic descent and what the country must do to recover.

In the aftermath of the global financial crisis of 2008, several European countries were engulfed in a spiral of rising public deficits and explosive borrowing costs that eventually drove them out of markets and into bail-out agreements with the International Monetary Fund (IMF), the European Union (EU) and the European Central Bank (ECB). Greece was by far the most perilous case, with a double-digit fiscal deficit, an accelerating public debt which in GDP terms was twice the Eurozone average, and an external deficit near US$5,000 per capita in 2008, one of the largest worldwide.

In the wake of an EU bailout and two elections the situation remains critical. Unemployment is rocketing, social unrest undermines the implementation of reforms and the fiscal front is not yet under control, despite extensive cuts in wages, salaries and pensions. The possibility of Greece exiting the Eurozone is widely anticipated.

Greece joined the European Union in 1980. Membership inspired confidence in political and institutional stability but fed uncertainties over the economy. After a long period of growth, Greece faced recession not only as a consequence of worldwide stagflation, but also because on its way to integration with the common market it had to dismantle its system of subsidies and tariffs. Soon after accession, many firms went out of business and unemployment rose for the first time in decades.

The government opted for fiscal expansion including underwriting ailing companies. The effect was predictable: a chronic haemorrhage of public funds without any supply-side improvements. Similarly, the expansion of demand simply led to more imports and higher prices. The external deficit approached 8% of GDP in 1985, a level at which several Latin American economies had collapsed. A stabilization programme in October 1985 involved a devaluation by 15%, a tough incomes policy and extensive cuts in public spending. The programme achieved a rise in revenues by beating tax-evasion practices and replacing less effective indirect taxes with the VAT system. Public debt was immediately stabilized, but the programme was opposed from within the government and was abandoned in 1988.

Despite looming deficits, in 1989 the coalition government decided to abolish prison sentences for tax arrears, which was taken as a signal of relaxed monitoring, thus effectively encouraging further evasion.

Another bizarre measure was to cut import duties for repatriates buying luxury cars, thus depriving the budget of badly needed revenues and leading to black-market abuses.

As a result, revenues collapsed and the country suffered a major fiscal crisis until a majority government elected in 1990 enacted a new stabilization programme.

Although Greece was a signatory of the Maastricht Treaty in 1991, it was far from obvious how or when the country could comply with the convergence criteria. Public deficits and inflation were at two-digit levels and there was great uncertainty about the viability of the managed exchange rate.

In May 1994, capital controls were lifted in compliance with European guidelines and this promoted fierce speculation. Interest rates rose sharply and the Central Bank of Greece exhausted most of its reserves to stave off speculation. This proved to be an incentive to join the European Monetary Union to ward off more attacks. Soon after the 'Convergence Program' set time limits to satisfy the Maastricht criteria and included a battery of reforms in the banking and the public sectors.

However, international markets continued to doubt exchange rate viability. With the advent of the Asian crisis in 1997 spreads rose dramatically and Greece finally chose to devalue in March 1998 by 12·5% and subsequently to enter the Exchange Rate Mechanism. The country was not ready to join the first round of Eurozone countries and was granted a transition to 1999 to comply with the convergence criteria.

After depreciation, credibility was enhanced by structural reforms and reduced state borrowing so that when the Russian crisis erupted in August 1998 the currency came under very little pressure. Public expenditure was kept below the peaks it had reached in the previous decade and was increasingly outpaced by the rising total revenues. Tax collection improved with the introduction of a scheme of minimum turnover on SMEs, eliminating a large number of tax allowances, by the imposition of a levy on valuable property and a reorganization of the auditing system. With the privatization of public companies, public debt fell to 93% of GDP in 1999. Although still higher than the 60% threshold required by the European Treaty, Greece was said to be on track 'to lean toward that level', a formula used by other countries to enter EMU.

Market reforms, introduced for the first time in 1986, aimed at modernizing the outmoded banking and financial system in compliance with European directives. A major reform in social security in 1992 curbed early retirement and excessively generous pension/income ratios.

Throughout the 1990s, reforms were aimed at restructuring public companies whose deficits had contributed to the fiscal crisis in 1989. State banks were privatized or merged, several outmoded organizations were closed down and initial public offerings (IPOs) provided capital and restructuring finance to several utilities. Other structural changes included the lifting of closed-shop practices in shipping, the entry of more players into the mobile telephone market and efforts to make the economic environment more conducive to entrepreneurship and employment.

After 2000, the reform process gradually slowed. Proceeds from privatization peaked in 1999, but subsequently remained low as a result of the contraction in capital markets after the dot.com bubble and the global recession in 2003.

An attempt in 2001 to reform the pension system led to social confrontation and was finally abandoned, to be replaced by a watered-down version a year later. Two other reforms followed in 2006 and 2010, but the social security system was still burdened by inequalities, inefficiencies and structural deficits.

Reform fatigue spread more widely after the Olympic Games in 2004. Since then, reforms have been concentrated on small-scale IPOs, with important exceptions being the sale of Greek Telecom and the privatization of the national air carrier.

Despite primary surpluses achieved throughout 1994-2002, public debt fell only slightly. There were three reasons. First, the government had to issue bonds to qualify for joining the Euro, a capital injection which led to an increase in public debt without affecting the deficit.

Second, after a military stand-off in the Aegean, Greece increased defence expenditure to well above 4% of GDP. In line with Eurostat rules, the burden was fully recorded in the debt statistics at the time of ordering, but only gradually in the current expenditure according to the actual delivery of equipment. This practice created a lag in the debt-deficit adjustment which was removed in 2004 when the government reverted to accounting at the date of ordering. Though a decision by Eurostat in 2006 made the delivery-based rule obligatory for all countries, Greece did not comply. The result was that deficits were augmented for 2000-04 and scaled back for 2005-06.

The third reason was the strong appreciation of the yen/euro exchange rate by more than 50% between 1999 and 2001, which increased Greek public debt on loans in the Japanese currency contracted during the 1994 crisis. To alleviate this, Greece entered a currency swap in 2001 by which the debt to GDP ratio was reduced by 1·4% in exchange for a rise in deficits by 0·15% of GDP in subsequent years, so that the overall fiscal position remained unchanged in present value terms. Although the transaction had no bearing on the statistics for 1999 on which EMU entry was assessed, critics mistook it as a ploy to circumvent a proper evaluation.

After the Eurozone became operational, hardly any attention was paid to current account imbalances of Greece or any other deficit country. It was only in the aftermath of the 2008 crisis that the European Union started emphasizing the adverse effects that external imbalances might have on the sustainability of the common currency.

The reason for this complacency was not merely that devaluations were ruled out by the common currency. A widespread view held that external imbalances were mostly demand-driven and, as such, they would sooner or later respond to fiscal adjustment. This proved to be misguided optimism.

The deterioration in the Greek current account accelerated after 2004 as domestic demand rose in the post-Olympics euphoria, inflation differentials with other Eurozone countries widened and the euro appreciated further. A similar erosion of competitiveness took place in all other European countries that are currently in bailout agreements (Ireland by 12% and Portugal by 8%) or considered to be at the risk of seeking one (Spain by 9% and Italy by 8%).

However, Greece was particularly vulnerable. Accelerating labour costs, the poor quality of the regulatory framework, corruption practices and weak government were all crucial in shaping productivity and competitiveness. These factors explain the poor performance of Greece in attracting foreign direct investment in spite of the substantial fall in interest rates and capital flows within the Eurozone. While FDI remained almost static, its composition changed with inflows directed to non-manufacturing sectors, notably to real estate. Investments in real estate boost aggregate demand, raise prices, cause the real exchange rate to appreciate and hinder competitiveness. These developments manifest a major failure of Greece—and for that matter of other Eurozone countries—to exploit the post-EMU capital flows in order to upgrade and expand production.

The fiscal decline started with the disappearance of primary surpluses after 2003 and culminated with rocketing public expenditure and the collapse of revenues in 2009. Revenues declined as a result of a major cut in corporate tax rate from 35% to 25% in 2005 and inattention to the collection of revenues.

It was becoming evident that stabilizing the economy was not a policy priority. Concerned over the rising deficits in 2007, the government sought a fresh mandate to redress public finances, but—despite securing a clear victory—no such action was taken. Only a few months before the global crisis erupted, the government claimed that the Greek economy was 'sufficiently fortressed' and would stay immune to international shocks. Even after 2008, the government hesitated to implement measures to stem fiscal deterioration or to expand public spending to fight off the prospect of recession. A compromise included a consumption stimulus at the end of the year, combined with a bank rescue plan of €5 billion and a pledge to raise extra revenues. The first two were quickly implemented, whilst the latter was forgotten.

Weakened by internal divisions, the government opted for a general election in October 2009 as a new opportunity to address the mounting economic problems. The fiscal consequences were stunning: total public expenditure was pumped up by more than 5 percentage points, exceeding 31% of GDP at the end of 2009. (In actual amount, it exceeded €62 billion, twice the size in 2003.) The rise was entirely owing to consumption, as public investment remained the same at 4·1% of GDP.

Total receipts in 2009 fell by another 4% of GDP as a result of widespread neglect in tax collection and emergency capitalization of Greek banks. The deficit was revised from an estimated 6·7% of GDP before the elections to 12·4% in October 2009, and finally widened to 15·4% of GDP by the end of the year.

Even then, the budget for 2010 included an expansion of public expenditure while excluding privatizations, rather than the other way around. Rating agencies downgraded the economy, sparking massive credit default swaps in international markets.

But instead of borrowing cheaply in the short term as a means of gaining time to redress the fiscal situation, the government continued to issue long maturities, despite the escalation of costs. This had dramatic consequences in the international markets where a Greek liquidity problem, having the cash to meet the next interest payments, became a solvency problem, a fear that Greece would never be able to repay its existing debt.

The borrowing capacity was further undermined when the ECB threatened to refuse collateral status for downgraded Greek bonds, fuelling fears that domestic liquidity would shrink and precipitating a capital flight from Greek banks. In early 2010, borrowing costs started to increase for both short- and long-term maturities, Greece had become a front page story worldwide and the countdown began. In April 2010 the government was financially exhausted and sought a bailout.

The global financial crisis in 2008 revealed that countries with sizeable current account deficits are vulnerable to international market pressures because they risk a 'sudden stoppage' of liquidity. As Krugman (2011) recently suggested, the crisis in the southern Eurozone countries had rather little to do with fiscal imbalances and rather more to do with the sudden shortage of capital inflows required to finance external deficits.

This explains why, immediately after the crisis, sovereign spreads peaked, mainly in economies with large external imbalances, such as Ireland, Spain, Portugal and the Baltic countries, which were under little or no pressure from fiscal deficits. It is worth noting that countries with substantially higher debt burdens, such as Belgium and Italy, experienced only a small increase in their borrowing costs at that time.

Since Greece had a dismal record on both deficits, its exposure to the credit stoppage soon became a debt crisis. The current account was in free-fall after 2006, when domestic credit expansion accelerated, disposable incomes were enhanced by tax cuts and capital inflows from the shipping sector peaked as a result of the global glut. The external deficit exceeded 14% of GDP in 2007 and 2008 but no warning was raised by any authority, domestic or European. In fact, the government acted pro-cyclically and decided to reduce surcharges on imported luxury vehicles, responding to the pleas of car dealers. This opened the way for the pre-electoral spree.

Two facts emerge. One is that in periods of recession counter-cyclical activism usually takes the form of increased consumption, not public investment, and this has detrimental effects on public and external deficits without contributing to higher growth. Another recurring characteristic is the propensity of governments to increase public spending and to tolerate lower revenues in elections years.

EU authorities were unprepared for the Greek problem and undertook action only when they recognized the risks it posed for the banking systems of other European states. A joint loan of €110 billion was finally agreed in May 2010 by the EU and the IMF to substitute for inaccessible market borrowing. The condition was that Greece was to follow a Memorandum of fiscal adjustments to stabilize the deficit and structural reforms to restore competitiveness and growth. In the event of success, Greece would be ready to tap markets in 2012 and then follow a path of lowering deficits and higher growth.

Faced with a deepening recession and a failure to produce fiscal surpluses sufficient to guarantee the sustainability of Greek debt, the European Union intervened twice to revise the terms of the Memorandum. In the first major intervention in July 2011, the amount of aid was increased by €130 billion and repayment extended over a longer period.

Crucially, the EU recognized the perils of recession and allowed Greece to withdraw a total amount of €17 billion from structural funds without applying the fiscal brake of national co-financing. The plan looked powerful, except for the typical implementation lags. The agreement was only voted through by all member-state parliaments in late September 2011 and the release of structural funds was approved by the European Parliament in late November. Participation in the Private Sector Involvement had reached only 70% of institutional holders amid speculation that post-agreement buyers of Greek debt from the heavily discounted secondary market were expecting a huge profit.

Thus, a new intervention looked inevitable and in October 2011 a revised restructuring was authorized, envisaging cuts of 50% of nominal bond value that would eventually reduce Greek debt by €100 billion. Greek debt was expected to be stabilized at 120% of GDP by year 2020. The agreement was hailed in Greece but euphoria turned sour when the government surprised everybody by seeking a referendum for its approval. In the ensuing furore, the decision was annulled, the prime minister resigned and a coalition was formed in November 2011 to implement debt restructuring and to negotiate terms for the new round of EU-IMF loans.

Routinely considered the habitual wrongdoer, especially when compared with the other countries (Ireland and Portugal) which are undergoing similar adjustment programmes, a Greek exit from the Eurozone started to attract attention both at home and abroad.

Although the complications and costs in the banking sector would be enormous, the exit of Greece could prove attractive to some European politicians who get angrier every time a new round of aid is discussed. However, they overlook the fact that a Greek exit would lead to an aggravation of the crisis. If the result was a two-tier model of Economic Governance, based on an inner core of surplus economies in the north and a weaker periphery in the south, competitiveness can only be restored by a so-called 'internal devaluation' of labour costs, thus perpetuating the gap that is already widening between the Eurozone countries.

For Greece, exit would trigger an economic catastrophe. As the entire Greek debt would remain in euros, the rapid depreciation of the new national currency will make its servicing unbearable and the next move will be a disorderly default. Isolation from international markets would drive away investors while the financial panic would drain domestic liquidity on a massive scale. The creditor countries of the EU would start demanding repayment of their aid loans, and this would soon deprive Greece of its claim on the EU cohesion funds. Tensions would produce further conflicts with EU agencies and the pressure to consider complete disengagement from the European Union would gain momentum.

The only option for Greece is to complete the fiscal adjustment and become reintegrated into the Eurozone as a normal partner. To gain credibility, Greece must achieve key fiscal targets quickly in order to be able to revise some of the pressing—although so far unattainable—schedules and ensure greater social approval. To ensure that there will be no spending spree in future elections, the best option for Greece is to adopt a constitutional amendment on debt and deficit ceilings, just as Spain did in September 2011, alleviating market pressures, at least for the time being.

Greece needs a fast-track policy for exiting the long recession. €17 billion could be routed immediately to support major infrastructural projects and private investment in export-oriented companies. The growth-bazooka should be followed by structural reforms and privatizations to attract private investment as market sentiment is restored. In addition, instilling growth will help to control the debt dynamics and reduce public deficits without ever-rising taxes.

The Greek economy has cumulatively shrunk by nearly 15% since 2008, social tensions are multiplying and the future of Greece in the Eurozone is in jeopardy. Some consider such an outcome as a due punishment for past excesses, while others see it as an escape from further unemployment and recession. Both are illusory. The only viable way out of the current crisis is to restore growth and to adopt a realistic plan for privatizations and reforms. The lesson of the past two years is that the deep recession will otherwise continue to hinder any exiting from the crisis. Greece, and perhaps other Eurozone countries, need a 'corridor of confidence', to use Keynes' famous phrase, in order to put things in order.

Nicos Christodoulakis is a professor at the Athens University of Economics and Business. He was the Greek Minister of the Economy and Finance between 2001 and 2004. During the Greek EU presidency in the first half of 2003 he was chair of the Euro group and the Economic and Financial Affairs Council (ECOFIN).

Extracted from The New Palgrave Dictionary of Economics March 2012; Palgrave Macmillan.


The End of Isolation, Nicholas J. Cull - an extract

In his article Pearl Harbor and public diplomacy: 70 years on, Nicholas J. Cull explores the British influence on the US entry into World War II.

Last December marked the 70th anniversary of one of the great turning points in international diplomacy: the entry of the United States into the Second World War following the Japanese attack on Pearl Harbor on 7 December 1941. Major US public diplomacy initiatives followed, including the Voice of America and appointment of the first wave of cultural attachés at American embassies. However, the public diplomacy preceding this momentous event also deserves to be revisited: specifically, the subtle campaign undertaken by Britain to wean the United States away from its profound historical attachment to neutrality and to secure its participation in the war.

Re-examined through the lens of contemporary understanding of the importance of public diplomacy and 'soft power', the British campaign now seems like the moment that foreign policy through engagement with a foreign public came of age.

The story begins in 1939 with Britain in trouble. The country faced a war for its existence, knowing that its only hope for survival—let alone victory—lay in securing help from rigidly neutral America. Although the United States had rescued the British 20 years previously in the Great War, the chances of this happening again were slim. The Great War now seemed futile and morally ambiguous, and the United States was in no hurry to be hoodwinked once again. In the days following the outbreak of the new war in Europe, the British wisely rejected a heavy propaganda campaign in the United States. They needed to find some way other than an overt appeal to rally America to their cause. The activities deployed extended across five areas: listening, advocacy, cultural diplomacy, exchange diplomacy and international broadcasting.

Britain certainly listened to American public opinion during the run up to Pearl Harbor. The government not only paid keen attention to the Gallup and other polls, but it also established a substantial apparatus to survey the American print media. Britain also listened to a wider range of contacts within American society, working through regional information offices attached to the consular network. The steady supply of reports ensured that the British government dodged some of the pitfalls of American opinion and was able to craft its messages to America's willingness to help. Hence, months after the Cabinet acknowledged that Britain needed American belligerence to have any chance of surviving, Churchill was still promising the US public that material aid was all that was needed. 'Give us the tools,' he pledged to the United States in January 1941, 'and we will finish the job.'

Although the British Ambassadors to Washington, Lord Lothian (architect of the strategy of gentle persuasion) and his successor Lord Halifax, spoke often in public, they avoided direct appeals for American aid. They knew that the best strategy was simply to make it possible for Americans to hear Britain's leaders appealing to their own people and trust the Americans to draw the appropriate conclusions. The American media obliged with thorough press coverage and domestic rebroadcast of speeches by the Prime Minister and the King and Queen. The best example of this was the relaying of Winston Churchill's 'Finest Hour' speech, which included plenty of passages aimed squarely at a 'New World' audience. American exhibition of British official documentary films aimed at boosting morale at home worked similarly well. The approach was spot on. Contemporary studies have shown that information that is overheard is given much greater credibility than a direct hard sell: hence the genre of American television commercials in which the point is made by a conversation between two authoritative characters—often doctors discussing a new medication.

Cultural diplomacy was already part of Britain's approach to the United States. There had been a spectacular British contribution to the New York World's Fair of 1939, with a pavilion that included a display of the Crown Jewels and the common anchor of the British and US legal systems, Magna Carta. All American schools received a facsimile and a translation in the mail. An art exhibit showcased Henry Moore and Graham Sutherland, while musical events included the premiere of a special work from Ralph Vaughan Williams (Dives and Lazarus). The 1940 season of the Fair included more art and a poem by T. S. Eliot (The Defence of the Islands).

But once the war was underway, one cultural forum surpassed all others: the British presence in Hollywood. Perhaps it was the excellence of British theatrical training that gave British actors such prominence in the American film industry during its first decade of sound. Ambassador Lothian urged the stars of the era—David Niven and C. Aubrey Smith among them—to stay in place, telling stories that showed Britain at its best. Two great British directors soon joined them in the California sun. Alexander Korda and Alfred Hitchcock both relocated to the United States and began telling pro-British stories. Korda's historical allegory, That Hamilton Woman, which told the story of Nelson's resistance to Napoleonic tyranny, and Hitchcock's anti-neutrality caper, Foreign Correspondent, both did their bit. The process was led by American demand, but Britain's Ministry of Information was happy to help show business allies along. Examples ranged from supplying authentic sound effects for a Broadway play about a theatre during the Blitz, Lesley Storm's Heart of a City, and script advice for MGM's Mrs Miniver, which went on to be one of the most successful films of the war.

The pre-war non-governmental Anglo-American exchange network transitioned into the leadership of the wartime work. The best example was the ambassador Lord Lothian, who as secretary to the Rhodes Trust had overseen the operation of the Rhodes scholarships in the inter-war period. The director of the Ministry of Information's American Department—Sir Frederick Whyte—had earlier headed the English Speaking Union. The network of Rhodes scholars provided a ready-made set of advocates for the British cause while the reciprocal Harkness scholars program, established in the 1920s to expose the best and brightest of the British Empire to the United States, furnished talented and informed Britons ready to interpret the United States for British audiences. They included the BBC's young American correspondent, Alistair Cooke.

International Broadcasting provided the final dimension. Here the British effort was split. First and foremost, the British understood that the most credible voice to Americans would always be American and thus worked to facilitate American coverage of events in Britain. Edward R. Murrow of CBS became a particular confidant of the government, but the entire American press corps had access to interviews and stories quite beyond anything available to British or Commonwealth reporters. Murrow was eventually allowed to commentate live on the London Blitz as though it was a sporting event—a privilege that brought the sounds of war directly into every American living room.

The second front was the direct broadcasting over the shortwave North American service of the BBC. Programming included talks by J. B. Priestley, whose Yorkshire accent belied the US stereotype of the British 'toff'. Learning that American women were more likely to be isolationist than men, the BBC sought to explain the war to a female audience through the medium of soap opera. Frontline Family—an everyday story of London life in the Blitz—became the first ever soap opera created by the BBC. It was rebroadcast in the United States by the Mutual network.

Britain's radio news strategy was important. Unlike the Germans, the British resolved to tell the truth even when the news was bad (and opened the processes like the calculation of losses during the Battle of Britain to American media scrutiny). The strategy paid off and honesty about the damage suffered during the Blitz built credibility so that in years ahead the good news would also be believed.

Britain's public diplomacy strategy required a substantial bureaucracy divided between the Ministry of Information in the United Kingdom and the specially created British Information Services in the United States. The British hit on some of the great staples of persuasive communication, not the least being understanding the 'soft power' of victimhood. The British realized that children have a special potency as guileless innocents caught in a war that could not be of their making. Images of suffering British children were widely shown—Cecil Beaton's photograph of the wounded child Ellen Dunne was a classic case—and the British persuaded NBC to carry a programme called Children Calling Home, in which children who had been evacuated to the United States spoke over a BBC relay to their families.

Today's communicators speak of the value of the social media and the power that comes from relationships with 'people like oneself'. Britain sought to mobilize a social network decades before its digital descendent. There was the British official in New York—Major Berkeley Ormerod—whose job was wandering around making new friends for Britain in the media and urging the old ones to keep in touch, or the genial Irishman Angus McDonnell who arranged small parties in Washington, D.C. to introduce the rather austere Ambassador Lord Halifax at his best.

More broadly, the British Ministry of Information urged ordinary Britons with American contacts to use their pen-pal relationships to help the British cause, sending out suggestions of useful themes to include in outbound mail. The British were able to create and facilitate networks around their cause—what would now be called civil society or non-governmental organizations—such as labour unions or the aid organization Bundles for Britain, which channelled the American volunteer spirit into the collection of clothes and blankets for shipping to Britain.

But perhaps the most interesting development was the bid to redefine the 'meaning' of Britain. Culture and values are a resource as real as military and economic leverage. The case of wartime Britain reminds us that the soft power audit for any country will include elements that attract and elements that repel. Britain's achievement was to accentuate the positive aspects of the British 'brand' and hit on a plausible story to minimize the drag of the negative.

The obstacles were formidable. The British were the bad guys in the epic of the American Revolution. Americans defined themselves in opposition to Britain; they were classless in opposition to British class-consciousness, Republican in opposition to British monarchy, anti-Imperial in opposition to British Imperialism.

There were positive images in the mix as well, especially the familiarity that flowed from the shared language and literature, but the intimacy could be a mixed blessing. America seemed to get angrier at British missteps than at those of other nations, and the British appeasement of Hitler in 1938 was seen as a massive misstep by many Americans.

Wartime Britain's achievement was to generate a different kind of relevance for the American imagination: a heroic image that gave America something to admire. The moment of transition was the battle for Dunkirk in May 1940. American reporters framed the story as a death and resurrection. The old classist Britain was said to have perished in the fires of the battle for France and a new Britain had emerged with a coalition government and a dynamic new Prime Minister. Americans could retain their cherished stereotypes as true of the past but overlay them with a portrait of the new Britain engaged in a people's war.

Of course, the shift of image would have meant little had not the British people delivered on the claims made about them by the American commentators. The spectacle of all classes working together in the face of the German blitz on London bore out the new narrative. The Americans repressed incidents that flew in the face of this narrative without being asked. There was no doubting the impact of the David and Goliath spectacle of Britain fighting on against the odds. Hitler had long since given the American people something to hate. Now the British people gave them something to love.

Historians seeking to trumpet the impact of the British campaign are denied an outright victory. It was the Japanese attack that finished the job and pitched the United States into war. Yet that pre-emptive attack is not wholly disconnected from the undeniable shift in US opinion during the years 1940 and 1941. As Britain's well considered approach moved American feeling and thereby allowed the sympathetic president Roosevelt to take incremental steps to aid Britain—swapping destroyers for bases; granting lend-lease aid; escorting British convoys—US relations with the Axis powers deteriorated. In the summer of 1941, Roosevelt all but declared naval war on Germany with an order to shoot U-boats on sight, and the American public was in no mood to appease the Japanese in their attempts to secure their holdings in Asia. Resigned to American belligerence sooner or later, the Japanese opted to pre-empt matters and struck first. The rest is history.

All campaigns have their unintended consequences. Some of the positive notions of a new Britishness were swiftly disproved. In the autumn of 1942, Churchill made it clear that his Britain still hoped to retain its Empire and America's Anglophobia snapped back into play in some quarters. But other elements in the new British 'brand' endured. The BBC's reputation for truth-telling and balance endured to be built on in post-war broadcasting.

Less helpful was the durability of the idea that Britain had resisted the Blitz because of a specific quality of the British people. That idea implied that other people would behave differently under bombardment and inhibited the United States from learning what now seems to be the demonstrable fact: that all humans tend to work together under external bombardment, given a reasonably cohesive government structure around them.

The United States committed lives, material and political capital to the task of inflicting Blitz-style devastation on Germany, Japan and later North Vietnam, Iraq and the Belgrade of Slobodan Milosevic on the assumption that those country's citizens would somehow behave differently from Churchill's people and crumble under bombardment rather than rallying to their government.

Contemporary communicators can draw many lessons from Britain's campaign against US neutrality. The power of cultivating what would now be called 'the journalism of attachment' has seldom been clearer. The value of foregrounding the experience of children, of working with culture and empowering local partners became obvious, as did the resources that flowed from the pre-existing elite exchange programmes. No less significantly, the entire operation rested on an essential foundation of listening: investing substantial resources in close monitoring of what would now be called 'open sources' and especially the press. Yet the limits of public diplomacy and place branding are also apparent. Britain's messages had to be based on demonstrable facts for the shift of reputation to take effect: sage advice for any era.


We Are All Diplomats Now, Nicholas J. Cull - an extract

In his article Wikileaks, public diplomacy 2.0 and the state of digital public diplomacy, Nicholas J. Cull explores the all-embracing potential for digital public diplomacy.

It happened in November. The world was weary of war and crisis when he stole the headlines. He was charismatic. He was radical. He had a point to prove. He defied years of diplomatic convention and laid the secrets of great power diplomacy before the world. His revelations captured the headlines and shocked the establishment. In laying bare these secrets he proclaimed a new approach to international affairs and – implicitly – the arrival of a new power. Julian Assange? November 2010? No. That vignette describes events in the now distant autumn of 1917 and the actions of Leon Trotsky, then the newly appointed People’s Commissar for International Affairs for the Bolshevik government of Russia.

In November 1917 Trotsky published a number of secret treaties which had been found in the archives of the Tsar in the aftermath of the Russian revolution. In a statement of 22 November 1917 Trotsky argued that: ‘The abolition of secret diplomacy is the primary condition for an honest, popular, truly democratic foreign policy. The Soviet Government regards it as its duty to carry out such a policy.’ The documents that Trotsky published revealed a sorry tale of the backroom deals in which belligerent powers of the entente (Russia, France and Great Britain) promised various concessions of territory to the neutral nations that they sought to draw into the Great War.

Diplomacy swiftly adjusted to compensate for Trotsky’s gambit. On 8 January 1918 the US president Woodrow Wilson replied in kind. His ‘fourteen points’ on which he claimed an equitable peace could be based included ‘open covenants openly arrived at’. Others tried to give the new regime a taste of its own medicine with revelations of collusion between the Bolsheviks and the Kaiser’s Germany. In time the Soviet state learnt the value of secrecy in diplomacy and reinstated the traditional approach, reaching new heights of double dealing in the pact with Hitler of August 1939, but for a season it enjoyed the fruits of its defiance of convention. Trotsky’s revelation was a symbol that the Bolshevik state represented something new in world affairs. It was public diplomacy by leak.

The analogy with 1917 is not wholly academic. The coming (or is it a springing?) of WikiLeaks is just as indicative of a ‘game change’ as Trotsky’s gambit was ninety-six years ago. As then, while the information itself is important what is crucial is the context. In 1917 the leak required an earth-shattering revolution. In 2010 all it took to challenge the diplomatic order of the day was a single individual with a well-placed accomplice and a little technical know-how.

Now, technical know-how is at the heart of the revolution in communications technology. WikiLeaks not only required a flash drive and surreptitious data dump to acquire its trove of material, but also needed the facilities of an easily accessible worldwide web to make it instantaneously available. Technology has given one individual the communication power that was the monopoly of the nation state in the previous century.

In the wake of the Trotsky leak the great powers faced a prolonged struggle to reassert their legitimacy and did so in part by shifting to greater openness with institutions such as the League of Nations.

In the wake of WikiLeaks the powers of our own time will have to consider again the dangers of double dealing, and work to ensure that there is a minimal gap between what is claimed in public and what is practised in private. For all its regrettable corrosion of the principles of confidentiality on which so much diplomacy rests, the shadow of WikiLeaks may play the classic role once memorably claimed for an Australian opposition party and thereafter embraced by the investigative press: ‘keep the bastards honest’.

THE FOUNDATION: PUBLIC DIPLOMACY 1.0

The web-based revolution in public diplomacy has been a long time coming. As far back as the late 1960s some public diplomats had been anticipating a golden age of communication made possible by a network of computers. In February 1968 America’s chief public diplomat, the director of the United States Information Agency, Leonard Marks, predicted that a world information grid of linked computers would be ‘a fundamental step toward lasting world peace … The culture of all lands must be circulated through the houses of nations as our technology permits.’

The dawn didn’t really break until the mid 1990s when the Mosaic browser system made it possible for the personal computers which had spread in the 1980s to access data platforms in the rapidly growing worldwide web.

For public diplomats the implications of this were slow to sink in. Web technology where it was used was a platform for press releases and one-way top-down communication. The pride of US public diplomacy was its system for making Voice of America available online, initially in script form but eventually as an audio stream. Journals became available online (cheaper than print) and the so-called ‘wireless file’ anthology of ‘useful’ American news and views which had been sent to embassies since the early 1930s became a website.

Amazing as it sounds now, the terrorist attacks of 9/11 found a number of US embassies still without websites. Other players around the world were more canny, realizing the value of a well-managed online identity. Cyber-image and, by extension, cyber-diplomacy became a tool in the public diplomacy toolbox. What seems to have been largely missed was the shift of power inherent in the new technology. Governments focused on how swiftly they could do what they had always done. Militaries looked to occupy cyber-space as if it were simply the modern equivalent of the prized high ground of old. Treasuries looked to save money by going ‘paperless’. But the new technology meant more than that. It was empowering the individual in a new way.

While the great powers continued (and continue) to broadcast their speeches, press releases and so forth into the ether and across the web, the audience was no longer as likely to listen. Part of the change was rooted in the sheer number of voices suddenly speaking online and the range of choices available. But as the number of websites proliferated it became possible to seek out a source closely matching one’s own sense of identity, and even to develop an identity based on an online connection. Many different imagined communities emerged online built from shared interests. Some had the potential to supplant national identity. Online communities based around radical Islamism were a case in point.

This proliferation of communities had one massive implication for public diplomacy and that was in the area of credibility. Public diplomacy relies on being credible to an audience but in this new environment polls revealed that credibility now rested not with the traditional generators of information – governments and news organizations – but with whoever seemed to be ‘someone like me’.

THE COMING OF WEB 2.0

By 2004 it became clear that the internet was changing and that a new term was needed to describe the quantum leap from the old world of web-pages and email to that of social media and sites based on user-generated content. The English-speaking internet community seized on a term first coined in the specialized literature in 1999 which drew an analogy to the release of the new version of a program: a version 2.0 (two-point-oh). The term Web 2.0 was used fairly loosely to discuss the explosion of user-generated content online including blogs, the crowd-sourced encyclopaedia site Wikipedia (founded 2001) and social media sites including Facebook (launched 2004), file-sharing sites like Flikr (launched 2004) and YouTube (launched 2005). By the end of 2006 the new trend was sufficiently established for Time magazine to perceptively name ‘YOU’ as the person of the year – an honour whose previous recipients include a parade of American presidents and world statesmen.

As the web became a domain for user-generated content a variety of sectors coined varieties of the Web 2.0 formulation for their own use including Library 2.0, Medicine 2.0, Government 2.0 and even Porn 2.0. Public Diplomacy 2.0 – a sub-set of Government 2.0 – was a late entrant in the field and owed its genesis to James K. Glassman, an American journalist and commentator who served as Undersecretary of State for Public Diplomacy for the final half of 2008. Although Glassman had a relatively short tenure in Washington DC he was a great enthusiast for the new media. He spoke of a unique opportunity to engage world opinion as never before and challenged the US Department of State to embrace the new technology.

Other ministries around the world underwent similar awakenings. Some were early adopters. In the spring of 2007 the Maldives and Sweden opened the first ‘embassies’ (or cultural centres) in the ‘virtual world’ of Second Life (launched in 2003). Many more people read about it in the old newspapers than actually visited online but that hardly dimmed the public diplomacy objective. An energetic Israeli diplomat posted to New York City named David Saranga dealt with the unenviable task of selling his nation’s offensive against Gaza in late 2008 by organizing a press conference on the social networking site Twitter (then just two years old). The herd thundered in behind them, but to what effect is still not clear.

THE FACE OF PUBLIC DIPLOMACY 2.0

The essential challenge of the Web 2.0 world is that it enabled the preferred source of ‘someone like me’ to become the principal point of contact for all information. In this regard it is a return to a village environment where one’s key interlocutors and sources were the hundred or so ‘like you’ who made up the village. Now each person could gather their personal ‘village’ of friends in cyberspace without regard for the limitations of geography.

This poses a problem for the public diplomacy agency seeking to utilize new media channels. As each individual’s cyber domain becomes more tailored to their own tastes and settled into a comfortable niche, the intervention of an outsider will seem increasingly incongruous. Both the US Department of State and Department of Defense have digital engagement teams participating in online discussions and ‘correcting’ misunderstandings of their interlocutors. This may be counterproductive if the intervention is judged to be at odds with the identity of the site.

The scale of successes is difficult to gauge. The number of friends on an organization’s Facebook page became the immediate measure rather than any consideration of whether real engagement was taking place as a result of the link.

More interesting was the use of YouTube. The Bureau of Educational and Cultural Affairs has launched a number of competitions for user-generated films including a contest for the best short film on the theme of ‘Democracy is’. Young filmmakers from around the world took part and their films were seen and circulated online. Winners of the first year’s competition included filmmakers from Iran and Nepal. The strategy was empowering voices who could be ‘someone like me’ and hence credible to the audiences that the US really needed to influence.

In a similar vein James Glassman launched a project to assist young international activists. He drew together a remarkable range of young people, the most famous being Oscar Morales who in the spring of 2008 had used Facebook to initiate what became an international wave of protests against the FARC guerrillas in Colombia. Their conference resulted in the creation of the Alliance of Youth Movements: a support structure for those seeking to use new technology to transform their world. Its activities include a website with clear instruction on how to set up a blog or social media campaign.

The most elaborate use of the new media was the creation of full blown joint projects in cyberspace. The State Department funded a remarkable collaboration between a school of architecture in California and one in Cairo during which the students worked together on joint projects. When they eventually met they already knew and trusted each other. It was an indication of what was possible. Yet there have been problems. They are clearest in the US official use of Twitter.

TWITTER

Twitter swept to prominence in 2008. The micro-blogging site seemed to offer an ideal technology for engaging foreign audiences. Its 140 character format required the discipline of brevity but was – by design – short enough to be read on the sort of handheld devices that much of the developing world used to access the internet. The United States and many other public diplomacy actors hurried to be part of the Twitter-revolution.

The first problem that the US ran into was the question of exactly how its personnel would conduct themselves online. Would they establish a feed in a formal capacity and use it as a platform to post the links for press releases and statements or would they seek to use the site to present themselves to the world, as a way to humanize US foreign policy.

A notable public diplomat who took the second course was Colleen Graffey, the Deputy Assistant Secretary of State with responsibility for US public diplomacy in Europe, a political appointee with a reputation for stridency in such issues as defence of conditions at the Guantanamo Bay prison. Her Tweets, however, seemed trivial when set against the events happening in the world. An infelicitous message in which she mentioned purchasing a bathing suit in the midst of a meltdown in the Middle East drew particular scorn.

The new-media experts who joined the Obama administration’s foreign policy operation ran into similar problems. Jared Cohen was criticized after he memorably Tweet-ed about a wonderful Frappachino in Syria in June 2010. The real problem with the explosion of State Department Twitter sites was not their personalization but their neglect of a key dimension of the platform.

The essence of Twitter is that it opens a space not only to speak in 140 character bursts but to listen in the same way also. The State Department has paid lots of attention to how many people are following its postings, but generally forgets to think about following anyone themselves. Those who were ‘following’ others – the new technology superstars Jared Cohen, Alec Ross and Katie Stanton – turn out to have been following each other, which is to say other people tweeting in the US new media community, rather than the wider world that they were supposed to be engaging online.

The fixation with ‘broadcast mode’ in US online diplomacy is a major faux pas. It is the equivalent of going into a party and shouting about one’s self and leaving: a behaviour which is intolerable even if one is buying all the drinks, which the United States no longer is.

The first duty of a public diplomat is to listen and the new media have an amazing ability to make that listening both easier and visible. Suppose one of the US embassy Twitter sites were to begin to survey the online environment and click to follow selected writers and sites in their assigned country. Each of those writers might then receive an email saying ‘Ambassador X or US embassy Y is now following you on Twitter’. This could encourage writers to reciprocate and follow, raising the possibility that they would re-tweet an embassy message or two and pass them further along their network with the added boost of their local credibility. It would certainly create a lineup of go-to feeds to help the embassy understand its country, which would be easily taken up by others in the embassy and beyond. Their Twitter-roll could be passed on to anyone else who cared to scroll down and right click. Fortunately, there are some embassies which have realized the potential of Twitter as a tool of listening. The US embassy in New Delhi, for example, is actively following feeds in its region.

A Tweet – like any other piece of information – is welcome to the extent that it is actually of interest to its recipient. There is a great danger that a Twitter feed will become ‘spam’ if it has too much to say on subjects beyond the precise interest of the reader. It is a mistake to insist on one-size-fits-all in a made to measure world. Twitter offers the potential for an unlimited variety of feeds and rather than expecting various diplomats to become providers of wisdom on every subject under the sun, it makes more sense to use discreet feeds to distribute information on discreet issues which will be of relevance to an audience. ‘Tweet the issue’ should be the mantra of public diplomats.

As already noted the great strength of Web 2.0 is its ability to connect people to others like themselves. With this in mind it is not wholly surprising that the greatest strides in Web 2.0 at the State Department have been internal to the department. Closed sites provide a platform for tasks as diverse as accumulating and disseminating best practice, the construction of a ‘diplopedia’ wiki with background and policy discussion on particular countries, a mechanism called ‘communities@state’ to bring diplomats together around shared interests and online sounding boards to feedback to management on ways to improve conditions within the department. Richard Boly, director of the Department’s office of e.diplomacy proudly revealed how online suggestions had yielded the brilliant insight that more people would cycle to work if there were showers located adjacent to the bike storage.

THE ILLUSIONS OF PUBLIC DIPLOMACY 2.0

The world of communications technology continues to evolve at an exponential rate. Science fact outstrips the science fiction of just a few years ago. At the edge of this vortex of innovation we find the practitioners of Public Diplomacy 2.0 in the foreign ministries of the world typically struggling to pull their risk-averse and information-protective agencies into the new era.

The achievements of Public Diplomacy 2.0 are notable and worthy of scrutiny but they must not be mistaken for offering some mechanism for mastery of the new environment. To think such would be to confuse a surfer with the wave he rides and to ignore the impact of the wave as it reshapes the shore.

The traditional diplomatic actors are attempting to get their message out and to engage with the world, but their competitors are doing precisely the same – often with the advantage of a local affinity – and the world is in flux, fragmenting and regrouping into new networks.

Secretary of State Clinton has argued that connectivity is an absolute good and pledged the United States to work to make the blessings of the information society as widely available as possible, but the voters of the United States will have to accept that the voices they empower will be diverse and will include some that are critical and even openly hostile.

RULES TO LIVE BY

How then should practitioners of public diplomacy – large and small – respond to this world of WikiLeaks and the wider Web 2.0 environment? The first step is to acknowledge the transformation of the world of which this winter’s online shenanigans is just one example. Whether in Tunisia or in Tunbridge Wells individuals are inherently more powerful than they have been at any time in history, more especially as they connect across networks. This global and wired public cannot be ignored and communication aimed only at its leaders will necessarily fall short.

The new technology opens a frightening aspect of chaos – the response of the diplomatic establishment to WikiLeaks had all the hallmarks of panic – but it also offers the opportunity for a new kind of politics and a new kind of diplomacy.

The first step for communicators is to acknowledge that they cannot be all things to all people. The task of public diplomacy should evolve to one of partnering around issues with those who share the same objectives and empowering those who will be credible with their target audience. Some nations are recognizing that being seen to be of help in building a network can be a valuable act of public diplomacy in its own right; hence the Swiss government has established its chain of SwissNex offices at strategic locations around the world to connect innovators with one another.

In planning new technology ventures I would propose the following. Rule 1: Be relevant. Don’t assume that what is important to you will matter to your audience: tweet the issue. Rule 2: Be cooperative. Look for partners and be ready to pass on messages from others and by the same token craft your messages so as to make them easy for others to pass them on. Rule 3: Know your audience. Understand the ways in which they use social media and be consistent with that culture as you would if you were physically entering a conversation. Understand the credibility that comes from being ‘like’ your online interlocutor. Rule 4: Be realistic. Public Diplomacy 2.0 can’t make a bad policy good any more than its 1.0 variety could. The prime need is not to say the right thing but to actually be the right thing, especially in an era of growing transparency. Rule 5: Listen. Do not let the 1001 new ways to speak that you have discovered online keep you from exploring 1002 new ways to listen. In the old media or the new, public diplomacy begins with listening.

Extracted from Place Branding and Public Diplomacy, edited by Simon Anholt. Volume 7, Number 1, May 2011; Palgrave Macmillan.


The Popular Image of North Africa and the Middle East, Keith Dinnie - an extract

Will the popular image of North Africa and the Middle East change after the Arab Spring? Keith Dinnie examines the possibilities in his article for Place Branding and Public Diplomacy journal.

The recent dramatic upheavals in North Africa and the Middle East have gripped the world’s attention in a way that has unmistakable echoes of the collapse of communism in 1989. What effect, if any, will these developments have on the reputation and image of the countries concerned?

Much will depend on whether real reform occurs or if the old regimes manage to hold onto power. Should the pro-democracy movement peter out, with a return to authoritarian rule, then it is unlikely that there will be any positive change in each country’s image. Real radical change is the basis of improvement in country image.

If democracy does take root in the region, then such a historic shift can be expected to lead to significant changes. Instead of being submerged by a somewhat negative ‘Middle East region brand’ effect, individual countries will begin to assert their own unique identity. Instead of being monopolized by the image of one political leader, countries will be able to project the full richness and diversity of their respective cultures, as Spain has done in the years following the end of the Franco dictatorship in 1975.

Spain’s transition to democracy and its subsequent cultural renaissance paved the way for it to become one of the countries most often quoted as an example of a successful nation brand. If real political change materializes, there is no reason why the countries of North Africa and the Middle East should not now follow a similarly positive trajectory. The obvious caveat is that these countries must avoid the post-dictator, political vacuum chaos of Iraq.

There are no limits to the creativity with which nations can attempt to project their identity to the rest of the world. On the other hand, the range of uncontrollable image determinants is very wide. They range from word-of-mouth and national stereotypes to export brands and the behaviour of a country’s citizens.

Unfortunately for most of the countries of North Africa and the Middle East, country image perceptions held by foreign audiences have been dominated and distorted by politics, whether projected by the personal image of a military dictator or a more diffuse regional image of extremism, terrorism and so on. All the other factors have been overshadowed, resulting in country images that are incomplete, inaccurate and grotesquely skewed in a negative direction.

For Egypt and Tunisia, the situation is redressed to some extent by their tourist industries. Indeed, it is unlikely that many foreigners who have visited Tunisia in the past twenty years are or were aware of the country’s leadership and political regime.

Visiting a country as a tourist may provide only a superficial impression of a country, but at least it allows personal interaction with locals and the host culture. In the absence of a significant tourist industry, external perceptions of other Middle East or North African countries are mediated to an unhealthy extent by the international media. This phenomenon is exacerbated by the striking absence of alternative image-formation factors such as sports performances or export brands, image determinants that can play a hugely significant role in country image perceptions. The country image of New Zealand, for example, is powerfully amplified by the All Blacks rugby team, whereas the country image of Japan is tightly linked with globally successful corporate brands such as Sony, Toyota and Toshiba. However, most foreign audiences would struggle to associate anything comparable with the countries of North Africa and the Middle East.

The closed nature of one-party states tends to be reflected in a lack of support for the promotion of cultural activities. The countries of North Africa and the Middle East lack influential cultural organizations such as Germany’s Goethe-Institut or the United Kingdom’s British Council, both of which play an important role in downplaying those two nations’ imperialistic past and in supporting a more cosmopolitan image. This type of soft power projection through public diplomacy has not as yet been embraced by most countries in North Africa and the Middle East. The opportunity to do so now beckons, provided that the revolutionary impulse towards more open societies does not fade away.

Extracted from Place Branding and Public Diplomacy, edited by Simon Anholt. Volume 7, Number 2, May 2011; Palgrave Macmillan.


Ireland's Boom and Bust, David J. Lynch - an extract

Few countries have been as dramatically transformed in recent years as Ireland. Ireland emerged as the fastest-growing country in Europe, however just a few years after celebrating their newly-won status among the world's richest societies, now saddled with a wounded, shrinking economy, soaring unemployment, and ruined public finances.  In his book When the Luck of the Irish Ran Out David J. Lynch offers an insightful, character-driven narrative of how the Irish boom came to be and how it went bust.

For more information about When the Luck of the Irish Ran Out and other titles in the History list visit Palgrave Macmillan.

For much of the twentieth century Ireland was the odd man out in Europe. While other countries rebuilt and modernised, Ireland stagnated. In the 1980s one-third of the population lived below the poverty line. Incredibly, fewer people held jobs in 1987 than had been working in 1926. Ireland was long on charm and short on almost everything that mattered to a modern economy: jobs, roads, telephone lines.

Making a phone call in Ireland required time, patience and a bit of luck. One-quarter of the country’s telephone exchanges were creaky manual museum pieces; one dated to the nineteenth century. As late as 1984 calls routinely failed to connect or endlessly rang busy. And tens of thousands of Irish men and women could only dream of such frustrations. In Greater Dublin alone, the waiting list for a telephone held forty thousand names.

Still the telephone network was positively futuristic compared to the roads. Highways in modern Ireland were all but unknown. The lack of by-pass roads skirting town centers meant that motorists journeying between any two major cities—say, Dublin and Cork or Waterford and Galway—had to pick their way through interminable local traffic in dozens of small villages. To travel from a town in the midlands to the capital—a distance of perhaps 75 miles as the crow flies—consumed a soul-crushing four hours.

And then, over the span of a decade, everything changed. A sclerotic economy, freed by bold policies and ample investment imported from the United States, roared into a growth miracle dubbed the Celtic Tiger. The culture, too, long dormant under the censorious hand of the Catholic Church, erupted in a fountain of creativity. Even the open wound of Northern Ireland healed, thanks to a peace midwived by American diplomats. Suddenly, the Irish, long on the periphery of global affairs, were at the center of everything. As 1989 dawned, growth was percolating at an annual rate of 5·6 percent versus almost nothing three years earlier. An August 1986 devaluation of the pound effectively cut the price of Irish goods on global markets, giving exporters a boost. Critically, interest rates were on the decline as well, making it easier for businesses to invest in new factories. But the job market remained becalmed: total employment in 1989 was no higher than it had been in 1974.
         
To create an adequate number of jobs, Ireland needed to attract the world’s best companies to its shores. It was a mark of the pragmatism and utter absence of ideology at the heart of Irish politics that Fianna Fáil, architects of the failed protectionist “Little Ireland” model throughout the party’s history, transformed itself into a fierce advocate of free trade.

By the late 1980s, foreign investment, especially from the United States, had brought manufacturers such as Fruit of the Loom, Bausch & Lomb and Digital Computers to Ireland. But the country remained a minor-league economic player. Then in October 1989, Intel, the US multinational, chose a 55-acre site on a former stud farm in Leixlip, about 15 miles west of Dublin, for its new plant. The Silicon Valley giant was drawn to Ireland by its well-educated, English-speaking workforce, low corporate taxes and generous state grants. The three-phase development promised 2,600 total jobs. The decision gave Ireland a sort of globalization seal of approval, one that elevated a chronically ill economy into a place worth a second look.

A succession of tax-cutting budgets aimed at stimulating enterprise culminated with Fianna Fáil’s election victory in 1997. The standard and top rates of personal tax fell, from 26 percent and 48 percent to 24 percent and 46 percent, respectively, as did corporate taxes, cut from 40 percent to 32 percent. Finance Minister Charles McCreevy made his biggest splash, however, by halving the capital gains tax from 40 percent to 20 percent. McCreevy slashed the gains levy over the objections of his department’s senior professionals, who feared a plunge in revenue. Instead, the government’s take soared: from IR£84 million in 1996 to IR£609·2 million by 2000.

Ireland’s robust performance, meanwhile, was beginning to revive fears of inflation. As the country entered its fifth consecutive year of strong growth, there were signs that annual price increases would near 3 percent. The currencies of Ireland’s trading partners, including the US dollar, had strengthened and a tight labor market threatened to push wages up. In the first quarter of 1998, new home prices were up 25 percent from one year earlier. But Irish officials weren’t overly concerned; the rising house prices, they said, could be explained by strong economic growth, favourable demographics—including the annual arrival of thousands of immigrants—and low interest rates. The government concentrated on growing employment while preparing for what it hoped would be an eventual ‘soft land’ for the hard-charging economy. The chief impediment was the white-hot housing market. In April 1999, the Central Bank sent a letter to all Irish credit institutions reminding them of the dangers of ‘a lending policy that is excessively flexible’. Complicating the policy challenge, the Central Bank was about to lose one of the principal tools for managing an economy: control over its money supply. Ireland was proud to be among the first 11 countries that would participate in the planned single European currency.

Joining the euro meant surrendering to the planned European Central Bank (ECB) control over both the country’s interest rates and the value of its currency. Moreover, the process of joining the euro would involve a massive jolt of adrenaline for the already supercharged Irish economy. To bring the Irish economy in line with Germany, Europe’s dominant economy, interest rates needed to drop sharply.

The economy might be overheating, but at least it was finally producing jobs. By the end of 2000, the number of those working was 40 percent higher than it had been just six years earlier. But the economy was running above capacity. Unemployment was now ‘significantly below’ the level associated with stable prices. Sure enough, inflation in 2000 hit a disturbing 5·5 percent, more than double the eurozone average. Property prices also were getting out of hand. Where once the economy had grown thanks to exporting, it was now deriving three quarters of its forward momentum from domestic demand. That was a sign that the nature of the Irish boom was changing, shifting more toward consumption than production, and that the government needed to either raise taxes or cut spending to cool the economic engine.

The Irish boom was living on borrowed time. The bursting of the internet bubble, rising oil prices, and a synchronized slowdown in the United States, Japan, and Germany all combined to halve global growth, producing ‘the sharpest slowdown in global economic activity in two decades’. As an extremely open economy dependent upon global trade, Ireland was especially hard hit. By the end of 2001, growth had ‘effectively ceased’.

But the Irish were about to get some help from friends in America. To prevent the 9/11 upheaval from capsizing the US and global economies, the Federal Reserve slashed interest rates to 1 percent and kept them there. The decision of the ECB to lower rates from 4·75 percent in 2001 to barely 2 percent two years later, gave a massive financial stimulus to Ireland, at times making borrowed money effectively free.

The government poured fuel on the economic fire with its own free-spending ways. The number of workers on the public payroll jumped by 22 percent in just four years. By the late 1990s, public-sector pay was spiralling out of control. In 2000, the government introduced a process called ‘bench-marking’ which was intended to align compensation for government workers with prevailing rates for similar jobs in the private sector. In theory, the new pay system would be coupled with improvements in public-sector efficiency. In practice, the head of the major teachers’ union gleefully compared the results to ‘going to an ATM’. In mid-2002, the first report of the new pay-review board recommended salary increases of up to 25 percent.

Despite a nod to prudence, the 2002 budget was a veritable laundry list of giveaways: more generous old age pensions, fatter child benefits, a 20 percent increase in provisions for free electricity for qualifying households, and even a sharp cut in the betting tax. Along with those goodies came personal tax cuts worth US$568 million as well as reductions in corporate levies valued at US$311 million. In the event, government spending in 2002 would increase 6∙3 percent after taking inflation into account, on top of an even more lavish 12∙1 percent real increase the previous year. If this were prudence, it was hard to conceive of profligacy. At a time when prices were rising in Ireland at a pace more than twice the eurozone average, the government was stepping on the economic accelerator.  

The economy grew by only 2·9 percent in 2002, its weakest performance in a decade, and what growth took place was predominantly in the construction and housing sectors. Easy money from the banks coupled with stimulative government policies encouraged limitless building. In 2002, for the first time in any 12-month period, Irish builders threw up more than 50,000 homes. In 2003, more than 62,000 were built, a record quickly surpassed the following year, when more than 72,000 arose. It was as if the engine of construction, once started, could not be stilled. Despite the supply increase, prices kept rising, too. They were up 14 percent in 2003 alone and had roughly tripled since 1996. The pace was insane, clearly speculative and unsustainable. And yet the building frenzy roared on. Buying and selling homes became a national obsession.

In February 2007, housing prices wobbled and then turned down for the first time in a decade. Once property prices started to slide, the Irish economy was like a running movie in reverse. Everything had grown with property. Just as it once had made sense to buy the house today because tomorrow’s prices would be higher, now the smart move was to wait. Prices would only be cheaper next week, next month, or even next year. Once that essential truth took hold, Irish banks were doomed.

The Irish recession that began officially in mid-2008 was the steepest downturn in any advanced nation, far outpacing that of the United States. The Irish housing bubble was three times as big as that of the United States. Real house prices in the United States rose roughly 50 percent in the decade preceding 2006; in Ireland they rocketed 172 percent. So when the bubble popped—Irish house prices dropped by one-third from their February 2007 peak and kept sinking—the damage was commensurately greater. As if to prove the point, Ireland’s output in the fourth quarter of 2009 was nearly 17 percent below its peak production in the same period two years earlier. (Over the same period, US quarterly output fell by about 7 percent.) The number of unemployed jumped quickly from 101,000 at the end of 2007 to more than 267,000 two years later.

Not everything about the Celtic Tiger, however, was illusory. Much in Irish life genuinely has changed for the better. Outside of the People’s Republic of China, in fact, few societies in the closing years of the twentieth century transformed themselves so quickly. The number of Irish people at work was almost 70 percent higher than it had been in 1984. Set against the long sweep of Irish history, that was no small achievement. The world’s best companies, especially in the software and pharmaceuticals industries, now consider the island an important part of their global operations. Irish artists, musicians and poets remained able cultural ambassadors. The influence of the once-omnipotent Catholic Church has receded and, despite a handful of isolated killings involving dissident republicans, the North is at peace.

But if it is wrong to exaggerate the scale of retrenchment amid the global financial crisis that began in 2007, it is equally ill-advised to minimize either the blow that has been absorbed or the challenges that lie ahead. Ireland is not going back to the misery of the 1980s. Neither can it return to the easy affluence of the Celtic Tiger. Gone is the romance of The Quiet Man Ireland of old. Gone, too, is the high-octane, consumption-first ethos of the Tiger. Neither was sustainable. Neither was real. And good riddance to both. The current crisis may put an end to any notion of Irish exceptionalism, but with a little luck, it will leave Ireland on a sounder footing.

For those hoping for a Celtic revival, the greatest misfortune would be if the world economy rebounds so powerfully that Irish elites believe they can stick with business as usual. If the politicians and financiers aren’t compelled by circumstances to adapt, they will not. The world has changed since the 1980s, with the collapse of the Berlin Wall and the rise of new competitors in Asia, eastern Europe and Latin America. The global bar is being set higher, and Ireland must adapt.

Extracted from When the Luck of the Irish Ran Out by David J. Lynch. (Palgrave).

With Clearer Heads and Clearer Lenses, What Might We Learn?, Professor Lief Johan Eliasson - an extract

Do Europeans have anything to teach their American cousins? Most certainly, argues Professor Leif Johan Eliasson of the University of Pennsylvania. In his book America’s Perceptions of Europe he makes the case for a clearer appreciation of European achievements as a way of strengthening the US in meeting the challenges of globalization.

For more information about America's Perception of Europe and other titles in the Politics list visit Palgrave Macmillan.

Americans are told from a young age that they live in the best country in the world, that others are envious and that they can do anything they want because they have all the best schools and technology. By the time youngsters begin high school, let alone college, they have joined their elders in believing that if someone somewhere else is producing a better car or TV, a more sophisticated phone or plane, or carrying out new life-saving surgery, they must in some way be cheating. The American labour unions promulgate their favourite mantra of being able to compete with anyone as long as the playing field is level, only thereafter to espouse a million excuses for other countries’ superior productivity including dismal labour standards.

Meanwhile, pundits blame foreign governments’ currency manipulation or industrial subsidies, while proclaiming lower American taxes as the solution to all ills. To crown the blame-game, ideological factions whose wealth and prosperity stems from the free flow of goods, services and ideas subscribe to that self-defeating folly called protectionism.

The truth is that most European countries, the northern ones in particular, have succeeded in part by emulating America’s strengths, while, and this is crucial, avoiding its failings; by substituting rigorous curriculum for feel-good education standards and embracing globalization, openness and adaptability as the twenty-first-century way of life.

The quintessential American question—‘what can we do to improve our competitiveness and prosperity?’—might be answered by following the European policy of lowering corporate taxes, removing the burden of health care costs (eight times more costly for an American automobile manufacturer such as Ford than for BMW, Fiat or Peugeot), while improving the delivery of social programmes. A tougher high school curriculum to prepare students for vocational training or college is also essential.

You are probably thinking there are thousands of bright, talented Bill Gates, Warren Buffets and brain surgeons-to-be across America, and you are correct. But young wizards’ enthusiasm fade when not challenged, when prevailing norms reflect declining standards. American high school students score lower than half of their European peers on international student assessment tests in math, reading and particularly science (in Europe, Finland comes out top in all three categories). If this trend continues, well-paying jobs will be lost to more competitive environments. Half of science graduate students at American universities are now foreigners who are finding better paying jobs elsewhere, lending their talent to other markets. There are not enough American students interested in science to fill domestic gaps, and while Europe has similar problems, it is now more open to attracting skilled labour than America.

It is true that European start-ups struggle to find venture capital, face more business bureaucracy and have higher first-decade mortality rates than American firms. But university-business research hubs to improve innovation, research, business and competitiveness are popping up across Europe, with good results. Investments are flowing in and businesses are benefiting from more American-inspired, business-friendly bankruptcy laws. Regarding small business regulations, it is clear that Europe is emulating the United States to improve competitiveness.

Fiscal responsibility matters. Twelve European countries led by the Germans and Dutch and including economic ‘bad boys’ Italy and Greece save more of their earnings than do Americans, leaving them better able to weather downturns. This provides greater purchasing power without racking up debt and helps explain why although seemingly overtaxed and underpaid, they import lots from America. American exports bound for Europe rose 60 per cent from 2003 to 2007, and it is not 60 cent rubber ducks in these shipments; rather, high-value goods such as transportation equipment, chemicals and computers topped the list!

Many European countries have lower annual deficits than the United States and also more realistic assumptions of economic growth over the next decade. Admittedly Latvia, Ireland, Greece, Portugal and Great Britain will suffer severe budget problems until at least 2014, but American states such as New Jersey, New York and California (with roughly the same combined population) also suffer humungous deficits, all in addition to the US federal debt. Both Americans and Europeans will have to endure tax increases and spending cuts; yet European citizens will still have health care and schooling for all citizens, while their American peers may not.

European trouble spots include Greece, a laggard in information society, financial liberalization and sustainable development, and Italy and Bulgaria, where corruption remains the major concern. Italy and Greece also have huge national debts and low birth rates, leaving the country of Parma ham and the cradle of democracy the closest we have to ‘sclerotic’ western European states. Then again, the states of Louisiana, Illinois and New Jersey can give the Europeans a run for their money.

In today’s global economy, where nanosecond transactions zip across the globe, transatlantic interests overlap. The engine of much new and existing business depends in different ways on high-tech components and gadgets in nanoform, as in iPods, laptops or magnets in automobile manufacturing, wind turbines or solar cells. These all depend on access to rare-earth metals, 97 per cent of which are in China. This means Europe and the United States may soon be exchanging dependency on Middle East oil for Chinese metals. This is an area where co-operation and joint pressure against an authoritarian-led monopoly that threatens to hamstring both economies is critically important. While China owns a quarter of US foreign-owned debt, Europe is less beholden to Chinese economic adventures.

Throughout Europe cell phones are cheap, ubiquitous and cutting-edge – the same applies to internet access. Estonia, Finland, Sweden and others have most government services online and available to all citizens. Though getting better, American federal, state and local governments could learn a thing or two about information access and web page organization from their European counterparts.

The costly patent system in Europe is nothing to envy. Years of business pleas to improve competitiveness has led to no more than an agreement to use a common language in applying for patents. A one-stop-shop to approve and enforce a European-wide patent would lower costs by half. Naturally, the entire European process applies to American firms operating in Europe, so they would benefit as well. Despite these problems, a third of global technology and electronics patents are European, proving that innovation is not lacking.

This extends to the environment. Norway has a 10-year project of catching carbon emissions from factories and sinking them into deep-sea depositories, soon to be expanded in Britain. Europe has extensive experience with cap-and-trade initiatives aimed at lowering greenhouse gas emissions. Twenty-seven American states in 2009, with more to follow, are involved in some form of regional trading scheme, so American and European experiences can be mutually beneficial in improving effectiveness, thereby setting global standards.

Europeans are convinced that welfare assistance helps stave off many social ills, and since European welfare systems were institutionalized before the continent became as racially diverse as the United States, and before the globalized economy took off, they are far less likely to be dismantled in the face of economic turbulence or massive immigration. During the 2008–09 recession, when more people needed help with housing, child care and education loans, no one was thrown out on the street or forced into bankruptcy because they needed a kidney but lacked insurance.

American unemployment in 2009 was higher than the European average, and roughly 25 million American citizens were using food stamps. It is worth pondering how the Danes and others combine a flexible, easy-hire-easy-fire job market—ranked as competitive as the American—with extensive social safety nets.

At the same time critics are wrong to dismiss the American welfare system as a mere skeleton, where individualism reigns supreme and inequality and poverty abound. US public assistance is far more extensive than commonly believed even among Americans. Combining private and public spending on social programmes, the United States is a ‘middle of the pack’ country, spending more than Spain, Finland or Austria, but less than Britain, France or Sweden.

International Issues

The strongest possible endorsement of globalization is found in northern and eastern Europe, in countries such as Sweden, Finland and the Czech Republic. Citizens in the five largest European countries and America are less welcoming to free trade, foreign investments and the internationalization of culture and other areas of life; this despite being the greatest recipients and providers or trade and business investments and having invented and driven the system for decades.

Americans’ views of Europeans as less willing to fight are correct. Nation building and peacekeeping remain their forte, and Europe will never match US fighting power. But a transatlantic division of labour, even if unspoken, may be mutually beneficial. Economic reality prevents the United States from having the resources to fight every war. Cuts in new weapons and equipment over the coming decade make the case for multinational production lines and increased co-operation. Both conservatives and liberals have testified that success in war, including the fight against terrorism, requires a ratio of nine-to-one non-military to military means. Yet US commitments in the twenty-first century are just the opposite. This is where Europe’s expertise compliments America’s.

In 2004, I attended a debate between two four-star generals, one British, one American. The British general argued that experience in Northern Ireland, Bosnia and parts of Africa showed that confidence-building through community patrols and involvement was the only way to win the hearts and minds and sustain long-term peace. The American general countered that this was too dangerous, while adding that Americans should not be involved in nation-building. Interestingly, the successful 2008 ‘surge’ strategy advocated and implemented by General Petraeus in Iraq, and subsequent American community involvement, has mimicked the British line.

Round and Round We Go

We see increased transatlantic harmonization of views and policies, from economics to finance, technology and international conflicts. Leaders on both sides of the pond are moving closer ideologically and responses in various surveys show similar trends. But public distortions continue, as was evident in the 2009 American debate on health care. Calling universal care socialist because it exists in a European country, disregarding all the facts, shows the enduring strength of prejudice.

Transatlantic efforts remain critical to political stability and economic growth across the world. Former German Foreign Minister Joschka Fischer’s argument that ‘Europe is weak and the US is blind’, implying a Europe limited in its ability to back economic power with military force and an America ignorant of cultural forces and the benefits of diplomatic endeavours, is slowly being replaced with greater EU capabilities and a diversification of American foreign policy.

The narrowing of ideological and practical differences is clear in America’s move toward the European model of a larger role for the state, visible in expanding American social assistance programmes and intervention in the economy. At the same time northern and continental European countries have adopted features of Anglo-Saxon capitalism (e.g., freer labour markets and lower business taxes). Furthermore, a 2009 survey revealed that more Americans support higher taxes on the wealthiest citizens than do French, Italian and British citizens. European and American citizens also share corresponding views on terrorism, global warming, energy, Islamic extremism and Iran’s nuclear programme, even if Europe is less concerned about China’s ascendancy.

Europeans and Americans have shared interests. In 2009, roughly 60 per cent of Europeans held favourable views of Americans and the number of Americans wanting closer ties to Europe was roughly the same, while suspicion of China’s intentions was rising on both sides of the Atlantic. For Americans it takes time adjusting to not being the sole superpower in all areas (economic, social and military). But America should welcome a strong, influential and competitive Europe that embraces many of the same values Americans hold dear, but which is also not afraid to assert its will and push its agenda in ways inherently conducive to capitalism, democracy and prosperity. In an ever more interdependent world, the potential for successful American foreign engagements can only increase when citizens’ perspectives are not clouded by myths, misperceptions and distortions of our closest allies.

Extracted from America’s Perceptions of Europe by Leif Johan Eliasson (Palgrave Macmillan)


The Secret History of Democracy, Benjamin Isakhan; Stephen Stockwell - an extract

Events in the Middle East have raised expectations for a democratic agenda. Benjamin Isakhan and Stephen Stockwell detect signs of an emerging democracy in their book The Secret History of Democracy.

For more information about The Secret History of Democracy and other titles in the Politics list visit Palgrave Macmillan.

The tendency of Western media to emphasize the daily atrocities of post-Saddam Iraq has obscured success stories of Iraq’s fledgling democracy. Yet there is much evidence to suggest a return to a civic culture in Iraq, where the streets have become a locus for deliberation and debate.

Following the fall of the Ba’athist regime, a complex array of political, religious and ethno-sectarian factions formed political parties and civil society movements, many of which have written policy agendas, engaged in complex political alliances and debated key issues. They also sponsor their own media outlets which have been enthusiastically read by a people thirsty for uncensored news, even if it is partisan. This was particularly true in the lead up to the elections and referendum when citizens were provided with a rich assortment of information on key policies, politicians and parties.

The subsequent elections saw millions – young and old, Sunni and Shia, Kurd and Arab, Christian and Muslim – risk threats of violence to line the streets, patiently waiting to take part in the first truly democratic elections held in Iraq for many decades. It was the same for the January 2009 provincial elections which saw colourful campaign posters glued to walls all over the country while party volunteers handed out leaflets at security check-points. Other volunteers used more traditional tactics, such as going door-to-door, giving radio interviews or calling public assemblies where ordinary citizens were invited to grill candidates on their policies.

The story of democracy in Iraq begins immediately after the fall of Baghdad in April 2003, when the nation witnessed a series of spontaneous elections. In northern Kurdish cities such as Mosul, in majority Sunni Arab towns like Samarra, in prominent Shia Arab cities such as Hilla and Najaf and in the capital of Baghdad, religious leaders, tribal elders and secular professionals summoned town hall meetings where representatives were elected and plans were hatched for reconstruction projects, security operations and the return of basic infrastructure.

Such moves were initially supported by the occupying forces. But fearing that the people of Iraq would elect ‘undesirables’ such as military strongmen or political Islamists, the United States was quick to quell these drives towards democratization and to exert its own hegemony. Members of the Interim Iraqi Government were appointed by the head of the coalition authority and, at the end of June, all local and regional elections were stopped. Decisions made by local councils were revoked, and the mayors and governors who had been elected by their own constituents were replaced by hand-picked representatives. Not surprisingly, these moves met with opposition across Iraq and prompted some of the earliest protests of the post-Saddam era.

When the coalition attempted to install a puppet government in Baghdad, senior religious figures such as Grand Ayatollah Ali Al-Sistani were able to mobilize thousands of Iraqis to call for a general election prior to the drafting of the Iraqi constitution. Al-Sistani took the unprecedented step of issuing politically motivated fatwas, urging his clergymen into local politics and encouraging the faithful, including women, to vote in elections.

In mid-January 2004, more than 100,000 Shia marched through Baghdad, while a further 30,000 took to the streets of Basra. They called on the US occupation to conduct free and fair national elections.

However, if it was Al-Sistani who was to have the most impact over the political landscape during the first months of the occupation, it was the younger, more radical Moqtada Al-Sadr who was to gain both notoriety and political influence in the years that followed. This began when the coalition forced the closure of two publications produced by Al-Sadr, Al-Hawza (the name of a particular Shia seminary in Najaf where a number of leading clerics teach) and the quarterly journal Al-Mada (The View). Both advocated an Islamic republic for Iraq and featured vitriolic attacks on Israel and on the American-led occupation. Thousands of protestors gathered at the paper’s office in central Baghdad vowing to avenge Al-Hawza’s closure. In a twist of irony, it was the forced closure of Al-Hawza, rather than anything printed in its pages that incited his Mahdi Army to violence.

Indeed, throughout 2004 Al-Sadr led several military uprisings against the occupation. These events helped to refine Al-Sadr’s mastery of anti-occupation rhetoric and to distinguish him from Al-Sistani as a strong militant religious leader who had both the strength and the gall to take on the United States. However, when his military campaign failed, Al-Sadr switched to (mostly) non-violent political struggle, with calls for tolerance, national unity and social inclusion, and the transformation of the Mahdi Army from militia to social welfare organization. On the second anniversary of the invasion of Iraq, Al-Sadr orchestrated massive protests in Baghdad. Thousands travelled from all over the nation to attend one of the largest political rallies in Iraqi history.

What was particularly interesting was that Al-Sadr ordered his followers to wave only Iraqi flags, and not flags of the Mahdi Army or of other Shia Arab organizations. This was a self-conscious attempt to move the protests beyond the level of a pro-Al-Sadr, Shia-backed movement, into more of a nationalist struggle against occupation, something which would appeal to Iraqis of all persuasions. In the event, a number of Sunni Arabs attended the Baghdad protests, as well as a small contingent of Iraqi Christians.

These anti-occupation protests have become an annual event. In addition, the followers of Al-Sadr have organized other demonstrations against the lack of basic infrastructure and public services such as electricity, fuel and potable water, against the high cost of ice and against the increasingly bleak employment market.

Following up on the strength of these protests, Al-Sadr has further demonstrated his political instincts and knowledge of democratic mechanisms. For example, in 2005, he instructed his followers to collect the signatures of one million Iraqis in a petition that asked the US and coalition troops to leave the country. More recently, he launched a nation-wide civil disobedience campaign in response to raids on the cleric’s offices and to the arrest of members of his organization. In several key Baghdad neighbourhoods such as Mahmoudiya and Yusufiya, members of the Mahdi Army marched in a show of force, while in Abu Disher the streets were emptied and the stores and schools closed. Then, in October 2008, thousands of Iraqis took to the streets of Sadr City and in the south-eastern province of Missan to object to a new draft of the US-Iraqi Security pact, which would extend US troop presence until 2011. When the Iraqi government ignored the protests and signed the deal, Al-Sadr’s followers reappeared in the streets.

The key reason why the Shia Arab protests have been so effective is the fact that they make up the majority of Iraq’s population. The minorities in Iraq, such as the Sunni Arab (around 20 per cent), the Kurds (around 20 per cent) and the Iraqi Christians (around 3 per cent), cannot command such impressively large demonstrations. Nonetheless, these minorities have also been able to utilize the power of the streets to air their concerns and advocate political change. For example, the Sunni Arab minority conducted general strikes in resistance to US blockades of Sunni cities. In Ramadi, the entire town shut down for two days as US troops launched a major offensive across the Sunni region. Sunni Arab protests were to gather increased momentum as members of the former ruling minority found themselves increasingly ostracized by the Shia Arab and Kurdish dominated central government. In 2005, Sunni Arab demonstrations were held in the towns of Hit, Ramadi, Samarra and Mosul to protest the new constitution which had been drawn up without their approval.

In addition, the Sunni-Arab population of northern cities such as Kirkuk and Mosul has frequently taken to the streets to protest what it sees as the Kurdish domination of Nineveh’s regional administration. Most recently, 2008 saw the Sunni-Arab population of the Baghdad suburb of Adhamiyah protest against moves by Kurds to incorporate the oil province of Kirkuk into the autonomous Kurdish region.

At around the same time, the Kurds were conducting their own protests regarding Kirkuk. Thousands gathered in cities such as Sulaymanyah, Arbil, Kirkuk and Dohuk after the Iraqi Parliament passed a law that would see a power-sharing arrangement devised for Kurdistan’s multi-ethnic cities. The Kurds have also rallied against the inequities they see across their own region. During March and August 2006, and more recently in August 2008, largely peaceful demonstrations broke into angry protest against the regional governor’s failure to provide basic public services.

Caught in the political and sectarian cross-fire of post-Saddam Iraq, smaller ethno-religious minorities such as the Turkomans, the Faili Kurds (Shiite Kurds) and the Christian minority of Iraq (made up mostly of Syriac-speaking Assyrians and Chaldeans) are often forgotten alongside the three larger ethno-sectarian groups. While they have been the victims of violence and harassment, they have nonetheless been politically active, scoring minor successes in coalitions with the larger groups and with their own political protests. In 2008, hundreds of Iraqi Christians demonstrated across key towns in northern Iraq to express their indignation at not being able to elect their own representatives. They also called for autonomy in their ancestral homeland.

Iraq has also seen a variety of civil movements emerge that are not so much concerned with issues regarding ethno-religious rights, their resistance to occupation or their rejection of state policy, but the plight of normal Iraqi citizens – ordinary people who demand better working conditions, higher salaries, safer environments and better infrastructure. While many of these protests have occurred in specific ethno-religious areas and often organized by one ethno-religious group, their common element is the people’s struggle for a more inclusive and equitable future. For example, the Iraqi people have repeatedly protested against corruption and nepotism in their local and national governments and called for the resignation of senior officials.

Women’s rights have become a particular concern in post-Saddam Iraq. Iraqi women of all ethnicities and religious persuasions mounted protest campaigns after the invasion in 2003. Women’s rights and social justice activists joined forces in a group known as ‘Women’s Will’, which has organized a boycott of the US goods that have flooded the Iraqi market since the invasion. In June 2005 protests were organized by Islamic human and women’s rights organizations in Mosul to press for the immediate release of all Iraqi women in US custody. So effective was this campaign that the United States was forced to release twenty-one Iraqi women who had been held as a bargaining chip against relatives suspected of resistance.

Iraq has also seen the emergence of powerful workers’ movements. Iraqi doctors, nurses, taxi drivers, university staff, police, customs officers and emergency service personnel have repeatedly engaged in non-violent protests, strikes, sit-ins and walk-outs. They have done so to draw attention to poor working conditions, the pressures under which they work, unfair dismissals, ineffectual government regulation and the dangerous nature of their jobs. The nation’s largest and most powerful independent union, the General Union of Oil Employees, later renamed the Iraqi Federation of Oil Unions (IFOU), began to flex its political muscles in May 2005 when it came out against the privatization of Iraq’s oil industry.

In June 2005, around 15,000 workers conducted a peaceful twenty-four-hour strike, cutting oil exports from the south of Iraq. This was in support of demands made by Basra Governor Mohammad Al-Waili that a higher percentage of Basra’s oil revenue be invested in infrastructure. The IFOU also demanded the removal of fifteen high-ranking Ba’ath loyalists in the Ministry of Oil as well as pay increases for the workers.

In May 2007, the IFOU threatened to strike again, but this was postponed when a meeting with Iraqi Prime Minister Nouri Al-Maliki resulted in efforts to find solutions acceptable to both sides. However, when the government failed to deliver on any of its promises, the oil workers went on strike across southern Iraq. A few days later, the Iraqi government responded by issuing arrest warrants for IFOU leaders. In the face of intimidation the union held firm, taking the further step of closing the main distribution pipelines, including supplies to Baghdad.

These indigenous, localized and highly coordinated movements reveal the strength of the Iraqi people’s will towards democracy. When given the opportunity they are more than capable of utilizing democratic mechanisms independently of foreign interference. The movements also indicate the degree to which democratic practise and culture are familiar to the people of Iraq. The Iraqi people implicitly understand that, by taking to the streets, they force their government to take their opinions into account. Another important point is that the actions of key religious figures such as Al-Sistani and Al-Sadr contradict the common belief that Islam is incompatible with democracy. Similarly, the protests conducted by the Sunnis, the Kurds and the Christians reveal that Iraqi culture, in its many rich and divergent guises, is open to democracy.

The Iraqi protest movements have revealed the strength of feeling against the United States and its self-proclaimed status as a harbinger of democracy in the Middle East. That the United States was so determined to shut down the original grassroots democratic impetus is also revealing, in that it demonstrates the US administration’s desire to exert its hegemony over the Iraqi people via an installed government rather than to foster and encourage genuine democratic reform. When the United States attempted to eschew democracy in favour of a puppet government, it was the power of the Iraqi people that put in motion a series of events that led to the formation of an Iraqi government elected by the people, in free and fair elections.

While the Iraqi citizenship’s participation in, and engagement with, democratic mechanisms such as elections, an independent press and mass demonstrations do not themselves qualify Iraq as a robust and stable democracy, they are positive milestones towards this end. Specifically, a strong protest culture is not only crucial in re-establishing a participatory and engaged public life, but it can also help to abate the many conflicts across Iraq and thereby to aid the shift towards a free, egalitarian and democratic nation.

Extracted from The Secret History of Democracy, edited by Benjamin Isakhan and Stephen Stockwell (Palgrave).


The Failure of Democratic Nation Building, Albert Somit; Steven A. Peterson - an extract

Events in the Middle East have raised expectations for a democratic agenda. But as Albert Somit and Steven Peterson show in their book The Failure of Democratic Nation Building, recent experience of democracy building by the US suggests that these hopes may be misplaced.

For more information about The Failure of Democratic Nation Building and other titles in the Politics list visit Palgrave Macmillan.

Taking the oath of office for his second term, George W. Bush promised ‘to seek and support the growth of democratic movements’, declaring that democracy around the world ‘is the urgent requirement of our nation’s security’. He dreamed of transplanting Americanized democracy first in Iraq and then the greater Middle East. This new manifesto penetrated deep into the US military and civilian bureaucracies.

Announced in late 2005, a little-noticed Pentagon directive placed stability operations on a par with combat missions. In another shift toward a democracy-crusading agenda, the US Department of State unveiled the Office of the Coordinator for Reconstruction and Stabilization ‘to help stabilize and reconstruct societies’. Nation building in broken countries became a State Department priority because the ‘security challenges’ are ‘threatening vulnerable populations, their neighbors, our allies, and ourselves’.

In places as far afield as Liberia, Haiti, Ukraine, Kyrgyzstan, Lebanon and the Republic of Georgia, the United States advanced its agenda by ushering out despotic regimes or protesting authoritarian power grabs in fraudulent elections. One of the first tests for America’s initiative came from the former American colony of Liberia which had endured more than a decade of misrule and barbarity in a multisided civil war. Fuelled by ethnically based rival rebel factions, the conflict engulfed the countryside until capped by an agreement in 1997. As part of the settlement, Charles Taylor, the biggest warlord, was engineered into the Liberian presidency by a dubious election. His ascension brought neither peace nor progress. Liberia soon slipped back into anarchy, as a smouldering second countryside civil war converged on Monrovia. By late summer 2003, the West African nation had become the archetypical failed state, chaotic and impoverished.

The US insisted on a cease-fire among the warring parties and the abdication and exile of President Taylor before deploying troops from a 2,300 US Marine taskforce. Just 200 Marines actually disembarked on August 14, 2003 to supply logistical support to a larger contingent of Nigerian peacekeepers. The exercise represented a less than overwhelming display of US power. But it sufficed to change the regime and led to a democratic election well after the US armed forces departed.

Like Liberia, the Republic of Haiti shared a tortuous history with the United States. During its 19-year US military occupation in the early twentieth century, Haiti had been an American colony in all but name. The Caribbean nation captured more of Washington’s attention in the 1990s when conditions became especially onerous on the island republic. President Clinton’s displacement of the junta to return Jean-Bertrand Aristide to the presidency did not bring a happy ending. After winning a second term in a 2000 election, he incited mobs to intimidate and assassinate political opponents, politicized the police and robbed the government coffers.

The Bush administration cringed at that thought of a militarized intervention. It, nonetheless, grew apprehensive at the prospect of Haitian boat people washing up on Florida beaches. Washington intensified pressure on the defrocked slum priest to resign. On February 28, Aristide boarded a US military aircraft and made a dawn departure. The United States landed 200 Marines as the lead contingent of international peacekeepers from France, Chile and Canada before a UN force arrived in June. Compared to the violent regime changes of the Afghan and Iraqi dictatorships, Haiti was a velvet-gloved operation that many Bush officials favoured.

Libya, a terrorist-sponsoring rogue state, decided to come in from the cold on the heels of the US invasion of Iraq. Pressures on the Qaddafi regime had mounted as international sanctions stifled oil production. In December 2003, Libya agreed to give up its WMD, ratify the nuclear test ban treaty and open its arms sites to international inspection. The regime also renounced terrorism. In turn, Washington and the United Nations dropped their restrictions on Libyan commerce and travel. But the US refrained from pressing for democracy or even regime change. It swallowed its democracy promotion rhetoric because deposing Libya’s authoritarian rule might play into the hands of Qaddafi’s Islamic theocratic opponents. For American interests, it was a wisely pragmatic choice.

Elsewhere, America’s diplomatic squeeze resulted in political changes that satisfied aspiring democratic populations within each of the countries but did not require a US occupation. On the Eurasian landmass, the three ‘color revolutions’ in Georgia, Ukraine and Kyrgyzstan offered positive outcomes for America’s prudent non-military tack. As the Republic of Georgia approached its November 2, 2003 elections, the US government transferred its loyalties from President Eduard Shevardnadze, the defunct Soviet Union’s last foreign minister, to the opposition. When the Columbia University-trained lawyer Mikheil Saakashvili disputed the election results as fraudulent, his followers took to the streets and seized the parliament building in the so-called Rose Revolution.

Though Russo-American tensions mounted over Georgia, Shevardnadze resigned the presidency and defused the powder keg. Rescheduled elections ran in January 2004, and the pro-Western reformer became president. He demanded the dismantling of Russian army bases, welcomed Western oil companies to construct a pipeline from Azerbaijan across Georgia to Turkey’s seaports and joined other Black Sea states in training exercises with a US destroyer.

Another significant democratic transition occurred in Ukraine. In November 2004, the Bush administration refused to accept the result of a tainted election. 17 days and nights of demonstrations were sponsored by US and European governments.

The Orange Revolution that led to an election victory for the Western-leaning reformer Viktor Yushchenko in late 2004 owed its nurturing, not its birth, to $58 million spent by the United States in the two previous years to train democratic activists, conduct public opinion surveys, maintain a website and broadcast independent radio news.

In Kyrgyzstan, the United States joined with European governments to fund and tutor the democratic opposition. Washington alone pumped in $12 million to underwrite civil society centres, which trained pro-democracy cadres, disseminated materials and broadcast Kyrgyz-language programmes. It was small money shrewdly spent.

The Kyrgyz pro-democracy movement staged anti-regime rallies in what became known as the ‘Lemon Revolution’ that ousted the repressive President Askar Akayev. That a democratically mobilized population turned out its dictator in an Islamic country encouraged US officials to take heart in their promotion of representative government in Afghanistan, Iraq and elsewhere in the Middle East. But the real lesson was lost that the democracy arose from within the country, not from the imposition of a non-Muslim occupation army as in Iraq.

It was in Lebanon that Washington boasted of the first regional example of the ‘demonstration effect’. The January 2005 elections in Iraq, according to this view, set off a ‘Baghdad spring’ that rippled across the Near East, particularly in Lebanon. Syrian military and intelligence units had occupied the Mediterranean country since 1976. Sceptics interpreted Syria’s intervention as an attempt to restore Lebanon to Greater Syria as it had been under the Ottoman Empire.

When a bomb killed former prime minister Rafik Hariri, an opponent of Syria’s presence, in mid-February 2005, his countrymen demanded a return of their sovereignty, genuine democracy and freedom from Syrian hegemony in what the international media dubbed the ‘Cedar Revolution’.

Internationally isolated, Syria relented. Parliamentary elections followed delivering a majority of seats to an anti-Syrian coalition. Washington saw this as evidence that the Bush strategy was bearing fruit. But they ignored the fact that the Lebanese had elections and parliaments before the US-led invasion of Iraq. Moreover, only months later Lebanon stood at the brink of civil war as the Syrian- and Iranian-sponsored radical Shiite movement Hezbollah (‘the Party of God’) consolidated its political position as a player within the Lebanese government and then provoked conflict with Israel in July 2006.

Lebanon’s largely passive ousting of Syrian rule marked a high point for the Bush White House’s democracy campaign. At the American University in Cairo in June 2005, Condoleezza Rice delivered a direct political appeal to Egypt and Saudi Arabia, two of America’s closest Arab allies, to hold genuine elections, empower women and tolerate free expression. Asserting that democracy does not lead to ‘chaos or conflict’, she added, ‘Freedom and democracy are the only ideas powerful enough to overcome hatred and division and violence.’ The secretary’s claims went too far for democracy, however. It is not a panacea for all the world’s violent and dysfunctional nations. Elections, referendums and elected officials in Iraq and Afghanistan delivered neither peace nor security to their electorates.

Democratic inroads irked Russia, which interpreted Washington’s programme as a means to serve America’s geopolitical priorities. Moscow pushed back by standing behind the dictatorial Aleksandr Lukashenko in his rigged re-election to the presidency of Belarus. The Kremlin also courted Kazakhstan and other former Soviet Republics in Central Asia with the aim of imposing a Cold War-style exclusion of the United States. Strikingly, it joined with China, a sometimes hostile neighbour, in forming the six-member Shanghai Cooperation Organization (SCO), a quasi-alliance. Along with Uzbekistan and Tajikistan, the seemingly pro-American governments in Kazakhstan and Kyrgyzstan also joined SCO, in which Beijing and Moscow promoted regional military cooperation and an ‘energy club’ that invited no membership from the United States. Washington’s democratization, in short, created a Sino-Russian backlash. There were other ominous developments.

While the Middle East witnessed a big shake-up in its political dynamics, the downside was the realization that it enabled regimes hostile to US interests to come to power through the ballot box. Rather than bringing harmony to Turkey, a vital American regional ally, the autonomy of the Iraqi Kurds reinspired the Kurdish minority in Turkey to wage a fresh guerrilla war that in the 1980s had claimed 35,000 lives. In this case, democracy gave rebirth to territorial conflict. Elsewhere in the Middle East, the US-backed democracy push worked against American interests.

From Washington’s perspective, Egypt’s late 2005 election painted a worrisome picture. Its parliamentary contest produced substantial gains for the Muslim Brotherhood, a fivefold increase from its previous showing, despite officially organized harassment of its candidates. In neighbouring Gaza, there was another dramatic example of unintended consequences when, five months after Israel’s disengagement from the territory, the parliamentary elections in January 2006 saw a democratically elected terrorist movement, Hamas, come to power. As a spin-off from the Muslim Brotherhood, Hamas’ political victory was at the expense of the Palestinian Fatah, the chosen partner of Tel Aviv and Washington. Gazan militants fought each other and fired rockets into Israel.

In reaction to the rise of Shiite political forces in Iraq and Lebanon and to the electoral gains by Islamic fundamentalists in Egypt and Gaza, other Middle East states either slowed their reform process or cracked down on democracy. Qatar postponed parliamentary elections; Bahrain backtracked and imposed a constitution calling for a second appointed legislative house to curtail the elected house’s power; Jordan placed democratization authority on the backburner; Yemen clamped down on the media; and Syria suppressed the political opposition.

Washington took note. Non-democratic stability and cooperation came back into vogue. Ilham Aliyev, Azerbaijan’s corrupt and autocratic but friendly leader, was welcomed to the White House in spring 2006. The promise of honest elections in Kazakhstan turned sour. But neither state was punished by Washington, which understood their importance as Iran made a bid for nuclear arms and Russia reasserted its influence in the Caucasus and Central Asia. Moreover, Azerbaijan and Kazakhstan were oil exporters.

As a new ‘great game’ dawned between the United States, Russia and China for advantage in this crucial hydrocarbon zone, America had less latitude to advance democracy. Instead, it had to revert to its former policy of accommodating friendly dictators.

Meanwhile, the risk of a two-sided or even three-sided civil war loomed over Iraq despite the acknowledged success of two elections. The Middle East stood apart from other arenas by virtue of its religious-based civilization and unremitting hostility to colonialism, Western cultural penetration and non-Muslim occupying forces. Washington wrongly discounted these factors in Iraq as it clung for too long at the glimpses of voters going to the polls. An election in no way guaranteed an acceptable government strong enough to govern a deeply divided society.

Other threats emerged to confront America’s stabilization goals. Chief among the immediate destabilizing powers is Iran, which oddly enough benefited most from America’s removal of Saddam Hussein and the Taliban regime. Instead of showing gratitude, the Islamic Republic worsened America’s Iraqi predicament by bolstering Shiite insurgents. Like the provocative North Korean nuclear test in October 2006, Iran’s growing nuclear capability and frequent threats to Israel unsettled the Middle East.

As the world’s fourth largest producer of oil, Iran presented a tough adversary. Its horde of petrodollars and its untapped crude reserves gave Tehran financial strength to resist international pressure and to fund proxy wars through terrorist-linked Hamas in Gaza and Hezbollah in Lebanon.

Another major destabilizing element radiating from the Middle East remains Al Qaeda and its clones. The loss of Afghanistan was a setback to these terrorist networks. But they have carved out safe havens in weak states such as Somalia, Sudan and the anarchic belts along the Afghanistan-Pakistan border. In Europe, they have taken advantage of immigrant enclaves and open societies to launch bombings as in Madrid and London. Along with the objective of destroying Israel and putting an Islamic regime in its place, they seek to overturn apostate governments in Muslim countries such as Egypt, Jordan, Saudi Arabia, Yemen, Pakistan and Bangladesh, whether through the ballot box or subversion. At this juncture, the Middle East and parts of South Asia look far less than secure, casting into doubt future US exertions to achieve friendly, harmonious governments through the spread of democracy.

Extracted from The Failure of Democratic Nation Building by Albert Somit and Steven A. Peterson (Palgrave Macmillan), 2010


Under the Shadow of Defeat, Karine Varley - an extract

Karine Varley's book Under the Shadow of Defeat is the first wide-ranging analysis of how memories of the Franco-Prussian War shaped French political culture and identities. Examining war remembrance as an emerging mass phenomenon in Europe, it sheds new light on the relationship between memories and the emergence of new concepts of the nation.

For more information about Under the Shadow of Defeat and other titles in the history list visit Palgrave Macmillan.

ONE HUNDRED AND FORTY YEARS AGO

Aftermath of the Franco-Prussian War

In the wake of military defeat, nations have a talent for reinventing themselves. The pain of transition from humiliation to self-respect can be softened by a thick layer of historical reinterpretation, a retelling of the story that minimizes blunders and builds on myth. By way of example, Germany after the Great War immediately springs to mind but in a revealing study of 1870–71, ‘l’année terrible’, when proud France was crushed by the upstart Prussia, Karine Varley (Under the Shadow of Defeat. The War of 187071 in French Memory) shows that a recognizable pattern of self justification and myth creation was well in place by the end of the nineteenth century.

It is one of the paradoxes of nineteenth-century France that while most of the country wanted to forget the Franco-Prussian War, it gave rise to one of the greatest waves of commemorative activity the nation had ever seen. Within only seven years of the war ending, some 460 memorials had been erected while crowds of thousands faithfully honoured the anniversaries of the defeat. In 1899 one journalist complained that there were more German corpses in French paintings relating to the war than there had ever been lying on the battlefields.

If Bismarck had been looking for an occasion to launch war against France, France in turn had been looking to cut a rising Prussia down to size. Few in France doubted that the war would see a rapid French victory. Yet within a few days of the outbreak of war, French forces were forced onto the defensive, suffering heavy losses. By the end of the month, they had been pushed back to the two cities that were to symbolize the ruin of the Second Empire: Metz and Sedan. At Metz, Marshal Bazaine allowed his men to become encircled, only to surrender two months later along with 137,000 men of the Army of the Rhine. The battle at Sedan on 1 September was a disaster waiting to happen; encircled, exhausted and demoralized French forces faced an enemy twice as numerous. Physically drained by ill health, and having lost all hope of victory, Napoleon III surrendered 83,000 men, 6,000 horses and himself.

News of the defeat at Sedan brought insurrection in Paris, the overthrow of the Second Empire and the proclamation of a republican Government of National Defence ready to consider peace but refusing to surrender one inch of French soil. With Bismarck continuing to demand Alsace-Lorraine, German forces began to march towards Paris to begin a siege that was to last until 26 January 1871. As food supplies in Paris ran out, the French government sued for an armistice.

The peace divided the country with some of the major cities, including Paris, calling for the war to continue. The government’s decision to relocate to Versailles, rather than return to the capital, fuelled suspicions that reactionary rural elements were deliberately targeting republican and left-wing Paris. Socialists and left-wing extremists backed by radicalized guardsmen, artisans and workers rose up to proclaim a new Paris Commune. Clashes between forces supporting the Commune and the reconstituted army based at Versailles on 2 April marked the beginnings of the civil war which was to bring violent suppression and around a further 22,000 dead.

Had the fighting ended at Sedan it might have been written off as the collapse of the Second Empire, but because the war had resumed under the republican Government of National Defence, it became a defeat not only of the regime but of the nation as well. The defeat cast a dark shadow over France’s political and cultural development. After the humiliation and anger came soul-searching and a widespread conviction that something must have been fundamentally rotten at the very core of the nation. It was a time when every political, cultural, religious and social group competed to offer their own panacea. The war dead lay at the heart of ideas on the regeneration of France. The post-1871 cult of the fallen placed unprecedented emphasis on the mass of common soldiers, invoking their patriotic self-sacrifice to lift and unify the nation after its collapse.

In the period between 1871 and 1873, when France was under occupation and still reeling from the suppression of the Paris Commune, a wave of memorial building spread across the nation, concentrated particularly in the areas directly affected by the war. After 1873 the state’s burial of all fallen soldiers on French territory triggered bellicose nationalism and the construction of further monuments. Then the political fault lines shifted again. Revenge became secondary to tensions between radical and moderate republican memories of the war and expressions of Catholic patriotism. The early 1900s inaugurated a new phase in the construction of war memorials, inspired partly by the effects of the Dreyfus Affair, the nationalist revival and fears of the rise of socialist internationalism.

Between 1871 and 1914, military painting overshadowed impressionism for a public eager to consume images of patriotism and heroism. Reproductions in the illustrated press, postcards and prints furthered the dissemination of paintings relating to the war and the recovering army, at once responding to and fuelling a market for images of patriotism. Popular literature presented the war as a test of moral strength. As with military art, literature portrayed real and fictional tales of glory rather than the wider picture of collapse, transforming the war into an adventure story with francs-tireurs as the intrepid heroes. Spies, barbaric Germans and precociously patriotic children filled the pages of novels.

The republican consolidation of power ushered a wave of reforms designed to instil patriotism within the population. Maps of France were introduced into classrooms so that pupils would develop a sense of national belonging; they featured Alsace-Lorraine shaded in a different colour to Germany to sustain hopes that the annexation was only temporary. Gymnastics became compulsory in schools in 1880, while in July 1882 the government introduced school battalions to improve the physical fitness of the nation’s future soldiers. Recreational gymnastics, rugby, rifle and military education societies sprang up across the nation, their members seeking to reverse the physical and skills deficiencies of French soldiers in 1870. Even the Tour de France, which began in 1903, perpetuated revanchist memories of the war. During the period 1906–11, the race crossed into Alsace-Lorraine, allowing spectators the opportunity to voice their feelings on the annexation as they lined the route singing the Marseillaise. Music hall performances often harked back to the war and to aspirations of revanche, with song sheets easily available to purchase from street vendors. There was no escape even on holiday. Tourists were encouraged to forego the pleasures of the seaside in favour of a fortifying trip to the battlefields of 1870–71.

For some historians, the military parade at Longchamps on 29 June 1871 symbolizes the magnitude of French delusions. The army marched for four hours before crowds of around 9,000. Newspapers reported the event as a tremendously uplifting boost for the nation, nothing less than the beginning of the national recovery. Coming a mere six months since the war had ended, at a time when eastern areas remained under German occupation, the parade appears somewhat premature, inaugurating a cult of the army that seemed bizarrely impervious to its recent collapse. Yet such interpretations result from separating memories of the war from memories of the Paris Commune. The army at Longchamps was thus not the army that had been routed by Germany, but rather the Army of Versailles that had successfully defeated the insurgency.

The Catholic Church articulated a very clear theological explanation for the recent misfortunes; it defined republican experiments as a betrayal of the nation’s divinely ordained mission as eldest daughter of the Church and an assault on the very heart of French national identity. Whereas republicans traced their vision of the nation back to the Revolution of 1789, Catholics traced theirs back to the fifth century. Thus not only was Catholicism the religion of the sovereign for over 1,300 years, but it was a defining element in the creation of the nation as well. There was a widely held view that France had been divinely selected to perform God’s will and that any deviation from this vocation would incur due punishment. Thus in the eyes of the Catholic Church, cataclysmic events such as wars and revolutions were not the products of shifting political and social forces but God’s punishment for national infidelity.

If republican claims over the character of the nation were rooted in recent French history, republicans laid claim to a more deeply rooted concept of the nation. Having been called upon to defend the revolution in the call to arms of 1792, the French people were no longer subjects but citizens who had earned themselves a stake in the French nation. While the Catholic Church maintained that the cult of the dead provided confirmation of immortality, republicans held that immortality was achieved and sustained through the cult of the dead. They could not offer the reward of eternal life, so they offered eternal memory instead.

The most spectacular and striking aspects of the republican campaign were in the cultural sphere. The creation of Bastille Day as a national holiday in 1880 mobilized communities across the nation in an overt attempt to redefine France along republican lines. The celebration suggested that if storming the Bastille had brought political and social liberation, then moral and spiritual liberation could only be achieved once the ‘clerical Bastille’ had been demolished.

The construction of the Sacré-Coeur basilica at Montmartre was one of the most prominent manifestations of the rivalry between the Church and the republican state. Even during its construction, it represented an aggressive and unambiguous assault on republicanism that culminated most strikingly in the erection of a luminous cross on top of the scaffolding of the Sacré-Coeur on 14 July 1892. Visible across much of Paris, the basilica rivalled republican architectural symbols such as the Eiffel Tower, which was built for the centenary of the Revolution, and of course the Panthéon.

Every political, religious and social group hunted for scapegoats to exonerate themselves from responsibility for the defeat. A moderate republican interpretation of the war sought to constitute itself as the dominant memory; yet because its parameters were so narrow, excluding the experiences of the Second Empire and Paris Commune and insisting upon an entirely glorious image of the army, they were easily infringed by all those who sought to challenge this political vision of the nation.

The myths of the French resistance in the Second World War and their role in the restoration of national pride are now a familiar subject of enquiry; much less, however, is known of the representations of resistance and martyrdom that emerged with the Franco-Prussian War. The two concepts embody an idea of the nation as one of ideas not aggressive military might, and of the resilience, patriotism and intelligence of the people. In their democratic qualities, the concepts are implicitly republican, but in their glorification of suffering, they are also implicitly Christian. Notions of a racial and cultural German ‘other’ served further to reinforce this newly created self-image.    

Extracted from Karine Varley, Under the Shadow of Defeat. The War of 1870–71 in French Memory. Palgrave Macmillan, 2008